WorldWideScience

Sample records for bottom-up saliency mediates

  1. Computational versus psychophysical bottom-up image saliency: A comparative evaluation study

    NARCIS (Netherlands)

    Toet, A.

    2011-01-01

    The predictions of 13 computational bottom-up saliency models and a newly introduced Multiscale Contrast Conspicuity (MCC) metric are compared with human visual conspicuity measurements. The agreement between human visual conspicuity estimates and model saliency predictions is quantified through the

  2. Bottom-Up Visual Saliency Estimation With Deep Autoencoder-Based Sparse Reconstruction.

    Science.gov (United States)

    Xia, Chen; Qi, Fei; Shi, Guangming

    2016-06-01

    Research on visual perception indicates that the human visual system is sensitive to center-surround (C-S) contrast in the bottom-up saliency-driven attention process. Different from the traditional contrast computation of feature difference, models based on reconstruction have emerged to estimate saliency by starting from original images themselves instead of seeking for certain ad hoc features. However, in the existing reconstruction-based methods, the reconstruction parameters of each area are calculated independently without taking their global correlation into account. In this paper, inspired by the powerful feature learning and data reconstruction ability of deep autoencoders, we construct a deep C-S inference network and train it with the data sampled randomly from the entire image to obtain a unified reconstruction pattern for the current image. In this way, global competition in sampling and learning processes can be integrated into the nonlocal reconstruction and saliency estimation of each pixel, which can achieve better detection results than the models with separate consideration on local and global rarity. Moreover, by learning from the current scene, the proposed model can achieve the feature extraction and interaction simultaneously in an adaptive way, which can form a better generalization ability to handle more types of stimuli. Experimental results show that in accordance with different inputs, the network can learn distinct basic features for saliency modeling in its code layer. Furthermore, in a comprehensive evaluation on several benchmark data sets, the proposed method can outperform the existing state-of-the-art algorithms. PMID:26800552

  3. The Roles of Feature-Specific Task Set and Bottom-Up Salience in Attentional Capture: An ERP Study

    Science.gov (United States)

    Eimer, Martin; Kiss, Monika; Press, Clare; Sauter, Disa

    2009-01-01

    We investigated the roles of top-down task set and bottom-up stimulus salience for feature-specific attentional capture. Spatially nonpredictive cues preceded search arrays that included a color-defined target. For target-color singleton cues, behavioral spatial cueing effects were accompanied by cue-induced N2pc components, indicative of…

  4. Modeling Visual Exploration in Rhesus Macaques with Bottom-Up Salience and Oculomotor Statistics.

    Science.gov (United States)

    König, Seth D; Buffalo, Elizabeth A

    2016-01-01

    There is a growing interest in studying biological systems in natural settings, in which experimental stimuli are less artificial and behavior is less controlled. In primate vision research, free viewing of complex images has elucidated novel neural responses, and free viewing in humans has helped discover attentional and behavioral impairments in patients with neurological disorders. In order to fully interpret data collected from free viewing of complex scenes, it is critical to better understand what aspects of the stimuli guide viewing behavior. To this end, we have developed a novel viewing behavior model called a Biased Correlated Random Walk (BCRW) to describe free viewing behavior during the exploration of complex scenes in monkeys. The BCRW can predict fixation locations better than bottom-up salience. Additionally, we show that the BCRW can be used to test hypotheses regarding specific attentional mechanisms. For example, we used the BCRW to examine the source of the central bias in fixation locations. Our analyses suggest that the central bias may be caused by a natural tendency to reorient the eyes toward the center of the stimulus, rather than a photographer's bias to center salient items in a scene. Taken together these data suggest that the BCRW can be used to further our understanding of viewing behavior and attention, and could be useful in optimizing stimulus and task design. PMID:27445721

  5. Modeling Visual Exploration in Rhesus Macaques with Bottom-Up Salience and Oculomotor Statistics

    Science.gov (United States)

    König, Seth D.; Buffalo, Elizabeth A.

    2016-01-01

    There is a growing interest in studying biological systems in natural settings, in which experimental stimuli are less artificial and behavior is less controlled. In primate vision research, free viewing of complex images has elucidated novel neural responses, and free viewing in humans has helped discover attentional and behavioral impairments in patients with neurological disorders. In order to fully interpret data collected from free viewing of complex scenes, it is critical to better understand what aspects of the stimuli guide viewing behavior. To this end, we have developed a novel viewing behavior model called a Biased Correlated Random Walk (BCRW) to describe free viewing behavior during the exploration of complex scenes in monkeys. The BCRW can predict fixation locations better than bottom-up salience. Additionally, we show that the BCRW can be used to test hypotheses regarding specific attentional mechanisms. For example, we used the BCRW to examine the source of the central bias in fixation locations. Our analyses suggest that the central bias may be caused by a natural tendency to reorient the eyes toward the center of the stimulus, rather than a photographer's bias to center salient items in a scene. Taken together these data suggest that the BCRW can be used to further our understanding of viewing behavior and attention, and could be useful in optimizing stimulus and task design.

  6. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    Science.gov (United States)

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not.

  7. Bottom-up and top-down mechanisms indirectly mediate interactions between benthic biotic ecosystem components

    Science.gov (United States)

    Van Colen, Carl; Thrush, Simon F.; Parkes, Samantha; Harris, Rachel; Woodin, Sally A.; Wethey, David S.; Pilditch, Conrad A.; Hewitt, Judi E.; Lohrer, Andrew M.; Vincx, Magda

    2015-04-01

    The loss or decline in population size of key species can instigate a cascade of effects that have implications for interacting species, therewith impacting biodiversity and ecosystem functioning. We examined how top-down and bottom-up interactions may mediate knock-on effects of a coastal deposit-feeding clam, Macomona liliana (hereafter Macomona), on sandflat meiobenthos densities. Therefore we manipulated densities of Macomona in combination with predator exclusion and experimental shading that was expected to alter microphytobenthos biomass. We show that Macomona regulated densities of meiobenthic (38-500 μm) nematodes, copepods, polychaetes, turbellarians, and ostracodes during the three months of incubation via indirect mechanisms. Predator pressure on Macomona by eagle rays (Myliobatis tenuicaudatus) was found to have a negative effect on densities of some meiobenthic taxa. Furthermore, experimental shading resulted in the loss of a positive relation between Macomona and microphytobenthos biomass, while concurrently increasing the density of some meiobenthic taxa. We suggest that this observation can be explained by the release from bioturbation interference effects of the cockle Austrovenus stutchburyi that was found to thrive in the presence of Macomona under non-shaded conditions. Our results highlight the importance of interactions between macrofaunal bioturbation, microphyte biomass, sediment stability, and predation pressure for the structuring of benthic communities. This experiment illustrates that manipulative field experiments may be particularly suitable to study such multiple indirect mechanisms that regulate ecosystem diversity and related functioning because such approaches may best capture the complex feedbacks and processes that determine ecosystem dynamics.

  8. Culture from the Bottom Up

    Science.gov (United States)

    Atkinson, Dwight; Sohn, Jija

    2013-01-01

    The culture concept has been severely criticized for its top-down nature in TESOL, leading arguably to its falling out of favor in the field. But what of the fact that people do "live culturally" (Ingold, 1994)? This article describes a case study of culture from the bottom up--culture as understood and enacted by its individual users.…

  9. Emergence of visual saliency from natural scenes via context-mediated probability distributions coding.

    Directory of Open Access Journals (Sweden)

    Jinhua Xu

    Full Text Available Visual saliency is the perceptual quality that makes some items in visual scenes stand out from their immediate contexts. Visual saliency plays important roles in natural vision in that saliency can direct eye movements, deploy attention, and facilitate tasks like object detection and scene understanding. A central unsolved issue is: What features should be encoded in the early visual cortex for detecting salient features in natural scenes? To explore this important issue, we propose a hypothesis that visual saliency is based on efficient encoding of the probability distributions (PDs of visual variables in specific contexts in natural scenes, referred to as context-mediated PDs in natural scenes. In this concept, computational units in the model of the early visual system do not act as feature detectors but rather as estimators of the context-mediated PDs of a full range of visual variables in natural scenes, which directly give rise to a measure of visual saliency of any input stimulus. To test this hypothesis, we developed a model of the context-mediated PDs in natural scenes using a modified algorithm for independent component analysis (ICA and derived a measure of visual saliency based on these PDs estimated from a set of natural scenes. We demonstrated that visual saliency based on the context-mediated PDs in natural scenes effectively predicts human gaze in free-viewing of both static and dynamic natural scenes. This study suggests that the computation based on the context-mediated PDs of visual variables in natural scenes may underlie the neural mechanism in the early visual cortex for detecting salient features in natural scenes.

  10. Bottom-up approach to silicon nanoelectronics

    OpenAIRE

    Mizumita, Hiroshi; Oda, S

    2005-01-01

    Submitted on behalf of EDA Publishing Association (http://irevues.inist.fr/handle/2042/5920) International audience This paper presents a brief review of our recent work investigating a novel bottom-up approach to realize silicon based nanoelectronics. We discuss fabrication technique, electronic properties and device applications of silicon nanodots as a building block for nanoscale silicon devices.

  11. Bottom-up organic integrated circuits

    NARCIS (Netherlands)

    Smits, Edsger C. P.; Mathijssen, Simon G. J.; van Hal, Paul A.; Setayesh, Sepas; Geuns, Thomas C. T.; Mutsaers, Kees A. H. A.; Cantatore, Eugenio; Wondergem, Harry J.; Werzer, Oliver; Resel, Roland; Kemerink, Martijn; Kirchmeyer, Stephan; Muzafarov, Aziz M.; Ponomarenko, Sergei A.; de Boer, Bert; Blom, Paul W. M.; de Leeuw, Dago M.

    2008-01-01

    Self- assembly - the autonomous organization of components into patterns and structures(1) - is a promising technology for the mass production of organic electronics. Making integrated circuits using a bottom- up approach involving self- assembling molecules was proposed(2) in the 1970s. The basic b

  12. Bottom-up holographic approach to QCD

    International Nuclear Information System (INIS)

    One of the most known result of the string theory consists in the idea that some strongly coupled gauge theories may have a dual description in terms of a higher dimensional weakly coupled gravitational theory — the so-called AdS/CFT correspondence or gauge/gravity correspondence. The attempts to apply this idea to the real QCD are often referred to as “holographic QCD” or “AdS/QCD approach”. One of directions in this field is to start from the real QCD and guess a tentative dual higher dimensional weakly coupled field model following the principles of gauge/gravity correspondence. The ensuing phenomenology can be then developed and compared with experimental data and with various theoretical results. Such a bottom-up holographic approach turned out to be unexpectedly successful in many cases. In the given short review, the technical aspects of the bottom-up holographic approach to QCD are explained placing the main emphasis on the soft wall model

  13. Bottom-up holographic approach to QCD

    Energy Technology Data Exchange (ETDEWEB)

    Afonin, S. S. [V. A. Fock Department of Theoretical Physics, Saint Petersburg State University, 1 ul. Ulyanovskaya, 198504 (Russian Federation)

    2016-01-22

    One of the most known result of the string theory consists in the idea that some strongly coupled gauge theories may have a dual description in terms of a higher dimensional weakly coupled gravitational theory — the so-called AdS/CFT correspondence or gauge/gravity correspondence. The attempts to apply this idea to the real QCD are often referred to as “holographic QCD” or “AdS/QCD approach”. One of directions in this field is to start from the real QCD and guess a tentative dual higher dimensional weakly coupled field model following the principles of gauge/gravity correspondence. The ensuing phenomenology can be then developed and compared with experimental data and with various theoretical results. Such a bottom-up holographic approach turned out to be unexpectedly successful in many cases. In the given short review, the technical aspects of the bottom-up holographic approach to QCD are explained placing the main emphasis on the soft wall model.

  14. A Clash of Bottom-Up and Top-Down Processes in Visual Search: The Reversed Letter Effect Revisited

    Science.gov (United States)

    Zhaoping, Li; Frith, Uta

    2011-01-01

    It is harder to find the letter "N" among its mirror reversals than vice versa, an inconvenient finding for bottom-up saliency accounts based on primary visual cortex (V1) mechanisms. However, in line with this account, we found that in dense search arrays, gaze first landed on either target equally fast. Remarkably, after first landing, gaze…

  15. Top Down Chemistry Versus Bottom up Chemistry

    Science.gov (United States)

    Oka, Takeshi; Witt, Adolf N.

    2016-06-01

    The idea of interstellar top down chemistry (TDC), in which molecules are produced from decomposition of larger molecules and dust in contrast to ordinary bottom up chemistry (BUC) in which molecules are produced synthetically from smaller molecules and atoms in the ISM, has been proposed in the chemistry of PAH and carbon chain molecules both for diffusea,c and dense cloudsb,d. A simple and natural idea, it must have occurred to many people and has been in the air for sometime. The validity of this hypothesis is apparent for diffuse clouds in view of the observed low abundance of small molecules and its rapid decrease with molecular size on the one hand and the high column densities of large carbon molecules demonstrated by the many intense diffuse interstellar bands (DIBs) on the other. Recent identification of C60^+ as the carrier of 5 near infrared DIBs with a high column density of 2×1013 cm-2 by Maier and others confirms the TDC. This means that the large molecules and dust produced in the high density high temperature environment of circumstellar envelopes are sufficiently stable to survive decompositions due to stellar UV radiaiton, cosmic rays, C-shocks etc. for a long time (≥ 10^7 year) of their migration to diffuse clouds and seems to disagree with the consensus in the field of interstellar grains. The stability of molecules and aggregates in the diffuse interstellar medium will be discussed. Duley, W. W. 2006, Faraday Discuss. 133, 415 Zhen,J., Castellanos, P., Paardekooper, D. M., Linnartz, H., Tielens, A. G. G. M. 2014, ApJL, 797, L30 Huang, J., Oka, T. 2015, Mol. Phys. 113, 2159 Guzmán, V. V., Pety, J., Goicoechea, J. R., Gerin, M., Roueff, E., Gratier, P., Öberg, K. I. 2015, ApJL, 800, L33 L. Ziurys has sent us many papers beginning Ziurys, L. M. 2006, PNAS 103, 12274 indicating she had long been a proponent of the idea. Campbell, E. K., Holz, M., Maier, J. P., Gerlich, D., Walker, G. A. H., Bohlender, D, 2016, ApJ, in press Draine, B. T. 2003

  16. Bottom-up Attention Orienting in Young Children with Autism

    Science.gov (United States)

    Amso, Dima; Haas, Sara; Tenenbaum, Elena; Markant, Julie; Sheinkopf, Stephen J.

    2014-01-01

    We examined the impact of simultaneous bottom-up visual influences and meaningful social stimuli on attention orienting in young children with autism spectrum disorders (ASDs). Relative to typically-developing age and sex matched participants, children with ASDs were more influenced by bottom-up visual scene information regardless of whether…

  17. Bottom-up Initiatives for Photovoltaic: Incentives and Barriers

    Directory of Open Access Journals (Sweden)

    Kathrin Reinsberger

    2014-06-01

    Full Text Available When facing the challenge of restructuring the energy system, bottom-up initiatives can aid the diffusion of decentralized and clean energy technologies. We focused here on a bottom-up initiative of citizen-funded and citizen-operated photovoltaic power plants. The project follows a case study-based approach and examines two different community initiatives. The aim is to investigate the potential incentives and barriers relating to participation or non-participation in predefined community PV projects. Qualitative, as well as quantitative empirical research was used to examine the key factors in the further development of bottom-up initiatives as contributors to a general energy transition.

  18. Nanoelectronics: Thermoelectric Phenomena in «Bottom-Up» Approach

    OpenAIRE

    Yu.A. Kruglyak; P.A. Kondratenko; Yu.М. Lopatkin

    2014-01-01

    Thermoelectric phenomena of Seebeck and Peltier, quality indicators and thermoelectric optimization, ballistic and diffusive phonon heat current are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  19. Nanoelectronics: Thermoelectric Phenomena in «Bottom-Up» Approach

    Directory of Open Access Journals (Sweden)

    Yu.A. Kruglyak

    2014-04-01

    Full Text Available Thermoelectric phenomena of Seebeck and Peltier, quality indicators and thermoelectric optimization, ballistic and diffusive phonon heat current are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  20. The Interplay of Top-Down and Bottom-Up

    DEFF Research Database (Denmark)

    Winkler, Till; Brown, Carol V.; Ozturk, Pinar

    2014-01-01

    The exchange of patient health information across different organizations involved in healthcare delivery has potential benefits for a wide range of stakeholders. However, many governments in Europe and in the U.S. have, despite both top-down and bottom-up initiatives, experienced major barriers...... organizations (HIOs) that facilitate HIE across regional stakeholders remains an unanswered question. This research investigates the impacts of top-down and bottom-up initiatives on the evolutionary paths of HIOs in two contingent states in the U.S. (New Jersey and New York) which had different starting...

  1. Bottom-Up Analysis of Single-Case Research Designs

    Science.gov (United States)

    Parker, Richard I.; Vannest, Kimberly J.

    2012-01-01

    This paper defines and promotes the qualities of a "bottom-up" approach to single-case research (SCR) data analysis. Although "top-down" models, for example, multi-level or hierarchical linear models, are gaining momentum and have much to offer, interventionists should be cautious about analyses that are not easily understood, are not governed by…

  2. Reading Nature from a "Bottom-Up" Perspective

    Science.gov (United States)

    Magntorn, Ola; Hellden, Gustav

    2007-01-01

    This paper reports on a study of ecology teaching and learning in a Swedish primary school class (age 10-11 yrs). A teaching sequence was designed to help students read nature in a river ecosystem. The teaching sequence had a "bottom up" approach, taking as its starting point a common key organism--the freshwater shrimp. From this species and its…

  3. Two Dimensional Polymerization of Graphene Oxide: Bottom-up Approach

    OpenAIRE

    Atanasov, Victor; Russev, Stoyan; Lyutov, Lyudmil; Zagranyarski, Yulian; Dimitrova, Iglika; Avdeev, Georgy; Avramova, Ivalina; Vulcheva, Evgenia; Kirilov, Kiril; Tzonev, Atanas; Abrashev, Miroslav; Tsutsumanova, Gichka

    2012-01-01

    We demonstrate a bottom-up synthesis of structures similar to graphene oxide via a two dimensional polymerization. Experimental evidence and discussion are conveyed as well as a general framework for this two dimensional polymerization. The proposed morphologies and lattice structures of these graphene oxides are derived from aldol condensation of alternating three nucleophilic and three electrophilic centers of benzenetriol.

  4. Effects of salience are both short- and long-lived

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund; Lagerkvist, Carl Johan

    2015-01-01

    necessarily override the latter. Instead, we find that the maximum effect of salience on the likelihood of making a saccade to the target cue is delayed until about 20 saccades after stimulus onset and that the effects of salience and valence are additive rather than multiplicative. Further, we find...... that in the positive and neutral valence conditions, salience continues to exert pressure on saccadic latency, i.e. the interval between saccades to the target with high salience targets being fixated faster than low salience targets. Our findings challenge the assumption that top down and bottom up processes operate...

  5. Bottom-up effects on attention capture and choice

    DEFF Research Database (Denmark)

    Peschel, Anne; Orquin, Jacob Lund; Mueller Loose, Simone

    Attention processes and decision making are accepted to be closely linked together because only information that is attended to can be incorporated in the decision process. Little is known however, to which extent bottom-up processes of attention affect stimulus selection and therefore...... the information available to form a decision. Does changing one visual cue in the stimulus set affect attention towards this cue and what does that mean for the choice outcome? To address this, we conducted a combined eye tracking and choice experiment in a consumer choice setting with visual shelf simulations...... salient. The observed effect on attention also carries over into increased choice likelihood. From these results, we conclude that even small changes in the choice capture attention based on bottom-up processes. Also for eye tracking studies in other domains (e.g. search tasks) this means that stimulus...

  6. Bottom-up Budgeting FY 2015 Assessment: Camarines Sur

    OpenAIRE

    Maramot, Joyce Anne; Yasay, Donald B.; de Guzman, Reinier

    2015-01-01

    Bottom-up budgeting (BUB) is an adaptation of the participatory budgeting model in identifying and providing solutions to poverty at the municipal/city level. Leaders of civil society organizations engage with LGU officials in formulating a poverty alleviation plan to be considered in preparing the budget of national agencies the following fiscal year. This paper reports on how the guideline was implemented in three municipalities in Camarines Sur. The study then presents suggestions and reco...

  7. Bottom-up approaches for defining future climate mitigation commitments

    Energy Technology Data Exchange (ETDEWEB)

    Den Elzen, M.G.J.; Berk, M.M.

    2004-07-01

    This report analyses a number of alternative, bottom-up approaches, i.e. technology and performance standards; technology Research and Development agreements, sectoral targets (national /transnational), sector based Clean Development Mechanism (CDM), and sustainable development policies and measures (SD-PAMs). Included are technology and performance standards; technology, research and development agreements, sectoral targets (national /transnational), and sector-based (CDM), and sustainable development policies and measures (SD-PAMs). A more bottom-up approach for defining national emission targets, the so-called Triptych approach is also explored and compared with more top-down types of approaches (Multi-Stage and Contraction and Convergence) based on a quantitative and qualitative analysis. While bottom-up approaches are concluded as being valuable components of a future climate regime, they, in themselves, do not seem to offer a real alternative to emission reduction and limitation targets, as they provide little certainty about the overall environmental effectiveness of climate policies. In comparison with Multi-stage and the C and C approaches, the global Triptych approach offers the opportunity of early participation by developing countries' without the risk of creating large amounts of surplus emissions as in C and C; in using the approach we also avoid the need for dividing up the non-Annex I countries as in Multi-Stage. However, there will be substantial implementation problems related to the institutional and technical capabilities required. Thus it would seem better to exclude the least developing countries and have them first participate in some of the alternative bottom-up approaches.

  8. Magic for Filter Optimization in Dynamic Bottom-up Processing

    CERN Document Server

    Minnen, G

    1996-01-01

    Off-line compilation of logic grammars using Magic allows an incorporation of filtering into the logic underlying the grammar. The explicit definite clause characterization of filtering resulting from Magic compilation allows processor independent and logically clean optimizations of dynamic bottom-up processing with respect to goal-directedness. Two filter optimizations based on the program transformation technique of Unfolding are discussed which are of practical and theoretical interest.

  9. On an elementary definition of visual saliency

    DEFF Research Database (Denmark)

    Loog, Marco

    2008-01-01

    Various approaches to computational modelling of bottom-up visual attention have been proposed in the past two decades. As part of this trend, researchers have studied ways to characterize the saliency map underlying many of these models. In more recent years, several definitions based on probabi......Various approaches to computational modelling of bottom-up visual attention have been proposed in the past two decades. As part of this trend, researchers have studied ways to characterize the saliency map underlying many of these models. In more recent years, several definitions based...... on probabilistic and information or decision theoretic considerations have been proposed. These provide experimentally successful, appealing, low-level, operational, and elementary definitions of visual saliency (see eg, Bruce, 2005 Neurocomputing 65 125 - 133). Here, I demonstrate that, in fact, all...

  10. Una implementación computacional de un modelo de atención visual Bottom-up aplicado a escenas naturales/A Computational Implementation of a Bottom-up Visual Attention Model Applied to Natural Scenes

    Directory of Open Access Journals (Sweden)

    Juan F. Ramírez Villegas

    2011-12-01

    Full Text Available El modelo de atención visual bottom-up propuesto por Itti et al., 2000 [1], ha sido un modelo popular en tanto exhibe cierta evidencia neurobiológica de la visión en primates. Este trabajo complementa el modelo computacional de este fenómeno desde la dinámica realista de una red neuronal. Asimismo, esta aproximación se basa en la existencia de mapas topográficos que representan la prominencia de los objetos del campo visual para la formación de una representación general (mapa de prominencia, esta representación es la entrada de una red neuronal dinámica con interacciones locales y globales de colaboración y competencia que convergen sobre las principales particularidades (objetos de la escena.The bottom-up visual attention model proposed by Itti et al. 2000 [1], has been a popular model since it exhibits certain neurobiological evidence of primates’ vision. This work complements the computational model of this phenomenon using a neural network with realistic dynamics. This approximation is based on several topographical maps representing the objects saliency that construct a general representation (saliency map, which is the input for a dynamic neural network, whose local and global collaborative and competitive interactions converge to the main particularities (objects presented by the visual scene as well.

  11. Recent progress in backreacted bottom-up holographic QCD

    Energy Technology Data Exchange (ETDEWEB)

    Järvinen, Matti [Laboratoire de Physique Théorique, École Normale Supérieure, 24 rue Lhomond, 75231 Paris Cedex 05 (France)

    2016-01-22

    Recent progress in constructing holographic models for QCD is discussed, concentrating on the bottom-up models which implement holographically the renormalization group flow of QCD. The dynamics of gluons can be modeled by using a string-inspired model termed improved holographic QCD, and flavor can be added by introducing space filling branes in this model. The flavor fully backreacts to the glue in the Veneziano limit, giving rise to a class of models which are called V-QCD. The phase diagrams and spectra of V-QCD are in good agreement with results for QCD obtained by other methods.

  12. Bottom-up graphene nanoribbon field-effect transistors

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Patrick B. [Applied Science and Technology, University of California, Berkeley, California 94720 (United States); Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States); Pedramrazi, Zahra [Department of Physics, University of California, Berkeley, California 94720 (United States); Madani, Ali [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States); Chen, Yen-Chia; Crommie, Michael F. [Department of Physics, University of California, Berkeley, California 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratories, Berkeley, California 94720 (United States); Oteyza, Dimas G. de [Department of Physics, University of California, Berkeley, California 94720 (United States); Centro de Física de Materiales CSIC/UPV-EHU-Materials Physics Center, San Sebastián E-20018 (Spain); Chen, Chen [Department of Chemistry, University of California, Berkeley, California 94720 (United States); Fischer, Felix R. [Department of Chemistry, University of California, Berkeley, California 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratories, Berkeley, California 94720 (United States); Bokor, Jeffrey, E-mail: jbokor@eecs.berkeley.edu [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratories, Berkeley, California 94720 (United States)

    2013-12-16

    Recently developed processes have enabled bottom-up chemical synthesis of graphene nanoribbons (GNRs) with precise atomic structure. These GNRs are ideal candidates for electronic devices because of their uniformity, extremely narrow width below 1 nm, atomically perfect edge structure, and desirable electronic properties. Here, we demonstrate nano-scale chemically synthesized GNR field-effect transistors, made possible by development of a reliable layer transfer process. We observe strong environmental sensitivity and unique transport behavior characteristic of sub-1 nm width GNRs.

  13. Wikipedia: organisation from a bottom-up approach

    OpenAIRE

    Spek, Sander; Postma, Eric; Herik, H. Jaap van den

    2006-01-01

    Wikipedia can be considered as an extreme form of a self-managing team, as a means of labour division. One could expect that this bottom-up approach, with the absense of top-down organisational control, would lead to a chaos, but our analysis shows that this is not the case. In the Dutch Wikipedia, an integrated and coherent data structure is created, while at the same time users succeed in distributing roles by self-selection. Some users focus on an area of expertise, while others edit over ...

  14. Distinguishing Top-Down From Bottom-Up Effects

    OpenAIRE

    Shea, Nicholas

    2015-01-01

    The distinction between top-down and bottom-up effects is widely relied on in experimental psychology. However, there is an important problem with the way it is normally defined. Top-down effects are effects of previously-stored information on processing the current input. But on the face of it that includes the information that is implicit in the operation of any psychological process – in its dispositions to transition from some types of representational state to others. This paper suggests...

  15. Stability of leadership in bottom-up hierarchical organizations

    CERN Document Server

    Galam, S

    2007-01-01

    The stability of a leadership against a growing internal opposition is studied in bottom-up hierarchical organizations. Using a very simple model with bottom-up majority rule voting, the dynamics of power distribution at the various hierarchical levels is calculated within a probabilistic framework. Given a leadership at the top, the opposition weight from the hierarchy bottom is shown to fall off quickly while climbing up the hierarchy. It reaches zero after only a few hierarchical levels. Indeed the voting process is found to obey a threshold dynamics with a deterministic top outcome. Accordingly the leadership may stay stable against very large amplitude increases in the opposition at the bottom level. An opposition can thus grow steadily from few percent up to seventy seven percent with not one a single change at the elected top level. However and in contrast, from one election to another, in the vicinity of the threshold, less than a one percent additional shift at the bottom level can drive a drastic an...

  16. Inverse Magnetic Catalysis in Bottom-Up Holographic QCD

    CERN Document Server

    Evans, Nick; Scott, Marc

    2016-01-01

    We explore the effect of magnetic field on chiral condensation in QCD via a simple bottom up holographic model which inputs QCD dynamics through the running of the anomalous dimension of the quark bilinear. Bottom up holography is a form of effective field theory and we use it to explore the dependence on the coefficients of the two lowest order terms linking the magnetic field and the quark condensate. In the massless theory, we identify a region of parameter space where magnetic catalysis occurs at zero temperature but inverse magnetic catalysis at temperatures of order the thermal phase transition. The model shows similar non-monotonic behaviour in the condensate with B at intermediate T as the lattice data. This behaviour is due to the separation of the meson melting and chiral transitions in the holographic framework. The introduction of quark mass raises the scale of B where inverse catalysis takes over from catalysis until the inverse catalysis lies outside the regime of validity of the effective descr...

  17. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  18. Fast full resolution saliency detection based on incoherent imaging system

    Science.gov (United States)

    Lin, Guang; Zhao, Jufeng; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2016-08-01

    Image saliency detection is widely applied in many tasks in the field of the computer vision. In this paper, we combine the saliency detection with the Fourier optics to achieve acceleration of saliency detection algorithm. An actual optical saliency detection system is constructed within the framework of incoherent imaging system. Additionally, the application of our system to implement the bottom-up rapid pre-saliency process of primate visual saliency is discussed with dual-resolution camera. A set of experiments over our system are conducted and discussed. We also demonstrate the comparisons between our method and pure computer methods. The results show our system can produce full resolution saliency maps faster and more effective.

  19. Fast full resolution saliency detection based on incoherent imaging system

    Science.gov (United States)

    Lin, Guang; Zhao, Jufeng; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2016-05-01

    Image saliency detection is widely applied in many tasks in the field of the computer vision. In this paper, we combine the saliency detection with the Fourier optics to achieve acceleration of saliency detection algorithm. An actual optical saliency detection system is constructed within the framework of incoherent imaging system. Additionally, the application of our system to implement the bottom-up rapid pre-saliency process of primate visual saliency is discussed with dual-resolution camera. A set of experiments over our system are conducted and discussed. We also demonstrate the comparisons between our method and pure computer methods. The results show our system can produce full resolution saliency maps faster and more effective.

  20. Bottom-up fabrication of graphene nanostructures on Ru(1010).

    Science.gov (United States)

    Song, Junjie; Zhang, Han-jie; Cai, Yiliang; Zhang, Yuxi; Bao, Shining; He, Pimo

    2016-02-01

    Investigations on the bottom-up fabrication of graphene nanostructures with 10, 10'-dibromo-9, 9'-bianthryl (DBBA) as a precursor on Ru(1010) were carried out using scanning tunnelling microscopy (STM) and density functional theory (DFT) calculations. Upon annealing the sample at submonolayer DBBA coverage, N = 7 graphene nanoribbons (GNRs) aligned along the [1210] direction form. Higher DBBA coverage and higher annealing temperature lead to the merging of GNRs into ribbon-like graphene nanoflakes with multiple orientations. These nanoflakes show different Moiré patterns, and their structures were determined by DFT simulations. The results showed that GNRs possess growth preference on the Ru(1010) substrate with a rectangular unit cell, and GNRs with armchair and zigzag boundaries are obtainable. Further DFT calculations suggest that the interaction between graphene and the substrate controls the orientations of the graphene overlayer and the growth of graphene on Ru(1010).

  1. BitCube: A Bottom-Up Cubing Engineering

    Science.gov (United States)

    Ferro, Alfredo; Giugno, Rosalba; Puglisi, Piera Laura; Pulvirenti, Alfredo

    Enhancing on line analytical processing through efficient cube computation plays a key role in Data Warehouse management. Hashing, grouping and mining techniques are commonly used to improve cube pre-computation. BitCube, a fast cubing method which uses bitmaps as inverted indexes for grouping, is presented. It horizontally partitions data according to the values of one dimension and for each resulting fragment it performs grouping following bottom-up criteria. BitCube allows also partial materialization based on iceberg conditions to treat large datasets for which a full cube pre-computation is too expensive. Space requirement of bitmaps is optimized by applying an adaption of the WAH compression technique. Experimental analysis, on both synthetic and real datasets, shows that BitCube outperforms previous algorithms for full cube computation and results comparable on iceberg cubing.

  2. Contextualised ICT4D: a Bottom-Up Approach

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Sutinen, Erkki

    2010-01-01

    . In a certain way, this agenda can be understood as a topdown approach which transfers technology in a hierarchical way to actual users. Complementary to the traditional approach, a bottom-up approach starts by identifying communities that are ready to participate in a process to use technology to transform......The term ICT4D refers to the opportunities of Information and Communication Technology (ICT) as an agent of development. Much of the research in the field is based on evaluating the feasibility of existing technologies, mostly of Western or Asian origin, in the context of developing countries...... their own strengths to new levels by designing appropriate technologies with experts of technology and design. The bottomup approach requires a new kind of ICT education at the undergraduate level. An example of the development of a contextualized IT degree program at Tumaini University in Tanzania shows...

  3. Making the results of bottom-up energy savings comparable

    Directory of Open Access Journals (Sweden)

    Moser Simon

    2012-01-01

    Full Text Available The Energy Service Directive (ESD has pushed forward the issue of energy savings calculations without clarifying the methodological basis. Savings achieved in the Member States are calculated with rather non-transparent and hardly comparable Bottom-up (BU methods. This paper develops the idea of parallel evaluation tracks separating the Member States’ issue of ESD verification and comparable savings calculations. Comparability is ensured by developing a standardised BU calculation kernel for different energy efficiency improvement (EEI actions which simultaneously depicts the different calculation options in a structured way (e.g. baseline definition, system boundaries, double counting. Due to the heterogeneity of BU calculations the approach requires a central database where Member States feed in input data on BU actions according to a predefined structure. The paper demonstrates the proposed approach including a concrete example of application.

  4. Bottom-Up Discrete Symmetries for Cabibbo Mixing

    CERN Document Server

    Varzielas, Ivo de Medeiros; Talbert, Jim

    2016-01-01

    We perform a bottom-up search for discrete non-Abelian symmetries capable of quantizing the Cabibbo angle that parameterizes CKM mixing. Given a particular Abelian symmetry structure in the up and down sectors, we construct representations of the associated residual generators which explicitly depend on the degrees of freedom present in our effective mixing matrix. We then discretize those degrees of freedom and utilize the Groups, Algorithms, Programming (GAP) package to close the associated finite groups. This short study is performed in the context of recent results indicating that, without resorting to special model-dependent corrections, no small-order finite group can simultaneously predict all four parameters of the three-generation CKM matrix and that only groups of $\\mathcal{O}(10^{2})$ can predict the analogous parameters of the leptonic PMNS matrix, regardless of whether neutrinos are Dirac or Majorana particles. Therefore a natural model of flavour might instead incorporate small(er) finite groups...

  5. Top-down or bottom-up forecasting?

    Directory of Open Access Journals (Sweden)

    Peter Wanke

    2007-01-01

    Full Text Available The operations literature continues on inconclusive as to the most appropriate sales forecasting approach (Top-Down or Bottom-up for the determination of safety inventory levels. This paper presents the analytical results for the variance of the sales forecasting errors during the lead-time in both approaches. The forecasting method used was the Simple Exponential Smoothing and the results led to the identification of two supplementary impacts upon the forecasting error variance, and consequently, upon safety inventory levels: the Portfolio Effect and the Anchoring Effect. The first depends upon the correlation coefficient of demand between two individual items and the latter, depends upon the smoothing constant and upon the participation of the individual item in total sales. It is also analysed under which conditions these variables would favour one forecasting approach instead of the other.A literatura de operações permanece sem concluir sobre a abordagem mais adequada de previsão de vendas (Top-Down ou Bottom-Up para o dimensionamento de estoques de segurança. Nesse manuscrito são apresentados os resultados analíticos para a variância dos erros de previsão no tempo de resposta com amortecimento exponencial nessas duas abordagens. Os resultados apontam dois impactos complementares na variância do erro de previsão, e conseqüentemente, nos níveis de estoque de segurança: Efeito Portifólio e Efeito Ancoragem. O primeiro depende do coeficiente de correlação da demanda entre os produtos e o segundo, da constante de amortecimento e da participação das vendas do produto nas vendas totais. É analisado sob quais condições essas variáveis favoreceriam uma abordagem de previsão em detrimento da outra.

  6. Nonplanar conductive surfaces via "bottom-up" nanostructured gold coating.

    Science.gov (United States)

    Vinod, T P; Jelinek, Raz

    2014-03-12

    Development of technologies for the construction of bent, curved, and flexible conductive surfaces is among the most important albeit challenging goals in the promising field of "flexible electronics". We present a generic solution-based "bottom-up" approach for assembling conductive gold nanostructured layers on nonplanar polymer surfaces. The simple two-step experimental scheme is based upon incubation of an amine-displaying polymer [the abundantly used poly(dimethylsiloxane) (PDMS), selected here as a proof of concept] with Au(SCN)4(-), followed by a brief treatment with a conductive polymer [poly(3,4-thylenedioxythiophene)/poly(styrenesulfonate)] solution. Importantly, no reducing agent is co-added to the gold complex solution. The resultant surfaces are conductive and exhibit a unique "nanoribbon" gold morphology. The scheme yields conductive layers upon PDMS in varied configurations: planar, "wrinkled", and mechanically bent surfaces. The technology is simple, inexpensive, and easy to implement for varied polymer surfaces (and other substances), opening the way for practical applications in flexible electronics and related fields. PMID:24548243

  7. BUEES:a bottom-up event extraction system

    Institute of Scientific and Technical Information of China (English)

    Xiao DING; Bing QIN; Ting LIU

    2015-01-01

    Traditional event extraction systems focus mainly on event type identifi cation and event participant extraction based on pre-specifi ed event type paradigms and manually annotated corpora. However, different domains have different event type paradigms. When transferring to a new domain, we have to build a new event type paradigm and annotate a new corpus from scratch. This kind of conventional event extraction system requires massive human effort, and hence prevents event extraction from being widely applicable. In this paper, we present BUEES, a bottom-up event extraction system, which extracts events from the web in a completely unsupervised way. The system automatically builds an event type paradigm in the input corpus, and then proceeds to extract a large number of instance patterns of these events. Subsequently, the system extracts event arguments according to these patterns. By conducting a series of experiments, we demonstrate the good performance of BUEES and compare it to a state-of-the-art Chinese event extraction system, i.e., a supervised event extraction system. Experimental results show that BUEES performs comparably (5% higher F-measure in event type identifi cation and 3% higher F-measure in event argument extraction), but without any human effort.

  8. Bottom-Up Synthesis and Sensor Applications of Biomimetic Nanostructures

    Directory of Open Access Journals (Sweden)

    Li Wang

    2016-01-01

    Full Text Available The combination of nanotechnology, biology, and bioengineering greatly improved the developments of nanomaterials with unique functions and properties. Biomolecules as the nanoscale building blocks play very important roles for the final formation of functional nanostructures. Many kinds of novel nanostructures have been created by using the bioinspired self-assembly and subsequent binding with various nanoparticles. In this review, we summarized the studies on the fabrications and sensor applications of biomimetic nanostructures. The strategies for creating different bottom-up nanostructures by using biomolecules like DNA, protein, peptide, and virus, as well as microorganisms like bacteria and plant leaf are introduced. In addition, the potential applications of the synthesized biomimetic nanostructures for colorimetry, fluorescence, surface plasmon resonance, surface-enhanced Raman scattering, electrical resistance, electrochemistry, and quartz crystal microbalance sensors are presented. This review will promote the understanding of relationships between biomolecules/microorganisms and functional nanomaterials in one way, and in another way it will guide the design and synthesis of biomimetic nanomaterials with unique properties in the future.

  9. Spatiochromatic Context Modeling for Color Saliency Analysis.

    Science.gov (United States)

    Zhang, Jun; Wang, Meng; Zhang, Shengping; Li, Xuelong; Wu, Xindong

    2016-06-01

    Visual saliency is one of the most noteworthy perceptual abilities of human vision. Recent progress in cognitive psychology suggests that: 1) visual saliency analysis is mainly completed by the bottom-up mechanism consisting of feedforward low-level processing in primary visual cortex (area V1) and 2) color interacts with spatial cues and is influenced by the neighborhood context, and thus it plays an important role in a visual saliency analysis. From a computational perspective, the most existing saliency modeling approaches exploit multiple independent visual cues, irrespective of their interactions (or are not computed explicitly), and ignore contextual influences induced by neighboring colors. In addition, the use of color is often underestimated in the visual saliency analysis. In this paper, we propose a simple yet effective color saliency model that considers color as the only visual cue and mimics the color processing in V1. Our approach uses region-/boundary-defined color features with spatiochromatic filtering by considering local color-orientation interactions, therefore captures homogeneous color elements, subtle textures within the object and the overall salient object from the color image. To account for color contextual influences, we present a divisive normalization method for chromatic stimuli through the pooling of contrary/complementary color units. We further define a color perceptual metric over the entire scene to produce saliency maps for color regions and color boundaries individually. These maps are finally globally integrated into a one single saliency map. The final saliency map is produced by Gaussian blurring for robustness. We evaluate the proposed method on both synthetic stimuli and several benchmark saliency data sets from the visual saliency analysis to salient object detection. The experimental results demonstrate that the use of color as a unique visual cue achieves competitive results on par with or better than 12 state

  10. Towards three-dimensional visual saliency

    OpenAIRE

    Sharma, Puneet

    2014-01-01

    A salient image region is defined as an image part that is clearly different from its surround in terms of a number of attributes. In bottom-up processing, these attributes are defined as: contrast, color difference, brightness, and orientation. By measuring these attributes, visual saliency algorithms aim to predict the regions in an image that would attract our attention under free viewing conditions, i.e., when the observer is viewing an image without a specific task such as searching for ...

  11. Bottom-Up Colloidal Crystal Assembly with a Twist.

    Science.gov (United States)

    Mahynski, Nathan A; Rovigatti, Lorenzo; Likos, Christos N; Panagiotopoulos, Athanassios Z

    2016-05-24

    Globally ordered colloidal crystal lattices have broad utility in a wide range of optical and catalytic devices, for example, as photonic band gap materials. However, the self-assembly of stereospecific structures is often confounded by polymorphism. Small free-energy differences often characterize ensembles of different structures, making it difficult to produce a single morphology at will. Current techniques to handle this problem adopt one of two approaches: that of the "top-down" or "bottom-up" methodology, whereby structures are engineered starting from the largest or smallest relevant length scales, respectively. However, recently, a third approach for directing high fidelity assembly of colloidal crystals has been suggested which relies on the introduction of polymer cosolutes into the crystal phase [Mahynski, N.; Panagiotopoulos, A. Z.; Meng, D.; Kumar, S. K. Nat. Commun. 2014, 5, 4472]. By tuning the polymer's morphology to interact uniquely with the void symmetry of a single desired crystal, the entropy loss associated with polymer confinement has been shown to strongly bias the formation of that phase. However, previously, this approach has only been demonstrated in the limiting case of close-packed crystals. Here, we show how this approach may be generalized and extended to complex open crystals, illustrating the utility of this "structure-directing agent" paradigm in engineering the nanoscale structure of ordered colloidal materials. The high degree of transferability of this paradigm's basic principles between relatively simple crystals and more complex ones suggests that this represents a valuable addition to presently known self-assembly techniques. PMID:27124487

  12. Building Models from the Bottom Up: The HOBBES Project

    Science.gov (United States)

    Medellin-Azuara, J.; Sandoval Solis, S.; Lund, J. R.; Chu, W.

    2013-12-01

    Water problems are often bigger than technical and data challenges associated in representing a water system using a model. Controversy and complexity is inherent when water is to be allocated among different uses making difficult to maintain coherent and productive discussions on addressing water problems. Quantification of a water supply system through models has proven to be helpful to improve understanding, explore and develop adaptable solutions to water problems. However, models often become too large and complex and become hostages of endless discussions of the assumptions, their algorithms and their limitations. Data management organization and documentation keep model flexible and useful over time. The UC Davis HOBBES project is a new approach, building models from the bottom up. Reversing the traditional model development, where data are arranged around a model algorithm, in Hobbes the data structure, organization and documentation are established first, followed by application of simulation or optimization modeling algorithms for a particular problem at hand. The HOBBES project establishes standards for storing, documenting and sharing datasets on California water system. This allows models to be developed and modified more easily and transparently, with greater comparability. Elements in the database have a spatial definition and can aggregate several infrastructural elements into detailed to coarse representations of the water system. Elements in the database represent reservoirs, groundwater basins, pumping stations, hydropower and water treatment facilities, demand areas and conveyance infrastructure statewide. These elements also host time series, economic and other information from hydrologic, economic, climate and other models. This presentation provides an overview of the project HOBBES project, its applications and prospects for California and elsewhere. The HOBBES Project

  13. Saliency computation via whitened frequency band selection.

    Science.gov (United States)

    Lv, Qi; Wang, Bin; Zhang, Liming

    2016-06-01

    Many saliency computational models have been proposed to simulate bottom-up visual attention mechanism of human visual system. However, most of them only deal with certain kinds of images or aim at specific applications. In fact, human beings have the ability to correctly select attentive focuses of objects with arbitrary sizes within any scenes. This paper proposes a new bottom-up computational model from the perspective of frequency domain based on the biological discovery of non-Classical Receptive Field (nCRF) in the retina. A saliency map can be obtained according to the idea of Extended Classical Receptive Field. The model is composed of three major steps: firstly decompose the input image into several feature maps representing different frequency bands that cover the whole frequency domain by utilizing Gabor wavelet. Secondly, whiten the feature maps to highlight the embedded saliency information. Thirdly, select some optimal maps, simulating the response of receptive field especially nCRF, to generate the saliency map. Experimental results show that the proposed algorithm is able to work with stable effect and outstanding performance in a variety of situations as human beings do and is adaptive to both psychological patterns and natural images. Beyond that, biological plausibility of nCRF and Gabor wavelet transform make this approach reliable. PMID:27275381

  14. Charge transport in bottom-up inorganic-organic and quantum-coherent nanostructures

    NARCIS (Netherlands)

    Makarenko, Ksenia Sergeevna

    2015-01-01

    This thesis is based on results obtained from experiments designed for a consistent study of charge transport in bottom-up inorganic-organic and quantum-coherent nanostructures. New unconventional ways to build elements of electrical circuits (like dielectrophoresis, wedging transfer and bottom-up f

  15. Visual saliency computations: mechanisms, constraints, and the effect of feedback.

    Science.gov (United States)

    Soltani, Alireza; Koch, Christof

    2010-09-22

    The primate visual system continuously selects spatial proscribed regions, features or objects for further processing. These selection mechanisms--collectively termed selective visual attention--are guided by intrinsic, bottom-up and by task-dependent, top-down signals. While much psychophysical research has shown that overt and covert attention is partially allocated based on saliency-driven exogenous signals, it is unclear how this is accomplished at the neuronal level. Recent electrophysiological experiments in monkeys point to the gradual emergence of saliency signals when ascending the dorsal visual stream and to the influence of top-down attention on these signals. To elucidate the neural mechanisms underlying these observations, we construct a biologically plausible network of spiking neurons to simulate the formation of saliency signals in different cortical areas. We find that saliency signals are rapidly generated through lateral excitation and inhibition in successive layers of neural populations selective to a single feature. These signals can be improved by feedback from a higher cortical area that represents a saliency map. In addition, we show how top-down attention can affect the saliency signals by disrupting this feedback through its action on the saliency map. While we find that saliency computations require dominant slow NMDA currents, the signal rapidly emerges from successive regions of the network. In conclusion, using a detailed spiking network model we find biophysical mechanisms and limitations of saliency computations which can be tested experimentally. PMID:20861387

  16. Fabricating ordered functional nanostructures onto polycrystalline substrates from the bottom-up

    Energy Technology Data Exchange (ETDEWEB)

    Torres, Maria, E-mail: mtorres@drexel.edu; Pardo, Lorena; Ricote, Jesus [Instituto de Ciencia de Materiales de Madrid (Spain); Fuentes-Cobas, Luis E. [Centro de Investigacion en Materiales Avanzados (Mexico); Rodriguez, Brian J. [University College Dublin, Belfield, School of Physics (Ireland); Calzada, M. Lourdes, E-mail: lcalzada@icmm.csic.es [Instituto de Ciencia de Materiales de Madrid (Spain)

    2012-10-15

    Microemulsion-mediated synthesis has emerged as a powerful bottom-up procedure for the preparation of ferroelectric nanostructures onto substrates. However, periodical order has yet to be achieved onto polycrystalline Pt-coated Si substrates. Here, we report a new methodology that involves microemulsion-mediated synthesis and the controlled modification of the surface of the substrate by coating it with a template-layer of water-micelles. This layer modifies the surface tension of the substrate and yields a periodic arrangement of ferroelectric crystalline nanostructures. The size of the nanostructures is decreased to the sub-50 nm range and they show a hexagonal order up to the third neighbors, which corresponds to a density of 275 Gb in{sup -2}. The structural analysis of the nanostructures by synchrotron X-ray diffraction confirms that the nanostructures have a PbTiO{sub 3} perovskite structure, with lattice parameters of a = b = 3.890(0) A and c = 4.056(7) A. Piezoresponse force microscopy confirmed the ferro-piezoelectric character of the nanostructures. This simple methodology is valid for the self-assembly of other functional oxides onto polycrystalline substrates, enabling their reliable integration into micro/nano devices.

  17. Cooperation between Top-Down and Bottom-Up Theorem Provers by Subgoal Clause Transfer

    OpenAIRE

    Fuchs, Dirk

    1999-01-01

    Top-down and bottom-up theorem proving approaches have each specific ad-vantages and disadvantages. Bottom-up provers profit from strong redundancycontrol and suffer from the lack of goal-orientation, whereas top-down provers aregoal-oriented but have weak calculi when their proof lengths are considered. Inorder to integrate both approaches our method is to achieve cooperation betweena top-down and a bottom-up prover: The top-down prover generates subgoalclauses, then they are processed by a ...

  18. Mapping practices of project management – merging top-down and bottom-up perspectives

    DEFF Research Database (Denmark)

    Thuesen, Christian

    2015-01-01

    This paper presents a new methodology for studying different accounts of project management practices based on network mapping and analysis. Drawing upon network mapping and visualization as an analytical strategy top-down and bottom-up accounts of project management practice are analysed...... and compared. The analysis initially reveals a substantial difference between the top-down and bottom-up accounts of practice. Furthermore it identifies a soft side of project management that is central in the bottom-up account but absent from the top-down. Finally, the study shows that network mapping...... is a promising strategy for visualizing and analysing different accounts of project management practices....

  19. Bottom-up and top-down effects on plant communities

    DEFF Research Database (Denmark)

    Souza, Lara; Zelikova, Tamara Jane; Sanders, Nate

    2016-01-01

    Top-down effects of herbivores and bottom-up effects of nutrients shape productivity and diversity across ecosystems, yet their single and combined effects on spatial and temporal beta diversity is unknown. We established a field experiment in which the abundance of insect herbivores (top...... herbivores did not alter plant richness (α diversity) yet consistently promoted Shannon's evenness, relative to plots where insect herbivores were present. Further, insect herbivores promoted spatial-temporal β diversity. Overall, we found that the relative importance of top-down and bottom-up controls......-down) and soil nitrogen (bottom-up) were manipulated over six years in an existing old-field community. We tracked plant α and β diversity - within plot richness and among plot biodiversity- and aboveground net primary productivity (ANPP) over the course of the experiment. We found that bottom-up factors...

  20. A Bottom up Initiative: Meditation & Mindfulness 'Eastern' Practices in the "Western" Academia

    DEFF Research Database (Denmark)

    Singla, Rashmi

    a case of bottom up initiative, where the students themselves have demanded inclusion of non- conventional psychosocial interventions illustrated by meditation and mindfulness as Eastern psychological practices, thus filling the gap related to the existential, spiritual approaches. The western...

  1. Social and ethical checkpoints for bottom-up synthetic biology, or protocells.

    Science.gov (United States)

    Bedau, Mark A; Parke, Emily C; Tangen, Uwe; Hantsche-Tangen, Brigitte

    2009-12-01

    An alternative to creating novel organisms through the traditional "top-down" approach to synthetic biology involves creating them from the "bottom up" by assembling them from non-living components; the products of this approach are called "protocells." In this paper we describe how bottom-up and top-down synthetic biology differ, review the current state of protocell research and development, and examine the unique ethical, social, and regulatory issues raised by bottom-up synthetic biology. Protocells have not yet been developed, but many expect this to happen within the next five to ten years. Accordingly, we identify six key checkpoints in protocell development at which particular attention should be given to specific ethical, social and regulatory issues concerning bottom-up synthetic biology, and make ten recommendations for responsible protocell science that are tied to the achievement of these checkpoints. PMID:19816801

  2. The updated bottom up solution applied to atmospheric pressure photoionization and electrospray ionization mass spectrometry

    Science.gov (United States)

    The Updated Bottom Up Solution (UBUS) was recently applied to atmospheric pressure chemical ionization (APCI) mass spectrometry (MS) of triacylglycerols (TAGs). This report demonstrates that the UBUS applies equally well to atmospheric pressure photoionization (APPI) MS and to electrospray ionizatio...

  3. Methane and ethane from global oil and gas production: bottom-up simulations over three decades

    OpenAIRE

    L. Höglund-Isaksson

    2016-01-01

    Existing bottom-up emission inventories of historical methane and ethane emissions from global oil and gas systems do not well explain year-on-year variations estimated by top-down models from atmospheric measurements. This paper develops a bottom-up methodology which allows for country- and year specific source attribution of methane and ethane emissions from global oil and natural gas production for the period 1980 to 2012. The analysis rests on country-specific simulations of associated ga...

  4. Mapping practices of project management – merging top-down and bottom-up perspectives

    OpenAIRE

    Thuesen, Christian

    2015-01-01

    This paper presents a new methodology for studying different accounts of project management practices based on network mapping and analysis. Drawing upon network mapping and visualization as an analytical strategy top-down and bottom-up accounts of project management practice are analysed and compared. The analysis initially reveals a substantial difference between the top-down and bottom-up accounts of practice. Furthermore it identifies a soft side of project management that is central in t...

  5. Bottoms Up

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    China’s high-end liquor is becoming a luxury item and a favorite among collectors spring Festival, the most important festival for the Chinese, is a time for celebration—and what would a celebration be without bottles of holi-

  6. Saliency-Based Fidelity Adaptation Preprocessing for Video Coding

    Institute of Scientific and Technical Information of China (English)

    Shao-Ping Lu; Song-Hai Zhang

    2011-01-01

    In this paper, we present a video coding scheme which applies the technique of visual saliency computation to adjust image fidelity before compression. To extract visually salient features, we construct a spatio-temporal saliency map by analyzing the video using a combined bottom-up and top-down visual saliency model. We then use an extended bilateral filter, in which the local intensity and spatial scales are adjusted according to visual saliency, to adaptively alter the image fidelity. Our implementation is based on the H.264 video encoder JM12.0. Besides evaluating our scheme with the H.264 reference software, we also compare it to a more traditional foreground-background segmentation-based method and a foveation-based approach which employs Gaussian blurring. Our results show that the proposed algorithm can improve the compression ratio significantly while effectively preserving perceptual visual quality.

  7. the effect of intergroup threat and social identity salience on the belief in conspiracy theories over terrorism in indonesia: collective angst as a mediator

    Directory of Open Access Journals (Sweden)

    Ali Mashuri

    2015-01-01

    Full Text Available The present study tested how intergroup threat (high versus low and social identity as a Muslim (salient versus non-salient affected belief in conspiracy theories. Data among Indonesian Muslim students (N = 139 from this study demonstrated that intergroup threat and social identity salience interacted to influence belief in conspiracy theories. High intergroup threat triggered greater belief in conspiracy theories than low intergroup threat, more prominently in the condition in which participants’ Muslim identity was made salient. Collective angst also proved to mediate the effect of intergroup threat on the belief. However, in line with the prediction, evidence of this mediation effect of collective angst was only on the salient social identity condition. Discussions on these research findings build on both theoretical and practical implications.

  8. Visual Saliency Computations: Mechanisms, Constraints, and the Effect of Feedback

    OpenAIRE

    Soltani, Alireza; Koch, Christof

    2010-01-01

    The primate visual system continuously selects spatial proscribed regions, features or objects for further processing. These selection mechanisms—collectively termed selective visual attention—are guided by intrinsic, bottom-up and by task-dependent, top-down signals. While much psychophysical research has shown that overt and covert attention is partially allocated based on saliency-driven exogenous signals, it is unclear how this is accomplished at the neuronal level. Recent electrophysiolo...

  9. High thermoelectric figure of merit nanostructured pnictogen chalcogenides by bottom-up synthesis and assembly

    Science.gov (United States)

    Mehta, Rutvik J.

    Thermoelectric materials offer promise for realizing transformative environmentallyfriendly solid-state refrigeration technologies that could replace current technologies based on ozone-depleting liquid coolants. The fruition of this vision requires factorial enhancements in the figure of merit (ZT) of thermoelectric materials, necessitating high Seebeck coefficient (alpha), high electrical conductivity (sigma) and low thermal conductivity (kappa). This thesis reports a novel bottom-up approach to scalably sculpt large quantities (>10g/minute) of V 2VI3 nanocrystals with controllable shapes and sizes, and assemble them into bulk samples to obtain both high power factors alpha 2sigma as well as unprecedentedly low kappa through tunable doping and nanostructuring. The thesis demonstrates a surfactant-mediated microwave-solvothermal synthesis technique that selectively yields both n- and p-typed pnictogen chalcogenide (Bi2Te3, Sb2Te3, Bi2Se3) nanoplates and, nanowires and nanotubes (Sb 2Se3) that can be sintered to obtain 25-250 % increases in ZT>1 compared to their non-nanostructured and un-doped counterparts. A key result is that nanostructuring diminishes the lattice thermal conductivity kappa L to ultra-low values of 0.2-0.5 Wm-1K-1. Sub-atomic-percent sulfur doping and sulfurization of the pnictogen chalcogenides induced through mercaptan-terminated organic surfactants used in the synthesis result in large Seebeck coefficients between -240 nanocomposites by mixing nanoplates of different materials (e.g., S-doped Sb2Te3 and S-doped Bi2Te3) and forming heterostructures of metals and chalcogenides. The thesis finally demonstrates the extendibility of the novel synthesis and assembly approach to tailor the thermoelectric properties of other non-traditional thermoelectric materials systems.

  10. Increased performance in a bottom-up designed robot by experimentally guided redesign

    DEFF Research Database (Denmark)

    Larsen, Jørgen Christian

    2013-01-01

    Purpose – Using a bottom-up, model-free approach when building robots is often seen as a less scientific way, compared to a top-down model-based approach, because the results are not easily generalizable to other systems. The authors, however, hypothesize that this problem may be addressed by using...... solid experimental methods. The purpose of this paper is to show how well-known experimental methods from bio-mechanics are used to measure and locate weaknesses in a bottom-up, model-free implementation of a quadruped walker and come up with a better solution. Design/methodology/approach – To study the...

  11. Social and ethical checkpoints for bottom-up synthetic biology, or protocells

    OpenAIRE

    Bedau M.A.; Parke E.C.; Tangen U.; Hantsche-Tangen B.

    2009-01-01

    An alternative to creating novel organisms through the traditional “top-down” approach to synthetic biology involves creating them from the “bottom up” by assembling them from non-living components; the products of this approach are called “protocells.” In this paper we describe how bottom-up and top-down synthetic biology differ, review the current state of protocell research and development, and examine the unique ethical, social, and regulatory issues raised by bottom-up synthetic biology....

  12. Bottom-up or top-down in dream neuroscience? A top-down critique of two bottom-up studies.

    Science.gov (United States)

    Foulkes, David; Domhoff, G William

    2014-07-01

    Recent neuroscientific studies of dreaming, specifically those in relation to waking sensory-motor impairments, but also more generally, betray a faulty understanding of the sort of process that dreaming is. They adhere to the belief that dreaming is a bottom-up phenomenon, whose form and content is dictated by sensory-motor brain stem activity, rather than a top-down process initiated and controlled by higher-level cognitive systems. But empirical data strongly support the latter alternative, and refute the conceptualization and interpretation of recent studies of dreaming in sensory-motor impairment in particular and of recent dream neuroscience in general. PMID:24905546

  13. A constraint-based bottom-up counterpart to definite clause grammars

    DEFF Research Database (Denmark)

    Christiansen, Henning

    2004-01-01

    A new grammar formalism, CHR Grammars (CHRG), is proposed that provides a constraint-solving approach to language analysis, built on top of the programming language of Constraint Handling Rules in the same way as Definite Clause Grammars (DCG) on Prolog. CHRG works bottom-up and adds the following...

  14. A novel bottom-up process to produce drug nanocrystals : Controlled crystallization during freeze-drying

    NARCIS (Netherlands)

    de Waard, H; Hinrichs, W L J; Frijlink, H W

    2008-01-01

    To improve the dissolution behavior of lipophilic drugs, a novel bottom-up process based upon freeze drying which allows for the production of nanocrystalline particles was developed: "controlled crystallization during freeze drying". This novel process could strongly increase the dissolution behavi

  15. Nanoelectronics: the Hall Effect and Measurement of Electrochemical Potentials by «Bottom-Up» Approach

    OpenAIRE

    Yu.A. Kruglyak; P.A. Kondratenko; Yu.М. Lopatkin

    2015-01-01

    Classical and quantum Hall effects, measurement of electrochemical potentials, the Landauer formulas and Buttiker formula, measurement of Hall potential, an account of magnetic field in the NEGF method, quantum Hall effect, Landau method, and edge states in graphene are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  16. Bottom-Up Molecular Tunneling Junctions Formed by Self-Assembly

    NARCIS (Netherlands)

    Zhang, Yanxi; Zhao, Zhiyuan; Fracasso, Davide; Chiechi, Ryan C

    2014-01-01

    This Minireview focuses on bottom-up molecular tunneling junctions - a fundamental component of molecular electronics - that are formed by self-assembly. These junctions are part of devices that, in part, fabricate themselves, and therefore, are particularly dependent on the chemistry of the molecul

  17. Learning affects top down and bottom up modulation of eye movements in decision making

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund; Bagger, Martin; Mueller Loose, Simone

    2013-01-01

    different information presentation formats. We thereby operationalized top down and bottom up control as the effect of individual utility levels and presentation formats on attention capture on a trial-by-trial basis. The experiment revealed an increase in top down control of eye movements over time...

  18. An integrated top-down and bottom-up strategy for characterization protein isoforms and modifications

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Si; Tolic, Nikola; Tian, Zhixin; Robinson, Errol W.; Pasa-Tolic, Ljiljana

    2011-04-15

    Bottom-up and top-down strategies are two commonly used methods for mass spectrometry (MS) based protein identification; each method has its own advantages and disadvantages. In this chapter, we describe an integrated top-down and bottom-up approach facilitated by concurrent liquid chromatography-mass spectrometry (LC-MS) analysis and fraction collection for comprehensive high-throughput intact protein profiling. The approach employs a high resolution reversed phase (RP) LC separation coupled with LC eluent fraction collection and concurrent on-line MS with a high field (12 Tesla) Fourier-transform ion cyclotron resonance (FTICR) mass spectrometer. Protein elusion profiles and tentative modified protein identification are made using detected intact protein mass in conjunction with bottom-up protein identifications from the enzymatic digestion and analysis of corresponding LC fractions. Specific proteins of biological interest are incorporated into a target ion list for subsequent off-line gas-phase fragmentation that uses an aliquot of the original collected LC fraction, an aliquot of which was also used for bottom-up analysis.

  19. Ways toward a European Vocational Education and Training Space: A "Bottom-Up" Approach

    Science.gov (United States)

    Blings, Jessica; Spottl, Georg

    2008-01-01

    Purpose: This paper seeks to concentrate on bottom-up approaches in order to promote a European vocational education and training (VET) concept. The overall aim of this article is to demonstrate that sophisticated approaches still have a chance of becoming common practice in European countries. Design/methodology/approach: The centre of the…

  20. Managing Bottom up Strategizing : Collective Sensemaking of Strategic Issues in a Dutch Bank

    NARCIS (Netherlands)

    van der Steen, Martijn

    2016-01-01

    This paper discusses a bottom-up approach to strategizing in two member banks of a Dutch cooperative bank. In both banks, through a collective process of sensemaking, organisational participants evaluated their day-to-day experiences in order to identify strategic issues. The potential benefits of s

  1. Bottom-up GGM algorithm for constructing multiple layered hierarchical gene regulatory networks

    Science.gov (United States)

    Multilayered hierarchical gene regulatory networks (ML-hGRNs) are very important for understanding genetics regulation of biological pathways. However, there are currently no computational algorithms available for directly building ML-hGRNs that regulate biological pathways. A bottom-up graphic Gaus...

  2. Nanoelectronics: the Hall Effect and Measurement of Electrochemical Potentials by «Bottom-Up» Approach

    Directory of Open Access Journals (Sweden)

    Yu.A. Kruglyak

    2015-06-01

    Full Text Available Classical and quantum Hall effects, measurement of electrochemical potentials, the Landauer formulas and Buttiker formula, measurement of Hall potential, an account of magnetic field in the NEGF method, quantum Hall effect, Landau method, and edge states in graphene are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  3. Oriented bottom-up growth of armchair graphene nanoribbons on germanium

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, Michael Scott; Jacobberger, Robert Michael

    2016-03-15

    Graphene nanoribbon arrays, methods of growing graphene nanoribbon arrays and electronic and photonic devices incorporating the graphene nanoribbon arrays are provided. The graphene nanoribbons in the arrays are formed using a scalable, bottom-up, chemical vapor deposition (CVD) technique in which the (001) facet of the germanium is used to orient the graphene nanoribbon crystals along the [110] directions of the germanium.

  4. A proto-object-based computational model for visual saliency.

    Science.gov (United States)

    Yanulevskaya, Victoria; Uijlings, Jasper; Geusebroek, Jan-Mark; Sebe, Nicu; Smeulders, Arnold

    2013-01-01

    State-of-the-art bottom-up saliency models often assign high saliency values at or near high-contrast edges, whereas people tend to look within the regions delineated by those edges, namely the objects. To resolve this inconsistency, in this work we estimate saliency at the level of coherent image regions. According to object-based attention theory, the human brain groups similar pixels into coherent regions, which are called proto-objects. The saliency of these proto-objects is estimated and incorporated together. As usual, attention is given to the most salient image regions. In this paper we employ state-of-the-art computer vision techniques to implement a proto-object-based model for visual attention. Particularly, a hierarchical image segmentation algorithm is used to extract proto-objects. The two most powerful ways to estimate saliency, rarity-based and contrast-based saliency, are generalized to assess the saliency at the proto-object level. The rarity-based saliency assesses if the proto-object contains rare or outstanding details. The contrast-based saliency estimates how much the proto-object differs from the surroundings. However, not all image regions with high contrast to the surroundings attract human attention. We take this into account by distinguishing between external and internal contrast-based saliency. Where the external contrast-based saliency estimates the difference between the proto-object and the rest of the image, the internal contrast-based saliency estimates the complexity of the proto-object itself. We evaluate the performance of the proposed method and its components on two challenging eye-fixation datasets (Judd, Ehinger, Durand, & Torralba, 2009; Subramanian, Katti, Sebe, Kankanhalli, & Chua, 2010). The results show the importance of rarity-based and both external and internal contrast-based saliency in fixation prediction. Moreover, the comparison with state-of-the-art computational models for visual saliency demonstrates the

  5. The life cycle of bottom-up ideas : case studies of the companies where the simulation game method was applied

    OpenAIRE

    Forssén, Minna

    2002-01-01

    The main aim of this thesis is to study the life cycle of the incremental "bottom-up" ideas, which concern process and organizational matters. According to earlier studies, bottom-up ideas are not always successfully used and managed and as well there exists need for more study on organizational and process innovations. It is therefore useful to study this phenomenon more and gain more information about how organizations manage the development and implementation of these bottom-up ideas. ...

  6. Spectral saliency via automatic adaptive amplitude spectrum analysis

    Science.gov (United States)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  7. The generation of myricetin-nicotinamide nanococrystals by top down and bottom up technologies.

    Science.gov (United States)

    Liu, Mingyu; Hong, Chao; Li, Guowen; Ma, Ping; Xie, Yan

    2016-09-30

    Myricetin-nicotinamide (MYR-NIC) nanococrystal preparation methods were developed and optimized using both top down and bottom up approaches. The grinding (top down) method successfully achieved nanococrystals, but there were some micrometer range particles and aggregation. The key consideration of the grinding technology was to control the milling time to determine a balance between the particle size and distribution. In contrast, a modified bottom up approach based on a solution method in conjunction with sonochemistry resulted in a uniform MYR-NIC nanococrystal that was confirmed by powder x-ray diffraction, scanning electron microscopy, dynamic light scattering, and differential scanning calorimeter, and the particle dissolution rate and amount were significantly greater than that of MYR-NIC cocrystal. Notably, this was a simple method without the addition of any non-solvent. We anticipate our findings will provide some guidance for future nanococrystal preparation as well as its application in both chemical and pharmaceutical area. PMID:27535365

  8. A balance of bottom-up and top-down in linking climate policies

    Science.gov (United States)

    Green, Jessica F.; Sterner, Thomas; Wagner, Gernot

    2014-12-01

    Top-down climate negotiations embodied by the Kyoto Protocol have all but stalled, chiefly because of disagreements over targets and objections to financial transfers. To avoid those problems, many have shifted their focus to linkage of bottom-up climate policies such as regional carbon markets. This approach is appealing, but we identify four obstacles to successful linkage: different levels of ambition; competing domestic policy objectives; objections to financial transfers; and the difficulty of close regulatory coordination. Even with a more decentralized approach, overcoming the 'global warming gridlock' of the intergovernmental negotiations will require close international coordination. We demonstrate how a balance of bottom-up and top-down elements can create a path toward an effective global climate architecture.

  9. Mindfulness meditation associated with alterations in bottom-up processing: psychophysiological evidence for reduced reactivity.

    Science.gov (United States)

    van den Hurk, Paul A M; Janssen, Barbara H; Giommi, Fabio; Barendregt, Henk P; Gielen, Stan C

    2010-11-01

    Mental training by meditation has been related to changes in high-level cognitive functions that involve top-down processing. The aim of this study was to investigate whether the practice of meditation is also related to alterations in low-level, bottom-up processing. Therefore, intersensory facilitation (IF) effects in a group of mindfulness meditators (MM) were compared to IF effects in an age- and gender-matched control group. Smaller and even absent IF effects were found in the MM group, which suggests that changes in bottom-up processing are associated with MM. Furthermore, reduced interference of a visual warning stimulus with the IF effects was found, which suggests an improved allocation of attentional resources in mindfulness meditators, even across modalities.

  10. The generation of myricetin-nicotinamide nanococrystals by top down and bottom up technologies

    Science.gov (United States)

    Liu, Mingyu; Hong, Chao; Li, Guowen; Ma, Ping; Xie, Yan

    2016-09-01

    Myricetin-nicotinamide (MYR-NIC) nanococrystal preparation methods were developed and optimized using both top down and bottom up approaches. The grinding (top down) method successfully achieved nanococrystals, but there were some micrometer range particles and aggregation. The key consideration of the grinding technology was to control the milling time to determine a balance between the particle size and distribution. In contrast, a modified bottom up approach based on a solution method in conjunction with sonochemistry resulted in a uniform MYR-NIC nanococrystal that was confirmed by powder x-ray diffraction, scanning electron microscopy, dynamic light scattering, and differential scanning calorimeter, and the particle dissolution rate and amount were significantly greater than that of MYR-NIC cocrystal. Notably, this was a simple method without the addition of any non-solvent. We anticipate our findings will provide some guidance for future nanococrystal preparation as well as its application in both chemical and pharmaceutical area.

  11. A VHDL-AMS Modeling Methodology for Top-Down/Bottom-Up Design of RF Systems

    OpenAIRE

    Maehne, Torsten; Vachoux, Alain; Giroud, Frédéric; Contaldo, Matteo

    2009-01-01

    This paper presents a modelling methodology for the top-down/bottom-up design of RF systems based on systematic use of VHDL-AMS models. The model interfaces are parameterizable and pin-accurate. The designer can choose to parameterize the models using performance specifications or device parameters back-annotated from the transistor-level implementation. The abstraction level used for the description of the respective analog/digital component behavior has been chosen to achieve a good t...

  12. Transition UGent: a bottom-up initiative towards a more sustainable university

    OpenAIRE

    Block, Thomas; Van de Velde, Riet

    2016-01-01

    The vibrant think-tank ‘Transition UGent’ engaged over 250 academics, students and people from the university management in suggesting objectives and actions for the Sustainability Policy of Ghent University (Belgium). Founded in 2012, this bottom-up initiative succeeded to place sustainability high on the policy agenda of our university. Through discussions within 9 working groups and using the transition management method, Transition UGent developed system analyses, sustainability visions a...

  13. Integrating top down policies and bottom up practices in Urban and Periurban Agriculture: an Italian dilemma

    OpenAIRE

    Cinà, Giuseppe; Di Iacovo, Francesco

    2015-01-01

    The paper deals with some relevant and contradictory aspects of urban and peri-urban agriculture in Italy: the traditional exclusion of agricultural areas from the goals of territorial planning; the separation between top-down policies and bottom-up practices; the lack of agricultural policies at local scale. In the first part the paper summarises the weak relation between urban planning and agriculture, showing how in Italy this gap has been only partially overcome by new laws and plans. Mor...

  14. Reforming the taxation of multijurisdictional enterprises in Europe: coopetition in a bottom-up federation

    OpenAIRE

    Gérard, Marcel

    2006-01-01

    This paper investigates replacing separate taxation by consolidation and formulary apportionment in a Bottom-up Federation, when a multijurisdictional firm is mobile in various respects. The reform is decided cooperatively by all the jurisdictions or by some of them, while tax rates remain within the competence of each jurisdiction. The paper sets forth the conditions for the reform to be social welfare enhancing, while not increasing tax competition. Among them, the formula should emphasize ...

  15. A computational study of liposome logic: towards cellular computing from the bottom up

    OpenAIRE

    Smaldon, James; Romero-Campero, Francisco J.; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron; Krasnogor, Natalio

    2010-01-01

    In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This “liposome logic” approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in “top-down” synthetic biology, particularly in the specification, design and impl...

  16. Public participation GIS to support a bottom-up approach in forest landscape planning

    OpenAIRE

    Paletto A; Lora C; Frattegiani M; De Meo I; Ferretti F

    2013-01-01

    Forest landscape planning analyses all forest aspects (economic, ecological and social) and defines long-term forest management guidelines. Various actors are involved in landscape planning; therefore the analysis needs to take into account goals and targets of the different stakeholders. The participatory process can strongly support the development of a bottom-up forest plan definition when stakeholders are involved throughout the decision-making process. In this way, management guidelines ...

  17. On the interactions between top-down anticipation and bottom-up regression

    Directory of Open Access Journals (Sweden)

    Jun Tani

    2007-11-01

    Full Text Available This paper discusses the importance of anticipation and regression in modeling cognitive behavior. The meanings of these cognitive functions are explained by describing our proposed neural network model which has been implemented on a set of cognitive robotics experiments. The reviews of these experiments suggest that the essences of embodied cognition may reside in the phenomena of the break-down between the top-down anticipation and the bottom-up regression and in its recovery process.

  18. The Application of Bottom-up and Top-down Processing in L2 Listening Comprehension

    Institute of Scientific and Technical Information of China (English)

    温颖茜

    2008-01-01

    Listening comprehension is one of the four basic skills for language learning and is also one of the most difficult tasks L2 learners ever experienced.L2 listening comprehemion is a cognitvive process,in which listeners use both bottom-up andtop-downprocessing to comprehend the auraltext.Thepaper focmes on the applicationof the two approaches in L2 lis-tening comprehemiom

  19. Self-assembled nanostructured resistive switching memory devices fabricated by templated bottom-up growth

    OpenAIRE

    Ji-Min Song; Jang-Sik Lee

    2016-01-01

    Metal-oxide-based resistive switching memory device has been studied intensively due to its potential to satisfy the requirements of next-generation memory devices. Active research has been done on the materials and device structures of resistive switching memory devices that meet the requirements of high density, fast switching speed, and reliable data storage. In this study, resistive switching memory devices were fabricated with nano-template-assisted bottom up growth. The electrochemical ...

  20. Bottom-Up Cost Analysis of a High Concentration PV Module

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, Kelsey A. W.; Woodhouse, Michael; Lee, Hohyun; Smestad, Greg P.

    2016-03-31

    We present a bottom-up model of III-V multi-junction cells, as well as a high concentration PV (HCPV) module. We calculate $0.59/W(DC) manufacturing costs for our model HCPV module design with today's capabilities, and find that reducing cell costs and increasing module efficiency offer the most promising paths for future cost reductions. Cell costs could be significantly reduced via substrate reuse and improved manufacturing yields.

  1. Environmental Sustainability and Regulation: To-Down Versus Bottom-Up Regulation

    OpenAIRE

    Mariam, Yohannes

    2001-01-01

    Environmental regulation can be broadly divided into those that follow the top-down and bottom-up approaches. The two approaches have similar objective with respect to environmental protection and sustainability. However, the success with which each approach achieves goals of environmental protection and sustainability may vary. Moreover, the costs and benefits of each approach differ. The present study will explore the implication of environmental regulation to sustainability, costs associat...

  2. Bottom-up effects of soil quality on a coffee arthropod interaction web

    OpenAIRE

    Gonthier, DJ; Dominguez, GM; Witter, JD; Spongberg, AL; Philpott, SM

    2013-01-01

    Nutrient availability and soil quality influence herbivores through changes in plant traits and can have cascading effects on herbivore interactions. In complex systems, with many positive and negative interactions, the consequences of these bottom-up effects are still not well established. We carried out a set of studies to determine the impact of soil quality (organic compost amendments) on a hemipteran herbivore (Coccus viridis), two ant mutualists, predators, pathogens, parasitoids of C. ...

  3. Combining shape and color: a bottom-up approach to evaluate object similarities

    OpenAIRE

    PASCUCCI, ALESSIO

    2011-01-01

    The objective of the present work is to develop a bottom-up approach to estimate the similarity between two unknown objects. Given a set of digital images, we want to identify the main objects and to determine whether they are similar or not. In the last decades many object recognition and classification strategies, driven by higher-level activities, have been successfully developed. The peculiarity of this work, instead, is the attempt to work without any training phase nor a priori knowledg...

  4. Complex numerical responses to top-down and bottom-up processes in vertebrate populations.

    OpenAIRE

    A. R. E. Sinclair; Krebs, Charles J.

    2002-01-01

    Population growth rate is determined in all vertebrate populations by food supplies, and we postulate bottom-up control as the universal primary standard. But this primary control system can be overridden by three secondary controls: top-down processes from predators, social interactions within the species and disturbances. Different combinations of these processes affect population growth rates in different ways. Thus, some relationships between growth rate and density can be hyperbolic or e...

  5. Bottom-up graphene-nanoribbon fabrication reveals chiral edges and enantioselectivity.

    Science.gov (United States)

    Han, Patrick; Akagi, Kazuto; Federici Canova, Filippo; Mutoh, Hirotaka; Shiraki, Susumu; Iwaya, Katsuya; Weiss, Paul S; Asao, Naoki; Hitosugi, Taro

    2014-09-23

    We produce precise chiral-edge graphene nanoribbons on Cu{111} using self-assembly and surface-directed chemical reactions. We show that, using specific properties of the substrate, we can change the edge conformation of the nanoribbons, segregate their adsorption chiralities, and restrict their growth directions at low surface coverage. By elucidating the molecular-assembly mechanism, we demonstrate that our method constitutes an alternative bottom-up strategy toward synthesizing defect-free zigzag-edge graphene nanoribbons.

  6. Nanomaterial processing using self-assembly-bottom-up chemical and biological approaches

    International Nuclear Information System (INIS)

    Nanotechnology is touted as the next logical sequence in technological evolution. This has led to a substantial surge in research activities pertaining to the development and fundamental understanding of processes and assembly at the nanoscale. Both top-down and bottom-up fabrication approaches may be used to realize a range of well-defined nanostructured materials with desirable physical and chemical attributes. Among these, the bottom-up self-assembly process offers the most realistic solution toward the fabrication of next-generation functional materials and devices. Here, we present a comprehensive review on the physical basis behind self-assembly and the processes reported in recent years to direct the assembly of nanoscale functional blocks into hierarchically ordered structures. This paper emphasizes assembly in the synthetic domain as well in the biological domain, underscoring the importance of biomimetic approaches toward novel materials. In particular, two important classes of directed self-assembly, namely, (i) self-assembly among nanoparticle–polymer systems and (ii) external field-guided assembly are highlighted. The spontaneous self-assembling behavior observed in nature that leads to complex, multifunctional, hierarchical structures within biological systems is also discussed in this review. Recent research undertaken to synthesize hierarchically assembled functional materials have underscored the need as well as the benefits harvested in synergistically combining top-down fabrication methods with bottom-up self-assembly. (review article)

  7. A Bottom-up Trend in Research of Management of Technology

    Directory of Open Access Journals (Sweden)

    Yoko Ishino

    2014-12-01

    Full Text Available Management of Technology (MOT is defined as an academic discipline of management that enables organizations to manage their technological fundamentals to create competitive advantage. MOT covers a wide range of contents including administrative strategy, R&D management, manufacturing management, technology transfer, production control, marketing, accounting, finance, business ethics, and others. For each topic, researchers have conducted their MOT research at various levels. However, a practical and pragmatic side of MOT surely affects its research trends. Finding changes of MOT research trends, or the chronological transitions of principal subjects, can help understand the key concepts of current MOT. This paper studied a bottom-up trend in research fields in MOT by applying a text-mining method to the conference proceedings of IAMOT (International Association for Management of Technology. First, focusing on only nouns found several keywords, which more frequently emerge over time in the IAMOT proceedings. Then, expanding the scope into other parts of speech viewed the keywords in a natural context. Finally, it was found that the use of an important keyword has qualitatively and quantitatively extended over time. In conclusion, a bottom-up trend in MOT research was detected and the effects of the social situation on the trend were discussed.Keywords: Management of Technology; Text Mining; Research Trend; Bottom-up Trend; Patent

  8. Top-down or bottom-up modelling. An application to CO2 abatement

    International Nuclear Information System (INIS)

    In four articles a comparison is made of bottom-up, or engineers'' models, and top-down models, which comprise macro-econometric models, computable general equilibrium models and also models in the system dynamics tradition. In the first article the history of economic modelling is outlined. In the second article the multi-sector macro-economic Computable General Equilibrium model for the Netherlands is described. It can be used to study the long-term effects of fiscal policy measures on economic and environmental indicators, in particular the effects on the level of CO2-emissions. The aim of article 3 is to describe the structure of the electricity supply industry in the UK and how it can be represented in a bottom-up sub-model within a more general E3 sectoral model of the UK economy. The objective of the last paper (4) is mainly a methodological discussion about integrating top-down and bottom-up models which can be used to assess CO2 abatement policies impacts on economic activity

  9. Piezoresistive characterization of bottom-up, n-type silicon microwires undergoing bend deformation

    Energy Technology Data Exchange (ETDEWEB)

    McClarty, Megan M.; Oliver, Derek R., E-mail: Michael.Freund@umanitoba.ca, E-mail: Derek.Oliver@umanitoba.ca [Department of Electrical and Computer Engineering, University of Manitoba, Winnipeg R3T 5V6 (Canada); Bruce, Jared P.; Freund, Michael S., E-mail: Michael.Freund@umanitoba.ca, E-mail: Derek.Oliver@umanitoba.ca [Department of Chemistry, University of Manitoba, Winnipeg R3T 2N2 (Canada)

    2015-01-12

    The piezoresistance of silicon has been studied over the past few decades in order to characterize the material's unique electromechanical properties and investigate their wider applicability. While bulk and top-down (etched) micro- and nano-wires have been studied extensively, less work exists regarding bottom-up grown microwires. A facile method is presented for characterizing the piezoresistance of released, phosphorus-doped silicon microwires that have been grown, bottom-up, via a chemical vapour deposition, vapour-liquid-solid process. The method uses conductive tungsten probes to simultaneously make electrical measurements via direct ohmic contact and apply mechanical strain via bend deformation. These microwires display piezoresistive coefficients within an order of magnitude of those expected for bulk n-type silicon; however, they show an anomalous response at degenerate doping concentrations (∼10{sup 20 }cm{sup −3}) when compared to lower doping concentrations (∼10{sup 17 }cm{sup −3}), with a stronger piezoresistive coefficient exhibited for the more highly doped wires. This response is postulated to be due to the different growth mechanism of bottom-up microwires as compared to top-down.

  10. BoB: Best of Both in Compiler Construction Bottom-up Parsing with Top-down Semantic Evaluation

    Directory of Open Access Journals (Sweden)

    Wolfgang Dichler

    Full Text Available Compilers typically use either a top-down or a bottom-up strategy for parsing as well as semantic evaluation. Both strategies have advantages and disadvantages: bottom-up parsing supports LR(k grammars but is limited to S- or LR-attribution while top-dow ...

  11. Elucidating the role of D4 receptors in mediating attributions of salience to incentive stimuli on Pavlovian conditioned approach and conditioned reinforcement paradigms.

    Science.gov (United States)

    Cocker, P J; Vonder Haar, C; Winstanley, C A

    2016-10-01

    The power of drug-associated cues to instigate drug 'wanting' and consequently promote drug seeking is a corner stone of contemporary theories of addiction. Gambling disorder has recently been added to the pantheon of addictive disorders due to the phenomenological similarities between the diseases. However, the neurobiological mechanism that may mediate increased sensitivity towards conditioned stimuli in addictive disorders is unclear. We have previously demonstrated using a rodent analogue of a simple slot machine that the dopamine D4 receptor is critically engaged in controlling animals' attribution of salience to stimuli associated with reward in this paradigm, and consequently may represent a target for the treatment of gambling disorder. Here, we investigated the role of acute administration of a D4 receptor agonist on animals' responsivity to conditioned stimuli on both a Pavlovian conditioned approach (autoshaping) and a conditioned reinforcement paradigm. Following training on one of the two tasks, separate cohorts of rats (male and female) were administered a dose of PD168077 shown to be maximally effective at precipitating errors in reward expectancy on the rat slot machine task (10mg/kg). However, augmenting the activity of the D4 receptors in this manner did not alter behaviour on either task. These data therefore provide novel evidence that the D4 receptor does not alter incentive motivation in response to cues on simple behavioural tasks.

  12. Elucidating the role of D4 receptors in mediating attributions of salience to incentive stimuli on Pavlovian conditioned approach and conditioned reinforcement paradigms.

    Science.gov (United States)

    Cocker, P J; Vonder Haar, C; Winstanley, C A

    2016-10-01

    The power of drug-associated cues to instigate drug 'wanting' and consequently promote drug seeking is a corner stone of contemporary theories of addiction. Gambling disorder has recently been added to the pantheon of addictive disorders due to the phenomenological similarities between the diseases. However, the neurobiological mechanism that may mediate increased sensitivity towards conditioned stimuli in addictive disorders is unclear. We have previously demonstrated using a rodent analogue of a simple slot machine that the dopamine D4 receptor is critically engaged in controlling animals' attribution of salience to stimuli associated with reward in this paradigm, and consequently may represent a target for the treatment of gambling disorder. Here, we investigated the role of acute administration of a D4 receptor agonist on animals' responsivity to conditioned stimuli on both a Pavlovian conditioned approach (autoshaping) and a conditioned reinforcement paradigm. Following training on one of the two tasks, separate cohorts of rats (male and female) were administered a dose of PD168077 shown to be maximally effective at precipitating errors in reward expectancy on the rat slot machine task (10mg/kg). However, augmenting the activity of the D4 receptors in this manner did not alter behaviour on either task. These data therefore provide novel evidence that the D4 receptor does not alter incentive motivation in response to cues on simple behavioural tasks. PMID:27275521

  13. Visual anticipation biases conscious perception but not bottom-up visual processing

    Directory of Open Access Journals (Sweden)

    Paul F.M.J. Verschure

    2015-01-01

    Full Text Available Theories of consciousness can be grouped with respect to their stance on embodiment, sensori-motor contingencies, prediction and integration. In this list prediction plays a key role and it is not clear which aspects of prediction are most prominent in the conscious scene. An evolving view on the brain is that it can be seen as a prediction machine that optimizes its ability to predict states of the world and the self through the top-down propagation of predictions and the bottom-up presentation of prediction errors. There are competing views though on whether prediction or prediction errors dominate the conscious scene. Yet, due to the lack of efficient indirect measures, the dynamic effects of prediction on perception, decision making and consciousness have been difficult to assess and to model. We propose a novel mathematical framework and psychophysical paradigm that allows us to assess both the hierarchical structuring of perceptual consciousness, its content and the impact of predictions and / or errors on the conscious scene. Using a displacement detection task combined with reverse correlation we reveal signatures of the usage of prediction at three different levels of perception: bottom-up early saccades, top-down driven late saccades and conscious decisions. Our results suggest that the brain employs multiple parallel mechanisms at different levels of information processing to restrict the sensory field using predictions. We observe that cognitive load has a quantifiable effect on this dissociation of the bottom-up sensory and top-down predictive processes. We propose a probabilistic data association model from dynamical systems theory to model this predictive bias in different information processing levels.

  14. Bottom-up metamaterials with an isotropic magnetic response in the visible

    Science.gov (United States)

    Mühlig, Stefan; Dintinger, José; Cunningham, Alastair; Scharf, Toralf; Bürgi, Thomas; Rockstuhl, Carsten; Lederer, Falk

    A theoretical framework to analyze the optical properties of amorphous metamaterials made from meta-atoms which are amenable for a fabrication with bottom-up technologies is introduced. The achievement of an isotropic magnetic resonance in the visible is investigated by suggesting suitable designs for the meta-atoms. Furthermore, two meta-atoms are discussed in detail that were fabricated by self-assembling plasmonic nanoparticles using techniques from the field of colloidal nanochemistry. The metamaterials are experimentally characterized by spectroscopic means and the excitation of the magnetic dipole moment is clearly revealed. Advantages and disadvantages of metamaterials made from such meta-atoms are discussed.

  15. NEMO. Netherlands Energy demand MOdel. A top-down model based on bottom-up information

    International Nuclear Information System (INIS)

    The title model links energy use to other production factors, (physical) production, energy prices, technological trends and government policies. It uses a 'putty-semiputty' vintage production structure, in which new investments, adaptations to existing capital goods (retrofit) and 'good-housekeeping' are discerned. Price elasticities are relatively large in the long term and small in the short term. Most predictions of energy use are based on either econometric models or on 'bottom-up information', i.e. disaggregated lists of technical possibilities for and costs of saving energy. Typically, one predicts more energy-efficiency improvements using bottom-up information than using econometric ('top-down') models. We bridged this so-called 'energy-efficiency gap' by designing our macro/meso model NEMO in such a way that we can use bottom-up (micro) information to estimate most model parameters. In our view, reflected in NEMO, the energy-efficiency gap arises for two reasons. The first is that firms and households use a fairly high discount rate of 15% when evaluating the profitability of energy-efficiency improvements. The second is that our bottom-up information ('ICARUS') for most economic sectors does not (as NEMO does) take account of the fact that implementation of new, energy-efficient technology in capital stock takes place only gradually. Parameter estimates for 19 sectors point at a long-term technological energy efficiency improvement trend in Netherlands final energy use of 0.8% per year. The long-term price elasticity is estimated to be 0.29. These values are comparable to other studies based on time series data. Simulations of the effects of the oil price shocks in the seventies and the subsequent fall of oil prices show that the NEMO's price elasticities are consistent with historical data. However, the present pace at which new technologies become available (reflected in NEMO) appears to be lower than in the seventies and eighties. This suggests that it

  16. Bottom-up assembly of hydrophobic nanocrystals and graphene nanosheets into mesoporous nanocomposites.

    Science.gov (United States)

    Huang, Jijiang; Liu, Wenxian; Wang, Li; Sun, Xiaoming; Huo, Fengwei; Liu, Junfeng

    2014-04-22

    A general strategy for constructing graphene-based nanocomposites is achieved by emulsion-based bottom-up self-assembly of hydrophobic nanocrystals (NCs) to positively charged colloidal spheres, followed by the electrostatic assembly of NC colloidal spheres with negatively charged graphene oxide in an acidulous aqueous solution. With a simple heat treatment, 3D mesoporous NC spheres/graphene composites are obtained. TiO2/graphene composites typically exhibit a better rate capability and cycle performance than do the corresponding isolated TiO2 spheres. PMID:24684553

  17. Scaled CMOS Reliability and Considerations for Spacecraft Systems : Bottom-Up and Top-Down Perspectives

    Science.gov (United States)

    White, Mark

    2012-01-01

    The recently launched Mars Science Laboratory (MSL) flagship mission, named Curiosity, is the most complex rover ever built by NASA and is scheduled to touch down on the red planet in August, 2012 in Gale Crater. The rover and its instruments will have to endure the harsh environments of the surface of Mars to fulfill its main science objectives. Such complex systems require reliable microelectronic components coupled with adequate component and system-level design margins. Reliability aspects of these elements of the spacecraft system are presented from bottom- up and top-down perspectives.

  18. Unsupervised tattoo segmentation combining bottom-up and top-down cues

    Science.gov (United States)

    Allen, Josef D.; Zhao, Nan; Yuan, Jiangbo; Liu, Xiuwen

    2011-06-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for finding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a figureground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is efficient and suitable for further tattoo classification and retrieval purpose.

  19. Co-financing of bottom-up approaches towards Broadband Infrastructure Development

    DEFF Research Database (Denmark)

    Williams, Idongesit

    2016-01-01

    networks –leading to the demise of some of these initiatives. This paper proposes co-financing of these networks as a means of sustaining the bottom-up Broadband network. The argument of this paper is anchored on two of developing country cases. One in India and the other in Ghana. One survived...... with financial injection and the other did not due to low revenue. This paper, based on these cases, proposes the utilization and the reintroduction of Universal Service funds in developing countries to aid these small networks. This is a qualitative study, the Grounded Theory approach was used adopted gather...

  20. A bottom-up approach for the synthesis of highly ordered fullerene-intercalated graphene hybrids

    Directory of Open Access Journals (Sweden)

    Dimitrios eGournis

    2015-02-01

    Full Text Available Much of the research effort on graphene focuses on its use as a building block for the development of new hybrid nanostructures with well-defined dimensions and properties suitable for applications such as gas storage, heterogeneous catalysis, gas/liquid separations, nanosensing and biomedicine. Towards this aim, here we describe a new bottom-up approach, which combines self-assembly with the Langmuir Schaefer deposition technique to synthesize graphene-based layered hybrid materials hosting fullerene molecules within the interlayer space. Our film preparation consists in a bottom-up layer-by-layer process that proceeds via the formation of a hybrid organo-graphene oxide Langmuir film. The structure and composition of these hybrid fullerene-containing thin multilayers deposited on hydrophobic substrates were characterized by a combination of X-ray diffraction, Raman and X-ray photoelectron spectroscopies, atomic force microscopy and conductivity measurements. The latter revealed that the presence of C60 within the interlayer spacing leads to an increase in electrical conductivity of the hybrid material as compared to the organo-graphene matrix alone.

  1. Top-down (Prior Knowledge) and Bottom-up (Perceptual Modality) Influences on Spontaneous Interpersonal Synchronization.

    Science.gov (United States)

    Gipson, Christina L; Gorman, Jamie C; Hessler, Eric E

    2016-04-01

    Coordination with others is such a fundamental part of human activity that it can happen unintentionally. This unintentional coordination can manifest as synchronization and is observed in physical and human systems alike. We investigated the role of top-down influences (prior knowledge of the perceptual modality their partner is using) and bottom-up factors (perceptual modality combination) on spontaneous interpersonal synchronization. We examine this phenomena with respect to two different theoretical perspectives that differently emphasize top-down and bottom-up factors in interpersonal synchronization: joint-action/shared cognition theories and ecological-interactive theories. In an empirical study twelve dyads performed a finger oscillation task while attending to each other's movements through either visual, auditory, or visual and auditory perceptual modalities. Half of the participants were given prior knowledge of their partner's perceptual capabilities for coordinating across these different perceptual modality combinations. We found that the effect of top-down influence depends on the perceptual modality combination between two individuals. When people used the same perceptual modalities, top-down influence resulted in less synchronization and when people used different perceptual modalities, top-down influence resulted in more synchronization. Furthermore, persistence in the change in behavior as a result of having perceptual information about each other ('social memory') was stronger when this top-down influence was present. PMID:27033133

  2. Formation of three-dimensional hepatic tissue by the bottom-up method using spheroids.

    Science.gov (United States)

    Okudaira, Tatsuya; Amimoto, Naoki; Mizumoto, Hiroshi; Kajiwara, Toshihisa

    2016-08-01

    Liver regenerative medicine has attracted attention as a possible alternative to organ transplantation. To address the challenge of liver regenerative medicine, the development of a construction method has been proposed for liver tissue in vitro with a high cell density and high functionality for transplantation into patients with severe liver failure. In this study, we fabricated highly functional three-dimensional hepatic tissue by a bottom-up method using spheroids. The hepatic tissue was formed by stacking hepatocyte spheroids covered with human umbilical vein endothelial cells (HUVECs). Hepatic tissue constructs were evaluated for cell survival, liver-specific functions, and histologically. As a result, we identified improvements in liver-specific functions (ammonia removal and albumin secretion) and cell survival. In addition, HUVECs were regularly distributed at every 100 μm within the tissue, and live cells were present within the whole tissue construct throughout the culture period. In summary, we successfully fabricated highly functional hepatic tissue by the bottom-up method using HUVEC-covered hepatocyte spheroids. PMID:26803704

  3. From bottom-up approaches to levels of organization and extended critical transitions

    Directory of Open Access Journals (Sweden)

    Giuseppe eLongo

    2012-07-01

    Full Text Available Biological thinking is structured by the notion of level of organization. We will show that this notion acquires a precise meaning in critical phenomena: they disrupt, by the appearance of infinite quantities, the mathematical (possibly equational determination at a given level, when moving at an ``higher'' one. As a result, their analysis cannot be called genuinely bottom-up, even though it remains upward in a restricted sense.At the same time, criticality and related phenomena are very common in biology. Because of this, we claim that bottom-up approaches are not sufficient, in principle, to capture biological phenomena. In the second part of this paper, following the work of Francis Bailly, we discuss a strong criterium of level transition. The core idea of the criterium is to start from the breaking of the symmetries and determination at a ``first'' level in order to ``move'' at the others. If biological phenomena have multiple, emph{sustained} levels of organization in this sense, then they should be interpreted as emph{extended} critical transitions.

  4. A bottom-up institutional approach to cooperative governance of risky commons

    Science.gov (United States)

    Vasconcelos, Vítor V.; Santos, Francisco C.; Pacheco, Jorge M.

    2013-09-01

    Avoiding the effects of climate change may be framed as a public goods dilemma, in which the risk of future losses is non-negligible, while realizing that the public good may be far in the future. The limited success of existing attempts to reach global cooperation has been also associated with a lack of sanctioning institutions and mechanisms to deal with those who do not contribute to the welfare of the planet or fail to abide by agreements. Here we investigate the emergence and impact of different types of sanctioning to deter non-cooperative behaviour in climate agreements. We show that a bottom-up approach, in which parties create local institutions that punish free-riders, promotes the emergence of widespread cooperation, mostly when risk perception is low, as it is at present. On the contrary, global institutions provide, at best, marginal improvements regarding overall cooperation. Our results clearly suggest that a polycentric approach involving multiple institutions is more effective than that associated with a single, global one, indicating that such a bottom-up, self-organization approach, set up at a local scale, provides a better ground on which to attempt a solution for such a complex and global dilemma.

  5. Top-down (Prior Knowledge) and Bottom-up (Perceptual Modality) Influences on Spontaneous Interpersonal Synchronization.

    Science.gov (United States)

    Gipson, Christina L; Gorman, Jamie C; Hessler, Eric E

    2016-04-01

    Coordination with others is such a fundamental part of human activity that it can happen unintentionally. This unintentional coordination can manifest as synchronization and is observed in physical and human systems alike. We investigated the role of top-down influences (prior knowledge of the perceptual modality their partner is using) and bottom-up factors (perceptual modality combination) on spontaneous interpersonal synchronization. We examine this phenomena with respect to two different theoretical perspectives that differently emphasize top-down and bottom-up factors in interpersonal synchronization: joint-action/shared cognition theories and ecological-interactive theories. In an empirical study twelve dyads performed a finger oscillation task while attending to each other's movements through either visual, auditory, or visual and auditory perceptual modalities. Half of the participants were given prior knowledge of their partner's perceptual capabilities for coordinating across these different perceptual modality combinations. We found that the effect of top-down influence depends on the perceptual modality combination between two individuals. When people used the same perceptual modalities, top-down influence resulted in less synchronization and when people used different perceptual modalities, top-down influence resulted in more synchronization. Furthermore, persistence in the change in behavior as a result of having perceptual information about each other ('social memory') was stronger when this top-down influence was present.

  6. Top-down and bottom-up definitions of human failure events in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  7. Bottom-up approach for decentralised energy planning: Case study of Tumkur district in India

    International Nuclear Information System (INIS)

    Decentralized Energy Planning (DEP) is one of the options to meet the rural and small-scale energy needs in a reliable, affordable and environmentally sustainable way. The main aspect of the energy planning at decentralized level would be to prepare an area-based DEP to meet energy needs and development of alternate energy sources at least-cost to the economy and environment. Present work uses goal-programming method in order to analyze the DEP through bottom-up approach. This approach includes planning from the lowest scale of Tumkur district in India. The scale of analysis included village level-Ungra, panchayat level (local council)-Yedavani, block level-Kunigal and district level-Tumkur. The approach adopted was bottom-up (village to district) to allow a detailed description of energy services and the resulting demand for energy forms and supply technologies. Different scenarios are considered at four decentralized scales for the year 2005 and are developed and analyzed for the year 2020. Decentralized bioenergy system for producing biogas and electricity, using local biomass resources, are shown to promote development compared to other renewables. This is because, apart from meeting energy needs, multiple goals could be achieved such as self-reliance, local employment, and land reclamation apart from CO2 emissions reduction.

  8. Views on helper/cytotoxic lineage choice from a bottom-up approach.

    Science.gov (United States)

    Taniuchi, Ichiro

    2016-05-01

    There has been speculation as to how bi-potent CD4(+)  CD8(+) double-positive precursor thymocytes choose their distinct developmental fate, becoming either CD4(+) helper or CD8(+) cytotoxic T cells. Based on the clear correlation of αβT cell receptor (TCR) specificity to major histocompatibility complex (MHC) classes with this lineage choice, various studies have attempted to resolve this question by examining the cellular signaling events initiated by TCR engagements, a strategy referred to as a 'top-down' approach. On the other hand, based on the other correlation of CD4/CD8 co-receptor expression with its selected fate, other studies have addressed this question by gradually unraveling the sequential mechanisms that control the phenotypic outcome of this fate decision, a method known as the 'bottom-up' approach. Bridging these two approaches will contribute to a more comprehensive understanding of how TCR signals are coupled with developmental programs in the nucleus. Advances made during the last two decades seemed to make these two approaches more closely linked. For instance, identification of two transcription factors, ThPOK and Runx3, which play central roles in the development of helper and cytotoxic lineages, respectively, provided significant insights into the transcriptional network that controls a CD4/CD8 lineage choice. This review summarizes achievements made using the 'bottom-up' approach, followed by a perspective on future pathways toward coupling TCR signaling with nuclear programs. PMID:27088909

  9. A bottom-up model to describe consumers’ preferences towards late season peaches

    Directory of Open Access Journals (Sweden)

    Etiénne Groot

    2015-12-01

    Full Text Available Peaches are consumed in Mediterranean countries since ancient times. Nowadays there are few areas in Europe that produce peaches with Protected Designation of Origin (PDO, and the Calanda area is one of them. The aim of this work is to describe consumers’ preferences towards late season PDO Calanda peaches in the city of Zaragoza, Spain, by a bottom-up model. The bottom-up model proves greater amount of information than top-down models. In this approach it is estimated one utility function per consumer. Thus, it is not necessary to make assumptions about preference distributions and correlations across respondents. It was observed that preference distributions were neither normal nor independently distributed. If those preferences were estimated by top-down models, conclusions would be biased. This paper also explores a new way to describe preferences through individual utility functions. Results show that the largest behavioural group gathered origin sensitive consumers. Their utility increased if the peaches were produced in the Calanda area and, especially, when peaches had the PDO Calanda brand. In sequence, the second most valuable attribute for consumers was the price. Peach size and packaging were not so important on purchase choice decision. Nevertheless, it is advisable to avoid trading smallest size peaches (weighting around 160 g/fruit. Traders also have to be careful by using active packaging. It was found that a group of consumers disliked this kind of product, probably, because they perceived it as less natural.

  10. Integrating the bottom-up and top-down approach to energy economy modelling. The case of Denmark

    DEFF Research Database (Denmark)

    Klinge Jacobsen, Henrik

    1998-01-01

    This paper presents results from an integration project covering Danish models based on bottom-up and top-down approaches to energy]economy modelling. The purpose of the project was to identify theoretical and methodological problems for integrating existing models for Denmark and to implement an...... integration of the models. The integration was established through a number of links between energy bottom-up modules and a macroeconomic model. In this integrated model it is possible to analyse both top-down instruments, such as taxes along with bottom-up instruments, such as regulation of technology...

  11. A bottom up approach for engineering catchments through sustainable runoff management

    Science.gov (United States)

    Wilkinson, M.; Quinn, P. F.; Jonczyk, J.; Burke, S.

    2010-12-01

    There is no doubt that our catchments are under great stress. There have been many accounts around the world of severe flood events and water quality issues within channels. As a result of these, ecological habitats in rivers are also under pressure. Within the United Kingdom, all these issues have been identified as key target areas for policy. Traditionally this has been managed by a policy driven top down approach which is usually ineffective. A one ‘size fits all’ attitude often does not work. This paper presents a case study in northern England whereby a bottom up approach is applied to multipurpose managing of catchments at the source (in the order of 1-10km2). This includes simultaneous tackling of water quality, flooding and ecological issues by creating sustainable runoff management solutions such as storage ponds, wetlands, beaver dams and willow riparian features. In order to identify the prevailing issues in a specific catchment, full and transparent stakeholder engagement is essential, with everybody who has a vested interest in the catchment being involved from the beginning. These problems can then be dealt with through the use of a novel catchment management toolkit, which is transferable to similar scale catchments. However, evidence collected on the ground also allows for upscaling of the toolkit. The process gathers the scientific evidence about the effectiveness of existing or new measures, which can really change the catchment functions. Still, we need to get better at communicating the science to policy makers and policy therefore must facilitate a bottom up approach to land and water management. We show a test site for this approach in the Belford burn catchment (6km2), northern England. This catchment has problems with flooding and water quality. Increased sediment loads are affecting the nearby estuary which is an important ecological zone and numerous floods have affected the local village. A catchment engineering toolkit has been

  12. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  13. Bottom-Up Cost Analysis of a High Concentration PV Module; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, K.; Woodhouse, M.; Lee, H.; Smestad, G.

    2015-04-13

    We present a bottom-up model of III-V multi-junction cells, as well as a high concentration PV (HCPV) module. We calculate $0.65/Wp(DC) manufacturing costs for our model HCPV module design with today’s capabilities, and find that reducing cell costs and increasing module efficiency offer the promising pathways for future cost reductions. Cell costs could be significantly reduced via an increase in manufacturing scale, substrate reuse, and improved manufacturing yields. We also identify several other significant drivers of HCPV module costs, including the Fresnel lens primary optic, module housing, thermal management, and the receiver board. These costs could potentially be lowered by employing innovative module designs.

  14. Ion mobility tandem mass spectrometry enhances performance of bottom-up proteomics.

    Science.gov (United States)

    Helm, Dominic; Vissers, Johannes P C; Hughes, Christopher J; Hahne, Hannes; Ruprecht, Benjamin; Pachl, Fiona; Grzyb, Arkadiusz; Richardson, Keith; Wildgoose, Jason; Maier, Stefan K; Marx, Harald; Wilhelm, Mathias; Becher, Isabelle; Lemeer, Simone; Bantscheff, Marcus; Langridge, James I; Kuster, Bernhard

    2014-12-01

    One of the limiting factors in determining the sensitivity of tandem mass spectrometry using hybrid quadrupole orthogonal acceleration time-of-flight instruments is the duty cycle of the orthogonal ion injection system. As a consequence, only a fraction of the generated fragment ion beam is collected by the time-of-flight analyzer. Here we describe a method utilizing postfragmentation ion mobility spectrometry of peptide fragment ions in conjunction with mobility time synchronized orthogonal ion injection leading to a substantially improved duty cycle and a concomitant improvement in sensitivity of up to 10-fold for bottom-up proteomic experiments. This enabled the identification of 7500 human proteins within 1 day and 8600 phosphorylation sites within 5 h of LC-MS/MS time. The method also proved powerful for multiplexed quantification experiments using tandem mass tags exemplified by the chemoproteomic interaction analysis of histone deacetylases with Trichostatin A.

  15. Differential recolonization of Atlantic intertidal habitats after disturbance reveals potential bottom-up community regulation.

    Science.gov (United States)

    Petzold, Willy; Scrosati, Ricardo A

    2014-01-01

    In the spring of 2014, abundant sea ice that drifted out of the Gulf of St. Lawrence caused extensive disturbance in rocky intertidal habitats on the northern Atlantic coast of mainland Nova Scotia, Canada. To monitor recovery of intertidal communities, we surveyed two wave-exposed locations in the early summer of 2014. Barnacle recruitment and the abundance of predatory dogwhelks were low at one location (Tor Bay Provincial Park) but more than 20 times higher at the other location (Whitehead). Satellite data indicated that the abundance of coastal phytoplankton (the main food source for barnacle larvae) was consistently higher at Whitehead just before the barnacle recruitment season, when barnacle larvae were in the water column. These observations suggest bottom-up forcing of intertidal communities. The underlying mechanisms and their intensity along the NW Atlantic coast could be investigated through studies done at local and regional scales.

  16. Bottom-Up Engineering of Well-Defined 3D Microtissues Using Microplatforms and Biomedical Applications.

    Science.gov (United States)

    Lee, Geon Hui; Lee, Jae Seo; Wang, Xiaohong; Lee, Sang Hoon

    2016-01-01

    During the last decades, the engineering of well-defined 3D tissues has attracted great attention because it provides in vivo mimicking environment and can be a building block for the engineering of bioartificial organs. In this Review, diverse engineering methods of 3D tissues using microscale devices are introduced. Recent progress of microtechnologies has enabled the development of microplatforms for bottom-up assembly of diverse shaped 3D tissues consisting of various cells. Micro hanging-drop plates, microfluidic chips, and arrayed microwells are the typical examples. The encapsulation of cells in hydrogel microspheres and microfibers allows the engineering of 3D microtissues with diverse shapes. Applications of 3D microtissues in biomedical fields are described, and the future direction of microplatform-based engineering of 3D micro-tissues is discussed.

  17. Manufacturing at Nanoscale: Top-Down, Bottom-up and System Engineering

    International Nuclear Information System (INIS)

    The current nano-technology revolution is facing several major challenges: to manufacture nanodevices below 20 nm, to fabricate three-dimensional complex nano-structures, and to heterogeneously integrate multiple functionalities. To tackle these grand challenges, the Center for Scalable and Integrated NAno-Manufacturing (SINAM), a NSF Nanoscale Science and Engineering Center, set its goal to establish a new manufacturing paradigm that integrates an array of new nano-manufacturing technologies, including the plasmonic imaging lithography and ultramolding imprint lithography aiming toward critical resolution of 1-10 nm and the hybrid top-down and bottom-up technologies to achieve massively parallel integration of heterogeneous nanoscale components into higher-order structures and devices. Furthermore, SINAM will develop system engineering strategies to scale-up the nano-manufacturing technologies. SINAMs integrated research and education platform will shed light to a broad range of potential applications in computing, telecommunication, photonics, biotechnology, health care, and national security

  18. Bottom-up synthesis of chiral covalent organic frameworks and their bound capillaries for chiral separation.

    Science.gov (United States)

    Qian, Hai-Long; Yang, Cheng-Xiong; Yan, Xiu-Ping

    2016-07-12

    Covalent organic frameworks (COFs) are a novel class of porous materials, and offer great potential for various applications. However, the applications of COFs in chiral separation and chiral catalysis are largely underexplored due to the very limited chiral COFs available and their challenging synthesis. Here we show a bottom-up strategy to construct chiral COFs and an in situ growth approach to fabricate chiral COF-bound capillary columns for chiral gas chromatography. We incorporate the chiral centres into one of the organic ligands for the synthesis of the chiral COFs. We subsequently in situ prepare the COF-bound capillary columns. The prepared chiral COFs and their bound capillary columns give high resolution for the separation of enantiomers with excellent repeatability and reproducibility. The proposed strategy provides a promising platform for the synthesis of chiral COFs and their chiral separation application.

  19. Strain Response of Hot-Mix Asphalt Overlays for Bottom-Up Reflective Cracking

    CERN Document Server

    Ghauch, Ziad G

    2011-01-01

    This paper examines the strain response of typical HMA overlays above jointed PCC slabs prone to bottom-up reflective cracking. The occurrence of reflective cracking under the combined effect of traffic and environmental loading significantly reduces the design life of the HMA overlays and can lead to its premature failure. In this context, viscoelastic material properties combined with cyclic vehicle loadings and pavement temperature distribution were implemented in a series of FE models in order to study the evolution of horizontal tensile and shear strains at the bottom of the HMA overlay. The effect of several design parameters, such as subbase and subgrade moduli, vehicle speed, overlay thickness, and temperature condition, on the horizontal and shear strain response was investigated. Results obtained show that the rate of horizontal and shear strain increase at the bottom of the HMA overlay drop with higher vehicle speed, higher subgrade modulus, and higher subbase modulus. Moreover, the rate of horizon...

  20. Collective Inclusioning: A Grounded Theory of a Bottom-Up Approach to Innovation and Leading

    Directory of Open Access Journals (Sweden)

    Michal Lysek

    2016-06-01

    Full Text Available This paper is a grounded theory study of how leaders (e.g., entrepreneurs, managers, etc. engage people in challenging undertakings (e.g., innovation that require everyone’s commitment to such a degree that they would have to go beyond what could be reasonably expected in order to succeed. Company leaders sometimes wonder why their employees no longer show the same responsibility towards their work, and why they are more concerned with internal politics than solving customer problems. It is because company leaders no longer apply collective inclusioning to the same extent as they did in the past. Collective inclusioning can be applied in four ways by convincing, afinitizing, goal congruencing, and engaging. It can lead to fostering strong units of people for taking on challenging undertakings. Collective inclusioning is a complementing theory to other strategic management and leading theories. It offers a new perspective on how to implement a bottom-up approach to innovation.

  1. Bottom-up synthesis of chiral covalent organic frameworks and their bound capillaries for chiral separation

    Science.gov (United States)

    Qian, Hai-Long; Yang, Cheng-Xiong; Yan, Xiu-Ping

    2016-07-01

    Covalent organic frameworks (COFs) are a novel class of porous materials, and offer great potential for various applications. However, the applications of COFs in chiral separation and chiral catalysis are largely underexplored due to the very limited chiral COFs available and their challenging synthesis. Here we show a bottom-up strategy to construct chiral COFs and an in situ growth approach to fabricate chiral COF-bound capillary columns for chiral gas chromatography. We incorporate the chiral centres into one of the organic ligands for the synthesis of the chiral COFs. We subsequently in situ prepare the COF-bound capillary columns. The prepared chiral COFs and their bound capillary columns give high resolution for the separation of enantiomers with excellent repeatability and reproducibility. The proposed strategy provides a promising platform for the synthesis of chiral COFs and their chiral separation application.

  2. Bottom-up formation of endohedral mono-metallofullerenes is directed by charge transfer

    Science.gov (United States)

    Dunk, Paul W.; Mulet-Gas, Marc; Nakanishi, Yusuke; Kaiser, Nathan K.; Rodríguez-Fortea, Antonio; Shinohara, Hisanori; Poblet, Josep M.; Marshall, Alan G.; Kroto, Harold W.

    2014-12-01

    An understanding of chemical formation mechanisms is essential to achieve effective yields and targeted products. One of the most challenging endeavors is synthesis of molecular nanocarbon. Endohedral metallofullerenes are of particular interest because of their unique properties that offer promise in a variety of applications. Nevertheless, the mechanism of formation from metal-doped graphite has largely eluded experimental study, because harsh synthetic methods are required to obtain them. Here we report bottom-up formation of mono-metallofullerenes under core synthesis conditions. Charge transfer is a principal factor that guides formation, discovered by study of metallofullerene formation with virtually all available elements of the periodic table. These results could enable production strategies that overcome long-standing problems that hinder current and future applications of metallofullerenes.

  3. Bottom-up design of 2D organic photocatalysts for visible-light driven hydrogen evolution

    International Nuclear Information System (INIS)

    To design two-dimensional (2D) organocatalysts, three series of covalent organic frameworks (COFs) are constructed using bottom-up strategies, i.e. molecular selection, tunable linkage, and functionalization. First-principles calculations are performed to confirm their photocatalytic activity under visible light. Two of our constructed 2D COF models (B1 and C3) are identified as a sufficiently efficient organocatalyst for visible light water splitting. The controllable construction of such COFs from suitable organic subunit, linkage, and functional groups paves the way for correlating band edge alignments and geometry parameters of 2D organic materials. Our theoretical prediction not only provides essential insights into designing 2D-COF photocatalysts for water splitting, but also sparks other technological applications for 2D organic materials. (paper)

  4. Bottom-Up Reconstruction Scenarios for (un)constrained MSSM Parameters at the LHC

    CERN Document Server

    Kneur, J -L

    2008-01-01

    We consider some specific inverse problem or "bottom-up" reconstruction strategies at the LHC for both general and constrained MSSM parameters, starting from a plausibly limited set of sparticle identification and mass measurements, using mainly gluinos/squarks cascade decays, plus eventually the lightest Higgs boson mass. For the three naturally separated sectors of: gaugino/Higgsino, squark/slepton, and Higgs parameters, we examine different step-by-step algorithms based on rather simple entirely analytical inverted relations between masses and basic MSSM parameters, including also radiative corrections as reasonably good approximations of the more complete available calculations. We distinguish the constraints obtained for a general MSSM from those obtained with universality assumptions in the three different sectors. Our results are compared at different stages with the determination from more standard "top-down" fit of models to data, and finally combined into a global determination of all the relevant p...

  5. Enhancing Bottom-up and Top-down Proteomic Measurements with Ion Mobility Separations

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Erin Shammel [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Burnum-Johnson, Kristin E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ibrahim, Yehia M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Orton, Daniel J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Monroe, Matthew E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kelly, Ryan T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moore, Ronald J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Xing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Theberge, Roger [Boston Univ. of Medicine, MA (United States); Costello, Catherine E. [Boston Univ. of Medicine, MA (United States); Smith, Richard D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-03

    Proteomic measurements with greater throughput, sensitivity and additional structural information enhance the in-depth characterization of complex mixtures and targeted studies with additional information and higher confidence. While liquid chromatography separation coupled with mass spectrometry (LC-MS) measurements have provided information on thousands of proteins in different sample types, the additional of another rapid separation stage providing structural information has many benefits for analyses. Technical advances in ion funnels and multiplexing have enabled ion mobility separations to be easily and effectively coupled with LC-MS proteomics to enhance the information content of measurements. Herein, we report on applications illustrating increased sensitivity, throughput, and structural information by utilizing IMS-MS and LC-IMS-MS measurements for both bottom-up and top-down proteomics measurements.

  6. Differential recolonization of Atlantic intertidal habitats after disturbance reveals potential bottom-up community regulation

    Science.gov (United States)

    Petzold, Willy; Scrosati, Ricardo A.

    2014-01-01

    In the spring of 2014, abundant sea ice that drifted out of the Gulf of St. Lawrence caused extensive disturbance in rocky intertidal habitats on the northern Atlantic coast of mainland Nova Scotia, Canada. To monitor recovery of intertidal communities, we surveyed two wave-exposed locations in the early summer of 2014. Barnacle recruitment and the abundance of predatory dogwhelks were low at one location (Tor Bay Provincial Park) but more than 20 times higher at the other location (Whitehead). Satellite data indicated that the abundance of coastal phytoplankton (the main food source for barnacle larvae) was consistently higher at Whitehead just before the barnacle recruitment season, when barnacle larvae were in the water column. These observations suggest bottom-up forcing of intertidal communities. The underlying mechanisms and their intensity along the NW Atlantic coast could be investigated through studies done at local and regional scales. PMID:26213609

  7. A bottom-up perspective on leadership of collaborative innovation in the public sector

    DEFF Research Database (Denmark)

    Hansen, Jesper Rohr

    The thesis investigates how new forms of public leadership can contribute to solving complex problems in today’s welfare societies through innovation. A bottom-up type of leadership for collaborative innovation addressing wicked problems is theorised, displaying a social constructive process...... approach to leadership; a theoretical model emphasises that leadership emerges through social processes of recognition. Leadership is recognised by utilising the uncertainty of a wicked problem and innovation to influence collaborators’ sensemaking processes. The empirical setting is the City of Copenhagen....... A crucial condition for success is iterative leadership adaptation. In conclusion, the thesis finds that specialized professionals are indeed able to develop politically viable, innovative and collaborative solutions to wicked problems; and that such professionals are able to transform themselves...

  8. The potential of LCM to mainstream bottom-up eco-innovation and alternative thinking

    DEFF Research Database (Denmark)

    De Rosa, Michele; Ghose, Agneta

    2015-01-01

    . For this reason, under the LCM framework, a number of bottom-up eco innovations and non-traditional approaches can be categorized, arising often in difficult economic context. However, it is not because of LCM that alternative solutions were found in these cases, but due to necessity. The potential of LCM and its...... to transform behaviors. Situated in a developed context, the drive of this experiment is to create an alternative framework to consumeristic society, which makes easier for people to avoid or reduce consumption. The second case is a SME, Mitti Cool from Gujarat (west India), a developing context where...... the drive is instead limited economic resources. It is a good example of how the LCM framework can mainstream community knowledge: started in 2002 as a one-man project, Mitti Cool revisited the Jugaad concept of pot-in-pot refrigerator to provide an affordable rural fridge suitable for dry areas that does...

  9. Unsupervised Tattoo Segmentation Combining Bottom-Up and Top-Down Cues

    Energy Technology Data Exchange (ETDEWEB)

    Allen, Josef D [ORNL

    2011-01-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for nding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a gure-ground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is e cient and suitable for further tattoo classi cation and retrieval purpose.

  10. Bottom-up and top-down controls on picoplankton in the East China Sea

    Science.gov (United States)

    Guo, C.; Liu, H.; Zheng, L.; Song, S.; Chen, B.; Huang, B.

    2013-05-01

    Dynamics of picoplankton population distribution in the East China Sea (ECS), a marginal sea in the western North Pacific Ocean, were studied during two "CHOICE-C" cruises in August 2009 (summer) and January 2010 (winter). Dilution experiments were conducted during the two cruises to investigate the growth and grazing among picophytoplantkon populations. Picoplankton accounted for an average of ~29% (2% to 88%) of community carbon biomass in the ECS on average, with lower percentages in plume region than in shelf and kuroshio regions. Averaged growth rates (μ) for Prochlorococcus (Pro), Synechococcus (Syn) and picoeukaryotes (peuk) were 0.36, 0.89, 0.90 d-1, respectively, in summer, and 0.46, 0.58, 0.56 d-1, respectively, in winter. Seawater salinity and nutrient availability exerted significant controls on picoplankton growth rate. Averaged grazing mortality (m) were 0.46, 0.63, 0.68 d-1 in summer, and 0.22, 0.32, 0.22 d-1 in winter for Pro, Syn and peuk respectively. The three populations demonstrated very different distribution patterns regionally and seasonally affected by both bottom-up and top-down controls. In summer, Pro, Syn and peuk were dominant in Kuroshio, transitional and plume regions respectively. Protist grazing consumed 84%, 78%, 73% and 45%, 47%, 57% of production for Pro, Syn and peuk in summer and winter respectively, suggesting more significant top-down controls in summer. In winter, all three populations tended to distribute in offshore regions, although the area of coverage was different (peuk > Syn > Pro). Bottom-up factors can explain as much as 91.5%, 82% and 81.2% of Pro, Syn and peuk abundance variance in winter, while only 59.1% and 43.7% for Pro and peuk in summer. Regionally, Yangtze River discharge plays a significant role in affecting the intensity of top-down control, indicated by significant and negative association between salinity and grazing mortality of all three populations and higher grazing mortality to growth rate ratio

  11. Using dichotic listening to study bottom-up and top-down processing in children and adults.

    Science.gov (United States)

    Andersson, Martin; Llera, John Eric; Rimol, Lars M; Hugdahl, Kenneth

    2008-09-01

    The study examined top-down attention modulation of bottom-up processing in children and adults under conditions of varying bottom-up stimulus demands. Voiced and unvoiced consonant-vowel syllables were used in a dichotic-listening situation to manipulate the strength of the bottom-up stimulus-driven right ear advantage when subjects were instructed to focus attention on, and report, either the left or right ear stimulus. We predicted that children would differ from adults in their ability to use attention to modulate a lateralized ear advantage, and particularly when there was a conflict between the direction of the bottom-up ear advantage and the direction of the top-down attention instruction. Thirty children and 30 adults were presented with dichotic presentations of consonant-vowel syllables. The results showed that the voicing manipulation affected the strength of the ear advantage, and that the children performed significantly below the adults when the voicing parameter caused a strong conflict between bottom-up and top down processing. Thus, children seem to lack the cognitive flexibility necessary to modulate a stimulus-driven bottom-up ear advantage, particularly in situations where right ear advantage (REA) is enhanced by the acoustic properties of the stimuli and attentional demands require a left ear shift. It is suggested that varying the stimulus demands in a dichotic-listening situation may be a novel way to study cognitive development. PMID:18608228

  12. Preferential effect of isoflurane on top-down versus bottom-up pathways in sensory cortex

    Directory of Open Access Journals (Sweden)

    Aeyal eRaz

    2014-10-01

    Full Text Available The mechanism of loss of consciousness (LOC under anesthesia is unknown. Because consciousness depends on activity in the cortico-thalamic network, anesthetic actions on this network are likely critical for LOC. Competing theories stress the importance of anesthetic actions on bottom-up ‘core’ thalamo-cortical (TC versus top-down cortico-cortical (CC and matrix TC connections. We tested these models using laminar recordings in rat auditory cortex in-vivo and murine brain slices. We selectively activated bottom-up vs. top-down afferent pathways using sensory stimuli in vivo and electrical stimulation in brain slices, and compared effects of isoflurane on responses evoked via the two pathways. Auditory stimuli in vivo and core TC afferent stimulation in brain slices evoked short latency current sinks in middle layers, consistent with activation of core TC afferents. By contrast, visual stimuli in vivo and stimulation of CC and matrix TC afferents in brain slices evoked responses mainly in superficial and deep layers, consistent with projection patterns of top-down afferents that carry visual information to auditory cortex. Responses to auditory stimuli in vivo and core TC afferents in brain slices were significantly less affected by isoflurane compared to responses triggered by visual stimuli in vivo and CC/matrix TC afferents in slices. At a just-hypnotic dose in vivo, auditory responses were enhanced by isoflurane, whereas visual responses were dramatically reduced. At a comparable concentration in slices, isoflurane suppressed both core TC and CC/matrix TC responses, but the effect on the latter responses was far greater than on core TC responses, indicating that at least part of the differential effects observed in vivo were due to local actions of isoflurane in auditory cortex. These data support a model in which disruption of top-down connectivity contributes to anesthesia-induced LOC, and have implications for understanding the neural

  13. A bottom-up approach of stochastic demand allocation in water quality modelling

    Directory of Open Access Journals (Sweden)

    E. J. M. Blokker

    2010-04-01

    Full Text Available An "all pipes" hydraulic model of a drinking water distribution system was constructed with two types of demand allocations. One is constructed with the conventional top-down approach, i.e. a demand multiplier pattern from the booster station is allocated to all demand nodes with a correction factor to account for the average water demand on that node. The other is constructed with a bottom-up approach of demand allocation, i.e., each individual home is represented by one demand node with its own stochastic water demand pattern. This was done for a drinking water distribution system of approximately 10 km of mains and serving ca. 1000 homes. The system was tested in a real life situation.

    The stochastic water demand patterns were constructed with the end-use model SIMDEUM on a per second basis and per individual home. Before applying the demand patterns in a network model, some temporal aggregation was done. The flow entering the test area was measured and a tracer test with sodium chloride was performed to determine travel times. The two models were validated on the total sum of demands and on travel times.

    The study showed that the bottom-up approach leads to realistic water demand patterns and travel times, without the need for any flow measurements or calibration. In the periphery of the drinking water distribution system it is not possible to calibrate models on pressure, because head losses are too low. The study shows that in the periphery it is also difficult to calibrate on water quality (e.g. with tracer measurements, as a consequence of the high variability between days. The stochastic approach of hydraulic modelling gives insight into the variability of travel times as an added feature beyond the conventional way of modelling.

  14. Achieving social-ecological fit through bottom-up collaborative governance: an empirical investigation

    Directory of Open Access Journals (Sweden)

    Angela M. Guerrero

    2015-12-01

    Full Text Available Significant benefits can arise from collaborative forms of governance that foster self-organization and flexibility. Likewise, governance systems that fit with the extent and complexity of the system under management are considered essential to our ability to solve environmental problems. However, from an empirical perspective the fundamental question of whether self-organized (bottom-up collaborative forms of governance are able to accomplish adequate fit is unresolved. We used new theory and methodological approaches underpinned by interdisciplinary network analysis to address this gap by investigating three governance challenges that relate to the problem of fit: shared management of ecological resources, management of interconnected ecological resources, and cross-scale management. We first identified a set of social-ecological network configurations that represent the hypothesized ways in which collaborative arrangements can contribute to addressing these challenges. Using social and ecological data from a large-scale biodiversity conservation initiative in Australia, we empirically determined how well the observed patterns of stakeholder interactions reflect these network configurations. We found that stakeholders collaborate to manage individual parcels of native vegetation, but not for the management of interconnected parcels. In addition, our data show that the collaborative arrangements enable management across different scales (local, regional, supraregional. Our study provides empirical support for the ability of collaborative forms of governance to address the problem of fit, but also suggests that in some cases the establishment of bottom-up collaborative arrangements would likely benefit from specific guidance to facilitate the establishment of collaborations that better align with the ways ecological resources are interconnected across the landscape. In our case study region, this would improve the capacity of stakeholders to

  15. Top-down instead of bottom-up estimates of uncertainty in INAA results?

    International Nuclear Information System (INIS)

    The initial publication of the ISO Guide to the Expression of Uncertainty in Measurement (GUM) and many related documents has resulted in a worldwide awareness of the importance of a realistic estimate of the value reported after the +/- sign. The evaluation of uncertainty in measurement, as introduced by the GUM, is derived from the principles applied in physical measurements. Many testing laboratories have already experienced large problems in applying these principles in e.g. (bio)chemical measurements, resulting in time-consuming evaluations and costly additional experiments. Other, more pragmatic and less costly approaches have been proposed to obtain a realistic estimate of the range in which the true value of the measurement may be found with a certain degree of probability. One of these approaches, the 'top-down method', is based on the standard deviation in the results of intercomparison data. This approach is much easier for tests for which it is either difficult to establish a full measurement equation, or if e.g. matrix-matching reference materials are absent. It has been demonstrated that the GUM 'bottom-up' approach of evaluating uncertainty in measurement can easily be applied in instrumental neutron activation analysis (INAA) as all significant sources of uncertainty can be evaluated. INAA is therefore a valuable technique to test the validity of the top-down approach. In this contribution, examples of the top-down evaluation of uncertainty in INAA derived from participation in intercomparison rounds and proficiency testing schemes will be presented. The results will be compared with the bottom-up evaluation of uncertainty, and ease of applicability, validity and usefullness of both approaches will be discussed.

  16. Scaling up self-assembly: bottom-up approaches to macroscopic particle organization.

    Science.gov (United States)

    Lash, M H; Fedorchak, M V; McCarthy, J J; Little, S R

    2015-07-28

    This review presents an overview of recent work in the field of non-Brownian particle self-assembly. Compared to nanoparticles that naturally self-assemble due to Brownian motion, larger, non-Brownian particles (d > 6 μm) are less prone to autonomously organize into crystalline arrays. The tendency for particle systems to experience immobilization and kinetic arrest grows with particle radius. In order to overcome this kinetic limitation, some type of external driver must be applied to act as an artificial "thermalizing force" upon non-Brownian particles, inducing particle motion and subsequent crystallization. Many groups have explored the use of various agitation methods to overcome the natural barriers preventing self-assembly to which non-Brownian particles are susceptible. The ability to create materials from a bottom-up approach with these characteristics would allow for precise control over their pore structure (size and distribution) and surface properties (topography, functionalization and area), resulting in improved regulation of key characteristics such as mechanical strength, diffusive properties, and possibly even photonic properties. This review will highlight these approaches, as well as discuss the potential impact of bottom-up macroscale particle assembly. The applications of such technology range from customizable and autonomously self-assembled niche microenvironments for drug delivery and tissue engineering to new acoustic dampening, battery, and filtration materials, among others. Additionally, crystals made from non-Brownian particles resemble naturally derived materials such as opals, zeolites, and biological tissue (i.e. bone, cartilage and lung), due to their high surface area, pore distribution, and tunable (multilevel) hierarchy. PMID:25947543

  17. An Improved Model of Producing Saliency Map for Visual Attention System

    Science.gov (United States)

    Huang, Jingang; Kong, Bin; Cheng, Erkang; Zheng, Fei

    The iLab Neuromorphic Vision Toolkit (iINVT), steadily kept up to date by the group around Laurent Itti, is one of the currently best known attention systems. Their model of bottom up or saliency-based visual attention as well as their implementation serves as a basis for many research groups. How to combine the feature maps finally into the saliency map is a key point for this kind of visual attention system. We modified the original model of Laurent Itti to make it more corresponding with our perception.

  18. Olfaction spontaneously highlights visual saliency map.

    Science.gov (United States)

    Chen, Kepu; Zhou, Bin; Chen, Shan; He, Sheng; Zhou, Wen

    2013-10-01

    Attention is intrinsic to our perceptual representations of sensory inputs. Best characterized in the visual domain, it is typically depicted as a spotlight moving over a saliency map that topographically encodes strengths of visual features and feedback modulations over the visual scene. By introducing smells to two well-established attentional paradigms, the dot-probe and the visual-search paradigms, we find that a smell reflexively directs attention to the congruent visual image and facilitates visual search of that image without the mediation of visual imagery. Furthermore, such effect is independent of, and can override, top-down bias. We thus propose that smell quality acts as an object feature whose presence enhances the perceptual saliency of that object, thereby guiding the spotlight of visual attention. Our discoveries provide robust empirical evidence for a multimodal saliency map that weighs not only visual but also olfactory inputs.

  19. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Directory of Open Access Journals (Sweden)

    Merger Eduard

    2012-08-01

    Full Text Available Abstract Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV, and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to

  20. How can bottom-up greenhouse gas flux quantification in urban systems be relevant to both carbon science and policy?

    Science.gov (United States)

    Gurney, K. R.

    2014-12-01

    Scientific research on spatially-resolved, bottom-up quantification of urban greenhouse gas (GHG) emissions at urban scales has advanced considerably in the last decade. It has been primarily focused on contributing prior information to top-down approaches aimed at GHG emissions validation via atmospheric monitoring of GHG mixing ratios. However, bottom-up quantification has a number of other contributions to both scientific and policy topics. In order to do so, however, it must expand beyond current capabilities. Among these are the need to quantify both consumption- and production-based data products, utilization of remote-sensing, prognostic capabilities, expansion outside of the US, and uncertainty quantification. Such advances will allow it to make significant contributions to scientific research on urban science and energy analysis. In the arena of climate change policy, spatially-resolved, bottom-up quantification efforts can baseline and guide urban emissions mitigation, educate and engage the public, and offer a much more consistent and comprehensive means to compare cities across national and international domains. It can also find inconsistencies in existing reported regulatory data such as recent bottom-up research on examining US power plant CO2 emissions. I will review the current bottom-up GHG emissions quantification and review the opportunities and challenges associated with satisfying both climate change science and policy needs.

  1. Pressurized Pepsin Digestion in Proteomics: An Automatable Alternative to Trypsin for Integrated Top-down Bottom-up Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Ferrer, Daniel; Petritis, Konstantinos; Robinson, Errol W.; Hixson, Kim K.; Tian, Zhixin; Lee, Jung Hwa; Lee, Sang-Won; Tolic, Nikola; Weitz, Karl K.; Belov, Mikhail E.; Smith, Richard D.; Pasa-Tolic, Ljiljana

    2011-02-01

    Integrated top-down bottom-up proteomics combined with online digestion has great potential to improve the characterization of protein isoforms in biological systems and is amendable to highthroughput proteomics experiments. Bottom-up proteomics ultimately provides the peptide sequences derived from the tandem MS analyses of peptides after the proteome has been digested. Top-down proteomics conversely entails the MS analyses of intact proteins for more effective characterization of genetic variations and/or post-translational modifications (PTMs). Herein, we describe recent efforts towards efficient integration of bottom-up and top-down LCMS based proteomic strategies. Since most proteomic platforms (i.e. LC systems) operate in acidic environments, we exploited the compatibility of the pepsin (i.e. the enzyme’s natural acidic activity) for the integration of bottom-up and top-down proteomics. Pressure enhanced pepsin digestions were successfully performed and characterized with several standard proteins in either an offline mode using a Barocycler or an online mode using a modified high pressure LC system referred to as a fast online digestion system (FOLDS). FOLDS was tested using pepsin and a whole microbial proteome, and the results compared against traditional trypsin digestions on the same platform. Additionally, FOLDS was integrated with a RePlay configuration to demonstrate an ultra-rapid integrated bottom-up top-down proteomic strategy employing a standard mixture of proteins and a monkey pox virus proteome.

  2. Bottom-up and top-down influences at untrained conditions determine perceptual learning specificity and transfer

    Science.gov (United States)

    Xiong, Ying-Zi; Zhang, Jun-Yun; Yu, Cong

    2016-01-01

    Perceptual learning is often orientation and location specific, which may indicate neuronal plasticity in early visual areas. However, learning specificity diminishes with additional exposure of the transfer orientation or location via irrelevant tasks, suggesting that the specificity is related to untrained conditions, likely because neurons representing untrained conditions are neither bottom-up stimulated nor top-down attended during training. To demonstrate these top-down and bottom-up contributions, we applied a “continuous flash suppression” technique to suppress the exposure stimulus into sub-consciousness, and with additional manipulations to achieve pure bottom-up stimulation or top-down attention with the transfer condition. We found that either bottom-up or top-down influences enabled significant transfer of orientation and Vernier discrimination learning. These results suggest that learning specificity may result from under-activations of untrained visual neurons due to insufficient bottom-up stimulation and/or top-down attention during training. High-level perceptual learning thus may not functionally connect to these neurons for learning transfer. DOI: http://dx.doi.org/10.7554/eLife.14614.001 PMID:27377357

  3. Top-down and bottom-up regulation of macroalgal community structure on a Kenyan reef

    Science.gov (United States)

    Mörk, Erik; Sjöö, Gustaf Lilliesköld; Kautsky, Nils; McClanahan, Tim R.

    2009-09-01

    Top-down and bottom-up regulation in the form of grazing by herbivores and nutrient availability are important factors governing macroalgal communities in the coral reef ecosystem. Today, anthropogenic activities, such as over-harvesting of herbivorous fish and sea urchins and increased nutrient loading, are altering the interaction of these two structuring forces. The present study was conducted in Kenya and investigates the relative importance of herbivory and nutrient loading on macroalgal community dynamics, by looking at alterations in macroalgal functional groups, species diversity ( H') and biomass within experimental quadrats. The experiment was conducted in situ for 42 days during the dry season. Cages excluding large herbivorous fish and sea urchins were used in the study and nutrient addition was conducted using coated, slow-release fertilizer (nitrogen and phosphorous) at a site where herbivory is generally low and nutrient levels are relatively high for the region. Nutrient addition increased tissue nutrient content in the algae, and fertilized quadrats had 24% higher species diversity. Herbivore exclusion resulted in a 77% increase in algal biomass, mainly attributable to a >1000% increase in corticated forms. These results are in accordance with similar studies in other regions, but are unique in that they indicate that, even when prevailing nutrient levels are relatively high and herbivore pressure is relatively low, continued anthropogenic disturbance results in further ecological responses and increased reef degradation.

  4. The Early Anthropogenic Hypothesis: Top-Down and Bottom-up Evidence

    Science.gov (United States)

    Ruddiman, W. F.

    2014-12-01

    Two complementary lines of evidence support the early anthropogenic hypothesis. Top-down evidence comes from comparing Holocene greenhouse-gas trends with those during equivalent intervals of previous interglaciations. The increases in CO2 and CH4 during the late Holocene are anomalous compared to the decreasing trends in a stacked average of previous interglaciations, thereby supporting an anthropogenic origin. During interglacial stage 19, the closest Holocene insolation analog, CO2 fell to 245 ppm by the time equivalent to the present, in contrast to the observed pre-industrial rise to 280-285 ppm. The 245-ppm level measured in stage 19 falls at the top of the natural range predicted by the original anthropogenic hypothesis of Ruddiman (2003). Bottom-up evidence comes from a growing list of archeological and other compilations showing major early anthropogenic transformations of Earth's surface. Key examples include: efforts by Dorian Fuller and colleagues mapping the spread of irrigated rice agriculture across southern Asia and its effects on CH4 emissions prior to the industrial era; an additional effort by Fuller showing the spread of methane-emitting domesticated livestock across Asia and Africa (coincident with the spread of fertile crescent livestock across Europe); historical compilations by Jed Kaplan and colleagues documenting very high early per-capita forest clearance in Europe, thus underpinning simulations of extensive pre-industrial clearance and large CO2 emissions; and wide-ranging studies by Erle Ellis and colleagues of early anthropogenic land transformations in China and elsewhere.

  5. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    Science.gov (United States)

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-01

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem. PMID:26709623

  6. Tracking and Removing Br during the Bottom-Up Synthesis of a Graphene Nanoribbon

    Science.gov (United States)

    Bronner, Christopher; Björk, Jonas; Tegeder, Petra

    Thermally induced, two-step bottom-up synthesis from halogen-substituted molecular precursors adsorbed at metal surfaces is an intriguing concept for obtaining graphene nanoribbons with well-defined edge structure and widths on the nanometer scale. The reaction pathways of the dissociated Br atoms have so far not been in the focus of research although they may very well interfere with the on-surface synthesis. Using temperature-programmed desorption we show that Br leaves the surface as HBr in an associative desorption process during the second reaction step, the cyclodehydrogenation. Density functional theory is employed to compare this process to the competing desorption of molecular hydrogen and furthermore shows that prior to desorption, Br is submerged under the three-dimensional intermediate reaction product, polyanthrylene. Upon exposure of this intermediate co-adsorbate to an atmosphere of molecular hydrogen, Br is removed from the surface but the cyclodehydrogenation step is still feasible which demonstrates that Br does not influence the on-surface synthesis. Generally, the ability to remove Br by providing molecular hydrogen opens an effective way to exclude unfavorable influences of the halogen (e.g. side-products, steric effects) in on-surface coupling reactions.

  7. Peptide Hydration Phenomena through a Combined Quantum Chemical and Bottom-Up Approach

    Science.gov (United States)

    Lanza, Giuseppe; Chiacchio, Maria Assunta

    2016-09-01

    The M06-2X, TPSS, and B3PW91 density functionals and the classical ab initio MP2 method were used to study microsolvation around the protonated trialanine, Ala3H+. All adopted electronic structure approaches show the formation of wires or compact ring clusters of water molecules strongly bonded to peptidic polar groups through hydrogen bonds with hydration energy ranging from - 93 to -66 kcal mol-1. Independently from the adopted electronic structure methods, explicit water molecules favor peptidic chain with the polyproline II (PPII) conformation, thus the electronic energy stability order of the four unfolded conformers follows the sequence: PPII-PPII > β-PPII ˜ PPII-β > β-β, while entropy favors the reversed order. The delicate balance of electronic energy (or enthalpy) and entropy modulated by the temperature accounts for the change in abundance of the PPII and β conformations experimentally observed. The proposed bottom-up approach has been developed following the energetically dominant polar groups of peptide and water dipoles interactions. The intrapeptide dipole decoupling, caused by the β → PPII transformation, and the consequent greater dipole coupling with water molecules provide a rational base to explain the energy gain due to the explicit water coordination to PPII residues.

  8. Achieving integrated urban water management: planning top-down or bottom-up?

    Science.gov (United States)

    Gabe, J; Trowsdale, S; Vale, R

    2009-01-01

    Integrated Urban Water Management (IUWM) acknowledges a broad range of environmental and socio-economic outcomes but the link between design intentions and operational performance is not always clear. This may be due in part to a lack of shared principles that remove bias and inconsistency in assessing the operational performance of IUWM. This paper investigates the possibility of developing shared principles through examination of shared objectives and shared indicators within two logical and integrated frameworks for urban residential developments that aspire for IUWM and sustainable development. The framework method was applied using very different approaches-one a top-down urban planning process, the other a bottom-up community consultation process. Both frameworks highlight the extent to which IUWM is part of a broad social and environmental system. Core environmental performance objectives and indicators were very similar, highlighting the potential to develop shared principles in reporting and benchmarking the environmental performance of neighbourhood developments. Socio-economic indicators were highly variable due to process and likely contextual differences, thus it is unclear if the influence of IUWM on these variables can transcend the social context unless the practice of urban water management can expand its core responsibility beyond "hard" physical infrastructure. PMID:19474495

  9. Template-Free Bottom-Up Method for Fabricating Diblock Copolymer Patchy Particles.

    Science.gov (United States)

    Ye, Xianggui; Li, Zhan-Wei; Sun, Zhao-Yan; Khomami, Bamin

    2016-05-24

    Patchy particles are one of most important building blocks for hierarchical structures because of the discrete patches on their surface. We have demonstrated a convenient, simple, and scalable bottom-up method for fabricating diblock copolymer patchy particles through both experiments and dissipative particle dynamics (DPD) simulations. The experimental method simply involves reducing the solvent quality of the diblock copolymer solution by the slow addition of a nonsolvent. Specifically, the fabrication of diblock copolymer patchy particles begins with a crew-cut soft-core micelle, where the micelle core is significantly swelled by the solvent. With water addition at an extremely slow rate, the crew-cut soft-core micelles first form a larger crew-cut micelle. With further water addition, the corona-forming blocks of the crew-cut micelles begin to aggregate and eventually form well-defined patches. Both experiments and DPD simulations indicate that the number of patches has a very strong dependence on the diblock copolymer composition-the particle has more patches on the surface with a lower volume fraction of patch-forming blocks. Furthermore, particles with more patches have a greater ability to assemble, and particles with fewer patches have a greater ability to merge once assembled. PMID:27109249

  10. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    Science.gov (United States)

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-01

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.

  11. Top-down silicon microcantilever with coupled bottom-up silicon nanowire for enhanced mass resolution

    International Nuclear Information System (INIS)

    A stepped cantilever composed of a bottom-up silicon nanowire coupled to a top-down silicon microcantilever electrostatically actuated and with capacitive or optical readout is fabricated and analyzed, both theoretically and experimentally, for mass sensing applications. The mass sensitivity at the nanowire free end and the frequency resolution considering thermomechanical noise are computed for different nanowire dimensions. The results obtained show that the coupled structure presents a very good mass sensitivity thanks to the nanowire, where the mass depositions take place, while also presenting a very good frequency resolution due to the microcantilever, where the transduction is carried out. A two-fold improvement in mass sensitivity with respect to that of the microcantilever standalone is experimentally demonstrated, and at least an order-of-magnitude improvement is theoretically predicted, only changing the nanowire length. Very close frequency resolutions are experimentally measured and theoretically predicted for a standalone microcantilever and for a microcantilever-nanowire coupled system. Thus, an improvement in mass sensing resolution of the microcantilever-nanowire stepped cantilever is demonstrated with respect to that of the microcantilever standalone. (paper)

  12. Rational design of modular circuits for gene transcription: A test of the bottom-up approach

    Directory of Open Access Journals (Sweden)

    Giordano Emanuele

    2010-11-01

    Full Text Available Abstract Background Most of synthetic circuits developed so far have been designed by an ad hoc approach, using a small number of components (i.e. LacI, TetR and a trial and error strategy. We are at the point where an increasing number of modular, inter-changeable and well-characterized components is needed to expand the construction of synthetic devices and to allow a rational approach to the design. Results We used interchangeable modular biological parts to create a set of novel synthetic devices for controlling gene transcription, and we developed a mathematical model of the modular circuits. Model parameters were identified by experimental measurements from a subset of modular combinations. The model revealed an unexpected feature of the lactose repressor system, i.e. a residual binding affinity for the operator site by induced lactose repressor molecules. Once this residual affinity was taken into account, the model properly reproduced the experimental data from the training set. The parameters identified in the training set allowed the prediction of the behavior of networks not included in the identification procedure. Conclusions This study provides new quantitative evidences that the use of independent and well-characterized biological parts and mathematical modeling, what is called a bottom-up approach to the construction of gene networks, can allow the design of new and different devices re-using the same modular parts.

  13. Optical and electronic properties study of bottom-up graphene nanoribbons for photovoltaic applications

    Science.gov (United States)

    Villegas, Cesar E. P.; Rocha, Alexandre

    2015-03-01

    Graphene nanoribbons (GNRs), turn out to be serious contender for several optolectronic applications due to their physical properties. Recently, bottom-up methods, using the assembly of appropriate precursor molecules were shown to be an exciting pathway towards making precise nanoribbons. In particular, it has been demonstrated that so-called cove-shaped GNRs absorb light in the visible part of the spectrum, suggesting they could be used for photovoltaic applications. In solar cells, the key ingredient is the presence excitons and their subsequent diffusion along a donor material. This is influenced by the character of the different excitations taking place, as well as, the exciton binding energy. Thus, In this work we use many-body corrected density functional theory to simulate the optical properties of these nanoribbons. We elucidate the most important transitions occurring in these systems, and identify types of excitatiions that have not been previously observed in conventional nanoribbons. We also find that the exciton binding energies for all the structures we considered are in the eV range, which enhances the diffusion lengths for the particle-hole pairs. Finally, we estimate the potencial of these systems as solar cells by calculating the short-circuit current. The Authors thank FAPESP for financial support.

  14. Kelps across the portuguese coast: evidence of top-down and bottom-up influences

    Directory of Open Access Journals (Sweden)

    Joao N Franco

    2014-04-01

    Full Text Available Kelps (large brown seaweeds are conspicuous elements of the Portuguese coast, although kelp abundance is declining, especially at central and southern Portugal. While many studies point out increased seawater temperature as the main factor explaining kelp decline, little attention has been given to top-down (predatory influences. Through in situ experiments, we tested how herbivory affects the distribution and abundance of kelp recruits. We compared the abundance and survivorship of recruits, the intensity of grazing on recruits and the abundance of herbivores between Viana do Castelo (northern Portugal and Peniche (central Portugal. In addition, through an outdoor mesocosm experiment, we disentangled the independent and interactive effects of both seawater temperature and nutrients on kelp recruits performance (mortality, growth, C, N and carbohydrate contents, and photosynthetic potential and efficiency. Our main results showed that herbivores and herbivory intensity is higher at central compared to northern Portugal and that the growth of juvenile sporophytes was larger under a high-nutrient scenarios, particularly when temperatures were low (12, 15 and 16ºC. Despite the presence and fitness of kelps across southern European waters has been exclusively linked to bottom-up influences, our results highlights the importance of top-down effects for kelp survivorship and performance.

  15. A Bottom-Up Engineered Broadband Optical Nanoabsorber for Radiometry and Energy Harnessing Applications

    Science.gov (United States)

    Kaul, Anupama B.; Coles, James B.; Megerian, Krikor G.; Eastwood, Michael; Green, Robert O.; Bandaru, Prabhakar R.

    2013-01-01

    Optical absorbers based on vertically aligned multi-walled carbon nanotubes (MWCNTs), synthesized using electric-field assisted growth, are described here that show an ultra-low reflectance, 100X lower compared to Au-black from wavelength lamba approximately 350 nm - 2.5 micron. A bi-metallic Co/Ti layer was shown to catalyze a high site density of MWCNTs on metallic substrates and the optical properties of the absorbers were engineered by controlling the bottom-up synthesis conditions using dc plasma-enhanced chemical vapor deposition (PECVD). Reflectance measurements on the MWCNT absorbers after heating them in air to 400deg showed negligible changes in reflectance which was still low, approximately 0.022 % at lamba approximately 2 micron. In contrast, the percolated structure of the reference Au-black samples collapsed completely after heating, causing the optical response to degrade at temperatures as low as 200deg. The high optical absorption efficiency of the MWCNT absorbers, synthesized on metallic substrates, over a broad spectral range, coupled with their thermal ruggedness, suggests they have promise in solar energy harnessing applications, as well as thermal detectors for radiometry.

  16. Bottom-Up Abstract Modelling of Optical Networks-on-Chip: From Physical to Architectural Layer

    Directory of Open Access Journals (Sweden)

    Alberto Parini

    2012-01-01

    Full Text Available This work presents a bottom-up abstraction procedure based on the design-flow FDTD + SystemC suitable for the modelling of optical Networks-on-Chip. In this procedure, a complex network is decomposed into elementary switching elements whose input-output behavior is described by means of scattering parameters models. The parameters of each elementary block are then determined through 2D-FDTD simulation, and the resulting analytical models are exported within functional blocks in SystemC environment. The inherent modularity and scalability of the S-matrix formalism are preserved inside SystemC, thus allowing the incremental composition and successive characterization of complex topologies typically out of reach for full-vectorial electromagnetic simulators. The consistency of the outlined approach is verified, in the first instance, by performing a SystemC analysis of a four-input, four-output ports switch and making a comparison with the results of 2D-FDTD simulations of the same device. Finally, a further complex network encompassing 160 microrings is investigated, the losses over each routing path are calculated, and the minimum amount of power needed to guarantee an assigned BER is determined. This work is a basic step in the direction of an automatic technology-aware network-level simulation framework capable of assembling complex optical switching fabrics, while at the same time assessing the practical feasibility and effectiveness at the physical/technological level.

  17. A Bottom-Up Approach for Automatically Grouping Sensor Data Layers by their Observed Property

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-01-01

    Full Text Available The Sensor Web is a growing phenomenon where an increasing number of sensors are collecting data in the physical world, to be made available over the Internet. To help realize the Sensor Web, the Open Geospatial Consortium (OGC has developed open standards to standardize the communication protocols for sharing sensor data. Spatial Data Infrastructures (SDIs are systems that have been developed to access, process, and visualize geospatial data from heterogeneous sources, and SDIs can be designed specifically for the Sensor Web. However, there are problems with interoperability associated with a lack of standardized naming, even with data collected using the same open standard. The objective of this research is to automatically group similar sensor data layers. We propose a methodology to automatically group similar sensor data layers based on the phenomenon they measure. Our methodology is based on a unique bottom-up approach that uses text processing, approximate string matching, and semantic string matching of data layers. We use WordNet as a lexical database to compute word pair similarities and derive a set-based dissimilarity function using those scores. Two approaches are taken to group data layers: mapping is defined between all the data layers, and clustering is performed to group similar data layers. We evaluate the results of our methodology.

  18. Visionmaker NYC: A bottom-up approach to finding shared socioeconomic pathways in New York City

    Science.gov (United States)

    Sanderson, E. W.; Fisher, K.; Giampieri, M.; Barr, J.; Meixler, M.; Allred, S. B.; Bunting-Howarth, K. E.; DuBois, B.; Parris, A. S.

    2015-12-01

    Visionmaker NYC is a free, public participatory, bottom-up web application to develop and share climate mitigation and adaptation strategies for New York City neighborhoods. The goal is to develop shared socioeconomic pathways by allowing a broad swath of community members - from schoolchildren to architects and developers to the general public - to input their concepts for a desired future. Visions are comprised of climate scenarios, lifestyle choices, and ecosystem arrangements, where ecosystems are broadly defined to include built ecosystems (e.g. apartment buildings, single family homes, etc.), transportation infrastructure (e.g. highways, connector roads, sidewalks), and natural land cover types (e.g. wetlands, forests, estuary.) Metrics of water flows, carbon cycling, biodiversity patterns, and population are estimated for the user's vision, for the same neighborhood today, and for that neighborhood as it existed in the pre-development state, based on the Welikia Project (welikia.org.) Users can keep visions private, share them with self-defined groups of other users, or distribute them publicly. Users can also propose "challenges" - specific desired states of metrics for specific parts of the city - and others can post visions in response. Visionmaker contributes by combining scenario planning, scientific modelling, and social media to create new, wide-open possibilities for discussion, collaboration, and imagination regarding future, shared socioeconomic pathways.

  19. A Novel GBM Saliency Detection Model Using Multi-Channel MRI.

    Directory of Open Access Journals (Sweden)

    Subhashis Banerjee

    Full Text Available The automatic computerized detection of regions of interest (ROI is an important step in the process of medical image processing and analysis. The reasons are many, and include an increasing amount of available medical imaging data, existence of inter-observer and inter-scanner variability, and to improve the accuracy in automatic detection in order to assist doctors in diagnosing faster and on time. A novel algorithm, based on visual saliency, is developed here for the identification of tumor regions from MR images of the brain. The GBM saliency detection model is designed by taking cue from the concept of visual saliency in natural scenes. A visually salient region is typically rare in an image, and contains highly discriminating information, with attention getting immediately focused upon it. Although color is typically considered as the most important feature in a bottom-up saliency detection model, we circumvent this issue in the inherently gray scale MR framework. We develop a novel pseudo-coloring scheme, based on the three MRI sequences, viz. FLAIR, T2 and T1C (contrast enhanced with Gadolinium. A bottom-up strategy, based on a new pseudo-color distance and spatial distance between image patches, is defined for highlighting the salient regions in the image. This multi-channel representation of the image and saliency detection model help in automatically and quickly isolating the tumor region, for subsequent delineation, as is necessary in medical diagnosis. The effectiveness of the proposed model is evaluated on MRI of 80 subjects from the BRATS database in terms of the saliency map values. Using ground truth of the tumor regions for both high- and low- grade gliomas, the results are compared with four highly referred saliency detection models from literature. In all cases the AUC scores from the ROC analysis are found to be more than 0.999 ± 0.001 over different tumor grades, sizes and positions.

  20. Bottom-up synthesis of ordered metal/oxide/metal nanodots on substrates for nanoscale resistive switching memory

    Science.gov (United States)

    Han, Un-Bin; Lee, Jang-Sik

    2016-05-01

    The bottom-up approach using self-assembled materials/processes is thought to be a promising solution for next-generation device fabrication, but it is often found to be not feasible for use in real device fabrication. Here, we report a feasible and versatile way to fabricate high-density, nanoscale memory devices by direct bottom-up filling of memory elements. An ordered array of metal/oxide/metal (copper/copper oxide/copper) nanodots was synthesized with a uniform size and thickness defined by self-organized nanotemplate mask by sequential electrochemical deposition (ECD) of each layer. The fabricated memory devices showed bipolar resistive switching behaviors confirmed by conductive atomic force microscopy. This study demonstrates that ECD with bottom-up growth has great potential to fabricate high-density nanoelectronic devices beyond the scaling limit of top-down device fabrication processes.

  1. Bottom-up synthesis of ordered metal/oxide/metal nanodots on substrates for nanoscale resistive switching memory

    Science.gov (United States)

    Han, Un-Bin; Lee, Jang-Sik

    2016-01-01

    The bottom-up approach using self-assembled materials/processes is thought to be a promising solution for next-generation device fabrication, but it is often found to be not feasible for use in real device fabrication. Here, we report a feasible and versatile way to fabricate high-density, nanoscale memory devices by direct bottom-up filling of memory elements. An ordered array of metal/oxide/metal (copper/copper oxide/copper) nanodots was synthesized with a uniform size and thickness defined by self-organized nanotemplate mask by sequential electrochemical deposition (ECD) of each layer. The fabricated memory devices showed bipolar resistive switching behaviors confirmed by conductive atomic force microscopy. This study demonstrates that ECD with bottom-up growth has great potential to fabricate high-density nanoelectronic devices beyond the scaling limit of top-down device fabrication processes. PMID:27157385

  2. The Comparative Effect of Top-down Processing and Bottom-up Processing through TBLT on Extrovert and Introvert EFL

    Directory of Open Access Journals (Sweden)

    Pezhman Nourzad Haradasht

    2013-09-01

    Full Text Available This research seeks to examine the effect of two models of reading comprehension, namely top-down and bottom-up processing, on the reading comprehension of extrovert and introvert EFL learners’ reading comprehension. To do this, 120 learners out of a total number of 170 intermediate learners being educated at Iran Mehr English Language School were selected all taking a PET (Preliminary English Test first for homogenization prior to the study. They also answered the Eysenck Personality Inventory (EPI which in turn categorized them into two subgroups within each reading models consisting of introverts and extroverts. All in all, there were four subgroups: 30 introverts and 30 extroverts undergoing the top-down processing treatment, and 30 introverts and 30 extroverts experiencing the bottom-up processing treatment. The aforementioned PET was administered as the post test of the study after each group was exposed to the treatment for 18 sessions in six weeks. After the instructions finished, the mean scores of all four groups on this post test were computed and a two-way ANOVA was run to test all the four hypotheses raise in this study. the results showed that while learners generally benefitted more from the bottom-up processing setting compared  to the top-down processing one, the extrovert group was better off receiving top-down instruction. Furthermore, introverts outperformed extroverts in bottom-up group; yet between the two personalities subgroups in the top-down setting no difference was seen. A predictable pattern of benefitting from teaching procedures could not be drawn for introverts as in both top-down and bottom-up settings, they benefitted more than extroverts.Keywords: Reading comprehension, top-down processing, bottom-up processing, extrovert, introvert

  3. The Role of Top-Down Focused Spatial Attention in Preattentive Salience Coding and Salience-based Attentional Capture.

    Science.gov (United States)

    Bertleff, Sabine; Fink, Gereon R; Weidner, Ralph

    2016-08-01

    Selective visual attention requires an efficient coordination between top-down and bottom-up attention control mechanisms. This study investigated the behavioral and neural effects of top-down focused spatial attention on the coding of highly salient distractors and their tendency to capture attention. Combining spatial cueing with an irrelevant distractor paradigm revealed bottom-up based attentional capture only when attention was distributed across the whole search display, including the distractor location. Top-down focusing spatial attention on the target location abolished attentional capture of a salient distractor outside the current attentional focus. Functional data indicated that the missing capture effect was not based on diminished bottom-up salience signals at unattended distractor locations. Irrespectively of whether salient distractors occurred at attended or unattended locations, their presence enhanced BOLD signals at their respective spatial representation in early visual areas as well as in inferior frontal, superior parietal, and medial parietal cortex. Importantly, activity in these regions reflected the presence of a salient distractor rather than attentional capture per se. Moreover, successfully inhibiting attentional capture of a salient distractor at an unattended location further increased neural responses in medial parietal regions known to be involved in controlling spatial attentional shifts. Consequently, data provide evidence that top-down focused spatial attention prevents automatic attentional capture by supporting attentional control processes counteracting a spatial bias toward a salient distractor. PMID:27054402

  4. Quantifying the uncertainties of a bottom-up emission inventory of anthropogenic atmospheric pollutants in China

    Directory of Open Access Journals (Sweden)

    Y. Zhao

    2010-11-01

    Full Text Available The uncertainties of a national, bottom-up inventory of Chinese emissions of anthropogenic SO2, NOx, and particulate matter (PM of different size classes and carbonaceous species are comprehensively quantified, for the first time, using Monte Carlo simulation. The inventory is structured by seven dominant sectors: coal-fired electric power, cement, iron and steel, other industry (boiler combustion, other industry (non-combustion processes, transportation, and residential. For each parameter related to emission factors or activity-level calculations, the uncertainties, represented as probability distributions, are either statistically fitted using results of domestic field tests or, when these are lacking, estimated based on foreign or other domestic data. The uncertainties (i.e., 95% confidence intervals around the central estimates of Chinese emissions of SO2, NOx, total PM, PM10, PM2.5, black carbon (BC, and organic carbon (OC in 2005 are estimated to be −14%~12%, −10%~36%, −10%~36%, −12%~42% −16%~52%, −23%~130%, and −37%~117%, respectively. Variations at activity levels (e.g., energy consumption or industrial production are not the main source of emission uncertainties. Due to narrow classification of source types, large sample sizes, and relatively high data quality, the coal-fired power sector is estimated to have the smallest emission uncertainties for all species except BC and OC. Due to poorer source classifications and a wider range of estimated emission factors, considerable uncertainties of NOx and PM emissions from cement production and boiler combustion in other industries are found. The probability distributions of emission factors for biomass burning, the largest source of BC and OC, are fitted based on very limited domestic field measurements, and special caution should thus be taken interpreting these emission uncertainties. Although Monte

  5. Top-down or bottom-up: Contrasting perspectives on psychiatric diagnoses

    Directory of Open Access Journals (Sweden)

    Willem MA Verhoeven

    2008-09-01

    Full Text Available Willem MA Verhoeven1,2, Siegfried Tuinier1, Ineke van der Burgt31Vincent van Gogh Institute for Psychiatry, Venray, The Netherlands; 2Department of Psychiatry, Erasmus University Medical Centre, Rotterdam, The Netherlands; 3Department of Human Genetics, Radboud University Medical Centre, Nijmegen, The NetherlandsAbstract: Clinical psychiatry is confronted with the expanding knowledge of medical genetics. Most of the research into the genetic underpinnings of major mental disorders as described in the categorical taxonomies, however, did reveal linkage with a variety of chromosomes. This heterogeneity of results is most probably due to the assumption that the nosological categories as used in these studies are disease entities with clear boundaries. If the reverse way of looking, the so-called bottom-up approach, is applied, it becomes clear that genetic abnormalities are in most cases not associated with a single psychiatric disorder but with a certain probability to develop a variety of aspecific psychiatric symptoms. The adequacy of the categorical taxonomy, the so-called top-down approach, seems to be inversely related to the amount of empirical etiological data. This is illustrated by four rather prevalent genetic syndromes, fragile X syndrome, Prader-Willi syndrome, 22q11 deletion syndrome, and Noonan syndrome, as well as by some cases with rare chromosomal abnormalities. From these examples, it becomes clear that psychotic symptoms as well as mood, anxiety, and autistic features can be found in a great variety of different genetic syndromes. A psychiatric phenotype exists, but comprises, apart from the chance to present several psychiatric symptoms, all elements from developmental, neurocognitive, and physical characteristics.Keywords: genetic disorders, psychiatric symptoms, phenotype, mental disorders

  6. A bottom-up control on fresh-bedrock topography under landscapes.

    Science.gov (United States)

    Rempe, Daniella M; Dietrich, William E

    2014-05-01

    The depth to unweathered bedrock beneath landscapes influences subsurface runoff paths, erosional processes, moisture availability to biota, and water flux to the atmosphere. Here we propose a quantitative model to predict the vertical extent of weathered rock underlying soil-mantled hillslopes. We hypothesize that once fresh bedrock, saturated with nearly stagnant fluid, is advected into the near surface through uplift and erosion, channel incision produces a lateral head gradient within the fresh bedrock inducing drainage toward the channel. Drainage of the fresh bedrock causes weathering through drying and permits the introduction of atmospheric and biotically controlled acids and oxidants such that the boundary between weathered and unweathered bedrock is set by the uppermost elevation of undrained fresh bedrock, Zb. The slow drainage of fresh bedrock exerts a "bottom up" control on the advance of the weathering front. The thickness of the weathered zone is calculated as the difference between the predicted topographic surface profile (driven by erosion) and the predicted groundwater profile (driven by drainage of fresh bedrock). For the steady-state, soil-mantled case, a coupled analytical solution arises in which both profiles are driven by channel incision. The model predicts a thickening of the weathered zone upslope and, consequently, a progressive upslope increase in the residence time of bedrock in the weathered zone. Two nondimensional numbers corresponding to the mean hillslope gradient and mean groundwater-table gradient emerge and their ratio defines the proportion of the hillslope relief that is unweathered. Field data from three field sites are consistent with model predictions.

  7. Do top-down or bottom-up forces determine Stephanitis pyrioides abundance in urban landscapes?

    Science.gov (United States)

    Shrewsbury, Paula M; Raupp, Michael J

    2006-02-01

    This study examined the influence of habitat structural complexity on the collective effects of top-down and bottom-up forces on herbivore abundance in urban landscapes. The persistence and varying complexity of urban landscapes set them apart from ephemeral agroecosystems and natural habitats where the majority of studies have been conducted. Using surveys and manipulative experiments. We explicitly tested the effect of natural enemies (enemies hypothesis), host plant quality, and herbivore movement on the abundance of the specialist insect herbivore, Stephanitis pyrioides, in landscapes of varying structural complexity. This herbivore was extremely abundant in simple landscapes and rare in complex ones. Natural enemies were the major force influencing abundance of S. pyrioides across habitat types. Generalist predators, particularly the spider Anyphaena celer, were more abundant in complex landscapes. Predator abundance was related to greater abundance of alternative prey in those landscapes. Stephanitis pyrioides survival was lower in complex habitats when exposed to endemic natural enemy populations. Laboratory feeding trials confirmed the more abundant predators consumed S. pyrioides. Host plant quality was not a strong force influencing patterns of S. pyrioides abundance. When predators were excluded, adult S. pyrioides survival was greater on azaleas grown in complex habitats, in opposition to the observed pattern of abundance. Similarly, complexity did not affect S. pyrioides immigration and emigration rates. The complexity of urban landscapes affects the strength of top-down forces on herbivorous insect populations by influencing alternative prey and generalist predator abundance. It is possible that habitats can be manipulated to promote the suppressive effects of generalist predators.

  8. Elicited Salience and Salience-Based Level-k

    OpenAIRE

    Wolff, Irenaeus

    2016-01-01

    A level-k model based on a specific salience-pattern is the only model in the literature that accounts for behaviour in hide-and-seek games. This paper presents nine different experiments designed to measure salience. The elicited salience patterns tend to be similar, but none of them is similar to the pattern needed to allow the level-k model explain the hide-and-seek data. When based on any of the empirical salience measures, the salience-based level-k model does not fit the data well. p...

  9. Grain size engineering of bcc refractory metals: Top-down and bottom-up-Application to tungsten

    International Nuclear Information System (INIS)

    We have used two general methodologies for the production of ultrafine grained (UFG) and nanocrystalline (NC) tungsten (W) metal samples: top-down and bottom-up. In the first, Equal channel angular extrusion (ECAE), coupled with warm rolling has been used to fabricate UFG W, and high pressure torsion (HPT) was used to fabricate NC W. We demonstrate an abrupt shift in the deformation mechanism, particularly under dynamic compressive loading, in UFG and NC W. This novel deformation mechanism, a dramatic transition from a uniform deformation mode to that of localized shearing, is shared by other UFG and NC body-centerd cubic (BCC) metals. We have also conducted a series of bottom-up experiments to consolidate powdered UFG W precursors into solid bodies. The bottom-up approach relies on rapid, high-temperature consolidation, specifically designed for UFG and NC W powders. The mechanical property results from the top-down UFG and NC W were used as minimum property benchmarks to guide and design the experimental protocols and parameters for use in the bottom-up procedures. Preliminary results, showing rapid grain growth during the consolidation cycle, did not achieve full density in the W samples. Further development of high-purity W nanopowders and appropriate grain-growth inhibitors (e.g., Zener pinning) will be required to successfully produce bulk-sized UFG and NC W samples

  10. Leadership for Quality University Teaching: How Bottom-Up Academic Insights Can Inform Top-Down Leadership

    Science.gov (United States)

    Scott, Donald E.; Scott, Shelleyann

    2016-01-01

    This paper presents the leadership implications from a study that explored how to increase the quality of teaching in a university thereby presenting data from the bottom up--the academic perspective--to inform leadership, policies, and academic development which generally flows from the top down. We report academics' perceptions of and…

  11. Engineered Micro-Objects as Scaffolding Elements in Cellular Building Blocks for Bottom-Up Tissue Engineering Approaches

    NARCIS (Netherlands)

    Leferink, A.M.; Schipper, D.; Arts, E.; Vrij, E.J.; Rivron, N.C.; Karperien, H.B.J.; Mittmann, K.; Blitterswijk, van C.A.; Moroni, L.; Truckenmuller, R.K.

    2014-01-01

    A material-based bottom-up approach is proposed towards an assembly of cells and engineered micro-objects at the macroscale. We show how shape, size and wettability of engineered micro-objects play an important role in the behavior of cells on these objects. This approach can, among other applicatio

  12. Assessing the Gap Between Top-down and Bottom-up Measured Methane Emissions in Indianapolis, IN.

    Science.gov (United States)

    Prasad, K.; Lamb, B. K.; Cambaliza, M. O. L.; Shepson, P. B.; Stirm, B. H.; Salmon, O. E.; Lavoie, T. N.; Lauvaux, T.; Ferrara, T.; Howard, T.; Edburg, S. L.; Whetstone, J. R.

    2014-12-01

    Releases of methane (CH4) from the natural gas supply chain in the United States account for approximately 30% of the total US CH4 emissions. However, there continues to be large questions regarding the accuracy of current emission inventories for methane emissions from natural gas usage. In this paper, we describe results from top-down and bottom-up measurements of methane emissions from the large isolated city of Indianapolis. The top-down results are based on aircraft mass balance and tower based inverse modeling methods, while the bottom-up results are based on direct component sampling at metering and regulating stations, surface enclosure measurements of surveyed pipeline leaks, and tracer/modeling methods for other urban sources. Mobile mapping of methane urban concentrations was also used to identify significant sources and to show an urban-wide low level enhancement of methane levels. The residual difference between top-down and bottom-up measured emissions is large and cannot be fully explained in terms of the uncertainties in top-down and bottom-up emission measurements and estimates. Thus, the residual appears to be, at least partly, attributed to a significant wide-spread diffusive source. Analyses are included to estimate the size and nature of this diffusive source.

  13. Using classic methods in a networked manner: seeing volunteered spatial information in a bottom-up fashion

    NARCIS (Netherlands)

    Carton, L.J.; Ache, P.M.

    2014-01-01

    Using new social media and ICT infrastructures for self-organization, more and more citizen networks and business sectors organize themselves voluntarily around sustainability themes. The paper traces and evaluates one emerging innovation in such bottom-up, networked form of sustainable governance

  14. Citizenship Policy from the Bottom-Up: The Linguistic and Semiotic Landscape of a Naturalization Field Office

    Science.gov (United States)

    Loring, Ariel

    2015-01-01

    This article follows a bottom-up approach to language policy (Ramanathan, 2005; Wodak, 2006) in an analysis of citizenship in policy and practice. It compares representations of citizenship in and around a regional branch of the United States Citizenship and Immigration Services (USCIS), with a focus on citizenship swearing-in ceremonies for…

  15. Reconciling Top-Down and Bottom-Up Estimates of Oil and Gas Methane Emissions in the Barnett Shale

    Science.gov (United States)

    Hamburg, S.

    2015-12-01

    Top-down approaches that use aircraft, tower, or satellite-based measurements of well-mixed air to quantify regional methane emissions have typically estimated higher emissions from the natural gas supply chain when compared to bottom-up inventories. A coordinated research campaign in October 2013 used simultaneous top-down and bottom-up approaches to quantify total and fossil methane emissions in the Barnett Shale region of Texas. Research teams have published individual results including aircraft mass-balance estimates of regional emissions and a bottom-up, 25-county region spatially-resolved inventory. This work synthesizes data from the campaign to directly compare top-down and bottom-up estimates. A new analytical approach uses statistical estimators to integrate facility emission rate distributions from unbiased and targeted high emission site datasets, which more rigorously incorporates the fat-tail of skewed distributions to estimate regional emissions of well pads, compressor stations, and processing plants. The updated spatially-resolved inventory was used to estimate total and fossil methane emissions from spatial domains that match seven individual aircraft mass balance flights. Source apportionment of top-down emissions between fossil and biogenic methane was corroborated with two independent analyses of methane and ethane ratios. Reconciling top-down and bottom-up estimates of fossil methane emissions leads to more accurate assessment of natural gas supply chain emission rates and the relative contribution of high emission sites. These results increase our confidence in our understanding of the climate impacts of natural gas relative to more carbon-intensive fossil fuels and the potential effectiveness of mitigation strategies.

  16. A bottom-up approach to urban metabolism: the perspective of BRIDGE

    Science.gov (United States)

    Chrysoulakis, N.; Borrego, C.; San Josè, R.; Grimmond, S. B.; Jones, M. B.; Magliulo, V.; Klostermann, J.; Santamouris, M.

    2011-12-01

    Urban metabolism considers a city as a system and usually distinguishes between energy and material flows as its components. "Metabolic" studies are usually top-down approaches that assess the inputs and outputs of food, water, energy, and pollutants from a city, or that compare the changing metabolic process of several cities. In contrast, bottom-up approaches are based on quantitative estimates of urban metabolism components at local to regional scales. Such approaches consider the urban metabolism as the 3D exchange and transformation of energy and matter between a city and its environment. The city is considered as a system and the physical flows between this system and its environment are quantitatively estimated. The transformation of landscapes from primarily agricultural and forest uses to urbanized landscapes can greatly modify energy and material exchanges and it is, therefore, an important aspect of an urban area. Here we focus on the exchanges and transformation of energy, water, carbon and pollutants. Recent advances in bio-physical sciences have led to new methods and models to estimate local scale energy, water, carbon and pollutant fluxes. However, there is often poor communication of new knowledge and its implications to end-users, such as planners, architects and engineers. The FP7 Project BRIDGE (SustainaBle uRban plannIng Decision support accountinG for urban mEtabolism) aims at bridging this gap and at illustrating the advantages of considering environmental issues in urban planning. BRIDGE does not perform a complete life cycle analysis or calculate whole system urban metabolism, but rather focuses on specific metabolism components (energy, water, carbon and pollutants). Its main goal is the development of a Decision Suport System (DSS) with the potential to select planning actions which better fit the goal of changing the metabolism of urban systems towards sustainability. BRIDGE evaluates how planning alternatives can modify the physical

  17. Prefrontal /accumbal catecholamine system processes high motivational salience.

    Directory of Open Access Journals (Sweden)

    Stefano ePuglisi-Allegra

    2012-06-01

    Full Text Available Motivational salience regulates the strength of goal seeking, the amount of risk taken, and the energy invested from mild to extreme. Highly motivational experiences promote highly persistent memories. Although this phenomenon is adaptive in normal conditions, experiences with extremely high levels of motivational salience can promote development of memories that can be re-experienced intrusively for long time resulting in maladaptive outcomes.Neural mechanisms mediating motivational salience attribution are, therefore, very important for individual and species survival and for well-being. However, these neural mechanisms could be implicated in attribution of abnormal motivational salience to different stimuli leading to maladaptive compulsive seeking or avoidance. We have offered the first evidence that prefrontal cortical norepinephrine transmission is a necessary condition for motivational salience attribution to highly salient stimuli, through modulation of dopamine in the nucleus accumbens, a brain area involved in all motivated behaviors. Moreover, we have shown that prefrontal-accumbal catecholamine system determines approach or avoidance responses to both reward- and aversion-related stimuli only when the salience of the unconditioned stimulus is high enough to induce sustained catecholamine activation, thus affirming that this system processes motivational salience attribution selectively to highly salient events.

  18. Top-down and bottom-up control of large herbivore populations: a review of natural and human-induced influences

    OpenAIRE

    Gandiwa, E.

    2013-01-01

    The question whether animal populations are top-down and/or bottom-up controlled has motivated a thriving body of research over the past five decades. In this review I address two questions: 1) how do top-down and bottom-up controls influence large herbivore populations? 2) How do human activities and control systems influence the top-down and bottom-up processes that affect large herbivore population dynamics? Previous studies suggest that the relative influence of top-down vs. bottom-up con...

  19. Salience and Asset Prices

    OpenAIRE

    Bordalo, Pedro; Gennaioli, Nicola; Shleifer, Andrei

    2013-01-01

    We present a simple model of asset pricing in which payoff salience drives investors' demand for risky assets. The key implication is that extreme payoffs receive disproportionate weight in the market valuation of assets. The model accounts for several puzzles in finance in an intuitive way, including preference for assets with a chance of very high payoffs, an aggregate equity premium, and countercyclical variation in stock market returns.

  20. Radiographic Evaluation of Children with Febrile Urinary Tract Infection: Bottom-Up, Top-Down, or None of the Above?

    OpenAIRE

    Prasad, Michaella M.; Cheng, Earl Y.

    2011-01-01

    The proper algorithm for the radiographic evaluation of children with febrile urinary tract infection (FUTI) is hotly debated. Three studies are commonly administered: renal-bladder ultrasound (RUS), voiding cystourethrogram (VCUG), and dimercapto-succinic acid (DMSA) scan. However, the order in which these tests are obtained depends on the methodology followed: bottom-up or top-down. Each strategy carries advantages and disadvantages, and some groups now advocate even less of a workup (no...

  1. A comprehensive estimate of recent carbon sinks in China using both top-down and bottom-up approaches

    Science.gov (United States)

    Jiang, Fei; Chen, Jing; Zhou, Linxi; Ju, Weimin; Zhang, Huifang; Machida, Toshinobu; Ciais, Philippe; Peters, Wouter; Wang, Hengmao; Chen, Baozhang; Liu, Linxin; Zhang, Chunhua; Matsueda, Hidekazu; Sawa, Yousuke

    2016-04-01

    Atmospheric inversions use measurements of atmospheric CO2 gradients to constrain regional surface fluxes. Current inversions indicate a net terrestrial CO2 sink in China between 0.16 and 0.35 PgC/yr. The uncertainty of these estimates is as large as the mean because the atmospheric network historically contained only one high altitude station in China. Here, we revisit the calculation of the terrestrial CO2 flux in China, excluding emissions from fossil fuel burning and cement production, by using two inversions with three new CO2 monitoring stations in China as well as aircraft observations over Asia. We estimate a net terrestrial CO2 uptake of 0.39-0.51 PgC/yr with a mean of 0.45 PgC/yr in 2006-2009. After considering the lateral transport of carbon in air and water and international trade, the annual mean carbon sink is adjusted to 0.35 PgC/yr. To evaluate this top-down estimate, we constructed an independent bottom-up estimate based on ecosystem data, and giving a net land sink of 0.33 PgC/yr. This demonstrates closure between the top-down and bottom-up estimates. Both top-down and bottom-up estimates give a higher carbon sink than previous estimates made for the 1980s and 1990s, suggesting a trend towards increased uptake by land ecosystems in China.

  2. Sponge communities on Caribbean coral reefs are structured by factors that are top-down, not bottom-up.

    Directory of Open Access Journals (Sweden)

    Joseph R Pawlik

    Full Text Available Caribbean coral reefs have been transformed in the past few decades with the demise of reef-building corals, and sponges are now the dominant habitat-forming organisms on most reefs. Competing hypotheses propose that sponge communities are controlled primarily by predatory fishes (top-down or by the availability of picoplankton to suspension-feeding sponges (bottom-up. We tested these hypotheses on Conch Reef, off Key Largo, Florida, by placing sponges inside and outside predator-excluding cages at sites with less and more planktonic food availability (15 m vs. 30 m depth. There was no evidence of a bottom-up effect on the growth of any of 5 sponge species, and 2 of 5 species grew more when caged at the shallow site with lower food abundance. There was, however, a strong effect of predation by fishes on sponge species that lacked chemical defenses. Sponges with chemical defenses grew slower than undefended species, demonstrating a resource trade-off between growth and the production of secondary metabolites. Surveys of the benthic community on Conch Reef similarly did not support a bottom-up effect, with higher sponge cover at the shallower depth. We conclude that the structure of sponge communities on Caribbean coral reefs is primarily top-down, and predict that removal of sponge predators by overfishing will shift communities toward faster-growing, undefended species that better compete for space with threatened reef-building corals.

  3. Fusion of multi-sensory saliency maps for automated perception and control

    Science.gov (United States)

    Huber, David J.; Khosla, Deepak; Dow, Paul A.

    2009-05-01

    In many real-world situations and applications that involve humans or machines (e.g., situation awareness, scene understanding, driver distraction, workload reduction, assembly, robotics, etc.) multiple sensory modalities (e.g., vision, auditory, touch, etc.) are used. The incoming sensory information can overwhelm processing capabilities of both humans and machines. An approach for estimating what is most important in our sensory environment (bottom-up or goal-driven) and using that as a basis for workload reduction or taking an action could be of great benefit in applications involving humans, machines or human-machine interactions. In this paper, we describe a novel approach for determining high saliency stimuli in multi-modal sensory environments, e.g., vision, sound, touch, etc. In such environments, the high saliency stimuli could be a visual object, a sound source, a touch event, etc. The high saliency stimuli are important and should be attended to from perception, cognition or/and action perspective. The system accomplishes this by the fusion of saliency maps from multiple sensory modalities (e.g., visual and auditory) into a single, fused multimodal saliency map that is represented in a common, higher-level coordinate system. This paper describes the computational model and method for generating multi-modal or fused saliency map. The fused saliency map can be used to determine primary and secondary foci of attention as well as for active control of a hardware/device. Such a computational model of fused saliency map would be immensely useful for a machine-based or robot-based application in a multi-sensory environment. We describe the approach, system and present preliminary results on a real-robotic platform.

  4. A convenient two-step bottom-up approach for developing Au/Fe{sub 3}O{sub 4} nanocomposites with useful optical and magnetic properties

    Energy Technology Data Exchange (ETDEWEB)

    Amala Jayanthi, S. [Department of Physics, Government Arts College (Autonomous), Nandanam, Chennai 600035 (India); Manovah David, T. [Department of Chemistry, Madras Christian College (Autonomous), Chennai 600059 (India); Jayashainy, J.; Muthu Gnana Theresa Nathan, D. [Department of Physics, Loyola College (Autonomous), Chennai 600034 (India); Sagayaraj, P., E-mail: psagayaraj@hotmail.com [Department of Physics, Loyola College (Autonomous), Chennai 600034 (India)

    2014-09-01

    Graphical abstract: Au/Fe{sub 3}O{sub 4} nanocomposites were successfully synthesized using a two-step bottom up approach under co-precipitation followed by solvothermal synthesis without using capping agents or additives. TEM results indicate that nanocomposites with less agglomeration and high monodispersion can be obtained even in the absence of additives or capping ligands. - Highlights: • Au/Fe{sub 3}O{sub 4} nanocomposites without using additives, mediator or capping ligands. • Surface morphology study reveals the uniform AuNPs coating in the nanocomposites. • Soft ferromagnetic behavior with larger M{sub s} values is observed at room temperature. - Abstract: A convenient two-step bottom-up approach is reported for the preparation of Au/Fe{sub 3}O{sub 4} nanocomposites. The synthesis of Fe{sub 3}O{sub 4} was achieved by co-precipitation method and rapid synthesis procedure was adopted for forming Au nanoparticles. The solutions containing the Fe{sub 3}O{sub 4} and Au nanoparticles were mixed in two different ratios and then solvothermally treated to obtain the Au/Fe{sub 3}O{sub 4} nanocomposites. The structural and optical properties of the nanocomposites were investigated by powder X-ray diffraction and optical absorption spectroscopic techniques. The field emission scanning electron microscopy pictures illustrate the surface morphology of the as-prepared nanocomposites. The energy dispersive X-ray analysis spectrum was taken to estimate the exact percentage of elemental composition of the nanopowder. The transmission electron microscopy analysis of the nanocomposites confirmed the presence and morphology of the Au and Fe{sub 3}O{sub 4} nanoparticles. The Au/Fe{sub 3}O{sub 4} nanocomposites were found to exhibit soft ferromagnetic behavior.

  5. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, Nicholas J. H.; Noid, W. G., E-mail: wnoid@chem.psu.edu [Department of Chemistry, The Pennsylvania State University, University Park, Pennsylvania 16802 (United States)

    2015-12-28

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed “pressure-matching” variational principle to determine a volume-dependent contribution to the potential, U{sub V}(V), that approximates the volume-dependence of the PMF. We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing U{sub V}, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that U{sub V} accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the “simplicity” of the model.

  6. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    Science.gov (United States)

    Dunn, Nicholas J. H.; Noid, W. G.

    2015-12-01

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed "pressure-matching" variational principle to determine a volume-dependent contribution to the potential, UV(V), that approximates the volume-dependence of the PMF. We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing UV, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that UV accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the "simplicity" of the model.

  7. Visually guided pointing movements are driven by the salience map.

    Science.gov (United States)

    Zehetleitner, Michael; Hegenloh, Michael; Müller, Hermann J

    2011-01-01

    Visual salience maps are assumed to mediate target selection decisions in a motor-unspecific manner; accordingly, modulations of salience influence yes/no target detection or left/right localization responses in manual key-press search tasks, as well as ocular or skeletal movements to the target. Although widely accepted, this core assumption is based on little psychophysical evidence. At least four modulations of salience are known to influence the speed of visual search for feature singletons: (i) feature contrast, (ii) cross-trial dimension sequence and (iii) semantic pre-cueing of the target dimension, and (iv) dimensional target redundancy. If salience guides also manual pointing movements, their initiation latencies (and durations) should be affected by the same four manipulations of salience. Four experiments, each examining one of these manipulations, revealed this to be the case. Thus, these effects are seen independently of the motor response required to signal the perceptual decision (e.g., directed manual pointing as well as simple yes/no detection responses). This supports the notion of a motor-unspecific salience map, which guides covert attention as well as overt eye and hand movements. PMID:21282341

  8. Benchmarking energy scenarios for China: perspectives from top-down, economic and bottom-up, technical modelling

    OpenAIRE

    Mischke, Peggy

    2014-01-01

    This study uses a soft-linking methodology to harmonise two complex global top-down and bottom-up models with a regional China focus. The baseline follows the GDP and demographic trends of the Shared Socio-economic Pathways (SSP2) scenario, down-scaled for China, while the carbon tax scenario follows the pathway of the Asia Modelling Exercise.We find that soft-linking allows "bridging the gap" and reducing uncertainty between these models. Without soft-linking, baseline result ranges for Chin...

  9. FROM A COMPARISON OF "TOP-DOWN" AND "BOTTOM-UP" APPROACHES TO THE APPLICATION OF THE "INTERACTIVE" APPROACH

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper introduces three models of reading. Then it ana-lyzes the data gathered from an experiment on the comparison ofthe "top.down" and the "bottom-up" approaches and accord-ingly draws the conclusion that the former approach is helpful inimproving students’ reading comprehension while the latter isuseful in developing their writing skills as well as their knowledgeof vocabulary and sentence structure. Finally this paper presentsa procedure of the application of the "interactive approach",which proves to be productive in teaching college English inten-sive reading.

  10. Bottom-Up Nano-heteroepitaxy of Wafer-Scale Semipolar GaN on (001) Si

    KAUST Repository

    Hus, Jui Wei

    2015-07-15

    Semipolar {101¯1} InGaN quantum wells are grown on (001) Si substrates with an Al-free buffer and wafer-scale uniformity. The novel structure is achieved by a bottom-up nano-heteroepitaxy employing self-organized ZnO nanorods as the strain-relieving layer. This ZnO nanostructure unlocks the problems encountered by the conventional AlN-based buffer, which grows slowly and contaminates the growth chamber. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. A statistical mixture method to reveal bottom-up and top-down factors guiding the eye-movements

    OpenAIRE

    Couronné, Thomas; Guérin-Dugué, Anne; Michel DUBOIS; Faye, Pauline; MARENDAZ, Christian

    2010-01-01

    When people gaze at real scenes, their visual attention is driven both by a set of bottom-up processes coming from the signal properties of the scene and also from top-down effects such as the task, the affective state, prior knowledge, or the semantic context. The context of this study is an assessment of manufactured objects (here car cab interior). From this dedicated context, this work describes a set of methods to analyze the eye-movements during the visual scene evaluation. But these me...

  12. A bottom-up approach for optimization of friction stir processing parameters; a study on aluminium 2024-T3 alloy

    International Nuclear Information System (INIS)

    Highlights: • An experimental bottom-up approach has been developed for optimizing the process parameters for friction stir processing. • Optimum parameter processed samples were tested and characterized in detail. • Ultimate tensile strength of 1.3 times the base metal strength was obtained. • Residual stresses on the processed surface were only 10% of the yield strength of base metal. • Microstructure observations revealed fine equi-axed grains with precipitate particles at the grain boundaries. - Abstract: Friction stir processing (FSP) is emerging as one of the most competent severe plastic deformation (SPD) method for producing bulk ultra-fine grained materials with improved properties. Optimizing the process parameters for a defect free process is one of the challenging aspects of FSP to mark its commercial use. For the commercial aluminium alloy 2024-T3 plate of 6 mm thickness, a bottom-up approach has been attempted to optimize major independent parameters of the process such as plunge depth, tool rotation speed and traverse speed. Tensile properties of the optimum friction stir processed sample were correlated with the microstructural characterization done using Scanning Electron Microscope (SEM) and Electron Back-Scattered Diffraction (EBSD). Optimum parameters from the bottom-up approach have led to a defect free FSP having a maximum strength of 93% the base material strength. Micro tensile testing of the samples taken from the center of processed zone has shown an increased strength of 1.3 times the base material. Measured maximum longitudinal residual stress on the processed surface was only 30 MPa which was attributed to the solid state nature of FSP. Microstructural observation reveals significant grain refinement with less variation in the grain size across the thickness and a large amount of grain boundary precipitation compared to the base metal. The proposed experimental bottom-up approach can be applied as an effective method for

  13. [Diversity in thalamic relay neurons: evidence for "bottom-up" and "top-down" information flow in thalamocortical pathways].

    Science.gov (United States)

    Clascá, Francisco; Rubio-Garrido, Pablo; Galazo, María J; Porrero, César

    2009-01-01

    Thalamocortical (TC) pathways are still mainly understood as the gateway for ascending sensory-motor information into the cortex. However, it is now clear that a great many TC cells are involved in interactions between cortical areas via the thalamus. We review recent data, including our own, which demonstrate the generalized presence in rodent thalamus of two major TC cell types characterized, among other features, by their axon development, arborization and laminar targeting in the cortex. Such duality may allow inputs from thalamus to access cortical circuits via "bottom-up"-wired axon arbors or via "top-down"-wired axon arbors.

  14. Bottom-Up Catalytic Approach towards Nitrogen-Enriched Mesoporous Carbons/Sulfur Composites for Superior Li-S Cathodes

    OpenAIRE

    Fugen Sun; Jitong Wang; Huichao Chen; Wenming Qiao; Licheng Ling; Donghui Long

    2013-01-01

    We demonstrate a sustainable and efficient approach to produce high performance sulfur/carbon composite cathodes via a bottom-up catalytic approach. The selective oxidation of H2S by a nitrogen-enriched mesoporous carbon catalyst can produce elemental sulfur as a by-product which in-situ deposit onto the carbon framework. Due to the metal-free catalytic characteristic and high catalytic selectivity, the resulting sulfur/carbon composites have almost no impurities that thus can be used as cath...

  15. Preference for Well-Balanced Saliency in Details Cropped from Photographs

    Directory of Open Access Journals (Sweden)

    Jonas eAbeln

    2016-01-01

    Full Text Available Photographic cropping is the act of selecting part of a photograph to enhance its aesthetic appearance or visual impact. It is common practice with both professional (expert and amateur (non-expert photographers. In a psychometric study, McManus et al. (2011b showed that participants cropped photographs confidently and reliably. Experts tended to select details from a wider range of positions than non-experts, but other croppers did not generally prefer details that were selected by experts. It remained unclear, however, on what grounds participants selected particular details from a photograph while avoiding other details. One of the factors contributing to cropping decision may be visual saliency. Indeed, various saliency-based computer algorithms are available for the automatic cropping of photographs. However, careful experimental studies on the relation between saliency and cropping are lacking to date. In the present study, we re-analyzed the data from the studies by McManus et al. (2011a,b, focusing on statistical image properties. We calculated saliency-based measures for details selected and details avoided during cropping. As expected, we found that selected details contain regions of higher saliency than avoided details on average. Moreover, the saliency center-of-mass was closer to the geometrical center in selected details than in avoided details. Results were confirmed in an eye tracking study with the same dataset of images. Interestingly, the observed regularities in cropping behavior were less pronounced for experts than for non-experts. In summary, our results suggest that, during cropping, participants tend to select salient regions and place them in an image composition that is well-balanced with respect to the distribution of saliency. Our study contributes to the knowledge of perceptual bottom-up features that are germane to aesthetic decisions in photography and their variability in non-experts and experts.

  16. Preference for Well-Balanced Saliency in Details Cropped from Photographs.

    Science.gov (United States)

    Abeln, Jonas; Fresz, Leonie; Amirshahi, Seyed Ali; McManus, I Chris; Koch, Michael; Kreysa, Helene; Redies, Christoph

    2015-01-01

    Photographic cropping is the act of selecting part of a photograph to enhance its aesthetic appearance or visual impact. It is common practice with both professional (expert) and amateur (non-expert) photographers. In a psychometric study, McManus et al. (2011b) showed that participants cropped photographs confidently and reliably. Experts tended to select details from a wider range of positions than non-experts, but other croppers did not generally prefer details that were selected by experts. It remained unclear, however, on what grounds participants selected particular details from a photograph while avoiding other details. One of the factors contributing to cropping decision may be visual saliency. Indeed, various saliency-based computer algorithms are available for the automatic cropping of photographs. However, careful experimental studies on the relation between saliency and cropping are lacking to date. In the present study, we re-analyzed the data from the studies by McManus et al. (2011a,b), focusing on statistical image properties. We calculated saliency-based measures for details selected and details avoided during cropping. As expected, we found that selected details contain regions of higher saliency than avoided details on average. Moreover, the saliency center-of-mass was closer to the geometrical center in selected details than in avoided details. Results were confirmed in an eye tracking study with the same dataset of images. Interestingly, the observed regularities in cropping behavior were less pronounced for experts than for non-experts. In summary, our results suggest that, during cropping, participants tend to select salient regions and place them in an image composition that is well-balanced with respect to the distribution of saliency. Our study contributes to the knowledge of perceptual bottom-up features that are germane to aesthetic decisions in photography and their variability in non-experts and experts. PMID:26793086

  17. Bottom-up and top-down herbivore regulation mediated by glucosinolates in Brassica oleracea var. acephala

    OpenAIRE

    Santolamazza Carbone, Serena; Velasco Pazos, Pablo; Soengas Fernández, María del Pilar; Cartea González, María Elena

    2014-01-01

    Quantitative differences in plant defence metabolites, such as glucosinolates, may directly affect herbivore preference and performance, and indirectly affect natural enemy pressure. By assessing insect abundance and leaf damage rate, we studied the responses of insect herbivores to six genotypes of Brassica oleracea var. acephala, selected from the same cultivar for having high or low foliar content of sinigrin, glucoiberin and glucobrassicin. We also investigated whether the natural parasit...

  18. A Bottom-up Route to a Chemically End-to-End Assembly of Nanocellulose Fibers.

    Science.gov (United States)

    Yang, Han; van de Ven, Theo G M

    2016-06-13

    In this work, we take advantage of the rod-like structure of electrosterically stabilized nanocrystalline cellulose (ENCC, with a width of about 7 nm and a length of about 130 nm), which has dicarboxylated cellulose (DCC) chains protruding from both ends, providing electrosterical stability for ENCC particles, to chemically end-to-end assemble these particles into nanocellulose fibers. ENCC with shorter DCC chains can be obtained by a mild hydrolysis of ENCC with HCl, and subsequently the hydrolyzed ENCC (HENCC, with a width of about 6 nm and a length of about 120 nm) is suitable to be assembled into high aspect ratio nanofibers by chemically cross-linking HENCC from one end to another. Two sets of HENCC were prepared by carbodiimide-mediated formation of an alkyne and an azide derivative, respectively. Cross-linking these two sets of HENCC was performed by a click reaction. HENCCs were also end-to-end cross-linked by a bioconjugation reaction, with a diamine. From atomic force microscopy (AFM) images, about ten HENCC nanoparticles were cross-linked and formed high aspect ratio nanofibers with a width of about 6 nm and a length of more than 1 μm. PMID:27211496

  19. Smart city planning from a bottom-up approach: local communities' intervention for a smarter urban environment

    Science.gov (United States)

    Alverti, Maroula; Hadjimitsis, Diofantos; Kyriakidis, Phaedon; Serraos, Konstantinos

    2016-08-01

    The aim of this paper is to explore the concept of "smart" cities from the perspective of inclusive community participation and Geographical Information Systems (GIS).The concept of a smart city is critically analyzed, focusing on the power/knowledge implications of a "bottom-up" approach in planning and how GIS could encourage community participation in smart urban planning. The paper commences with a literature review of what it means for cities to be "smart". It draws supporting definitions and critical insights into smart cities with respect to the built environment and the human factor. The second part of the paper, analyzes the "bottom-up" approach in urban planning, focusing on community participation reviewing forms and expressions through good practices from European cities. The third part of the paper includes a debate on how smart urban cities policies and community participation interact and influence each other. Finally, the paper closes with a discussion of the insights that were found and offers recommendations on how this debate could be addressed by Information and Communication Technologies and GIS in particular.

  20. Source attribution of methane emissions from global oil and gas production: results of bottom-up simulations over three decades

    Science.gov (United States)

    Höglund-Isaksson, Lena

    2016-04-01

    Existing bottom-up emission inventories of historical methane and ethane emissions from global oil and gas systems do not well explain year-on-year variations estimated by top-down models from atmospheric measurements. This paper develops a bottom-up methodology which allows for country- and year specific source attribution of methane and ethane emissions from global oil and natural gas production for the period 1980 to 2012. The analysis rests on country-specific simulations of associated gas flows which are converted into methane and ethane emissions. The associated gas flows are constructed from country-specific information on oil and gas production and associated gas generation and recovery, and coupled with generic assumptions to bridge regional information gaps on the fractions of unrecovered associated gas that is vented instead of flared. Summing up emissions from associated gas flows with global estimates of emissions from unintended leakage and natural gas transmission and distribution, the resulting global emissions of methane and ethane from oil and gas systems are reasonably consistent with corresponding estimates from top-down models. Also revealed is that the fall of the Soviet Union in 1990 had a significant impact on methane and ethane emissions from global oil and gas systems.

  1. Optimal Environmental Conditions and Anomalous Ecosystem Responses: Constraining Bottom-up Controls of Phytoplankton Biomass in the California Current System

    Science.gov (United States)

    Jacox, Michael G.; Hazen, Elliott L.; Bograd, Steven J.

    2016-06-01

    In Eastern Boundary Current systems, wind-driven upwelling drives nutrient-rich water to the ocean surface, making these regions among the most productive on Earth. Regulation of productivity by changing wind and/or nutrient conditions can dramatically impact ecosystem functioning, though the mechanisms are not well understood beyond broad-scale relationships. Here, we explore bottom-up controls during the California Current System (CCS) upwelling season by quantifying the dependence of phytoplankton biomass (as indicated by satellite chlorophyll estimates) on two key environmental parameters: subsurface nitrate concentration and surface wind stress. In general, moderate winds and high nitrate concentrations yield maximal biomass near shore, while offshore biomass is positively correlated with subsurface nitrate concentration. However, due to nonlinear interactions between the influences of wind and nitrate, bottom-up control of phytoplankton cannot be described by either one alone, nor by a combined metric such as nitrate flux. We quantify optimal environmental conditions for phytoplankton, defined as the wind/nitrate space that maximizes chlorophyll concentration, and present a framework for evaluating ecosystem change relative to environmental drivers. The utility of this framework is demonstrated by (i) elucidating anomalous CCS responses in 1998–1999, 2002, and 2005, and (ii) providing a basis for assessing potential biological impacts of projected climate change.

  2. Thousand and one ways to quantify and compare protein abundances in label-free bottom-up proteomics.

    Science.gov (United States)

    Blein-Nicolas, Mélisande; Zivy, Michel

    2016-08-01

    How to process and analyze MS data to quantify and statistically compare protein abundances in bottom-up proteomics has been an open debate for nearly fifteen years. Two main approaches are generally used: the first is based on spectral data generated during the process of identification (e.g. peptide counting, spectral counting), while the second makes use of extracted ion currents to quantify chromatographic peaks and infer protein abundances based on peptide quantification. These two approaches actually refer to multiple methods which have been developed during the last decade, but were submitted to deep evaluations only recently. In this paper, we compiled these different methods as exhaustively as possible. We also summarized the way they address the different problems raised by bottom-up protein quantification such as normalization, the presence of shared peptides, unequal peptide measurability and missing data. This article is part of a Special Issue entitled: Plant Proteomics- a bridge between fundamental processes and crop production, edited by Dr. Hans-Peter Mock. PMID:26947242

  3. The drastic outcomes from voting alliances in three-party bottom-up democratic voting (1990 $\\rightarrow$ 2013)

    CERN Document Server

    Galam, Serge

    2013-01-01

    The drastic effect of local alliances in three-party competition is investigated in democratic hierarchical bottom-up voting. The results are obtained analytically using a model which extends a sociophysics frame introduced in 1986 \\cite{psy} and 1990 \\cite{lebo} to study two-party systems and the spontaneous formation of democratic dictatorship. It is worth stressing that the 1990 paper was published in the Journal of Statistical Physics, the first paper of its kind in the journal. It was shown how a minority in power can preserve its leadership using bottom-up democratic elections. However such a bias holds only down to some critical value of minimum support. The results were used latter to explain the sudden collapse of European communist parties in the nineties. The extension to three-party competition reveals the mechanisms by which a very small minority party can get a substantial representation at higher levels of the hierarchy when the other two competing parties are big. Additional surprising results...

  4. Carbon balance: the top-down and bottom-up emissions accounting methodologies; Balanco de carbono: a contabilidade das emissoes nas metodologias 'Top-Down' estendida ('Top-Bottom') e 'Bottom-Up'

    Energy Technology Data Exchange (ETDEWEB)

    Alvim, Carlos Feu; Eidelman, Frida; Ferreira, Omar Campos

    2005-08-15

    The Economy and Energy Organization has carried out together with the Ministry of Science and Technology a study on the carbon balance of energy use and transformation. The publication of its results has been made through the e and e periodical in its 48 and 50 issues. In the present issue we are publishing the results corresponding to the extended Top-Down accounting process and those corresponding to the use of the coefficients calculated for the Brazilian inventory from 1990 to 1994, using the Bottom-Up process, to estimate the emissions from 1970 to 2002. By comparing the two results it is possible to evaluate their deficiencies and the possible incoherence in the use of the two methodologies. (author)

  5. Atomic layer deposition-Sequential self-limiting surface reactions for advanced catalyst "bottom-up" synthesis

    Science.gov (United States)

    Lu, Junling; Elam, Jeffrey W.; Stair, Peter C.

    2016-06-01

    Catalyst synthesis with precise control over the structure of catalytic active sites at the atomic level is of essential importance for the scientific understanding of reaction mechanisms and for rational design of advanced catalysts with high performance. Such precise control is achievable using atomic layer deposition (ALD). ALD is similar to chemical vapor deposition (CVD), except that the deposition is split into a sequence of two self-limiting surface reactions between gaseous precursor molecules and a substrate. The unique self-limiting feature of ALD allows conformal deposition of catalytic materials on a high surface area catalyst support at the atomic level. The deposited catalytic materials can be precisely constructed on the support by varying the number and type of ALD cycles. As an alternative to the wet-chemistry based conventional methods, ALD provides a cycle-by-cycle "bottom-up" approach for nanostructuring supported catalysts with near atomic precision. In this review, we summarize recent attempts to synthesize supported catalysts with ALD. Nucleation and growth of metals by ALD on oxides and carbon materials for precise synthesis of supported monometallic catalyst are reviewed. The capability of achieving precise control over the particle size of monometallic nanoparticles by ALD is emphasized. The resulting metal catalysts with high dispersions and uniformity often show comparable or remarkably higher activity than those prepared by conventional methods. For supported bimetallic catalyst synthesis, we summarize the strategies for controlling the deposition of the secondary metal selectively on the primary metal nanoparticle but not on the support to exclude monometallic formation. As a review of the surface chemistry and growth behavior of metal ALD on metal surfaces, we demonstrate the ways to precisely tune size, composition and structure of bimetallic metal nanoparticles. The cycle-by-cycle "bottom up" construction of bimetallic (or multiple

  6. The top-down, middle-down, and bottom-up mass spectrometry approaches for characterization of histone variants and their post-translational modifications.

    Science.gov (United States)

    Moradian, Annie; Kalli, Anastasia; Sweredoski, Michael J; Hess, Sonja

    2014-03-01

    Epigenetic regulation of gene expression is, at least in part, mediated by histone modifications. PTMs of histones change chromatin structure and regulate gene transcription, DNA damage repair, and DNA replication. Thus, studying histone variants and their modifications not only elucidates their functional mechanisms in chromatin regulation, but also provides insights into phenotypes and diseases. A challenge in this field is to determine the best approach(es) to identify histone variants and their PTMs using a robust high-throughput analysis. The large number of histone variants and the enormous diversity that can be generated through combinatorial modifications, also known as histone code, makes identification of histone PTMs a laborious task. MS has been proven to be a powerful tool in this regard. Here, we focus on bottom-up, middle-down, and top-down MS approaches, including CID and electron-capture dissociation/electron-transfer dissociation based techniques for characterization of histones and their PTMs. In addition, we discuss advances in chromatographic separation that take advantage of the chemical properties of the specific histone modifications. This review is also unique in its discussion of current bioinformatic strategies for comprehensive histone code analysis.

  7. Direct and indirect bottom-up and top-down forces shape the abundance of the orb-web spider Argiope bruennichi

    OpenAIRE

    Bruggisser, Odile T; Sandau, Nadine; Blandenier, Gilles; Fabian, Yvonne; Kehrli, Patrik; Aebi, Alex; Russell E Naisbit; Bersier, Louis-Félix

    2014-01-01

    Species abundance in local communities is determined by bottom-up and top-down processes, which can act directly and indirectly on the focal species. Studies examining these effects simultaneously are rare. Here we explore the direct top-down and direct and indirect bottom-up forces regulating the abundance and predation success of an intermediate predator, the web-building spider Argiope bruennichi (Araneae: Araneidae). We manipulated plant diversity (2, 6, 12 or 20 sown species) in 9 wildfl...

  8. The influence of top-down, bottom-up and abiotic factors on the moose (Alces alces) population of Isle Royale.

    OpenAIRE

    Vucetich, John A.; Rolf O Peterson

    2004-01-01

    Long-term, concurrent measurement of population dynamics and associated top-down and bottom-up processes are rare for unmanipulated, terrestrial systems. Here, we analyse populations of moose, their predators (wolves, Canis lupus), their primary winter forage (balsam fir, Abies balsamea) and several climatic variables that were monitored for 40 consecutive years in Isle Royale National Park (544 km2), Lake Superior, USA. We judged the relative importance of top-down, bottom-up and abiotic fac...

  9. Bottom-up fabrication of paper-based microchips by blade coating of cellulose microfibers on a patterned surface.

    Science.gov (United States)

    Gao, Bingbing; Liu, Hong; Gu, Zhongze

    2014-12-23

    We report a method for the bottom-up fabrication of paper-based capillary microchips by the blade coating of cellulose microfibers on a patterned surface. The fabrication process is similar to the paper-making process in which an aqueous suspension of cellulose microfibers is used as the starting material and is blade-coated onto a polypropylene substrate patterned using an inkjet printer. After water evaporation, the cellulose microfibers form a porous, hydrophilic, paperlike pattern that wicks aqueous solution by capillary action. This method enables simple, fast, inexpensive fabrication of paper-based capillary channels with both width and height down to about 10 μm. When this method is used, the capillary microfluidic chip for the colorimetric detection of glucose and total protein is fabricated, and the assay requires only 0.30 μL of sample, which is 240 times smaller than for paper devices fabricated using photolithography.

  10. Benchmarking energy scenarios for China: perspectives from top-down, economic and bottom-up, technical modelling

    DEFF Research Database (Denmark)

    This study uses a soft-linking methodology to harmonise two complex global top-down and bottom-up models with a regional China focus. The baseline follows the GDP and demographic trends of the Shared Socio-economic Pathways (SSP2) scenario, down-scaled for China, while the carbon tax scenario...... follows the pathway of the Asia Modelling Exercise. We find that soft-linking allows "bridging the gap" and reducing uncertainty between these models. Without soft-linking, baseline result ranges for China in 2050 are 240-260 EJ in primary energy, 180-200 EJ in final energy, 8-10 GWh in electricity...... production and 15-18 Gt in carbon dioxide emissions. The highest uncertainty in modelling results can be mapped for China's future coal use in 2050, in particular in electricity production. Sub-regional China features, when incorporated into complex global models, do not increase uncertainty in China...

  11. The faith of a physicist reflections of a bottom-up thinker : the Gifford lectures for 1993-4

    CERN Document Server

    Polkinghorne, John C

    1994-01-01

    Is it possible to think like a scientist and yet have the faith of a Christian? Although many Westerners might say no, there are also many critically minded individuals who entertain what John Polkinghorne calls a "wistful wariness" toward religion--they feel unable to accept religion on rational grounds yet cannot dismiss it completely. Polkinghorne, both a particle physicist and Anglican priest, here explores just what rational grounds there could be for Christian beliefs, maintaining that the quest for motivated understanding is a concern shared by scientists and religious thinkers alike. Anyone who assumes that religion is based on unquestioning certainties, or that it need not take into account empirical knowledge, will be challenged by Polkinghorne's bottom-up examination of Christian beliefs about events ranging from creation to the resurrection. The author organizes his inquiry around the Nicene Creed, an early statement that continues to summarize Christian beliefs. He applies to each of its tenets ...

  12. Construction of membrane-bound artificial cells using microfluidics: a new frontier in bottom-up synthetic biology

    Science.gov (United States)

    Elani, Yuval

    2016-01-01

    The quest to construct artificial cells from the bottom-up using simple building blocks has received much attention over recent decades and is one of the grand challenges in synthetic biology. Cell mimics that are encapsulated by lipid membranes are a particularly powerful class of artificial cells due to their biocompatibility and the ability to reconstitute biological machinery within them. One of the key obstacles in the field centres on the following: how can membrane-based artificial cells be generated in a controlled way and in high-throughput? In particular, how can they be constructed to have precisely defined parameters including size, biomolecular composition and spatial organization? Microfluidic generation strategies have proved instrumental in addressing these questions. This article will outline some of the major principles underpinning membrane-based artificial cells and their construction using microfluidics, and will detail some recent landmarks that have been achieved. PMID:27284034

  13. Construction of membrane-bound artificial cells using microfluidics: a new frontier in bottom-up synthetic biology.

    Science.gov (United States)

    Elani, Yuval

    2016-06-15

    The quest to construct artificial cells from the bottom-up using simple building blocks has received much attention over recent decades and is one of the grand challenges in synthetic biology. Cell mimics that are encapsulated by lipid membranes are a particularly powerful class of artificial cells due to their biocompatibility and the ability to reconstitute biological machinery within them. One of the key obstacles in the field centres on the following: how can membrane-based artificial cells be generated in a controlled way and in high-throughput? In particular, how can they be constructed to have precisely defined parameters including size, biomolecular composition and spatial organization? Microfluidic generation strategies have proved instrumental in addressing these questions. This article will outline some of the major principles underpinning membrane-based artificial cells and their construction using microfluidics, and will detail some recent landmarks that have been achieved.

  14. Methodology to characterize a residential building stock using a bottom-up approach: a case study applied to Belgium

    Directory of Open Access Journals (Sweden)

    Samuel Gendebien

    2014-06-01

    Full Text Available In the last ten years, the development and implementation of measures to mitigate climate change have become of major importance. In Europe, the residential sector accounts for 27% of the final energy consumption [1], and therefore contributes significantly to CO2 emissions. Roadmaps towards energy-efficient buildings have been proposed [2]. In such a context, the detailed characterization of residential building stocks in terms of age, type of construction, insulation level, energy vector, and of evolution prospects appears to be a useful contribution to the assessment of the impact of implementation of energy policies. In this work, a methodology to develop a tree-structure characterizing a residential building stock is presented in the frame of a bottom-up approach that aims to model and simulate domestic energy use. The methodology is applied to the Belgian case for the current situation and up to 2030 horizon. The potential applications of the developed tool are outlined.

  15. Cyclization of the N-Terminal X-Asn-Gly Motif during Sample Preparation for Bottom-Up Proteomics

    DEFF Research Database (Denmark)

    Zhang, Xumin; Højrup, Peter

    2010-01-01

    We, herein, report a novel -17 Da peptide modification corresponding to an N-terminal cyclization of peptides possessing the N-terminal motif of X-Asn-Gly. The cyclization occurs spontaneously during sample preparation for bottom-up proteomics studies. Distinct from the two well-known N......-terminal cyclizations, cyclization of N-terminal glutamine and S-carbamoylmethylcysteine, it is dependent on pH instead of [NH(4)(+)]. The data set from our recent study on large-scale N(α)-modified peptides revealed a sequence requirement for the cyclization event similar to the well-known deamidation of Asn to iso......Asp and Asp. Detailed analysis using synthetic peptides confirmed that the cyclization forms between the N-terminus and its neighboring Asn residue, and the reaction shares the same succinimide intermediate with the Asn deamidation event. As a result, we, here, propose a molecular mechanism for this specific...

  16. Radiographic Evaluation of Children with Febrile Urinary Tract Infection: Bottom-Up, Top-Down, or None of the Above?

    Directory of Open Access Journals (Sweden)

    Michaella M. Prasad

    2012-01-01

    Full Text Available The proper algorithm for the radiographic evaluation of children with febrile urinary tract infection (FUTI is hotly debated. Three studies are commonly administered: renal-bladder ultrasound (RUS, voiding cystourethrogram (VCUG, and dimercapto-succinic acid (DMSA scan. However, the order in which these tests are obtained depends on the methodology followed: bottom-up or top-down. Each strategy carries advantages and disadvantages, and some groups now advocate even less of a workup (none of the above due to the current controversies about treatment when abnormalities are diagnosed. New technology is available and still under investigation, but it may help to clarify the interplay between vesicoureteral reflux, renal scarring, and dysfunctional elimination in the future.

  17. Mass Spectrometry Applied to Bottom-Up Proteomics: Entering the High-Throughput Era for Hypothesis Testing

    Science.gov (United States)

    Gillet, Ludovic C.; Leitner, Alexander; Aebersold, Ruedi

    2016-06-01

    Proteins constitute a key class of molecular components that perform essential biochemical reactions in living cells. Whether the aim is to extensively characterize a given protein or to perform high-throughput qualitative and quantitative analysis of the proteome content of a sample, liquid chromatography coupled to tandem mass spectrometry has become the technology of choice. In this review, we summarize the current state of mass spectrometry applied to bottom-up proteomics, the approach that focuses on analyzing peptides obtained from proteolytic digestion of proteins. With the recent advances in instrumentation and methodology, we show that the field is moving away from providing qualitative identification of long lists of proteins to delivering highly consistent and accurate quantification values for large numbers of proteins across large numbers of samples. We believe that this shift will have a profound impact for the field of proteomics and life science research in general.

  18. Identifying robust clusters and multi-community nodes by combining top-down and bottom-up approaches to clustering

    CERN Document Server

    Gaiteri, Chris; Szymanski, Boleslaw; Kuzmin, Konstantin; Xie, Jierui; Lee, Changkyu; Blanche, Timothy; Neto, Elias Chaibub; Huang, Su-Chun; Grabowski, Thomas; Madhyastha, Tara; Komashko, Vitalina

    2015-01-01

    Biological functions are often realized by groups of interacting molecules or cells. Membership in these groups may overlap when molecules or cells are reused in multiple functions. Traditional clustering methods assign each component to one group. Noisy measurements are common in high-throughput biological datasets. These two limitations reduce our ability to accurately define clusters in biological datasets and to interpret their biological functions. To address these limitations, we designed an algorithm called SpeakEasy, which detects overlapping or non-overlapping communities in biological networks. Input to SpeakEasy can be physical networks, such as molecular interactions, or inferred networks, such as gene coexpression networks. The networks can be directed or undirected, and may contain negative links. SpeakEasy combines traditional bottom-up and top-down approaches to clustering, by creating competition between clusters. Nodes that oscillate between multiple clusters in this competition are classifi...

  19. Integrating top-down and bottom-up nanomanufacturing: Design of nucleation and growth processes from electrolytes

    Science.gov (United States)

    Kitayaporn, Sathana

    2011-07-01

    The integration of self-propagating material growth (bottom-up) with tool-directed patterning (top-down) has great potential for minimizing the cost and reducing the time needed for manufacturing nanoscale products. This requires new molecules, algorithms, and growth processes. We describe a process called "orchestrated structure evolution" (OSE) in which one "seeds" specific locations and allows a material to spontaneously grow from these sites into the desired final pattern. Software-reconfigurable seed patterning is ideal for manufacturing flexibility, but direct-write tools are often slow: combining them with bottom-up growth is a strategy for reducing patterning times. Seeds are any nucleation initiator (nanoelectrodes, proteins, catalyst, etc.) that can be patterned using tools such as electron-beam lithography (EBL) or dip-pen nanolithography. Here, we explore the OSE concept using nanoelectrode seeds patterned with EBL and engineered Thioredoxin A (TrxA) as protein seeds. For the case of nanoelectrode seeds, we use electrodeposition to initiate copper and nickel growth that propagates into a continuous patterned film. We evaluate the trade-off between reduced pattern time and pattern degradation, and predict seed-scale interactions governing growth rate and composition using Voronoi diagrams and modified Green's function calculations. The work combines experiments and theory for a wide range of pattern length scales, driving forces, seed densities, compositions and geometries. For the case of protein seeds, we use ZnO-binding derivatives of TrxA to understand how proteins may serve as nucleation initiators for ZnO crystal growth. Our studies include thermodynamics prediction of zinc-compatible biological buffers, adsorption isotherms, and electrodeposition of protein-modified ZnO. We show that electrolyte engineering is a critical part of the process, and that the electrolyte stability and prevalence of key species must be matched with protein stability

  20. Biochemistry-directed hollow porous microspheres: bottom-up self-assembled polyanion-based cathodes for sodium ion batteries.

    Science.gov (United States)

    Lin, Bo; Li, Qiufeng; Liu, Baodong; Zhang, Sen; Deng, Chao

    2016-04-21

    Biochemistry-directed synthesis of functional nanomaterials has attracted great interest in energy storage, catalysis and other applications. The unique ability of biological systems to guide molecule self-assembling facilitates the construction of distinctive architectures with desirable physicochemical characteristics. Herein, we report a biochemistry-directed "bottom-up" approach to construct hollow porous microspheres of polyanion materials for sodium ion batteries. Two kinds of polyanions, i.e. Na3V2(PO4)3 and Na3.12Fe2.44(P2O7)2, are employed as cases in this study. The microalgae cell realizes the formation of a spherical "bottom" bio-precursor. Its tiny core is subjected to destruction and its tough shell tends to carbonize upon calcination, resulting in the hollow porous microspheres for the "top" product. The nanoscale crystals of the polyanion materials are tightly enwrapped by the highly-conductive framework in the hollow microsphere, resulting in the hierarchical nano-microstructure. The whole formation process is disclosed as a "bottom-up" mechanism. Moreover, the biochemistry-directed self-assembly process is confirmed to play a crucial role in the construction of the final architecture. Taking advantage of the well-defined hollow-microsphere architecture, the abundant interior voids and the highly-conductive framework, polyanion materials show favourable sodium-intercalation kinetics. Both materials are capable of high-rate long-term cycling. After five hundred cycles at 20 C and 10 C, Na3V2(PO4)3 and Na(3.12)Fe2.44(P2O7)2 retain 96.2% and 93.1% of the initial capacity, respectively. Therefore, the biochemistry-directed technique provides a low-cost, highly-efficient and widely applicable strategy to produce high-performance polyanion-based cathodes for sodium ion batteries.

  1. Reconciling Long-Term Trends in Air Quality with Bottom-up Emission Inventories for Los Angeles

    Science.gov (United States)

    Mcdonald, B. C.; Kim, S. W.; Frost, G. J.; Harley, R.; Trainer, M.

    2014-12-01

    Significant long-term changes in air quality have been observed in the United States over several decades. However, reconciling ambient observations with bottom-up emission inventories has proved challenging. In this study, we perform WRF-Chem modeling in the Los Angeles basin for carbon monoxide (CO), nitrogen oxides (NOx), volatile organic compounds (VOCs), and ozone (O3) over a long time period (1987-2010). To improve reconciliation of emission inventories with atmospheric observations, we incorporate new high-resolution emissions maps of a major to dominant source of urban air pollution, motor vehicles. A fuel-based approach is used to estimate motor vehicle emissions utilizing annual fuel sales reports, traffic count data that capture spatial and temporal patterns of vehicle activity, and pollutant emission factors measured from roadway studies performed over the last twenty years. We also update emissions from stationary sources using Continuous Emissions Monitoring Systems (CEMS) data when available, and use emission inventories developed by the South Coast Air Quality Management District (SCAQMD) and California Air Resources Board (ARB) for other important emission source categories. WRF-Chem modeling is performed in three years where field-intensive measurements were made: 1987 (SCAQS: Southern California Air Quality Study), 2002 (ITCT: Intercontinental Transport and Chemical Transformation Study), and 2010 (CALNEX). We assess the ability of the improved bottom-up emissions inventory to predict long-term changes in ambient levels of CO, NOx, and O3, which are known to have occurred over this time period. We also assess changing spatial and temporal patterns of primary (CO and NOx) and secondary (O3) pollutant concentrations across the Los Angeles basin, which has important implications on human health.

  2. Fixation and saliency during search of natural scenes: the case of visual agnosia.

    Science.gov (United States)

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2009-07-01

    Models of eye movement control in natural scenes often distinguish between stimulus-driven processes (which guide the eyes to visually salient regions) and those based on task and object knowledge (which depend on expectations or identification of objects and scene gist). In the present investigation, the eye movements of a patient with visual agnosia were recorded while she searched for objects within photographs of natural scenes and compared to those made by students and age-matched controls. Agnosia is assumed to disrupt the top-down knowledge available in this task, and so may increase the reliance on bottom-up cues. The patient's deficit in object recognition was seen in poor search performance and inefficient scanning. The low-level saliency of target objects had an effect on responses in visual agnosia, and the most salient region in the scene was more likely to be fixated by the patient than by controls. An analysis of model-predicted saliency at fixation locations indicated a closer match between fixations and low-level saliency in agnosia than in controls. These findings are discussed in relation to saliency-map models and the balance between high and low-level factors in eye guidance.

  3. Hybrid bottom-up/top-down energy and economy outlooks: a survey of the IMACLIM-S experiments

    Directory of Open Access Journals (Sweden)

    Frédéric eGhersi

    2015-11-01

    Full Text Available In this paper we survey the research undertaken at the Centre International de Recherche sur l’Environnement et le Développement (CIRED on the combination of the IMACLIM-S macroeconomic model with ‘bottom-up’ energy modeling, with a view to associate the strengths and circumvent the limitations of both approaches to energy-economy-environment (E3 prospective modeling. We start by presenting the two methodological avenues of coupling IMACLIM-S with detailed energy systems models pursued at CIRED since the late 1990s’: (1 the calibration of the behavioral functions of IMACLIM-S that represent the producers’ and consumers’ trade-offs between inputs or consumptions, on a large set of bottom-up modeling results; (2 the coupling of IMACLIM-S to some bottom-up model through the iterative exchange of some of each model’s outputs as the other model’s inputs until convergence of the exchanged data, comprising the main macroeconomic drivers and energy systems variables. In the following section, we turn to numerical application and address the prerequisite of harmonizing national accounts, energy balance and energy price data to produce consistent hybrid input-output matrices as a basis of scenario exploration. We highlight how this data treatment step reveals the discrepancies and biases induced by sticking to the conventional modeling usage of uniform pricing of homogeneous goods. IMACLIM-S rather calibrates agent-specific margins, which we introduce and comment upon. In a further section we sum up the results of 4 IMACLIM-S experiments, insisting upon the value-added of hybrid modeling. These varied experiments regard international climate policy burden sharing; the more general numerical consequences of shifting from a biased standard CGE model perspective to the hybrid IMACLIM approach; the macroeconomic consequences of a strong development of electric mobility in the European Union; and the resilience of public debts to energy shocks

  4. Intentional action processing results from automatic bottom-up attention: An EEG-investigation into the Social Relevance Hypothesis using hypnosis.

    Science.gov (United States)

    Neufeld, Eleonore; Brown, Elliot C; Lee-Grimm, Sie-In; Newen, Albert; Brüne, Martin

    2016-05-01

    Social stimuli grab our attention. However, it has rarely been investigated how variations in attention affect the processing of social stimuli, although the answer could help us uncover details of social cognition processes such as action understanding. In the present study, we examined how changes to bottom-up attention affects neural EEG-responses associated with intentional action processing. We induced an increase in bottom-up attention by using hypnosis. We recorded the electroencephalographic μ-wave suppression of hypnotized participants when presented with intentional actions in first and third person perspective in a video-clip paradigm. Previous studies have shown that the μ-rhythm is selectively suppressed both when executing and observing goal-directed motor actions; hence it can be used as a neural signal for intentional action processing. Our results show that neutral hypnotic trance increases μ-suppression in highly suggestible participants when they observe intentional actions. This suggests that social action processing is enhanced when bottom-up attentional processes are predominant. Our findings support the Social Relevance Hypothesis, according to which social action processing is a bottom-up driven attentional process, and can thus be altered as a function of bottom-up processing devoted to a social stimulus. PMID:26998562

  5. Analysis of the Economic Impact of Large-Scale Deployment of Biomass Resources for Energy and Materials in the Netherlands. Appendix 1. Bottom-up Scenarios

    International Nuclear Information System (INIS)

    The Bio-based Raw Materials Platform (PGG), part of the Energy Transition in The Netherlands, commissioned the Agricultural Economics Research Institute (LEI) and the Copernicus Institute of Utrecht University to conduct research on the macro-economic impact of large scale deployment of biomass for energy and materials in the Netherlands. Two model approaches were applied based on a consistent set of scenario assumptions: a bottom-up study including technoeconomic projections of fossil and bio-based conversion technologies and a topdown study including macro-economic modelling of (global) trade of biomass and fossil resources. The results of the top-down and bottom-up modelling work are reported separately. The results of the synthesis of the modelling work are presented in the main report. This report (part 1) presents scenarios for future biomass use for energy and materials, and analyses the consequences on energy supply, chemical productions, costs and greenhouse gas (GHG) emissions with a bottom-up approach. The bottom-up projections, as presented in this report, form the basis for modelling work using the top-down macro-economic model (LEITAP) to assess the economic impact of substituting fossil-based energy carriers with biomass in the Netherlands. The results of the macro-economic modelling work, and the linkage between the results of the bottom-up and top-down work, will be presented in the top-down economic part and synthesis report of this study

  6. Top-down versus bottom-up estimates of methane fluxes over the East Siberian Arctic Shelf

    Science.gov (United States)

    Shakhova, N. E.; Semiletov, I. P.; Repina, I.; Salyuk, A.; Kosmach, D.; Chernykh, D.; Aniferov, A.

    2014-12-01

    Global methane (CH4) emissions are currently quantified from statistical data without testing the results against either distribution of the actual atmospheric CH4 concentrations observed in different part of the globe or the regional dynamics of these concentrations. Measurement methods despite been improved remarkably in the past few years, especially with the advent of new optical and satellite-derived methods, are limited in their applicability in the Arctic. Modeling methodologies are still under development and cannot help to evolve very coarse global-scale understanding of CH4 sources to resolution of regional-scale emissions. As a result, contribution of the Arctic sources in the global CH4 budget are yet to be quantified adequately. We used a decadal observational data set collected from the water column and from the atmospheric boundary layer (ABL) over the East Siberian Arctic Shelf (ESAS), which is the largest continental shelf, to determine the minimum source strength required to explain observed seasonally increased concentration of CH4 in the ABL. The results of top-down modeling performed by implementing a simple box model show a good agreement with results of bottom-up estimates made using interpretation of in-situ calibrated sonar data.

  7. Encouraging the pursuit of advanced degrees in science and engineering: Top-down and bottom-up methodologies

    Science.gov (United States)

    Maddox, Anthony B.; Smith-Maddox, Renee P.; Penick, Benson E.

    1989-01-01

    The MassPEP/NASA Graduate Research Development Program (GRDP) whose objective is to encourage Black Americans, Mexican Americans, American Indians, Puerto Ricans, and Pacific Islanders to pursue graduate degrees in science and engineering is described. The GRDP employs a top-down or goal driven methodology through five modules which focus on research, graduate school climate, technical writing, standardized examinations, and electronic networking. These modules are designed to develop and reinforce some of the skills necessary to seriously consider the goal of completing a graduate education. The GRDP is a community-based program which seeks to recruit twenty participants from a pool of Boston-area undergraduates enrolled in engineering and science curriculums and recent graduates with engineering and science degrees. The program emphasizes that with sufficient information, its participants can overcome most of the barriers perceived as preventing them from obtaining graduate science and engineering degrees. Experience has shown that the top-down modules may be complemented by a more bottom-up or event-driven methodology. This approach considers events in the academic and professional experiences of participants in order to develop the personal and leadership skills necessary for graduate school and similar endeavors.

  8. Proteome digestion specificity analysis for rational design of extended bottom-up and middle-down proteomics experiments.

    Science.gov (United States)

    Laskay, Ünige A; Lobas, Anna A; Srzentić, Kristina; Gorshkov, Mikhail V; Tsybin, Yury O

    2013-12-01

    Mass spectrometry (MS)-based bottom-up proteomics (BUP) is currently the method of choice for large-scale identification and characterization of proteins present in complex samples, such as cell lysates, body fluids, or tissues. Technically, BUP relies on MS analysis of complex mixtures of small, approaches such as middle-down proteomics (MDP, addressing up to 15 kDa peptides) and top-down proteomics (TDP, addressing proteins exceeding 15 kDa) have been gaining particular interest. Here we report on the bioinformatics study of both common and less frequently employed digestion procedures for complex protein mixtures specifically targeting the MDP approach. The aim of this study was to maximize the yield of protein structure information from MS data by optimizing peptide size distribution and sequence specificity. We classified peptides into four categories based on molecular weight: 0.6-3 (classical BUP), 3-7 (extended BUP), 7-15 kDa (MDP), and >15 kDa (TDP). Because of instrumentation-related considerations, we first advocate for the extended BUP approach as the potential near-future improvement of BUP. Therefore, we chose to optimize the number of unique peptides in the 3-7 kDa range while maximizing the number of represented proteins. The present study considers human, yeast, and bacterial proteomes. Results of the study can be further used for designing extended BUP or MDP experimental workflows. PMID:24171472

  9. Estimation of Emissions from Sugarcane Field Burning in Thailand Using Bottom-Up Country-Specific Activity Data

    Directory of Open Access Journals (Sweden)

    Wilaiwan Sornpoon

    2014-09-01

    Full Text Available Open burning in sugarcane fields is recognized as a major source of air pollution. However, the assessment of its emission intensity in many regions of the world still lacks information, especially regarding country-specific activity data including biomass fuel load and combustion factor. A site survey was conducted covering 13 sugarcane plantations subject to different farm management practices and climatic conditions. The results showed that pre-harvest and post-harvest burnings are the two main practices followed in Thailand. In 2012, the total production of sugarcane biomass fuel, i.e., dead, dry and fresh leaves, amounted to 10.15 million tonnes, which is equivalent to a fuel density of 0.79 kg∙m−2. The average combustion factor for the pre-harvest and post-harvest burning systems was determined to be 0.64 and 0.83, respectively. Emissions from sugarcane field burning were estimated using the bottom-up country-specific values from the site survey of this study and the results compared with those obtained using default values from the 2006 IPCC Guidelines. The comparison showed that the use of default values lead to underestimating the overall emissions by up to 30% as emissions from post-harvest burning are not accounted for, but it is the second most common practice followed in Thailand.

  10. Conservative and dissipative force field for simulation of coarse-grained alkane molecules: a bottom-up approach.

    Science.gov (United States)

    Trément, Sébastien; Schnell, Benoît; Petitjean, Laurent; Couty, Marc; Rousseau, Bernard

    2014-04-01

    We apply operational procedures available in the literature to the construction of coarse-grained conservative and friction forces for use in dissipative particle dynamics (DPD) simulations. The full procedure rely on a bottom-up approach: large molecular dynamics trajectories of n-pentane and n-decane modeled with an anisotropic united atom model serve as input for the force field generation. As a consequence, the coarse-grained model is expected to reproduce at least semi-quantitatively structural and dynamical properties of the underlying atomistic model. Two different coarse-graining levels are studied, corresponding to five and ten carbon atoms per DPD bead. The influence of the coarse-graining level on the generated force fields contributions, namely, the conservative and the friction part, is discussed. It is shown that the coarse-grained model of n-pentane correctly reproduces self-diffusion and viscosity coefficients of real n-pentane, while the fully coarse-grained model for n-decane at ambient temperature over-predicts diffusion by a factor of 2. However, when the n-pentane coarse-grained model is used as a building block for larger molecule (e.g., n-decane as a two blobs model), a much better agreement with experimental data is obtained, suggesting that the force field constructed is transferable to large macro-molecular systems. PMID:24712786

  11. Understanding agent-based models of financial markets: A bottom-up approach based on order parameters and phase diagrams

    Science.gov (United States)

    Lye, Ribin; Tan, James Peng Lung; Cheong, Siew Ann

    2012-11-01

    We describe a bottom-up framework, based on the identification of appropriate order parameters and determination of phase diagrams, for understanding progressively refined agent-based models and simulations of financial markets. We illustrate this framework by starting with a deterministic toy model, whereby N independent traders buy and sell M stocks through an order book that acts as a clearing house. The price of a stock increases whenever it is bought and decreases whenever it is sold. Price changes are updated by the order book before the next transaction takes place. In this deterministic model, all traders based their buy decisions on a call utility function, and all their sell decisions on a put utility function. We then make the agent-based model more realistic, by either having a fraction fb of traders buy a random stock on offer, or a fraction fs of traders sell a random stock in their portfolio. Based on our simulations, we find that it is possible to identify useful order parameters from the steady-state price distributions of all three models. Using these order parameters as a guide, we find three phases: (i) the dead market; (ii) the boom market; and (iii) the jammed market in the phase diagram of the deterministic model. Comparing the phase diagrams of the stochastic models against that of the deterministic model, we realize that the primary effect of stochasticity is to eliminate the dead market phase.

  12. Beyond Defining the Smart City. Meeting Top-Down and Bottom-Up Approaches in the Middle

    Directory of Open Access Journals (Sweden)

    Jonas Breuer

    2014-05-01

    Full Text Available This paper aims to better frame the discussion and the various, divergent operationalisations and interpretations of the Smart City concept. We start by explicating top-down approaches to the Smart City, followed by what purely bottom-up initiatives can look like. We provide a clear overview of stakeholders’ different viewpoints on the city of tomorrow. Particularly the consequences and potential impacts of these differing interpretations and approaches should be of specific interest to researchers, policy makers, city administrations, private actors and anyone involved and concerned with life in cities. Therefore the goal of this article is not so much answering the question of what the Smart City is, but rather what the concept can mean for different stakeholders as well as the consequences of their interpretation. We do this by assembling an eclectic overview, bringing together definitions, examples and operationalisations from academia, policy and industry as well as identifying major trends and approaches to realizing the Smart City. We add to the debate by proposing a different approach that starts from the collective, collaboration and context when researching Smart City initiatives.

  13. Estimation of the measurement uncertainty by the bottom-up approach for the determination of methamphetamine and amphetamine in urine.

    Science.gov (United States)

    Lee, Sooyeun; Choi, Hyeyoung; Kim, Eunmi; Choi, Hwakyung; Chung, Heesun; Chung, Kyu Hyuck

    2010-05-01

    The measurement uncertainty (MU) of methamphetamine (MA) and amphetamine (AP) was estimated in an authentic urine sample with a relatively low concentration of MA and AP using the bottom-up approach. A cause and effect diagram was deduced; the amount of MA or AP in the sample, the volume of the sample, method precision, and sample effect were considered uncertainty sources. The concentrations of MA and AP in the urine sample with their expanded uncertainties were 340.5 +/- 33.2 ng/mL and 113.4 +/- 15.4 ng/mL, respectively, which means 9.7% and 13.6% of the concentration gave an estimated expanded uncertainty, respectively. The largest uncertainty originated from sample effect and method precision in MA and AP, respectively, but the uncertainty of the volume of the sample was minimal in both. The MU needs to be determined during the method validation process to assess test reliability. Moreover, the identification of the largest and/or smallest uncertainty source can help improve experimental protocols.

  14. Top-down and Bottom-up Approaches in Production of Aqueous Nanocolloids of Low Soluble Drug Paclitaxel

    Science.gov (United States)

    Pattekari, P.; Zheng, Z.; Zhang, X.; Levchenko, T.; Torchilin, V.

    2015-01-01

    Nano-encapsulation of poorly soluble anticancer drug was developed with sonication assisted layer-by-layer polyelectrolyte coating (SLbL). We changed the strategy of LbL-encapsulation from making microcapsules with many layers in the walls for encasing highly soluble materials to using very thin polycation / polyanion coating on low soluble nanoparticles to provide their good colloidal stability. SLbL encapsulation of paclitaxel resulted in stable 100-200 nm diameter colloids with high electrical surface ξ-potential (of -45 mV) and drug content in the nanoparticles of 90 wt %. In the top-down approach, nanocolloids were prepared by rupturing powder of paclitaxel using ultrasonication and simultaneous sequential adsorption of oppositely charged biocompatible polyelectrolytes. In the bottom-up approach paclitaxel was dissolved in organic solvent (ethanol or acetone), and drug nucleation was initiated by gradual worsening the solution with the addition of aqueous polyelectrolyte assisted by ultrasonication. Paclitaxel release rates from such nanocapsules were controlled by assembling multilayer shells with variable thicknesses and are in the range of 10-20 hours. PMID:21442095

  15. D-Branes at Singularities A Bottom-Up Approach to the String Embedding of the Standard Model

    CERN Document Server

    Aldazabal, G; Quevedo, Fernando; Uranga, Angel M

    2000-01-01

    We propose a bottom-up approach to the building of particle physics models from string theory. Our building blocks are Type II D-branes which we combine appropriately to reproduce desirable features of a particle theory model: 1) Chirality ; 2) Standard Model group ; 3) N=1 or N=0 supersymmetry ; 4) Three quark-lepton generations. We start such a program by studying configurations of D=10, Type IIB D3-branes located at singularities. We study in detail the case of Z_N, N=1,0 supersymmetric orbifold singularities leading to the SM group or some left-right symmetricextension. In general, tadpole cancellation conditions require the presence of additional branes, e.g. D7-branes. For the N=1 supersymmetric case the unique twist leading to three quark-lepton generations is Z_3, predicting $\\sin^2\\theta_W=3/14=0.21$. The models obtained are the simplest semirealistic string models ever built. In the non-supersymmetric case there is a three-generation model for each Z_N, N>4, but the Weinberg angle is in general too ...

  16. 2D FT-ICR MS of Calmodulin: A Top-Down and Bottom-Up Approach

    Science.gov (United States)

    Floris, Federico; van Agthoven, Maria; Chiron, Lionel; Soulby, Andrew J.; Wootton, Christopher A.; Lam, Yuko P. Y.; Barrow, Mark P.; Delsuc, Marc-André; O'Connor, Peter B.

    2016-09-01

    Two-dimensional Fourier transform ion cyclotron resonance mass spectrometry (2D FT-ICR MS) allows data-independent fragmentation of all ions in a sample and correlation of fragment ions to their precursors through the modulation of precursor ion cyclotron radii prior to fragmentation. Previous results show that implementation of 2D FT-ICR MS with infrared multi-photon dissociation (IRMPD) and electron capture dissociation (ECD) has turned this method into a useful analytical tool. In this work, IRMPD tandem mass spectrometry of calmodulin (CaM) has been performed both in one-dimensional and two-dimensional FT-ICR MS using a top-down and bottom-up approach. 2D IRMPD FT-ICR MS is used to achieve extensive inter-residue bond cleavage and assignment for CaM, using its unique features for fragment identification in a less time- and sample-consuming experiment than doing the same thing using sequential MS/MS experiments.

  17. A bottom-up method for module-based product platform development through mapping, clustering and matching analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Meng; LI Guo-xi; CAO Jian-ping; GONG Jing-zhong; WU Bao-zhong

    2016-01-01

    Designing product platform could be an effective and efficient solution for manufacturing firms. Product platforms enable firms to provide increased product variety for the marketplace with as little variety between products as possible. Developed consumer products and modules within a firm can further be investigated to find out the possibility of product platform creation. A bottom-up method is proposed for module-based product platform through mapping, clustering and matching analysis. The framework and the parametric model of the method are presented, which consist of three steps: (1) mapping parameters from existing product families to functional modules, (2) clustering the modules within existing module families based on their parameters so as to generate module clusters, and selecting the satisfactory module clusters based on commonality, and (3) matching the parameters of the module clusters to the functional modules in order to capture platform elements. In addition, the parameter matching criterion and mismatching treatment are put forward to ensure the effectiveness of the platform process, while standardization and serialization of the platform element are presented. A design case of the belt conveyor is studied to demonstrate the feasibility of the proposed method.

  18. Identifying robust communities and multi-community nodes by combining top-down and bottom-up approaches to clustering.

    Science.gov (United States)

    Gaiteri, Chris; Chen, Mingming; Szymanski, Boleslaw; Kuzmin, Konstantin; Xie, Jierui; Lee, Changkyu; Blanche, Timothy; Chaibub Neto, Elias; Huang, Su-Chun; Grabowski, Thomas; Madhyastha, Tara; Komashko, Vitalina

    2015-11-09

    Biological functions are carried out by groups of interacting molecules, cells or tissues, known as communities. Membership in these communities may overlap when biological components are involved in multiple functions. However, traditional clustering methods detect non-overlapping communities. These detected communities may also be unstable and difficult to replicate, because traditional methods are sensitive to noise and parameter settings. These aspects of traditional clustering methods limit our ability to detect biological communities, and therefore our ability to understand biological functions. To address these limitations and detect robust overlapping biological communities, we propose an unorthodox clustering method called SpeakEasy which identifies communities using top-down and bottom-up approaches simultaneously. Specifically, nodes join communities based on their local connections, as well as global information about the network structure. This method can quantify the stability of each community, automatically identify the number of communities, and quickly cluster networks with hundreds of thousands of nodes. SpeakEasy shows top performance on synthetic clustering benchmarks and accurately identifies meaningful biological communities in a range of datasets, including: gene microarrays, protein interactions, sorted cell populations, electrophysiology and fMRI brain imaging.

  19. Bottom-Up Fabrication of Activated Carbon Fiber for All-Solid-State Supercapacitor with Excellent Electrochemical Performance.

    Science.gov (United States)

    Ma, Wujun; Chen, Shaohua; Yang, Shengyuan; Chen, Wenping; Weng, Wei; Zhu, Meifang

    2016-06-15

    Activated carbon (AC) is the most extensively used electrode material for commercial electric double layer capacitors (EDLC) given its high specific surface area (SSA) and moderate cost. However, AC is primarily used in the forms of powders, which remains a big challenge in developing AC powders into continuous fibers. If AC powders can be processed into fiber, then they may be scaled up for practical applications to supercapacitors (SCs) and satisfy the rapid development of flexible electronics. Herein, we report a bottom-up method to fabricate AC fiber employing graphene oxide (GO) as both dispersant and binder. After chemical reduction, the fiber has high electrical conductivity (185 S m(-1)), high specific surface area (1476.5 m(2) g(-1)), and good mechanical flexibility. An all solid-state flexible SC was constructed using the prepared fiber as electrode, which is free of binder, conducting additive, and additional current collector. The fiber-shaped SC shows high capacitance (27.6 F cm(-3) or 43.8 F g(-1), normalized to the two-electrode volume), superior cyclability (90.4% retention after 10 000 cycles), and good bendability (96.8% retention after bending 1000 times). PMID:27239680

  20. Temporal shifts in top-down vs. bottom-up control of epiphytic algae in a seagrass ecosystem

    Science.gov (United States)

    Whalen, Matthew A.; Duffy, J. Emmett; Grace, James B.

    2013-01-01

    In coastal marine food webs, small invertebrate herbivores (mesograzers) have long been hypothesized to occupy an important position facilitating dominance of habitat-forming macrophytes by grazing competitively superior epiphytic algae. Because of the difficulty of manipulating mesograzers in the field, however, their impacts on community organization have rarely been rigorously documented. Understanding mesograzer impacts has taken on increased urgency in seagrass systems due to declines in seagrasses globally, caused in part by widespread eutrophication favoring seagrass overgrowth by faster-growing algae. Using cage-free field experiments in two seasons (fall and summer), we present experimental confirmation that mesograzer reduction and nutrients can promote blooms of epiphytic algae growing on eelgrass (Zostera marina). In this study, nutrient additions increased epiphytes only in the fall following natural decline of mesograzers. In the summer, experimental mesograzer reduction stimulated a 447% increase in epiphytes, appearing to exacerbate seasonal dieback of eelgrass. Using structural equation modeling, we illuminate the temporal dynamics of complex interactions between macrophytes, mesograzers, and epiphytes in the summer experiment. An unexpected result emerged from investigating the interaction network: drift macroalgae indirectly reduced epiphytes by providing structure for mesograzers, suggesting that the net effect of macroalgae on seagrass depends on macroalgal density. Our results show that mesograzers can control proliferation of epiphytic algae, that top-down and bottom-up forcing are temporally variable, and that the presence of macroalgae can strengthen top-down control of epiphytic algae, potentially contributing to eelgrass persistence.

  1. A Nonminimal SO(10) x U(1)-F SUSY GUT model obtained from a bottom up approach

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Carl H.

    1996-08-01

    Many of the ingredients are explored which are needed to develop a super- symmetric SO(10) x U(1)_F grand unified model based on the Yukawa structure of a model previously constructed in collaboration with S. Nandi to explain the quark and lepton masses and mixings in a particular neutrino scenario. The U(1)_F family symmetry can be made anomaly-free with the introduction of one conjugate pair of SO(10)-singlet neutrinos with the same U(1)_F charge. Due to a plethora of conjugate pairs of supermultiplets, the model develops a Landau singularity within a factor of 1.5 above the GUT scale. With the imposition of a Z_2 discrete symmetry and under certain conditions, all higgsino triplets can be made superheavy while just one pair of higgsino doublets remains light and results in mass matrix textures previously obtained from the bottom-up approach. Diametrically opposite splitting of the first and third family scalar quark and lepton masses away from the second family ones results from the nonuniversal D-term contributions.

  2. Conservative and dissipative force field for simulation of coarse-grained alkane molecules: A bottom-up approach

    Energy Technology Data Exchange (ETDEWEB)

    Trément, Sébastien; Rousseau, Bernard, E-mail: bernard.rousseau@u-psud.fr [Laboratoire de Chimie-Physique, UMR 8000 CNRS, Université Paris-Sud, Orsay (France); Schnell, Benoît; Petitjean, Laurent; Couty, Marc [Manufacture Française des Pneumatiques MICHELIN, Centre de Ladoux, 23 place des Carmes, 63000 Clermont-Ferrand (France)

    2014-04-07

    We apply operational procedures available in the literature to the construction of coarse-grained conservative and friction forces for use in dissipative particle dynamics (DPD) simulations. The full procedure rely on a bottom-up approach: large molecular dynamics trajectories of n-pentane and n-decane modeled with an anisotropic united atom model serve as input for the force field generation. As a consequence, the coarse-grained model is expected to reproduce at least semi-quantitatively structural and dynamical properties of the underlying atomistic model. Two different coarse-graining levels are studied, corresponding to five and ten carbon atoms per DPD bead. The influence of the coarse-graining level on the generated force fields contributions, namely, the conservative and the friction part, is discussed. It is shown that the coarse-grained model of n-pentane correctly reproduces self-diffusion and viscosity coefficients of real n-pentane, while the fully coarse-grained model for n-decane at ambient temperature over-predicts diffusion by a factor of 2. However, when the n-pentane coarse-grained model is used as a building block for larger molecule (e.g., n-decane as a two blobs model), a much better agreement with experimental data is obtained, suggesting that the force field constructed is transferable to large macro-molecular systems.

  3. Pitch and spectral resolution: A systematic comparison of bottom-up cues for top-down repair of degraded speech.

    Science.gov (United States)

    Clarke, Jeanne; Başkent, Deniz; Gaudrain, Etienne

    2016-01-01

    The brain is capable of restoring missing parts of speech, a top-down repair mechanism that enhances speech understanding in noisy environments. This enhancement can be quantified using the phonemic restoration paradigm, i.e., the improvement in intelligibility when silent interruptions of interrupted speech are filled with noise. Benefit from top-down repair of speech differs between cochlear implant (CI) users and normal-hearing (NH) listeners. This difference could be due to poorer spectral resolution and/or weaker pitch cues inherent to CI transmitted speech. In CIs, those two degradations cannot be teased apart because spectral degradation leads to weaker pitch representation. A vocoding method was developed to evaluate independently the roles of pitch and spectral resolution for restoration in NH individuals. Sentences were resynthesized with different spectral resolutions and with either retaining the original pitch cues or discarding them all. The addition of pitch significantly improved restoration only at six-bands spectral resolution. However, overall intelligibility of interrupted speech was improved both with the addition of pitch and with the increase in spectral resolution. This improvement may be due to better discrimination of speech segments from the filler noise, better grouping of speech segments together, and/or better bottom-up cues available in the speech segments. PMID:26827034

  4. Integration of bottom-up and top-down models for the energy system. A practical case for Denmark

    International Nuclear Information System (INIS)

    The main objective of the project was to integrate the Danish macro economic model ADAM with elements from the energy simulation model BRUS, developed at Risoe. The project has been carried out by Risoe National Laboratory with assistance from the Ministry of Finance. A theoretical part focuses on the differences between top-down and bottom-up modelling of the energy-economy interaction. A combined hybrid model seems a relevant alternative to the two traditional approaches. The hybrid model developed is called Hybris and includes models for: supply of electricity and heat, household demand for electricity, and household demand for heat. These three models interact in a iterative procedure with the macro economic model ADAM through a number of links. A reference case as well as a number of scenarios illustrating the capabilities of the model has been set up.Hybris is a simulation model which is capable of analyzing combined CO2 reduction initiatives as regulation of the energy supply system and a CO2 tax in an integrated and consistent way. (au) 32 tabs., 98 ills., 55 refs

  5. Bottom-up estimation of joint moments during manual lifting using orientation sensors instead of position sensors.

    Science.gov (United States)

    Faber, Gert S; Kingma, Idsart; van Dieën, Jaap H

    2010-05-01

    L5/S1, hip and knee moments during manual lifting tasks are, in a laboratory environment, frequently established by bottom-up inverse dynamics, using force plates to measure ground reaction forces (GRFs) and an optoelectronic system to measure segment positions and orientations. For field measurements, alternative measurement systems are being developed. One alternative is the use of small body-mounted inertial/magnetic sensors (IMSs) and instrumented force shoes to measure segment orientation and GRFs, respectively. However, because IMSs measure segment orientations only, the positions of segments relative to each other and relative to the GRFs have to be determined by linking them, assuming fixed segment lengths and zero joint translation. This will affect the estimated joint positions and joint moments. This study investigated the effect of using segment orientations only (orientation-based method) instead of using orientations and positions (reference method) on three-dimensional joint moments. To compare analysis methods (and not measurement methods), GRFs were measured with a force plate and segment positions and/or orientations were measured using optoelectronic marker clusters for both analysis methods. Eleven male subjects lifted a box from floor level using three lifting techniques: a stoop, a semi-squat and a squat technique. The difference between the two analysis methods remained small for the knee moments: knee joint and with reasonable accuracy at the hip and L5/S1 joints using segment orientation and GRF data only.

  6. Bottom-up catalytic approach towards nitrogen-enriched mesoporous carbons/sulfur composites for superior Li-S cathodes.

    Science.gov (United States)

    Sun, Fugen; Wang, Jitong; Chen, Huichao; Qiao, Wenming; Ling, Licheng; Long, Donghui

    2013-01-01

    We demonstrate a sustainable and efficient approach to produce high performance sulfur/carbon composite cathodes via a bottom-up catalytic approach. The selective oxidation of H2S by a nitrogen-enriched mesoporous carbon catalyst can produce elemental sulfur as a by-product which in-situ deposit onto the carbon framework. Due to the metal-free catalytic characteristic and high catalytic selectivity, the resulting sulfur/carbon composites have almost no impurities that thus can be used as cathode materials with compromising battery performance. The layer-by-layer sulfur deposition allows atomic sulfur binding strongly with carbon framework, providing efficient immobilization of sulfur. The nitrogen atoms doped on the carbon framework can increase the surface interactions with polysulfides, leading to the improvement in the trapping of polysulfides. Thus, the composites exhibit a reversible capacity of 939 mAh g(-1) after 100 cycles at 0.2 C and an excellent rate capability of 527 mAh g(-1) at 5 C after 70 cycles.

  7. A novel bottom-up process to produce nanoparticles containing protein and peptide for suspension in hydrofluoroalkane propellants.

    Science.gov (United States)

    Tan, Yinhe; Yang, Zhiwen; Peng, Xinsheng; Xin, Feng; Xu, Yuehong; Feng, Min; Zhao, Chunshun; Hu, Haiyan; Wu, Chuanbin

    2011-07-15

    To overcome the disadvantages of microemulsion and nanoprecipitation methods to produce protein-containing nanoparticles, a novel bottom-up process was developed to produce nanoparticles containing the model protein lysozyme. The nanoparticles were generated by freeze-drying a solution of lysozyme, lecithin and lactose in tert-butyl alcohol (TBA)/water co-solvent system and washing off excess lecithin in lyophilizate by centrifugation. Formulation parameters such as lecithin concentration in organic phase, water content in TBA/water co-solvent, and lactose concentration in water were optimized so as to obtain desired nanoparticles with retention of the bioactivity of lysozyme. Based on the results, 24.0% (w/v) of lecithin, 37.5% (v/v) of water content, and 0.56% (w/v) of lactose concentration were selected to generate spherical nanoparticles with approximately 200 nm in mean size, 0.1 in polydispersity index (PI), and 99% retained bioactivity of lysozyme. These nanoparticles rinsed with ethanol containing dipalmitoylphosphatidylcholine (DPPC), Span 85 or oleic acid (3%, w/v) could readily be dispersed in HFA 134a to form a stable suspension with good redispersibility and 98% retained bioactivity of lysozyme. The study indicates there is a potential to produce pressed metered dose inhaler (pMDI) formulations containing therapeutic protein and peptide nanoparticles.

  8. Top-Down and Bottom-Up Approaches in 3D Printing Technologies for Drug Delivery Challenges.

    Science.gov (United States)

    Katakam, Prakash; Dey, Baishakhi; Assaleh, Fathi H; Hwisa, Nagiat Tayeb; Adiki, Shanta Kumari; Chandu, Babu Rao; Mitra, Analava

    2015-01-01

    3-Dimensional printing (3DP) constitutes a raft of technologies, based on different physical mechanisms, that generate a 3-dimensional physical object from a digital model. Because of its rapid fabrication and precise geometry, 3DP has gained a prominent focus in biomedical and nanobiomaterials research. Despite advancements in targeted, controlled, and pulsatile drug delivery, the achievement of site-specific and disease-responsive drug release and stringent control over in vivo biodistribution, are still some of the important, challenging areas for pharmaceutical research and development and existing drug delivery techniques. Microelectronic industries are capable of generating nano-/microdrug delivery devices at high throughputs with a highly precise control over design. Successful miniaturizations of micro-pumps with multireservoir architectures for delivery of pharmaceuticals developed by micro-electromechanical systems technology were more acceptable than implantable devices. Inkjet printing technologies, which dispense a precise amount of polymer ink solutions, find applications in controlled drug delivery. Bioelectronic products have revolutionized drug delivery technologies. Designing nanoparticles by nanoimprint lithography showed a controlled drug release pattern, biodistribution, and in vivo transport. This review highlights the "top-down" and "bottom-up" approaches of the most promising 3DP technologies and their broader applications in biomedical and therapeutic drug delivery, with critical assessment of its merits, demerits, and intellectual property rights challenges. PMID:25746205

  9. Biochemistry-directed hollow porous microspheres: bottom-up self-assembled polyanion-based cathodes for sodium ion batteries

    Science.gov (United States)

    Lin, Bo; Li, Qiufeng; Liu, Baodong; Zhang, Sen; Deng, Chao

    2016-04-01

    Biochemistry-directed synthesis of functional nanomaterials has attracted great interest in energy storage, catalysis and other applications. The unique ability of biological systems to guide molecule self-assembling facilitates the construction of distinctive architectures with desirable physicochemical characteristics. Herein, we report a biochemistry-directed ``bottom-up'' approach to construct hollow porous microspheres of polyanion materials for sodium ion batteries. Two kinds of polyanions, i.e. Na3V2(PO4)3 and Na3.12Fe2.44(P2O7)2, are employed as cases in this study. The microalgae cell realizes the formation of a spherical ``bottom'' bio-precursor. Its tiny core is subjected to destruction and its tough shell tends to carbonize upon calcination, resulting in the hollow porous microspheres for the ``top'' product. The nanoscale crystals of the polyanion materials are tightly enwrapped by the highly-conductive framework in the hollow microsphere, resulting in the hierarchical nano-microstructure. The whole formation process is disclosed as a ``bottom-up'' mechanism. Moreover, the biochemistry-directed self-assembly process is confirmed to play a crucial role in the construction of the final architecture. Taking advantage of the well-defined hollow-microsphere architecture, the abundant interior voids and the highly-conductive framework, polyanion materials show favourable sodium-intercalation kinetics. Both materials are capable of high-rate long-term cycling. After five hundred cycles at 20 C and 10 C, Na3V2(PO4)3 and Na3.12Fe2.44(P2O7)2 retain 96.2% and 93.1% of the initial capacity, respectively. Therefore, the biochemistry-directed technique provides a low-cost, highly-efficient and widely applicable strategy to produce high-performance polyanion-based cathodes for sodium ion batteries.Biochemistry-directed synthesis of functional nanomaterials has attracted great interest in energy storage, catalysis and other applications. The unique ability of

  10. Bioenergy decision-making of farms in Northern Finland: Combining the bottom-up and top-down perspectives

    International Nuclear Information System (INIS)

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers.

  11. Bioenergy decision-making of farms in Northern Finland: Combining the bottom-up and top-down perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Snaekin, Juha-Pekka, E-mail: juhapekkasnakin@luukku.co [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland); Muilu, Toivo; Pesola, Tuomo [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland)

    2010-10-15

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers.

  12. Bioenergy decision-making of farms in Northern Finland. Combining the bottom-up and top-down perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Snaekin, Juha-Pekka; Muilu, Toivo; Pesola, Tuomo [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland)

    2010-10-15

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers. (author)

  13. —Competitive Brand Salience

    OpenAIRE

    Ralf van der Lans; Rik Pieters; Michel Wedel

    2008-01-01

    Brand salience—the extent to which a brand visually stands out from its competitors—is vital in competing on the shelf, yet is not easy to achieve in practice. This study proposes a methodology to determine the competitive salience of brands, based on a model of visual search and eye-movement recordings collected during a brand search experiment. We estimate brand salience at the point of purchase, based on perceptual features (color, luminance, edges) and how these are influenced by consumer...

  14. A bottom-up approach to derive the closure relation for modelling hydrological fluxes at the watershed scale

    Science.gov (United States)

    Vannametee, Ekkamol; Karssenberg, Derek; Hendriks, Martin; Bierkens, Marc

    2014-05-01

    Physically-based hydrological modelling could be considered as an ideal approach for predictions in ungauged basins because observable catchment characteristics can be used to parameterize the model, avoiding model calibration using discharge data, which are not available. Lumped physically-based modelling at the watershed scale is possible with the Representative Elementary Watershed (REW) approach. A key to successful application of this approach is to find a reliable way of developing closure relations to calculate fluxes from different hydrological compartments in the REWs. Here, we present a bottom-up approach as a generic framework to identify the closure relations for particular hydrological processes that are scale-independent and can be directly parameterized using the local-scale observable REW characteristics. The approach is illustrated using the Hortonian runoff as an example. This approach starts from developing a physically-based high-resolution model describing the Hortonian runoff mechanism based on physically-based infiltration theory and runoff generation processes at a local scale. This physically-based model is used to generate a synthetic discharge data set of hypothetical rainfall events and HRUs (6×105 scenarios) as a surrogate for real-world observations. The Hortonian runoff closure relation is developed as a lumped process-based model, consisting of the Green-Ampt equation, a time-lagged linear reservoir model, and three scale-transfer parameters representing the processes within REWs. These scale-transfer parameters are identified by calibrating the closure relations against the synthetic discharge data set for each scenario run, which are, in turn, empirically related to their corresponding observable REW properties and rainstorm characteristics. This results in a parameter library, which allows direct estimation of scaling parameter for arbitrary REWs based on their local-scale observable properties and rainfall characteristics

  15. The benefits of China's efforts on gaseous pollutant control indicated by the bottom-up emissions and satellite observation

    Science.gov (United States)

    Xia, Y.; Zhao, Y.

    2015-12-01

    To evaluate the effectiveness of national policies of air pollution control, the emissions of SO2, NOX, CO and CO2 in China are estimated with a bottom-up method from 2000 to 2014, and vertical column densities (VCD) from satellite observation are used to evaluate the inter-annual trends and spatial distribution of emissions and the temporal and spatial patterns of ambient levels of gaseous pollutants across the country. In particular, an additional emission case named STD case, which combines the most recent issued emission standards for specific industrial sources, is developed for 2012-2014. The inter-annual trends in emissions and VCDs match well except for SO2, and the revised emissions in STD case improve the comparison, implying the benefits of emission control for most recent years. Satellite retrieval error, underestimation of emission reduction and improved atmospheric oxidization caused the differences between emissions and VCDs trend of SO2. Coal-fired power plants play key roles in SO2 and NOX emission reduction. As suggested by VCD and emission inventory, the control of CO in 11th five year plan (FYP) period was more effective than that in the 12th FYP period, while the SO2 appeared opposite. As the new control target added in 12th FYP, NOX emissions have been clearly decreased 4.3 Mt from 2011 to 2014, in contrast to the fast growth before 2011. The inter-annual trends in NO2 VCDs has the poorest correlation with vehicle ownership (R=0.796), due to the staged emission standard of vehicles. In developed regions, transportation has become the main pollutants emission source and we prove this by comparing VCDs of NO2 to VCDs of SO2. Moreover, air quality in mega cities has been evaluated based on satellite observation and emissions, and results indicate that Beijing suffered heavily from the emissions from Hebei and Tianjin, while the local emissions tend to dominate in Shanghai.

  16. Using the Hestia bottom-up FFCO2 emissions estimation to identify drivers and hotspots in urban areas

    Science.gov (United States)

    Rao, P.; Patarasuk, R.; Gurney, K. R.; o'Keefe, D.; Song, Y.; Huang, J.; Buchert, M.; Lin, J. C.; Mendoza, D. L.; Ehleringer, J. R.; Eldering, A.; Miller, C. E.; Duren, R. M.

    2015-12-01

    Urban areas occupy 3% of the earth's land surface and generate 75% of the fossil fuel carbon dioxide (FFCO2) emissions. We report on the application of the Hestia Project to the Salt Lake County (SLC) and Los Angeles (LA) domains. Hestia quantifies FFCO2 in fine space-time detail across urban domains using a scientific "bottom-up" approach. We explore the utility of the Hestia to inform both urbanization science and greenhouse gas (GHG) mitigation policy. We focus on the residential sector in SLC and the onroad sector in LA as these sectors are large emissions contributors in each locale, and local governments have some authority and policy levers to mitigate these emissions. Multiple regression using sociodemographic data across SLC census block-groups shows that per capita income exhibits a positive relationship with FFCO2 emissions while household size exhibits a negative relationship, after controlling for total population. Housing units per area (i.e., compact development) has little effect on FFCO2 emissions. Rising income in the high-income group has twice as much impact on the emissions as the low-income group. Household size for the low-income group has four times the impact on the emissions as the high-income group. In LA, onroad FFCO2 emissions account for 49% of total emissions, of which 41% is from arterials (intermediate road class). Arterials also have the largest carbon emissions intensity - FFCO2/vehicle distance travelled (VKT) - possibly from high traffic congestion and fleet composition. Non-interstate hotspot emissions (> 419 tC ln-km-1) are equally dominated by particular arterials and collectors (lowest road class) though collectors have a higher VKT. These hotspots occur largely in LA (67%) and Orange (18%) counties and provide targeted information for onroad emissions reduction. Using Hestia to identify FFCO2 emissions drivers and hotpots can aid state and local policy makers in planning the most effective GHG reductions.

  17. Comparing bottom-up and top-down parameterisations of a process-based runoff generation model tailored on floods

    Science.gov (United States)

    Antonetti, Manuel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2016-04-01

    Information about the spatial distribution of dominant runoff processes (DRPs) can improve flood predictions on ungauged basins, where conceptual rainfall-runoff models usually appear to be limited due to the need for calibration. For example, hydrological classifications based on DRPs can be used as regionalisation tools assuming that, once a model structure and its parameters have been identified for each DRP, they can be transferred to other areas where the same DRP occurs. Here we present a process-based runoff generation model as an event-based spin-off of the conceptual hydrological model PREVAH. The model is grid-based and consists of a specific storage system for each DRP. To unbind the parameter values from catchment-related characteristics, the runoff concentration and the flood routing are uncoupled from the runoff generation routine and simulated separately. For the model parameterisation, two contrasting approaches are applied. First, in a bottom-up approach, the parameters of the runoff generation routine are determined a priori based on the results of sprinkling experiments on 60-100 m2 hillslope plots at several grassland locations in Switzerland. The model is, then, applied on a small catchment (0.5 km2) on the Swiss Plateau, and the parameters linked to the runoff concentration are calibrated on a single heavy rainfall-runoff event. The whole system is finally verified on several nearby catchments of larger sizes (up to 430 km2) affected by different heavy rainfall events. In a second attempt, following a top-down approach, all the parameters are calibrated on the largest catchment under investigation and successively verified on three sub-catchments. Simulation results from both parameterisation techniques are finally compared with results obtained with the traditional PREVAH.

  18. Bottom-up fabrication of artery-mimicking tubular co-cultures in collagen-based microchannel scaffolds.

    Science.gov (United States)

    Tan, A; Fujisawa, K; Yukawa, Y; Matsunaga, Y T

    2016-10-20

    We developed a robust bottom-up approach to construct open-ended, tubular co-culture constructs that simulate the human vascular morphology and microenvironment. By design, these three-dimensional artificial vessels mimic the basic architecture of an artery: a collagen-rich extracellular matrix (as the tunica externa), smooth muscle cells (SMCs) (as the tunica media), and an endothelial cell (EC) lining (as the tunica interna). A versatile needle-based fabrication technique was employed to achieve controllable arterial layouts within a PDMS-hosted collagen microchannel scaffold (330 ± 10 μm in diameter): (direct co-culture) a SMC/EC bilayer to follow the structure of an arteriole-like segment; and (encapsulated co-culture) a lateral SMC multilayer covered by an EC monolayer lining to simulate the architecture of a larger artery. Optical and fluorescence microscopy images clearly evidenced the progressive cell elongation and sprouting behavior of SMCs and ECs along the collagen gel contour and within the gel matrix under static co-culture conditions. The progressive cell growth patterns effectively led to the formation of a tubular co-culture with an internal endothelial lining expressing prominent CD31 (cluster of differentiation 31) intercellular junction markers. During a 4-day static maturation period, the artery constructs showed modest alteration in the luminal diameters (i.e. less than 10% changes from the initial measurements). This argues in favor of stable and predictable arterial architecture achieved via the proposed fabrication protocols. Both co-culture models showed a high glucose metabolic rate during the initial proliferation phase, followed by a temporary quiescent (and thus, mature) stage. These proof-of-concept models with a controllable architecture create an important foundation for advanced vessel manipulations such as the integration of relevant physiological functionality or remodeling into a vascular disease-mimicking tissue. PMID

  19. Regime shift from phytoplankton to macrophyte dominance in a large river: Top-down versus bottom-up effects

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Carles, E-mail: carles.ibanez@irta.cat [IRTA Aquatic Ecosystems, Carretera Poble Nou, Km 5.5, 43540 St. Carles de la Rapita, Catalonia (Spain); Alcaraz, Carles; Caiola, Nuno; Rovira, Albert; Trobajo, Rosa [IRTA Aquatic Ecosystems, Carretera Poble Nou, Km 5.5, 43540 St. Carles de la Rapita, Catalonia (Spain); Alonso, Miguel [United Research Services S.L., Urgell 143, 08036 Barcelona, Catalonia (Spain); Duran, Concha [Confederacion Hidrografica del Ebro, Sagasta 24-26, 50071 Zaragoza, Aragon (Spain); Jimenez, Pere J. [Grup Natura Freixe, Major 56, 43750 Flix, Catalonia (Spain); Munne, Antoni [Agencia Catalana de l' Aigua, Provenca 204-208, 08036 Barcelona, Catalonia (Spain); Prat, Narcis [Departament d' Ecologia, Universitat de Barcelona, Diagonal 645, 08028 Barcelona Catalonia (Spain)

    2012-02-01

    The lower Ebro River (Catalonia, Spain) has recently undergone a regime shift from a phytoplankton-dominated to a macrophyte-dominated system. This shift is well known in shallow lakes but apparently it has never been documented in rivers. Two initial hypotheses to explain the collapse of the phytoplankton were considered: a) the diminution of nutrients (bottom-up); b) the filtering effect due to the colonization of the zebra mussel (top-down). Data on water quality, hydrology and biological communities (phytoplankton, macrophytes and zebra mussel) was obtained both from existing data sets and new surveys. Results clearly indicate that the decrease in phosphorus is the main cause of a dramatic decrease in chlorophyll and large increase in water transparency, triggering the subsequent colonization of macrophytes in the river bed. A Generalized Linear Model analysis showed that the decrease in dissolved phosphorus had a relative importance 14 times higher than the increase in zebra mussel density to explain the variation of total chlorophyll. We suggest that the described changes in the lower Ebro River can be considered a novel ecosystem shift. This shift is triggering remarkable changes in the biological communities beyond the decrease of phytoplankton and the proliferation of macrophytes, such as massive colonization of Simulidae (black fly) and other changes in the benthic invertebrate communities that are currently investigated. - Highlights: Black-Right-Pointing-Pointer We show a regime shift in a large river from phytoplankton to macrophyte dominance. Black-Right-Pointing-Pointer Two main hypotheses are considered: nutrient decrease and zebra mussel grazing. Black-Right-Pointing-Pointer Phosphorus depletion is found to be the main cause of the phytoplankton decline. Black-Right-Pointing-Pointer We conclude that oligotrophication triggered the colonization of macrophytes. Black-Right-Pointing-Pointer This new regime shift in a river is similar to that described

  20. Motivation and drives in bottom-up developments in natural hazards management: multiple-use of adaptation strategies in Austria

    Science.gov (United States)

    Thaler, Thomas; Fuchs, Sven

    2015-04-01

    Losses from extreme hydrological events, such as recently experienced in Europe have focused the attention of policymakers as well as researchers on vulnerability to natural hazards. In parallel, the context of changing flood risks under climate and societal change is driving transformation in the role of the state in responsibility sharing and individual responsibilities for risk management and precaution. The new policy agenda enhances the responsibilities of local authorities and private citizens in hazard management and reduces the role of central governments. Within the objective is to place added responsibility on local organisations and citizens to determine locally-based strategies for risk reduction. A major challenge of modelling adaptation is to represent the complexity of coupled human-environmental systems and particularly the feedback loops between environmental dynamics and human decision-making processes on different scales. This paper focuses on bottom-up initiatives to flood risk management which are, by definition, different from the mainstream. These initiatives are clearly influenced (positively or negatively) by a number of factors, where the combination of these interdependences can create specific conditions that alter the opportunity for effective governance arrangements in a local scheme approach. In total, this study identified six general drivers which encourage the implementation of flood storages, such as direct relation to recent major flood frequency and history, the initiative of individual stakeholders (promoters), political pressures from outside (e.g. business companies, private households) and a strong solidarity attitude of municipalities and the stakeholders involved. Although partnership approach may be seen as an 'optimal' solution for flood risk management, in practice there are many limitations and barriers in establishing these collaborations and making them effective (especially in the long term) with the consequences

  1. Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing.

    Directory of Open Access Journals (Sweden)

    Alexander eJones

    2015-01-01

    Full Text Available Selective attention to a spatial location has shown enhance perception and facilitate behaviour for events at attended locations. However, selection relies not only on where but also when an event occurs. Recently, interest has turned to how intrinsic neural oscillations in the brain entrain to rhythms in our environment, and, stimuli appearing in or out of synch with a rhythm have shown to modulate perception and performance. Temporal expectations created by rhythms and spatial attention are two processes which have independently shown to affect stimulus processing but it remains largely unknown how, and if, they interact. In four separate tasks, this study investigated the effects of voluntary spatial attention and bottom-up temporal expectations created by rhythms in both unimodal and crossmodal conditions. In each task the participant used an informative cue, either colour or pitch, to direct their covert spatial attention to the left or right, and respond as quickly as possible to a target. The lateralized target (visual or auditory was then presented at the attended or unattended side. Importantly, although not task relevant, the cue was a rhythm of either flashes or beeps. The target was presented in or out of sync (early or late with the rhythmic cue. The results showed participants were faster responding to spatially attended compared to unattended targets in all tasks. Moreover, there was an effect of rhythmic cueing upon response times in both unimodal and crossmodal conditions. Responses were faster to targets presented in sync with the rhythm compared to when they appeared too early in both crossmodal tasks. That is, rhythmic stimuli in one modality influenced the temporal expectancy in the other modality, suggesting temporal expectancies created by rhythms are crossmodal. Interestingly, there was no interaction between top-down spatial attention and rhythmic cueing in any task suggesting these two processes largely influenced

  2. Bottom-up engineering of biological systems through standard bricks: a modularity study on basic parts and devices.

    Directory of Open Access Journals (Sweden)

    Lorenzo Pasotti

    Full Text Available BACKGROUND: Modularity is a crucial issue in the engineering world, as it enables engineers to achieve predictable outcomes when different components are interconnected. Synthetic Biology aims to apply key concepts of engineering to design and construct new biological systems that exhibit a predictable behaviour. Even if physical and measurement standards have been recently proposed to facilitate the assembly and characterization of biological components, real modularity is still a major research issue. The success of the bottom-up approach strictly depends on the clear definition of the limits in which biological functions can be predictable. RESULTS: The modularity of transcription-based biological components has been investigated in several conditions. First, the activity of a set of promoters was quantified in Escherichia coli via different measurement systems (i.e., different plasmids, reporter genes, ribosome binding sites relative to an in vivo reference promoter. Second, promoter activity variation was measured when two independent gene expression cassettes were assembled in the same system. Third, the interchangeability of input modules (a set of constitutive promoters and two regulated promoters connected to a fixed output device (a logic inverter expressing GFP was evaluated. The three input modules provide tunable transcriptional signals that drive the output device. If modularity persists, identical transcriptional signals trigger identical GFP outputs. To verify this, all the input devices were individually characterized and then the input-output characteristic of the logic inverter was derived in the different configurations. CONCLUSIONS: Promoters activities (referred to a standard promoter can vary when they are measured via different reporter devices (up to 22%, when they are used within a two-expression-cassette system (up to 35% and when they drive another device in a functionally interconnected circuit (up to 44%. This paper

  3. Research on ethics in two large Human Biomonitoring projects ECNIS and NewGeneris: a bottom up approach

    Directory of Open Access Journals (Sweden)

    Casteleyn Ludwine

    2008-01-01

    Full Text Available Abstract Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed. This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach

  4. Assisted editing od SensorML with EDI. A bottom-up scenario towards the definition of sensor profiles.

    Science.gov (United States)

    Oggioni, Alessandro; Tagliolato, Paolo; Fugazza, Cristiano; Bastianini, Mauro; Pavesi, Fabio; Pepe, Monica; Menegon, Stefano; Basoni, Anna; Carrara, Paola

    2015-04-01

    -product of this ongoing work is currently constituting an archive of predefined sensor descriptions. This information is being collected in order to further ease metadata creation in the next phase of the project. Users will be able to choose among a number of sensor and sensor platform prototypes: These will be specific instances on which it will be possible to define, in a bottom-up approach, "sensor profiles". We report on the outcome of this activity.

  5. Research on ethics in two large Human Biomonitoring projects ECNIS and NewGeneris: a bottom up approach.

    Science.gov (United States)

    Dumez, Birgit; Van Damme, Karel; Casteleyn, Ludwine

    2008-01-01

    Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed.This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach. This will not only increase

  6. Novelty seeking, incentive salience and acquisition of cocaine self-administration in the rat.

    Science.gov (United States)

    Beckmann, Joshua S; Marusich, Julie A; Gipson, Cassandra D; Bardo, Michael T

    2011-01-01

    It has been suggested that incentive salience plays a major role in drug abuse and the development of addiction. Additionally, novelty seeking has been identified as a significant risk factor for drug abuse. However, how differences in the readiness to attribute incentive salience relate to novelty seeking and drug abuse vulnerability has not been explored. The present experiments examined how individual differences in incentive salience attribution relate to novelty seeking and acquisition of cocaine self-administration in a preclinical model. Rats were first assessed in an inescapable novelty task and a novelty place preference task (measures of novelty seeking), followed by a Pavlovian conditioned approach task for food (a measure of incentive salience attribution). Rats then were trained to self-administer cocaine (0.3 or 1.0 mg/kg/infusion) using an autoshaping procedure. The results demonstrate that animals that attributed incentive salience to a food-associated cue were higher novelty seekers and acquired cocaine self-administration more quickly at the lower dose. The results suggest that novelty-seeking behavior may be a mediator of incentive salience attribution and that incentive salience magnitude may be an indicator of drug reward.

  7. Salience-Affected Neural Networks

    CERN Document Server

    Remmelzwaal, Leendert A; Ellis, George F R

    2010-01-01

    We present a simple neural network model which combines a locally-connected feedforward structure, as is traditionally used to model inter-neuron connectivity, with a layer of undifferentiated connections which model the diffuse projections from the human limbic system to the cortex. This new layer makes it possible to model global effects such as salience, at the same time as the local network processes task-specific or local information. This simple combination network displays interactions between salience and regular processing which correspond to known effects in the developing brain, such as enhanced learning as a result of heightened affect. The cortex biases neuronal responses to affect both learning and memory, through the use of diffuse projections from the limbic system to the cortex. Standard ANNs do not model this non-local flow of information represented by the ascending systems, which are a significant feature of the structure of the brain, and although they do allow associational learning with...

  8. Top-Down and Bottom-Up Approaches in Engineering 1 T Phase Molybdenum Disulfide (MoS2 ): Towards Highly Catalytically Active Materials.

    Science.gov (United States)

    Chua, Chun Kiang; Loo, Adeline Huiling; Pumera, Martin

    2016-09-26

    The metallic 1 T phase of MoS2 has been widely identified to be responsible for the improved performances of MoS2 in applications including hydrogen evolution reactions and electrochemical supercapacitors. To this aim, various synthetic methods have been reported to obtain 1 T phase-rich MoS2 . Here, the aim is to evaluate the efficiencies of the bottom-up (hydrothermal reaction) and top-down (chemical exfoliation) approaches in producing 1 T phase MoS2 . It is established in this study that the 1 T phase MoS2 produced through the bottom-up approach contains a high proportion of 1 T phase and demonstrates excellent electrochemical and electrical properties. Its performance in the hydrogen evolution reaction and electrochemical supercapacitors also surpassed that of 1 T phase MoS2 produced through a top-down approach.

  9. Energy-environment policy modeling of endogenous technological change with personal vehicles. Combining top-down and bottom-up methods

    International Nuclear Information System (INIS)

    The transportation sector offers substantial potential for greenhouse gas (GHG) emission abatement, but widely divergent cost estimates complicate policy making; energy-economy policy modelers apply top-down and bottom-up cost definitions and different assumptions about future technologies and the preferences of firms and households. Our hybrid energy-economy policy model is technology-rich, like a bottom-up model, but has empirically estimated behavioral parameters for risk and technology preferences, like a top-down model. Unlike typical top-down models, however, it simulates technological change endogenously with functions that relate the financial costs of technologies to cumulative production and adjust technology preferences as market shares change. We apply it to the choice of personal vehicles to indicate, first, the effect on cost estimates of divergent cost definitions and, second, the possible response to policies that require a minimum market share for low emission vehicles

  10. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    OpenAIRE

    de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude

    2010-01-01

    Background: Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Purpose: To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs.Results: In the first step, a diagnostic study was conducted with face-to-face interviews to g...

  11. Diagnostic, design and implementation of an integrated model of care in France: a bottom-up process with a continuous leadership

    OpenAIRE

    de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude

    2010-01-01

    Purpose To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Context Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Case description In the first step, a diagnostic study was conducted with face-to-face interviews ...

  12. "Disorganized in time": Impact of bottom-up and top-down negative emotion generation on memory formation among healthy and traumatized adolescents.

    OpenAIRE

    Guillery-Girard, Bérengère; Clochon, Patrice; Giffard, Bénédicte; Viard, Armelle; Egler, Pierre-Jean; Baleyte, Jean-Marc; Eustache, Francis; Dayan, Jacques

    2013-01-01

    International audience "Travelling in time," a central feature of episodic memory is severely affected among individuals with Post Traumatic Stress Disorder (PTSD) with two opposite effects: vivid traumatic memories are unorganized in temporality (bottom-up processes), non-traumatic personal memories tend to lack spatio-temporal details and false recognitions occur more frequently that in the general population (top-down processes). To test the effect of these two types of processes (i.e. ...

  13. Modeling technical change in energy system analysis: analyzing the introduction of learning-by-doing in bottom-up energy models

    International Nuclear Information System (INIS)

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aimed at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options-which is absent in many top-down models-they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they often fail in capturing strategic technology diffusion behavior in the energy sector as well as the energy sector's endogenous responses to policy, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). Some suggestions on how innovation and diffusion modeling in bottom-up analysis can be improved are put forward

  14. Modeling Technical Change in Energy System Analysis: Analyzing the Introduction of Learning-by-Doing in Bottom-up Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Berglund, Christer; Soederholm, Patrik [Luleaa Univ. of Technology (Sweden). Div. of Economics

    2005-02-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aiming at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options - which is absent in many top-down models - they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they fail in capturing strategic technology diffusion behavior in the energy sector, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). For these reasons bottom-up and top-down models with induced technical change should not be viewed as substitutes but rather as complements.

  15. Tax Salience, Voting, and Deliberation

    DEFF Research Database (Denmark)

    Sausgruber, Rupert; Tyran, Jean-Robert

    Tax incentives can be more or less salient, i.e. noticeable or cognitively easy to process. Our hypothesis is that taxes on consumers are more salient to consumers than equivalent taxes on sellers because consumers underestimate the extent of tax shifting in the market. We show that tax salience...... biases consumers' voting on tax regimes, and that experience is an effective de-biasing mechanism in the experimental laboratory. Pre-vote deliberation makes initially held opinions more extreme rather than correct and does not eliminate the bias in the typical committee. Yet, if voters can discuss...... their experience with the tax regimes they are less likely to be biased....

  16. Stakeholder salience in humanitarian supply chain management

    OpenAIRE

    Schiffling, Sarah

    2013-01-01

    Mitchell et al.(1997) developed a framework for assessing the salience of stakeholder groups based on their power, urgency and the legitimacy of their claim. This has been applied to illustrate the complexities of stakeholder interactions in humanitarian supply chains and to provide insights for their management and further research. Keywords: Supply chain management, Humanitarian logistics, Stakeholder salience

  17. Salience of Alcohol Expectancies and Drinking Outcomes.

    Science.gov (United States)

    Reese, Finetta L.

    1997-01-01

    Investigated whether the prediction of drinking might be enhanced by considering salience of alcohol expectancies rather than mere endorsement. Hierarchical regression analyses demonstrated that expectancy salience significantly improved the prediction of total alcohol consumption above and beyond the effects of expectancy endorsement. Expectancy…

  18. Exploring the Life Expectancy Increase in Poland in the Context of CVD Mortality Fall: The Risk Assessment Bottom-Up Approach, From Health Outcome to Policies.

    Science.gov (United States)

    Kobza, Joanna; Geremek, Mariusz

    2015-01-01

    Life expectancy at birth is considered the best mortality-based summary indicator of the health status of the population and is useful for measuring long-term health changes. The objective of this article was to present the concept of the bottom-up policy risk assessment approach, developed to identify challenges involved in analyzing risk factor reduction policies and in assessing how the related health indicators have changed over time. This article focuses on the reasons of the significant life expectancy prolongation in Poland over the past 2 decades, thus includes policy context. The methodology details a bottom-up risk assessment approach, a chain of relations between the health outcome, risk factors, and health policy, based on Risk Assessment From Policy to Impact Dimension project guidance. A decline in cardiovascular disease mortality was a key factor that followed life expectancy prolongation. Among basic factors, tobacco and alcohol consumption, diet, physical activity, and new treatment technologies were identified. Poor health outcomes of the Polish population at the beginning of 1990s highlighted the need of the implementation of various health promotion programs, legal acts, and more effective public health policies. Evidence-based public health policy needs translating scientific research into policy and practice. The bottom-up case study template can be one of the focal tools in this process. Accountability for the health impact of policies and programs and legitimization of the decisions of policy makers has become one of the key questions nowadays in European countries' decision-making process and in EU public health strategy. PMID:26546595

  19. Exploring the Life Expectancy Increase in Poland in the Context of CVD Mortality Fall: The Risk Assessment Bottom-Up Approach, From Health Outcome to Policies.

    Science.gov (United States)

    Kobza, Joanna; Geremek, Mariusz

    2015-01-01

    Life expectancy at birth is considered the best mortality-based summary indicator of the health status of the population and is useful for measuring long-term health changes. The objective of this article was to present the concept of the bottom-up policy risk assessment approach, developed to identify challenges involved in analyzing risk factor reduction policies and in assessing how the related health indicators have changed over time. This article focuses on the reasons of the significant life expectancy prolongation in Poland over the past 2 decades, thus includes policy context. The methodology details a bottom-up risk assessment approach, a chain of relations between the health outcome, risk factors, and health policy, based on Risk Assessment From Policy to Impact Dimension project guidance. A decline in cardiovascular disease mortality was a key factor that followed life expectancy prolongation. Among basic factors, tobacco and alcohol consumption, diet, physical activity, and new treatment technologies were identified. Poor health outcomes of the Polish population at the beginning of 1990s highlighted the need of the implementation of various health promotion programs, legal acts, and more effective public health policies. Evidence-based public health policy needs translating scientific research into policy and practice. The bottom-up case study template can be one of the focal tools in this process. Accountability for the health impact of policies and programs and legitimization of the decisions of policy makers has become one of the key questions nowadays in European countries' decision-making process and in EU public health strategy.

  20. Bottom-Up Enhancement of g-C3N4 Photocatalytic H2 Evolution Utilising Disordering Intermolecular Interactions of Precursor

    Directory of Open Access Journals (Sweden)

    Xue Lu Wang

    2014-01-01

    Full Text Available Disordered intermolecular interaction carbon nitride precursor prepared by water-assisted grinding of dicyandiamide was used for synthesis of g-C3N4. The final sample possesses much looser structure and provides a broadening optical window for effective light harvesting and charge separation efficiency, which exhibits significantly improved H2 evolution by photocatalytic water splitting. The bottom-up mechanochemistry method opens new vistas towards the potential applications of weak interactions in the photocatalysis field and may also stimulate novel ideas completely different from traditional ones for the design and optimization of photocatalysts.

  1. Assessing the construct validity of aberrant salience

    Directory of Open Access Journals (Sweden)

    Kristin Schmidt

    2009-12-01

    Full Text Available We sought to validate the psychometric properties of a recently developed paradigm that aims to measure salience attribution processes proposed to contribute to positive psychotic symptoms, the Salience Attribution Test (SAT. The “aberrant salience” measure from the SAT showed good face validity in previous results, with elevated scores both in high-schizotypy individuals, and in patients with schizophrenia suffering from delusions. Exploring the construct validity of salience attribution variables derived from the SAT is important, since other factors, including latent inhibition/learned irrelevance, attention, probabilistic reward learning, sensitivity to probability, general cognitive ability and working memory could influence these measures. Fifty healthy participants completed schizotypy scales, the SAT, a learned irrelevance task, and a number of other cognitive tasks tapping into potentially confounding processes. Behavioural measures of interest from each task were entered into a principal components analysis, which yielded a five-factor structure accounting for ~75% percent of the variance in behaviour. Implicit aberrant salience was found to load onto its own factor, which was associated with elevated “Introvertive Anhedonia” schizotypy, replicating our previous finding. Learned irrelevance loaded onto a separate factor, which also included implicit adaptive salience, but was not associated with schizotypy. Explicit adaptive and aberrant salience, along with a measure of probabilistic learning, loaded onto a further factor, though this also did not correlate with schizotypy. These results suggest that the measures of learned irrelevance and implicit adaptive salience might be based on similar underlying processes, which are dissociable both from implicit aberrant salience and explicit measures of salience.

  2. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    Directory of Open Access Journals (Sweden)

    Matthieu de Stampa

    2010-02-01

    Full Text Available Background: Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs. Purpose: To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Results: In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher followed by a double one (clinician and managers of services in the implementation phase. Conclusion: The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements

  3. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    Directory of Open Access Journals (Sweden)

    Matthieu de Stampa

    2010-02-01

    Full Text Available Background: Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs. Purpose: To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs.Results: In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher followed by a double one (clinician and managers of services in the implementation phase.Conclusion: The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements

  4. Increased water salinity applied to tomato plants accelerates the development of the leaf miner Tuta absoluta through bottom-up effects.

    Science.gov (United States)

    Han, Peng; Wang, Zhi-Jian; Lavoir, Anne-Violette; Michel, Thomas; Seassau, Aurélie; Zheng, Wen-Yan; Niu, Chang-Ying; Desneux, Nicolas

    2016-01-01

    Variation in resource inputs to plants may trigger bottom-up effects on herbivorous insects. We examined the effects of water input: optimal water vs. limited water; water salinity: with vs. without addition of 100 mM NaCl; and their interactions on tomato plants (Solanum lycopersicum), and consequently, the bottom-up effects on the tomato leaf miner, Tuta absoluta (Meytick) (Lepidoptera: Gelechiidae). Plant growth was significantly impeded by limited water input and NaCl addition. In terms of leaf chemical defense, the production of tomatidine significantly increased with limited water and NaCl addition, and a similar but non-significant trend was observed for the other glycoalkaloids. Tuta absoluta survival did not vary with the water and salinity treatments, but the treatment "optimal water-high salinity" increased the development rate without lowering pupal mass. Our results suggest that caution should be used in the IPM program against T. absoluta when irrigating tomato crops with saline water. PMID:27619473

  5. 自底向上的应用层组播树重构算法%Bottom-Up Application Layer Multicast Tree Reconstruction Algorithm

    Institute of Scientific and Technical Information of China (English)

    邓正伟; 李锋

    2011-01-01

    Based on the analysis of the traditional application layer multicast tree reconstruction algorithm, combined with proactive reconstruction technique, a bottom-up application layer multicast tree reconstruction algorithm is proposed. The algorithm employs a bottom-up strategy, which combines both local and global-selection strategies for backup parent node choice. Simulation results show that the algorithm has improvement in the respect of recovery delay of multicast tree, the quality of the reconstructed tree and the control overhead of tree reconstruction.%分析传统应用层组播树重构算法的不足,结合前向式重构技术,提出一种自底向上的应用层组播树重构算法.采用自底向上的方法将备用父节点的本地选择策略和全局选择策略进行有机结合.仿真结果表明,该算法在组播树的恢复时延、重构树的质量、树重建的控制开销方面都有一定的改进.

  6. Increased water salinity applied to tomato plants accelerates the development of the leaf miner Tuta absoluta through bottom-up effects

    Science.gov (United States)

    Han, Peng; Wang, Zhi-jian; Lavoir, Anne-Violette; Michel, Thomas; Seassau, Aurélie; Zheng, Wen-yan; Niu, Chang-ying; Desneux, Nicolas

    2016-01-01

    Variation in resource inputs to plants may trigger bottom-up effects on herbivorous insects. We examined the effects of water input: optimal water vs. limited water; water salinity: with vs. without addition of 100 mM NaCl; and their interactions on tomato plants (Solanum lycopersicum), and consequently, the bottom-up effects on the tomato leaf miner, Tuta absoluta (Meytick) (Lepidoptera: Gelechiidae). Plant growth was significantly impeded by limited water input and NaCl addition. In terms of leaf chemical defense, the production of tomatidine significantly increased with limited water and NaCl addition, and a similar but non-significant trend was observed for the other glycoalkaloids. Tuta absoluta survival did not vary with the water and salinity treatments, but the treatment “optimal water-high salinity” increased the development rate without lowering pupal mass. Our results suggest that caution should be used in the IPM program against T. absoluta when irrigating tomato crops with saline water. PMID:27619473

  7. Objective correlates of pitch salience using pupillometry

    DEFF Research Database (Denmark)

    Bianchi, Federica; Santurette, Sébastien; Wendt, Dorothea;

    2014-01-01

    with increasing effort in performing the task and thus with decreasing pitch salience. A group of normal-hearing listeners first performed a behavioral pitch-discrimination experiment, where fundamental frequency difference limens ( F 0 DLs ) were measured as a function of F 0 . Results showed that pitch salience...... the frequency region and F 0 , were considered. Pupil size was measured for each condition, while the subjects’ task was to detect the deviants by pressing a response button. The expected trend was that pupil size would increase with decreasing salience. Results for musically trained listeners showed...... the expected trend, whereby pupil size significantly increased with decreasing salience of the stimuli. Non-musically trained listeners showed, however, a smaller pupil size for the least salient condition as compared to a medium salient condition, probably due to a too demanding task...

  8. Effects of Communication Mode and Salience on Recasts: A First Exposure Study

    Science.gov (United States)

    Yilmaz, Yucel; Yuksel, Dogan

    2011-01-01

    This article reports on a study that investigated whether the extent to which learners benefit from recasts on two Turkish morphemes differ depending on communication mode--i.e. Face-to-Face Communication (F2FC) and text-based Synchronous Computer-Mediated Communication (SCMC)--and/or the salience of the target structure (i.e. salient and…

  9. Isolating the Incentive Salience of Reward-Associated Stimuli: Value, Choice, and Persistence

    Science.gov (United States)

    Beckmann, Joshua S.; Chow, Jonathan J.

    2015-01-01

    Sign- and goal-tracking are differentially associated with drug abuse-related behavior. Recently, it has been hypothesized that sign- and goal-tracking behavior are mediated by different neurobehavioral valuation systems, including differential incentive salience attribution. Herein, we used different conditioned stimuli to preferentially elicit…

  10. Quantifying object salience by equating distractor effects

    OpenAIRE

    Huang, L Q; Pashler, Harold

    2005-01-01

    It is commonly believed that objects viewed in certain contexts may be more or less salient. Measurements of salience have usually relied on asking observers "How much does this object stand out against the background?". In this study, we measured the salience of objects by assessing the distraction they produce for subjects searching for a different, pre-specified target. Distraction was measured through response times, but changes in response times were not assumed to be a linear measure of...

  11. Discrimination learning with variable stimulus 'salience'

    OpenAIRE

    2011-01-01

    Background In nature, sensory stimuli are organized in heterogeneous combinations. Salient items from these combinations 'stand-out' from their surroundings and determine what and how we learn. Yet, the relationship between varying stimulus salience and discrimination learning remains unclear. Presentation of the hypothesis A rigorous formulation of the problem of discrimination learning should account for varying salience effects. We hypothesize that structural variations in the environment ...

  12. Language-experience plasticity in neural representation of changes in pitch salience.

    Science.gov (United States)

    Krishnan, Ananthanarayan; Gandour, Jackson T; Suresh, Chandan H

    2016-04-15

    Neural representation of pitch-relevant information at the brainstem and cortical levels of processing is influenced by language experience. A well-known attribute of pitch is its salience. Brainstem frequency following responses and cortical pitch specific responses, recorded concurrently, were elicited by a pitch salience continuum spanning weak to strong pitch of a dynamic, iterated rippled noise pitch contour-homolog of a Mandarin tone. Our aims were to assess how language experience (Chinese, English) affects i) enhancement of neural activity associated with pitch salience at brainstem and cortical levels, ii) the presence of asymmetry in cortical pitch representation, and iii) patterns of relative changes in magnitude along the pitch salience continuum. Peak latency (Fz: Na, Pb, and Nb) was shorter in the Chinese than the English group across the continuum. Peak-to-peak amplitude (Fz: Na-Pb, Pb-Nb) of the Chinese group grew larger with increasing pitch salience, but an experience-dependent advantage was limited to the Na-Pb component. At temporal sites (T7/T8), the larger amplitude of the Chinese group across the continuum was both limited to the Na-Pb component and the right temporal site. At the brainstem level, F0 magnitude gets larger as you increase pitch salience, and it too reveals Chinese superiority. A direct comparison of cortical and brainstem responses for the Chinese group reveals different patterns of relative changes in magnitude along the pitch salience continuum. Such differences may point to a transformation in pitch processing at the cortical level presumably mediated by local sensory and/or extrasensory influence overlaid on the brainstem output. PMID:26903418

  13. Toward consistency between bottom-up CO2 emissions trends and top-down atmospheric measurements in the Los Angeles megacity

    Science.gov (United States)

    Newman, S.; Xu, X.; Gurney, K. R.; Hsu, Y.-K.; Li, K.-F.; Jiang, X.; Keeling, R.; Feng, S.; O'Keefe, D.; Patarasuk, R.; Wong, K. W.; Rao, P.; Fischer, M. L.; Yung, Y. L.

    2015-10-01

    Large urban emissions of greenhouse gases result in large atmospheric enhancements relative to background that are easily measured. Using CO2 mole fractions and Δ14C and δ13C values of CO2 in the Los Angeles megacity observed in inland Pasadena (2006-2013) and coastal Palos Verdes peninsula (autumn 2009-2013), we have determined time series for CO2 contributions from fossil fuel combustion for both sites and broken those down into contributions from petroleum/gasoline and natural gas burning for Pasadena. We find a 10 % reduction in Pasadena CO2 emissions from fossil fuel combustion during the Great Recession of 2008-2010, which is consistent with the bottom-up inventory determined by the California Air Resources Board. The isotopic variations and total atmospheric CO2 from our observations are used to infer seasonality of natural gas and petroleum combustion. For natural gas, inferred emissions are out of phase with the seasonal cycle of total natural gas combustion seasonal patterns in bottom-up inventories but are consistent with the seasonality of natural gas usage by the area's electricity generating power plants. For petroleum, the inferred seasonality of CO2 emissions from burning petroleum is delayed by several months relative to usage indicated by statewide gasoline taxes. Using the high-resolution Hestia-LA data product to compare emissions from parts of the basin sampled by winds at different times of year, we find that variations in observed fossil fuel CO2 reflect seasonal variations in wind direction. The seasonality of the local CO2 excess from fossil fuel combustion along the coast, on Palos Verdes peninsula, is higher in fall and winter than spring and summer, almost completely out of phase with that from Pasadena, also because of the annual variations of winds in the region. Variations in fossil fuel CO2 signals are consistent with sampling the bottom-up Hestia-LA fossil CO2 emissions product for sub-city source regions in the LA megacity domain

  14. Toward consistency between trends in bottom-up CO2 emissions and top-down atmospheric measurements in the Los Angeles megacity

    Science.gov (United States)

    Newman, Sally; Xu, Xiaomei; Gurney, Kevin R.; Kuang Hsu, Ying; Li, King Fai; Jiang, Xun; Keeling, Ralph; Feng, Sha; O'Keefe, Darragh; Patarasuk, Risa; Weng Wong, Kam; Rao, Preeti; Fischer, Marc L.; Yung, Yuk L.

    2016-03-01

    Large urban emissions of greenhouse gases result in large atmospheric enhancements relative to background that are easily measured. Using CO2 mole fractions and Δ14C and δ13C values of CO2 in the Los Angeles megacity observed in inland Pasadena (2006-2013) and coastal Palos Verdes peninsula (autumn 2009-2013), we have determined time series for CO2 contributions from fossil fuel combustion (Cff) for both sites and broken those down into contributions from petroleum and/or gasoline and natural gas burning for Pasadena. We find a 10 % reduction in Pasadena Cff during the Great Recession of 2008-2010, which is consistent with the bottom-up inventory determined by the California Air Resources Board. The isotopic variations and total atmospheric CO2 from our observations are used to infer seasonality of natural gas and petroleum combustion. The trend of CO2 contributions to the atmosphere from natural gas combustion is out of phase with the seasonal cycle of total natural gas combustion seasonal patterns in bottom-up inventories but is consistent with the seasonality of natural gas usage by the area's electricity generating power plants. For petroleum, the inferred seasonality of CO2 contributions from burning petroleum is delayed by several months relative to usage indicated by statewide gasoline taxes. Using the high-resolution Hestia-LA data product to compare Cff from parts of the basin sampled by winds at different times of year, we find that variations in observed fossil fuel CO2 reflect seasonal variations in wind direction. The seasonality of the local CO2 excess from fossil fuel combustion along the coast, on Palos Verdes peninsula, is higher in autumn and winter than spring and summer, almost completely out of phase with that from Pasadena, also because of the annual variations of winds in the region. Variations in fossil fuel CO2 signals are consistent with sampling the bottom-up Hestia-LA fossil CO2 emissions product for sub-city source regions in the LA

  15. Relative Influence of Top-Down ond Bottom-Up Controls on Mixed Severity Burn Patterns in Yosemite National Park, California, USA

    Science.gov (United States)

    Kane, V. R.; Povak, N.; Brooks, M.; Collins, B.; Smith, D.; Churchill, D.

    2015-12-01

    In western North America, recent and projected increases in the frequency and severity of large wildfires have elevated the need to understand the key drivers of fire regimes across landscapes so that managers can predict where fires will have the greatest ecological impact, and anticipate changes under future climate change. Yosemite National Park offers a unique opportunity to study potential biophysical controls on fire severity patterns - fire management in this area has allowed many fires to burn since the 1970s, re-establishing a mixed severity fire regime. Previous studies within the park showed a high level of control from a variety of bottom-up (e.g., fire history, topography) and top-down (e.g., climate) variables on fire severity within a portion of the current study area, and found some evidence controls may break down for the largest fires. In the current study, we sought to identify (1) controls on fire severity across all fires that burned within Yosemite (1984-2013), (2) differences in controls across fire sizes, (3) the contributions of topographic, climatic, and fire history variables to total variance explained, and (4) the influence of spatial autocorrelation on model results. Our study includes 147 fires that burned over 78,500 ha within Yosemite. Modeling results suggested that fire size and shape, topography, and localized climate variables explained fire severity patterns. Fires responded to inter-annual climate variability (top-down) plus local variation in water balance, past fire history, and local topographic variability (bottom-up). Climate-only models lead to the highest level of pure variance explained followed by fire history, and topography models. Climate variables had distinctly non-linear relationships with fire severity, and key drivers were related to winter conditions. Fire severity was positively correlated with fire size, and severity increased towards fire interiors. Steeper and more complex topographies were associated

  16. HCFC-142b emissions in China: An inventory for 2000 to 2050 basing on bottom-up and top-down methods

    Science.gov (United States)

    Han, Jiarui; Li, Li; Su, Shenshen; Hu, Jianxin; Wu, Jing; Wu, Yusheng; Fang, Xuekun

    2014-05-01

    1-Chloro-1,1-difluoroethane (HCFC-142b) is both ozone depleting substance included in the Montreal Protocol on Substances that Deplete the Ozone Layer (Montreal Protocol) and potent greenhouse gas with high global warming potential. As one of the major HCFC-142b consumption and production countries in the world, China's control action will contribute to both mitigating climate change and protecting ozone layer. Estimating China's HCFC-142b emission is a crucial step for understanding its emission status, drawing up phasing-out plan and evaluating mitigation effect. Both the bottom-up and top-down method were adopted in this research to estimate HCFC-142b emissions from China. Results basing on different methods were compared to test the effectiveness of two methods and validate inventory's reliability. Firstly, a national bottom-up emission inventory of HCFC-142b for China during 2000-2012 was established based on the 2006 IPCC Guidelines for National Greenhouse Gas Inventories and the Montreal Protocol, showing that in contrast to the downward trend revealed by existing results, HCFC-142b emissions kept increasing from 0.1 kt/yr in 2000 to the peak of 14.4 kt/yr in 2012. Meanwhile a top-down emission estimation was also developed using interspecies correlation method. By correlating atmospheric mixing ratio data of HCFC-142b and reference substance HCFC-22 sampled from four representative cities (Beijing, Hangzhou, Lanzhou and Guangzhou, for northern, eastern, western and southern China, respectively), China's HCFC-142b emission in 2012 was calculated. It was 16.24(13.90-18.58) kt, equivalent to 1.06 kt ODP and 37 Tg CO2-eq, taking up 9.78% (ODP) of total HCFCs emission in China or 30.5% of global HCFC-142b emission. This result was 12.7% higher than that in bottom-up inventory. Possible explanations were discussed. The consistency of two results lend credit to methods effectiveness and results reliability. Finally, future HCFC-142b emission was projected to 2050

  17. Object recognition with hierarchical discriminant saliency networks.

    Science.gov (United States)

    Han, Sunhyoung; Vasconcelos, Nuno

    2014-01-01

    The benefits of integrating attention and object recognition are investigated. While attention is frequently modeled as a pre-processor for recognition, we investigate the hypothesis that attention is an intrinsic component of recognition and vice-versa. This hypothesis is tested with a recognition model, the hierarchical discriminant saliency network (HDSN), whose layers are top-down saliency detectors, tuned for a visual class according to the principles of discriminant saliency. As a model of neural computation, the HDSN has two possible implementations. In a biologically plausible implementation, all layers comply with the standard neurophysiological model of visual cortex, with sub-layers of simple and complex units that implement a combination of filtering, divisive normalization, pooling, and non-linearities. In a convolutional neural network implementation, all layers are convolutional and implement a combination of filtering, rectification, and pooling. The rectification is performed with a parametric extension of the now popular rectified linear units (ReLUs), whose parameters can be tuned for the detection of target object classes. This enables a number of functional enhancements over neural network models that lack a connection to saliency, including optimal feature denoising mechanisms for recognition, modulation of saliency responses by the discriminant power of the underlying features, and the ability to detect both feature presence and absence. In either implementation, each layer has a precise statistical interpretation, and all parameters are tuned by statistical learning. Each saliency detection layer learns more discriminant saliency templates than its predecessors and higher layers have larger pooling fields. This enables the HDSN to simultaneously achieve high selectivity to target object classes and invariance. The performance of the network in saliency and object recognition tasks is compared to those of models from the biological and

  18. A Local Texture-Based Superpixel Feature Coding for Saliency Detection Combined with Global Saliency

    Directory of Open Access Journals (Sweden)

    Bingfei Nan

    2015-12-01

    Full Text Available Because saliency can be used as the prior knowledge of image content, saliency detection has been an active research area in image segmentation, object detection, image semantic understanding and other relevant image-based applications. In the case of saliency detection from cluster scenes, the salient object/region detected needs to not only be distinguished clearly from the background, but, preferably, to also be informative in terms of complete contour and local texture details to facilitate the successive processing. In this paper, a Local Texture-based Region Sparse Histogram (LTRSH model is proposed for saliency detection from cluster scenes. This model uses a combination of local texture patterns and color distribution as well as contour information to encode the superpixels to characterize the local feature of image for region contrast computing. Combining the region contrast as computed with the global saliency probability, a full-resolution salient map, in which the salient object/region detected adheres more closely to its inherent feature, is obtained on the bases of the corresponding high-level saliency spatial distribution as well as on the pixel-level saliency enhancement. Quantitative comparisons with five state-of-the-art saliency detection methods on benchmark datasets are carried out, and the comparative results show that the method we propose improves the detection performance in terms of corresponding measurements.

  19. Aposematism: balancing salience and camouflage.

    Science.gov (United States)

    Barnett, James B; Scott-Samuel, Nicholas E; Cuthill, Innes C

    2016-08-01

    Aposematic signals are often characterized by high conspicuousness. Larger and brighter signals reinforce avoidance learning, distinguish defended from palatable prey and are more easily memorized by predators. Conspicuous signalling, however, has costs: encounter rates with naive, specialized or nutritionally stressed predators are likely to increase. It has been suggested that intermediate levels of aposematic conspicuousness can evolve to balance deterrence and detectability, especially for moderately defended species. The effectiveness of such signals, however, has not yet been experimentally tested under field conditions. We used dough caterpillar-like baits to test whether reduced levels of aposematic conspicuousness can have survival benefits when predated by wild birds in natural conditions. Our results suggest that, when controlling for the number and intensity of internal contrast boundaries (stripes), a reduced-conspicuousness aposematic pattern can have a survival advantage over more conspicuous signals, as well as cryptic colours. Furthermore, we find a survival benefit from the addition of internal contrast for both high and low levels of conspicuousness. This adds ecological validity to evolutionary models of aposematic saliency and the evolution of honest signalling. PMID:27484645

  20. Chitosan microspheres with an extracellular matrix-mimicking nanofibrous structure as cell-carrier building blocks for bottom-up cartilage tissue engineering

    Science.gov (United States)

    Zhou, Yong; Gao, Huai-Ling; Shen, Li-Li; Pan, Zhao; Mao, Li-Bo; Wu, Tao; He, Jia-Cai; Zou, Duo-Hong; Zhang, Zhi-Yuan; Yu, Shu-Hong

    2015-12-01

    Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation. Recently, as a valuable alternative, a bottom-up TE approach utilizing cell-loaded micrometer-scale modular components as building blocks to reconstruct a new tissue in vitro or in vivo has been proved to demonstrate a number of desirable advantages compared with the traditional bulk scaffold based top-down TE approach. Nevertheless, micro-components with an ECM-mimicking nanofibrous structure are still very scarce and highly desirable. Chitosan (CS), an accessible natural polymer, has demonstrated appealing intrinsic properties and promising application potential for TE, especially the cartilage tissue regeneration. According to this background, we report here the fabrication of chitosan microspheres with an ECM-mimicking nanofibrous structure for the first time based on a physical gelation process. By combining this physical fabrication procedure with microfluidic technology, uniform CS microspheres (CMS) with controlled nanofibrous microstructure and tunable sizes can be facilely obtained. Especially, no potentially toxic or denaturizing chemical crosslinking agent was introduced into the products. Notably, in vitro chondrocyte culture tests revealed that enhanced cell attachment and proliferation were realized, and a macroscopic 3D geometrically shaped cartilage-like composite can be easily constructed with the nanofibrous CMS (NCMS) and chondrocytes, which demonstrate significant application potential of NCMS as the bottom-up cell-carrier components for cartilage tissue engineering.Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation

  1. Bottom-up processing of thermoelectric nanocomposites from colloidal nanocrystal building blocks: the case of Ag{sub 2}Te-PbTe

    Energy Technology Data Exchange (ETDEWEB)

    Cadavid, Doris [Catalonia Institute for Energy Research, IREC (Spain); Ibanez, Maria [Universitat de Barcelona, Departament d' Electronica (Spain); Gorsse, Stephane [Universite de Bordeaux, ICMCB, CNRS (France); Lopez, Antonio M. [Universitat Politecnica de Catalunya, Departament d' Enginyeria Electronica (Spain); Cirera, Albert [Universitat de Barcelona, Departament d' Electronica (Spain); Morante, Joan Ramon; Cabot, Andreu, E-mail: acabot@irec.cat [Catalonia Institute for Energy Research, IREC (Spain)

    2012-12-15

    Nanocomposites are highly promising materials to enhance the efficiency of current thermoelectric devices. A straightforward and at the same time highly versatile and controllable approach to produce nanocomposites is the assembly of solution-processed nanocrystal building blocks. The convenience of this bottom-up approach to produce nanocomposites with homogeneous phase distributions and adjustable composition is demonstrated here by blending Ag{sub 2}Te and PbTe colloidal nanocrystals to form Ag{sub 2}Te-PbTe bulk nanocomposites. The thermoelectric properties of these nanocomposites are analyzed in the temperature range from 300 to 700 K. The evolution of their electrical conductivity and Seebeck coefficient is discussed in terms of the blend composition and the characteristics of the constituent materials.

  2. Integration scheme of nanoscale resistive switching memory using bottom-up processes at room temperature for high-density memory applications

    Science.gov (United States)

    Han, Un-Bin; Lee, Jang-Sik

    2016-01-01

    A facile and versatile scheme is demonstrated to fabricate nanoscale resistive switching memory devices that exhibit reliable bipolar switching behavior. A solution process is used to synthesize the copper oxide layer into 250-nm via-holes that had been patterned in Si wafers. Direct bottom-up filling of copper oxide can facilitate fabrication of nanoscale memory devices without using vacuum deposition and etching processes. In addition, all materials and processes are CMOS compatible, and especially, the devices can be fabricated at room temperature. Nanoscale memory devices synthesized on wafers having 250-nm via-holes showed reproducible resistive switching programmable memory characteristics with reasonable endurance and data retention properties. This integration strategy provides a solution to overcome the scaling limit of current memory device fabrication methods. PMID:27364856

  3. The synthesis of bottom-up and top-down approaches to climate policy modeling: Electric power technology detail in a social accounting framework

    International Nuclear Information System (INIS)

    ''Hybrid'' climate policy simulations have sought to bridge the gap between ''bottom-up'' engineering and ''top-down'' macroeconomic models by integrating the former's energy technology detail into the latter's macroeconomic framework. Construction of hybrid models is complicated by the need to numerically calibrate them to multiple, incommensurate sources of economic and engineering data. I develop a solution to this problem following Howitt's [Howitt, R.E., 1995. Positive Mathematical Programming, American Journal of Agricultural Economics 77: 329-342] positive mathematical programming approach. Using data for the U.S., I illustrate how the inputs to the electricity sector in a social accounting matrix may be allocated among discrete types of generation so as to be consistent with both technologies' input shares from engineering cost estimates, and the zero profit and market clearance conditions of the sector's macroeconomic production structure. (author)

  4. A neural computational model of incentive salience.

    Directory of Open Access Journals (Sweden)

    Jun Zhang

    2009-07-01

    Full Text Available Incentive salience is a motivational property with 'magnet-like' qualities. When attributed to reward-predicting stimuli (cues, incentive salience triggers a pulse of 'wanting' and an individual is pulled toward the cues and reward. A key computational question is how incentive salience is generated during a cue re-encounter, which combines both learning and the state of limbic brain mechanisms. Learning processes, such as temporal-difference models, provide one way for stimuli to acquire cached predictive values of rewards. However, empirical data show that subsequent incentive values are also modulated on the fly by dynamic fluctuation in physiological states, altering cached values in ways requiring additional motivation mechanisms. Dynamic modulation of incentive salience for a Pavlovian conditioned stimulus (CS or cue occurs during certain states, without necessarily requiring (relearning about the cue. In some cases, dynamic modulation of cue value occurs during states that are quite novel, never having been experienced before, and even prior to experience of the associated unconditioned reward in the new state. Such cases can include novel drug-induced mesolimbic activation and addictive incentive-sensitization, as well as natural appetite states such as salt appetite. Dynamic enhancement specifically raises the incentive salience of an appropriate CS, without necessarily changing that of other CSs. Here we suggest a new computational model that modulates incentive salience by integrating changing physiological states with prior learning. We support the model with behavioral and neurobiological data from empirical tests that demonstrate dynamic elevations in cue-triggered motivation (involving natural salt appetite, and drug-induced intoxication and sensitization. Our data call for a dynamic model of incentive salience, such as presented here. Computational models can adequately capture fluctuations in cue-triggered 'wanting' only by

  5. "Disorganized in time": impact of bottom-up and top-down negative emotion generation on memory formation among healthy and traumatized adolescents.

    Science.gov (United States)

    Guillery-Girard, Bérengère; Clochon, Patrice; Giffard, Bénédicte; Viard, Armelle; Egler, Pierre-Jean; Baleyte, Jean-Marc; Eustache, Francis; Dayan, Jacques

    2013-09-01

    "Travelling in time," a central feature of episodic memory is severely affected among individuals with Post Traumatic Stress Disorder (PTSD) with two opposite effects: vivid traumatic memories are unorganized in temporality (bottom-up processes), non-traumatic personal memories tend to lack spatio-temporal details and false recognitions occur more frequently that in the general population (top-down processes). To test the effect of these two types of processes (i.e. bottom-up and top-down) on emotional memory, we conducted two studies in healthy and traumatized adolescents, a period of life in which vulnerability to emotion is particularly high. Using negative and neutral images selected from the international affective picture system (IAPS), stimuli were divided into perceptual images (emotion generated by perceptual details) and conceptual images (emotion generated by the general meaning of the material). Both categories of stimuli were then used, along with neutral pictures, in a memory task with two phases (encoding and recognition). In both populations, we reported a differential effect of the emotional material on encoding and recognition. Negative perceptual scenes induced an attentional capture effect during encoding and enhanced the recollective distinctiveness. Conversely, the encoding of conceptual scenes was similar to neutral ones, but the conceptual relatedness induced false memories at retrieval. However, among individuals with PTSD, two subgroups of patients were identified. The first subgroup processed the scenes faster than controls, except for the perceptual scenes, and obtained similar performances to controls in the recognition task. The second subgroup group desmonstrated an attentional deficit in the encoding task with no benefit from the distinctiveness associated with negative perceptual scenes on memory performances. These findings provide a new perspective on how negative emotional information may have opposite influences on memory in

  6. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories.

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G; Quinn, Paul C; Hu, Chao S; Qian, Miao; Fu, Genyue; Lee, Kang

    2015-02-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese, Caucasian, and racially ambiguous faces. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time. PMID:25497461

  7. A bottom-up model to estimate the energy efficiency improvement and CO2 emission reduction potentials in the Chinese iron and steel industry

    International Nuclear Information System (INIS)

    China's annual crude steel production in 2010 was 638.7 Mt accounting for nearly half of the world's annual crude steel production in the same year. Around 461 TWh of electricity and 14,872 PJ of fuel were consumed to produce this quantity of steel. We identified and analyzed 23 energy efficiency technologies and measures applicable to the processes in China's iron and steel industry. Using a bottom-up electricity CSC (Conservation Supply Curve) model, the cumulative cost-effective electricity savings potential for the Chinese iron and steel industry for 2010–2030 is estimated to be 251 TWh, and the total technical electricity saving potential is 416 TWh. The CO2 emissions reduction associated with cost-effective electricity savings is 139 Mt CO2 and the CO2 emission reduction associated with technical electricity saving potential is 237 Mt CO2. The FCSC (Fuel CSC) model for the Chinese iron and steel industry shows cumulative cost-effective fuel savings potential of 11,999 PJ, and the total technical fuel saving potential is 12,139. The CO2 emissions reduction associated with cost-effective and technical fuel savings is 1191 Mt CO2 and 1205 Mt CO2, respectively. In addition, a sensitivity analysis with respect to the discount rate used is conducted. - Highlights: ► Estimation of energy saving potential in the entire Chinese steel industry. ► Development of the bottom-up technology-rich Conservation Supply Curve models. ► Discussion of different approaches for developing Conservation Supply Curves. ► Primary energy saving over 20 years equal to 72% of primary energy of Latin America

  8. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Directory of Open Access Journals (Sweden)

    W. J. F. Acton

    2015-10-01

    Full Text Available This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton transfer reaction-mass spectrometer (PTR-MS and a proton transfer reaction-time of flight-mass spectrometer (PTR-ToF-MS together with the methods of virtual disjunct eddy covariance (PTR-MS and eddy covariance (PTR-ToF-MS. Isoprene was the dominant emitted compound with a mean day-time flux of 1.9 mg m-2 h-1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28 day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m-2 h-1 was calculated using the MEGAN isoprene emissions algorithms (Guenther et al., 2006. A detailed tree species distribution map for the site enabled the leaf-level emissions of isoprene and monoterpenes recorded using GC-MS to be scaled up to produce a "bottom-up" canopy-scale flux. This was compared with the "top-down" canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  9. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Directory of Open Access Journals (Sweden)

    Sebastian McBride

    Full Text Available Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1 conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2 implementation and validation of the model into robotic hardware (as a representative of an active vision system. Seven computational requirements were identified: 1 transformation of retinotopic to egocentric mappings, 2 spatial memory for the purposes of medium-term inhibition of return, 3 synchronization of 'where' and 'what' information from the two visual streams, 4 convergence of top-down and bottom-up information to a centralized point of information processing, 5 a threshold function to elicit saccade action, 6 a function to represent task relevance as a ratio of excitation and inhibition, and 7 derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  10. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Science.gov (United States)

    McBride, Sebastian; Huelse, Martin; Lee, Mark

    2013-01-01

    Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1) conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2) implementation and validation of the model into robotic hardware (as a representative of an active vision system). Seven computational requirements were identified: 1) transformation of retinotopic to egocentric mappings, 2) spatial memory for the purposes of medium-term inhibition of return, 3) synchronization of 'where' and 'what' information from the two visual streams, 4) convergence of top-down and bottom-up information to a centralized point of information processing, 5) a threshold function to elicit saccade action, 6) a function to represent task relevance as a ratio of excitation and inhibition, and 7) derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  11. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    Science.gov (United States)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat

  12. Object recognition with hierarchical discriminant saliency networks

    Directory of Open Access Journals (Sweden)

    Sunhyoung eHan

    2014-09-01

    Full Text Available The benefits of integrating attention and object recognition are investigated. While attention is frequently modeled as pre-processor for recognition, we investigate the hypothesis that attention is an intrinsic component of recognition and vice-versa. This hypothesis is tested with a recognitionmodel, the hierarchical discriminant saliency network (HDSN, whose layers are top-down saliency detectors, tuned for a visual class according to the principles of discriminant saliency. The HDSN has two possible implementations. In a biologically plausible implementation, all layers comply with the standard neurophysiological model of visual cortex, with sub-layers of simple and complex units that implement a combination of filtering, divisive normalization, pooling, and non-linearities. In a neuralnetwork implementation, all layers are convolutional and implement acombination of filtering, rectification, and pooling. The rectificationis performed with a parametric extension of the now popular rectified linearunits (ReLUs, whose parameters can be tuned for the detection of targetobject classes. This enables a number of functional enhancementsover neural network models that lack a connection to saliency, including optimal feature denoising mechanisms for recognition, modulation ofsaliency responses by the discriminant power of the underlying features,and the ability to detect both feature presence and absence.In either implementation, each layer has a precise statistical interpretation, and all parameters are tuned by statistical learning. Each saliency detection layer learns more discriminant saliency templates than its predecessors and higher layers have larger pooling fields. This enables the HDSN to simultaneously achieve high selectivity totarget object classes and invariance. The resulting performance demonstrates benefits for all the functional enhancements of the HDSN.

  13. Comparing bottom-up and top-down approaches at the landscape scale, including agricultural activities and water systems, at the Roskilde Fjord, Denmark

    Science.gov (United States)

    Lequy, Emeline; Ibrom, Andreas; Ambus, Per; Massad, Raia-Silvia; Markager, Stiig; Asmala, Eero; Garnier, Josette; Gabrielle, Benoit; Loubet, Benjamin

    2015-04-01

    The greenhouse gas nitrous oxide (N2O) mainly originates in direct emissions from agricultural soils due to microbial reactions stimulated by the use of nitrogen fertilisers. Indirect N2O emissions from water systems due to nitrogen leaching and deposition from crop fields range between 26 and 37% of direct agricultural emissions, indicating their potential importance and uncertainty (Reay et al. 2012). The study presented here couples a top-down approach with eddy covariance (EC) and a bottom-up approach using different models and measurements. A QCL sensor at 96-m height on a tall tower measures the emissions of N2O from 1100 ha of crop fields and from the south part of the Roskilde fjord, in a 5-km radius area around the tall tower at Roskilde, Denmark. The bottom-up approach includes ecosystem modelling with CERES-EGC for the crops and PaSIM for the grasslands, and the N2O fluxes from the Roskilde fjord are derived from N2O sea water concentration measurements. EC measurements are now available from July to December 2014, and indicate a magnitude of the emissions from the crop fields around 0.2 mg N2O-N m-2 day-1 (range -9 to 5) which is consistent with the CERES-EGC simulations and calculations using IPCC emission factors. N2O fluxes from the Roskilde fjord in May and July indicated quite constant N2O concentrations around 0.1 µg N L-1 despite variations of nitrate and ammonium in the fjord. The calculated fluxes from these concentrations and the tall tower measurements consistently ranged between -7 and 6 mg N2O-N m-2 day-1. The study site also contains a waste water treatment plant, whose direct emissions will be measured in early 2015 using a dynamic plume tracer dispersion method (Mønster et al. 2014). A refined source attribution methodology together with more measurements and simulations of the N2O fluxes from the different land uses in this study site will provide a clearer view of the dynamics and budgets of N2O at the regional scale. The

  14. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Science.gov (United States)

    Acton, W. Joe F.; Schallhart, Simon; Langford, Ben; Valach, Amy; Rantala, Pekka; Fares, Silvano; Carriero, Giulia; Tillmann, Ralf; Tomlinson, Sam J.; Dragosits, Ulrike; Gianelle, Damiano; Hewitt, C. Nicholas; Nemitz, Eiko

    2016-06-01

    This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs) 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton-transfer-reaction mass spectrometer (PTR-MS) and a proton-transfer-reaction time-of-flight mass spectrometer (PTR-ToF-MS) together with the methods of virtual disjunct eddy covariance (using PTR-MS) and eddy covariance (using PTR-ToF-MS). Isoprene was the dominant emitted compound with a mean daytime flux of 1.9 mg m-2 h-1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28-day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m-2 h-1 was calculated using the Model of Emissions of Gases and Aerosols from Nature (MEGAN) isoprene emission algorithms (Guenther et al., 2006). A detailed tree-species distribution map for the site enabled the leaf-level emission of isoprene and monoterpenes recorded using gas-chromatography mass spectrometry (GC-MS) to be scaled up to produce a bottom-up canopy-scale flux. This was compared with the top-down canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant-species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  15. Linking top-down and bottom-up approaches for assessing the vulnerability of a 100 % renewable energy system in Northern-Italy

    Science.gov (United States)

    Borga, Marco; Francois, Baptiste; Hingray, Benoit; Zoccatelli, Davide; Creutin, Jean-Dominique; brown, Casey

    2016-04-01

    Due to their variable and un-controllable features, integration of Variable Renewable Energies (e.g. solar-power, wind-power and hydropower, denoted as VRE) into the electricity network implies higher production variability and increased risk of not meeting demand. Two approaches are commonly used for assessing this risk and especially its evolution in a global change context (i.e. climate and societal changes); top-down and bottom-up approaches. The general idea of a top-down approach is to drive analysis of global change or of some key aspects of global change on their systems (e.g., the effects of the COP 21, of the deployment of Smart Grids, or of climate change) with chains of loosely linked simulation models within a predictive framework. The bottom-up approach aims to improve understanding of the dependencies between the vulnerability of regional systems and large-scale phenomenon from knowledge gained through detailed exploration of the response to change of the system of interest, which may reveal vulnerability thresholds, tipping points as well as potential opportunities. Brown et al. (2012) defined an analytical framework to merge these two approaches. The objective is to build, a set of Climate Response Functions (CRFs) putting in perspective i) indicators of desired states ("success") and undesired states ("failure") of a system as defined in collaboration with stakeholders 2) exhaustive exploration of the effects of uncertain forcings and imperfect system understanding on the response of the system itself to a plausible set of possible changes, implemented a with multi-dimensionally consistent "stress test" algorithm, and 3) a set "ex post" hydroclimatic and socioeconomic scenarios that provide insight into the differential effectiveness of alternative policies and serve as entry points for the provision of climate information to inform policy evaluation and choice. We adapted this approach for analyzing a 100 % renewable energy system within a region

  16. A two-step combination of top-down and bottom-up fire emission estimates at regional and global scales: strengths and main uncertainties

    Science.gov (United States)

    Sofiev, Mikhail; Soares, Joana; Kouznetsov, Rostislav; Vira, Julius; Prank, Marje

    2016-04-01

    Top-down emission estimation via inverse dispersion modelling is used for various problems, where bottom-up approaches are difficult or highly uncertain. One of such areas is the estimation of emission from wild-land fires. In combination with dispersion modelling, satellite and/or in-situ observations can, in principle, be used to efficiently constrain the emission values. This is the main strength of the approach: the a-priori values of the emission factors (based on laboratory studies) are refined for real-life situations using the inverse-modelling technique. However, the approach also has major uncertainties, which are illustrated here with a few examples of the Integrated System for wild-land Fires (IS4FIRES). IS4FIRES generates the smoke emission and injection profile from MODIS and SEVIRI active-fire radiative energy observations. The emission calculation includes two steps: (i) initial top-down calibration of emission factors via inverse dispersion problem solution that is made once using training dataset from the past, (ii) application of the obtained emission coefficients to individual-fire radiative energy observations, thus leading to bottom-up emission compilation. For such a procedure, the major classes of uncertainties include: (i) imperfect information on fires, (ii) simplifications in the fire description, (iii) inaccuracies in the smoke observations and modelling, (iv) inaccuracies of the inverse problem solution. Using examples of the fire seasons 2010 in Russia, 2012 in Eurasia, 2007 in Australia, etc, it is pointed out that the top-down system calibration performed for a limited number of comparatively moderate cases (often the best-observed ones) may lead to errors in application to extreme events. For instance, the total emission of 2010 Russian fires is likely to be over-estimated by up to 50% if the calibration is based on the season 2006 and fire description is simplified. Longer calibration period and more sophisticated parameterization

  17. The influence of multiple primes on bottom-up and top-down regulation during meaning retrieval: evidence for 2 distinct neural networks.

    Science.gov (United States)

    Whitney, Carin; Grossman, Murray; Kircher, Tilo T J

    2009-11-01

    Meaning retrieval of a word can proceed fast and effortlessly or can be characterized by a controlled search for candidate lexical items and a subsequent selection process. In the current study, we facilitated meaning retrieval by increasing the number of words that were related to the final target word in a triplet (e.g., lion-stripes-tiger). To induce higher search and selection demands, we presented ambiguous words as targets (i.e., homonyms like ball) in half of the trials. Hereby, the dominant (game), low-frequent (dance), or both meanings of the homonym were primed. Participants performed a relatedness judgment during functional magnetic resonance imaging. Activation in a bilateral network (angular gyrus, rostromedial prefrontal cortex) increased linearly with multiple related primes, whereas the posterior left inferior prefrontal cortex (pLIPC) showed the reverse activation pattern for unambiguous trials. When homonyms served as targets, pLIPC responded strongest when both meanings or low-frequent concepts were addressed. Additional anterior left inferior prefrontal cortex activation was observed for the latter trials only. The data support an interaction between 2 distinct cerebral networks that can be linked to automatic bottom-up support and top-down control during meaning retrieval. They further imply a functional specialization of the LIPC along an anterior-posterior dimension.

  18. Bottom-up coarse-grained models with predictive accuracy and transferability for both structural and thermodynamic properties of heptane-toluene mixtures

    Science.gov (United States)

    Dunn, Nicholas J. H.; Noid, W. G.

    2016-05-01

    This work investigates the promise of a "bottom-up" extended ensemble framework for developing coarse-grained (CG) models that provide predictive accuracy and transferability for describing both structural and thermodynamic properties. We employ a force-matching variational principle to determine system-independent, i.e., transferable, interaction potentials that optimally model the interactions in five distinct heptane-toluene mixtures. Similarly, we employ a self-consistent pressure-matching approach to determine a system-specific pressure correction for each mixture. The resulting CG potentials accurately reproduce the site-site rdfs, the volume fluctuations, and the pressure equations of state that are determined by all-atom (AA) models for the five mixtures. Furthermore, we demonstrate that these CG potentials provide similar accuracy for additional heptane-toluene mixtures that were not included their parameterization. Surprisingly, the extended ensemble approach improves not only the transferability but also the accuracy of the calculated potentials. Additionally, we observe that the required pressure corrections strongly correlate with the intermolecular cohesion of the system-specific CG potentials. Moreover, this cohesion correlates with the relative "structure" within the corresponding mapped AA ensemble. Finally, the appendix demonstrates that the self-consistent pressure-matching approach corresponds to minimizing an appropriate relative entropy.

  19. Evaluating vehicle re-entrained road dust and its potential to deposit to Lake Tahoe: a bottom-up inventory approach.

    Science.gov (United States)

    Zhu, Dongzi; Kuhns, Hampden D; Gillies, John A; Gertler, Alan W

    2014-01-01

    Identifying hotspot areas impacted by emissions of dust from roadways is an essential step for mitigation. This paper develops a detailed road dust PM₁₀ emission inventory using a bottom-up approach and evaluates the potential for the dust to deposit to Lake Tahoe where it can affect water clarity. Previous studies of estimates of quantities of atmospheric deposition of fine sediment particles ("FSP", dust emission factors, five years of meteorological data, a traffic demand model and GIS analysis was used to estimate the near field deposition of airborne particulate matter atmospheric deposition to the lake. Approximately ~20 Mg year(-1) of PM₁₀ and ~36 Mg year(-1) Total Suspended Particulate (TSP) from roadway emissions of dust are estimated to reach the lake. We estimate that the atmospheric dry deposition of particles to the lake attributable to vehicle travel on paved roads is approximately 0.6% of the Total Maximum Daily Loadings (TMDL) of FSP that the lake can receive and still meet water quality standards.

  20. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems, Using a Bottom-Up Approach and Installer Survey - Second Edition

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, B.; Ardani, K.; Feldman, D.; Citron, R.; Margolis, R.; Zuboy, J.

    2013-10-01

    This report presents results from the second U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs -- often referred to as 'business process' or 'soft' costs -- for U.S. residential and commercial photovoltaic (PV) systems. In service to DOE's SunShot Initiative, annual expenditure and labor-hour-productivity data are analyzed to benchmark 2012 soft costs related to (1) customer acquisition and system design (2) permitting, inspection, and interconnection (PII). We also include an in-depth analysis of costs related to financing, overhead, and profit. Soft costs are both a major challenge and a major opportunity for reducing PV system prices and stimulating SunShot-level PV deployment in the United States. The data and analysis in this series of benchmarking reports are a step toward the more detailed understanding of PV soft costs required to track and accelerate these price reductions.

  1. Evolutionary Steps in the Emergence of Life Deduced from the Bottom-Up Approach and GADV Hypothesis (Top-Down Approach).

    Science.gov (United States)

    Ikehara, Kenji

    2016-01-26

    It is no doubt quite difficult to solve the riddle of the origin of life. So, firstly, I would like to point out the kinds of obstacles there are in solving this riddle and how we should tackle these difficult problems, reviewing the studies that have been conducted so far. After that, I will propose that the consecutive evolutionary steps in a timeline can be rationally deduced by using a common event as a juncture, which is obtained by two counter-directional approaches: one is the bottom-up approach through which many researchers have studied the origin of life, and the other is the top-down approach, through which I established the [GADV]-protein world hypothesis or GADV hypothesis on the origin of life starting from a study on the formation of entirely new genes in extant microorganisms. Last, I will describe the probable evolutionary process from the formation of Earth to the emergence of life, which was deduced by using a common event-the establishment of the first genetic code encoding [GADV]-amino acids-as a juncture for the results obtained from the two approaches.

  2. Employment impacts of EU biofuels policy. Combining bottom-up technology information and sectoral market simulations in an input-output framework

    International Nuclear Information System (INIS)

    This paper analyses the employment consequences of policies aimed to support biofuels in the European Union. The promotion of biofuel use has been advocated as a means to promote the sustainable use of natural resources and to reduce greenhouse gas emissions originating from transport activities on the one hand, and to reduce dependence on imported oil and thereby increase security of the European energy supply on the other hand. The employment impacts of increasing biofuels shares are calculated by taking into account a set of elements comprising the demand for capital goods required to produce biofuels, the additional demand for agricultural feedstock, higher fuel prices or reduced household budget in the case of price subsidisation, price effects ensuing from a hypothetical world oil price reduction linked to substitution in the EU market, and price impacts on agro-food commodities. The calculations refer to scenarios for the year 2020 targets as set out by the recent Renewable Energy Roadmap. Employment effects are assessed in an input-output framework taking into account bottom-up technology information to specify biofuels activities and linked to partial equilibrium models for the agricultural and energy sectors. The simulations suggest that biofuels targets on the order of 10-15% could be achieved without adverse net employment effects. (author)

  3. Middle-Out Approaches to Reform of University Teaching and Learning: Champions striding between the top-down and bottom-up approaches

    Directory of Open Access Journals (Sweden)

    Rick Cummings

    2005-03-01

    Full Text Available In recent years, Australian universities have been driven by a diversity of external forces, including funding cuts, massification of higher education, and changing student demographics, to reform their relationship with students and improve teaching and learning, particularly for those studying off-campus or part-time. Many universities have responded to these forces either through formal strategic plans developed top-down by executive staff or through organic developments arising from staff in a bottom-up approach. By contrast, much of Murdoch University’s response has been led by a small number of staff who have middle management responsibilities and who have championed the reform of key university functions, largely in spite of current policy or accepted practice. This paper argues that the ‘middle-out’ strategy has both a basis in change management theory and practice, and a number of strengths, including low risk, low cost, and high sustainability. Three linked examples of middle-out change management in teaching and learning at Murdoch University are described and the outcomes analyzed to demonstrate the benefits and pitfalls of this approach.

  4. Bottom-up derivation of conservative and dissipative interactions for coarse-grained molecular liquids with the conditional reversible work method

    Energy Technology Data Exchange (ETDEWEB)

    Deichmann, Gregor; Marcon, Valentina; Vegt, Nico F. A. van der, E-mail: vandervegt@csi.tu-darmstadt.de [Center of Smart Interfaces, Technische Universität Darmstadt, Alarich-Weiss-Straße 10, 64287 Darmstadt (Germany)

    2014-12-14

    Molecular simulations of soft matter systems have been performed in recent years using a variety of systematically coarse-grained models. With these models, structural or thermodynamic properties can be quite accurately represented while the prediction of dynamic properties remains difficult, especially for multi-component systems. In this work, we use constraint molecular dynamics simulations for calculating dissipative pair forces which are used together with conditional reversible work (CRW) conservative forces in dissipative particle dynamics (DPD) simulations. The combined CRW-DPD approach aims to extend the representability of CRW models to dynamic properties and uses a bottom-up approach. Dissipative pair forces are derived from fluctuations of the direct atomistic forces between mapped groups. The conservative CRW potential is obtained from a similar series of constraint dynamics simulations and represents the reversible work performed to couple the direct atomistic interactions between the mapped atom groups. Neopentane, tetrachloromethane, cyclohexane, and n-hexane have been considered as model systems. These molecular liquids are simulated with atomistic molecular dynamics, coarse-grained molecular dynamics, and DPD. We find that the CRW-DPD models reproduce the liquid structure and diffusive dynamics of the liquid systems in reasonable agreement with the atomistic models when using single-site mapping schemes with beads containing five or six heavy atoms. For a two-site representation of n-hexane (3 carbons per bead), time scale separation can no longer be assumed and the DPD approach consequently fails to reproduce the atomistic dynamics.

  5. Fabricación de electrodos para control de transporte y alineamiento a micro y nanoescalas usando técnicas bottom-up y top-down

    Directory of Open Access Journals (Sweden)

    Darwin Rodríguez

    2014-12-01

    Full Text Available El continuo avance de aplicaciones en dispositivos de autoensamble, posicionamiento, sensores, actuadores, y que permitan controladamente la manipulación de micro y nanoestructuras, han generado amplio interés en el desarrollo de metodologías que permitan optimizar la fabricación de dispositivos para el control y manipulación a micro y nanoescalas. Este proyecto explora técnicas de fabricación de electrodos con el fin de encontrar una técnica óptima y reproducible. Se compara el rendimiento de cada técnica y se describen protocolos de limpieza y seguridad. Se diseñan e implementan tres geometrías para movilizar y posicionar micro y nanopartículas de hierro en una solución de aceite natural. Finalmente se generan campos eléctricos a partir de electroforesis, con el fin de encontrar la curva que describe el desplazamiento de las partículas con respecto al potencial aplicado. Estos resultados generan gran impacto en los actuales esfuerzos de fabricación bottom-up (controlando con campos la ubicación y la movilidad en dispositivos electrónicos. El hecho de fabricar geometría planar con electrodos genera la posibilidad de que se pueda integrar movimiento de partículas a los circuitos integrados que se fabrican en la actualidad.

  6. When top-down becomes bottom up: behaviour of hyperdense howler monkeys (Alouatta seniculus trapped on a 0.6 ha island.

    Directory of Open Access Journals (Sweden)

    Gabriela Orihuela

    Full Text Available Predators are a ubiquitous presence in most natural environments. Opportunities to contrast the behaviour of a species in the presence and absence of predators are thus rare. Here we report on the behaviour of howler monkey groups living under radically different conditions on two land-bridge islands in Lago Guri, Venezuela. One group of 6 adults inhabited a 190-ha island (Danto where they were exposed to multiple potential predators. This group, the control, occupied a home range of 23 ha and contested access to food resources with neighbouring groups in typical fashion. The second group, containing 6 adults, was isolated on a remote, predator-free 0.6 ha islet (Iguana offering limited food resources. Howlers living on the large island moved, fed and rested in a coherent group, frequently engaged in affiliative activities, rarely displayed agonistic behaviour and maintained intergroup spacing through howling. In contrast, the howlers on Iguana showed repulsion, as individuals spent most of their time spaced widely around the perimeter of the island. Iguana howlers rarely engaged in affiliative behaviour, often chased or fought with one another and were not observed to howl. These behaviors are interpreted as adjustments to the unrelenting deprivation associated with bottom-up limitation in a predator-free environment.

  7. Synthesis of a Cementitious Material Nanocement Using Bottom-Up Nanotechnology Concept: An Alternative Approach to Avoid CO2 Emission during Production of Cement

    Directory of Open Access Journals (Sweden)

    Byung Wan Jo

    2014-01-01

    Full Text Available The world’s increasing need is to develop smart and sustainable construction material, which will generate minimal climate changing gas during their production. The bottom-up nanotechnology has established itself as a promising alternative technique for the production of the cementitious material. The present investigation deals with the chemical synthesis of cementitious material using nanosilica, sodium aluminate, sodium hydroxide, and calcium nitrate as reacting phases. The characteristic properties of the chemically synthesized nanocement were verified by the chemical composition analysis, setting time measurement, particle size distribution, fineness analysis, and SEM and XRD analyses. Finally, the performance of the nanocement was ensured by the fabrication and characterization of the nanocement based mortar. Comparing the results with the commercially available cement product, it is demonstrated that the chemically synthesized nanocement not only shows better physical and mechanical performance, but also brings several encouraging impacts to the society, including the reduction of CO2 emission and the development of sustainable construction material. A plausible reaction scheme has been proposed to explain the synthesis and the overall performances of the nanocement.

  8. The bottom-up approach to defining life : deciphering the functional organization of biological cells via multi-objective representation of biological complexity from molecules to cells

    Directory of Open Access Journals (Sweden)

    Sathish ePeriyasamy

    2013-12-01

    Full Text Available In silico representation of cellular systems needs to represent the adaptive dynamics of biological cells, recognizing a cell’s multi-objective topology formed by spatially and temporally cohesive intracellular structures. The design of these models needs to address the hierarchical and concurrent nature of cellular functions and incorporate the ability to self-organise in response to transitions between healthy and pathological phases, and adapt accordingly. The functions of biological systems are constantly evolving, due to the ever changing demands of their environment. Biological systems meet these demands by pursuing objectives, aided by their constituents, giving rise to biological functions. A biological cell is organised into an objective/task hierarchy. These objective hierarchy corresponds to the nested nature of temporally cohesive structures and representing them will facilitate in studying pleiotropy and polygeny by modeling causalities propagating across multiple interconnected intracellular processes. Although biological adaptations occur in physiological, developmental and reproductive timescales, the paper is focused on adaptations that occur within physiological timescales, where the biomolecular activities contributing to functional organisation, play a key role in cellular physiology. The paper proposes a multi-scale and multi-objective modelling approach from the bottom-up by representing temporally cohesive structures for multi-tasking of intracellular processes. Further the paper characterises the properties and constraints that are consequential to the organisational and adaptive dynamics in biological cells.

  9. When top-down becomes bottom up: behaviour of hyperdense howler monkeys (Alouatta seniculus) trapped on a 0.6 ha island.

    Science.gov (United States)

    Orihuela, Gabriela; Terborgh, John; Ceballos, Natalia; Glander, Kenneth

    2014-01-01

    Predators are a ubiquitous presence in most natural environments. Opportunities to contrast the behaviour of a species in the presence and absence of predators are thus rare. Here we report on the behaviour of howler monkey groups living under radically different conditions on two land-bridge islands in Lago Guri, Venezuela. One group of 6 adults inhabited a 190-ha island (Danto) where they were exposed to multiple potential predators. This group, the control, occupied a home range of 23 ha and contested access to food resources with neighbouring groups in typical fashion. The second group, containing 6 adults, was isolated on a remote, predator-free 0.6 ha islet (Iguana) offering limited food resources. Howlers living on the large island moved, fed and rested in a coherent group, frequently engaged in affiliative activities, rarely displayed agonistic behaviour and maintained intergroup spacing through howling. In contrast, the howlers on Iguana showed repulsion, as individuals spent most of their time spaced widely around the perimeter of the island. Iguana howlers rarely engaged in affiliative behaviour, often chased or fought with one another and were not observed to howl. These behaviors are interpreted as adjustments to the unrelenting deprivation associated with bottom-up limitation in a predator-free environment. PMID:24743575

  10. Toward improved prediction of the bedrock depth underneath hillslopes: Bayesian inference of the bottom-up control hypothesis using high-resolution topographic data

    Science.gov (United States)

    Gomes, Guilherme J. C.; Vrugt, Jasper A.; Vargas, Eurípedes A.

    2016-04-01

    The depth to bedrock controls a myriad of processes by influencing subsurface flow paths, erosion rates, soil moisture, and water uptake by plant roots. As hillslope interiors are very difficult and costly to illuminate and access, the topography of the bedrock surface is largely unknown. This essay is concerned with the prediction of spatial patterns in the depth to bedrock (DTB) using high-resolution topographic data, numerical modeling, and Bayesian analysis. Our DTB model builds on the bottom-up control on fresh-bedrock topography hypothesis of Rempe and Dietrich (2014) and includes a mass movement and bedrock-valley morphology term to extent the usefulness and general applicability of the model. We reconcile the DTB model with field observations using Bayesian analysis with the DREAM algorithm. We investigate explicitly the benefits of using spatially distributed parameter values to account implicitly, and in a relatively simple way, for rock mass heterogeneities that are very difficult, if not impossible, to characterize adequately in the field. We illustrate our method using an artificial data set of bedrock depth observations and then evaluate our DTB model with real-world data collected at the Papagaio river basin in Rio de Janeiro, Brazil. Our results demonstrate that the DTB model predicts accurately the observed bedrock depth data. The posterior mean DTB simulation is shown to be in good agreement with the measured data. The posterior prediction uncertainty of the DTB model can be propagated forward through hydromechanical models to derive probabilistic estimates of factors of safety.

  11. Structural and optical nanoscale analysis of GaN core-shell microrod arrays fabricated by combined top-down and bottom-up process on Si(111)

    Science.gov (United States)

    Müller, Marcus; Schmidt, Gordon; Metzner, Sebastian; Veit, Peter; Bertram, Frank; Krylyuk, Sergiy; Debnath, Ratan; Ha, Jong-Yoon; Wen, Baomei; Blanchard, Paul; Motayed, Abhishek; King, Matthew R.; Davydov, Albert V.; Christen, Jürgen

    2016-05-01

    Large arrays of GaN core-shell microrods were fabricated on Si(111) substrates applying a combined bottom-up and top-down approach which includes inductively coupled plasma (ICP) etching of patterned GaN films grown by metal-organic vapor phase epitaxy (MOVPE) and selective overgrowth of obtained GaN/Si pillars using hydride vapor phase epitaxy (HVPE). The structural and optical properties of individual core-shell microrods have been studied with a nanometer scale spatial resolution using low-temperature cathodoluminescence spectroscopy (CL) directly performed in a scanning electron microscope (SEM) and in a scanning transmission electron microscope (STEM). SEM, TEM, and CL measurements reveal the formation of distinct growth domains during the HVPE overgrowth. A high free-carrier concentration observed in the non-polar \\{ 1\\bar{1}00\\} HVPE shells is assigned to in-diffusion of silicon atoms from the substrate. In contrast, the HVPE shells directly grown on top of the c-plane of the GaN pillars reveal a lower free-carrier concentration.

  12. Emotional Intelligence, Identity Salience, and Metaphors. Symposium.

    Science.gov (United States)

    2002

    This document contains three papers from a symposium on emotional intelligence, identity salience, and metaphors in human resource development (HRD). "Applying Client and Consultant Generated Metaphors in HRD: Lessons from Psychotherapy" (Darren Short) reviews some techniques that psychotherapists have devised for using their own metaphors and the…

  13. Visualization of neural networks using saliency maps

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.; Kjems, Ulrik; Hansen, Lars Kai;

    1995-01-01

    The saliency map is proposed as a new method for understanding and visualizing the nonlinearities embedded in feedforward neural networks, with emphasis on the ill-posed case, where the dimensionality of the input-field by far exceeds the number of examples. Several levels of approximations...

  14. Top-down/bottom-up description of electricity sector for Switzerland using the GEM-E3 computable general equilibrium model

    International Nuclear Information System (INIS)

    Participation of the Paul Scherrer Institute (PSI) in the advancement and extension of the multi-region, Computable General Equilibrium (CGE) model GEM-E3 (CES/KUL, 2002) focused primarily on two top-level facets: a) extension of the model database and model calibration, particularly as related to the second component of this study, which is; b) advancement of the dynamics of innovation and investment, primarily through the incorporation of Exogenous Technical Learning (ETL) into he Bottom-Up (BU, technology-based) part of the dynamic upgrade; this latter activity also included the completion of the dynamic coupling of the BU description of the electricity sector with the 'Top-Down' (TD, econometric) description of the economy inherent to the GEM-E3 CGE model. The results of this two- component study are described in two parts that have been combined in this single summary report: Part I describes the methodology and gives illustrative results from the BUTD integration, as well as describing the approach to and giving preliminary results from incorporating an ETL description into the BU component of the overall model; Part II reports on the calibration component of task in terms of: a) formulating a BU technology database for Switzerland based on previous work; incorporation of that database into the GEM-E3 model; and calibrating the BU database with the TD database embodied in the (Swiss) Social Accounting Matrix (SAM). The BUTD coupling along with the ETL incorporation described in Part I represent the major effort embodied in this investigation, but this effort could not be completed without the calibration preamble reported herein as Part II. A brief summary of the scope of each of these key study components is given. (author)

  15. Top-down model estimates, bottom-up inventories, and future projections of global natural and anthropogenic emissions of nitrous oxide

    Science.gov (United States)

    Davidson, E. A.; Kanter, D.

    2013-12-01

    Nitrous oxide (N2O) is the third most abundantly emitted greenhouse gas and the largest remaining emitted ozone depleting substance. It is a product of nitrifying and denitrifying bacteria in soils, sediments and water bodies. Humans began to disrupt the N cycle in the preindustrial era as they expanded agricultural land, used fire for land clearing and management, and cultivated leguminous crops that carry out biological N fixation. This disruption accelerated after the industrial revolution, especially as the use of synthetic N fertilizers became common after 1950. Here we present findings from a new United Nations Environment Programme report, in which we constrain estimates of the anthropogenic and natural emissions of N2O and consider scenarios for future emissions. Inventory-based estimates of natural emissions from terrestrial, marine and atmospheric sources range from 10 to 12 Tg N2O-N/yr. Similar values can be derived for global N2O emissions that were predominantly natural before the industrial revolution. While there was inter-decadal variability, there was little or no consistent trend in atmospheric N2O concentrations between 1730 and 1850, allowing us to assume near steady state. Assuming an atmospheric lifetime of 120 years, the 'top-down' estimate of pre-industrial emissions of 11 Tg N2O-N/yr is consistent with the bottom-up inventories for natural emissions, although the former includes some modest pre-industrial anthropogenic effects (probably demand for biofuels is highly uncertain, ranging from trivial to the most significant N2O source to date, depending on the types of plants, their nutrient management, the amount of land used for their cultivation, and the fates of their waste products.

  16. The control of automatic imitation based on bottom-up and top-down cues to animacy: insights from brain and behavior.

    Science.gov (United States)

    Klapper, André; Ramsey, Richard; Wigboldus, Daniël; Cross, Emily S

    2014-11-01

    Humans automatically imitate other people's actions during social interactions, building rapport and social closeness in the process. Although the behavioral consequences and neural correlates of imitation have been studied extensively, little is known about the neural mechanisms that control imitative tendencies. For example, the degree to which an agent is perceived as human-like influences automatic imitation, but it is not known how perception of animacy influences brain circuits that control imitation. In the current fMRI study, we examined how the perception and belief of animacy influence the control of automatic imitation. Using an imitation-inhibition paradigm that involves suppressing the tendency to imitate an observed action, we manipulated both bottom-up (visual input) and top-down (belief) cues to animacy. Results show divergent patterns of behavioral and neural responses. Behavioral analyses show that automatic imitation is equivalent when one or both cues to animacy are present but reduces when both are absent. By contrast, right TPJ showed sensitivity to the presence of both animacy cues. Thus, we demonstrate that right TPJ is biologically tuned to control imitative tendencies when the observed agent both looks like and is believed to be human. The results suggest that right TPJ may be involved in a specialized capacity to control automatic imitation of human agents, rather than a universal process of conflict management, which would be more consistent with generalist theories of imitative control. Evidence for specialized neural circuitry that "controls" imitation offers new insight into developmental disorders that involve atypical processing of social information, such as autism spectrum disorders. PMID:24742157

  17. A comparison of top-down and bottom-up carbon dioxide fluxes in the UK using a multi-platform measurement network.

    Science.gov (United States)

    White, Emily; Rigby, Matt; O'Doherty, Simon; Stavert, Ann; Lunt, Mark; Nemitz, Eiko; Helfter, Carole; Allen, Grant; Pitt, Joe; Bauguitte, Stéphane; Levy, Pete; van Oijen, Marcel; Williams, Mat; Smallman, Luke; Palmer, Paul

    2016-04-01

    Having a comprehensive understanding, on a countrywide scale, of both biogenic and anthropogenic CO2 emissions is essential for knowing how best to reduce anthropogenic emissions and for understanding how the terrestrial biosphere is responding to global fossil fuel emissions. Whilst anthropogenic CO2 flux estimates are fairly well constrained, fluxes from biogenic sources are not. This work will help to verify existing anthropogenic emissions inventories and give a better understanding of biosphere - atmosphere CO2 exchange. Using an innovative top-down inversion scheme; a hierarchical Bayesian Markov Chain Monte Carlo approach with reversible jump "trans-dimensional" basis function selection, we aim to find emissions estimates for biogenic and anthropogenic sources simultaneously. Our approach allows flux uncertainties to be derived more comprehensively than previous methods, and allows the resolved spatial scales in the solution to be determined using the data. We use atmospheric CO2 mole fraction data from the UK Deriving Emissions related to Climate Change (DECC) and Greenhouse gAs UK and Global Emissions (GAUGE) projects. The network comprises of 6 tall tower sites, flight campaigns and a ferry transect along the east coast, and enables us to derive high-resolution monthly flux estimates across the UK and Ireland for the period 2013-2015. We have derived UK total fluxes of 675 PIC 78 Tg/yr during January 2014 (seasonal maximum) and 23 PIC 96 Tg/yr during May 2014 (seasonal minimum). Our disaggregated anthropogenic and biogenic flux estimates are compared to a new high-resolution time resolved anthropogenic inventory that will underpin future UNFCCC reports by the UK, and to DALEC carbon cycle model. This allows us to identify where significant differences exist between these "bottom-up" and "top-down" flux estimates and suggest reasons for discrepancies. We will highlight the strengths and limitations of the UK's CO2 emissions verification infrastructure at

  18. Reducing energy consumption and CO2 emissions by energy efficiency measures and international trading: A bottom-up modeling for the U.S. iron and steel sector

    International Nuclear Information System (INIS)

    Highlights: • Use ISEEM to evaluate energy and emission reduction in U.S. Iron and Steel sector. • ISEEM is a new bottom-up optimization model for industry sector energy planning. • Energy and emission reduction includes efficiency measure and international trading. • International trading includes commodity and carbon among U.S., China and India. • Project annual energy use, CO2 emissions, production, and costs from 2010 to 2050. - Abstract: Using the ISEEM modeling framework, we analyzed the roles of energy efficiency measures, steel commodity and international carbon trading in achieving specific CO2 emission reduction targets in the U.S iron and steel sector from 2010 to 2050. We modeled how steel demand is balanced under three alternative emission reduction scenarios designed to include national energy efficiency measures, commodity trading, and international carbon trading as key instruments to meet a particular emission restriction target in the U.S. iron and steel sector; and how production, process structure, energy supply, and system costs change with those scenarios. The results advance our understanding of long-term impacts of different energy policy options designed to reduce energy consumption and CO2 emissions for U.S. iron and steel sector, and generate insight of policy implications for the sector’s environmentally and economically sustainable development. The alternative scenarios associated with 20% emission-reduction target are projected to result in approximately 11–19% annual energy reduction in the medium term (i.e., 2030) and 9–20% annual energy reduction in the long term (i.e., 2050) compared to the Base scenario

  19. Top-down/bottom-up description of electricity sector for Switzerland using the GEM-E3 computable general equilibrium model

    Energy Technology Data Exchange (ETDEWEB)

    Krakowski, R. A

    2006-06-15

    Participation of the Paul Scherrer Institute (PSI) in the advancement and extension of the multi-region, Computable General Equilibrium (CGE) model GEM-E3 (CES/KUL, 2002) focused primarily on two top-level facets: a) extension of the model database and model calibration, particularly as related to the second component of this study, which is; b) advancement of the dynamics of innovation and investment, primarily through the incorporation of Exogenous Technical Learning (ETL) into he Bottom-Up (BU, technology-based) part of the dynamic upgrade; this latter activity also included the completion of the dynamic coupling of the BU description of the electricity sector with the 'Top-Down' (TD, econometric) description of the economy inherent to the GEM-E3 CGE model. The results of this two- component study are described in two parts that have been combined in this single summary report: Part I describes the methodology and gives illustrative results from the BUTD integration, as well as describing the approach to and giving preliminary results from incorporating an ETL description into the BU component of the overall model; Part II reports on the calibration component of task in terms of: a) formulating a BU technology database for Switzerland based on previous work; incorporation of that database into the GEM-E3 model; and calibrating the BU database with the TD database embodied in the (Swiss) Social Accounting Matrix (SAM). The BUTD coupling along with the ETL incorporation described in Part I represent the major effort embodied in this investigation, but this effort could not be completed without the calibration preamble reported herein as Part II. A brief summary of the scope of each of these key study components is given. (author)

  20. Energetic Bottomup in the Low Countries. Energy transition from the bottom-up. On Happy energetic civilians, Solar and wind cooperatives, New utility companies; Energieke BottomUp in Lage Landen. De Energietransitie van Onderaf. Over Vrolijke energieke burgers, Zon- en windcooperaties, Nieuwe nuts

    Energy Technology Data Exchange (ETDEWEB)

    Schwencke, A.M.

    2012-08-15

    This essay is an outline of the 'energy transition from the bottom-up'. Leading questions are: (1) what are the actual initiatives; (2) who is involved; (3) how does one work (organization, business models); (4) why are people active in this field; (5) what good is it; (6) what is the aim? The essay is based on public information sources (websites, blogs, publications) and interviews with people involved [Dutch] Dit essay is een verkenning van de 'energietransitie van onderaf'. Leidende vragen zijn: (1) om wat voor initiatieven gaat het nu eigenlijk?; (2) wie zijn daarbij betrokken?; (3) hoe gaat men te werk (organisatie, business modellen)?; (4) waarom is men er op die manier mee bezig?; (5) Zet het zoden aan de dijk?; (6) Waar beweegt het naar toe? Het essay baseert zich op openbare bronnen (websites, blogs, publicaties) en gesprekken met mensen uit het veld.

  1. Fused methods for visual saliency estimation

    Science.gov (United States)

    Danko, Amanda S.; Lyu, Siwei

    2015-02-01

    In this work, we present a new model of visual saliency by combing results from existing methods, improving upon their performance and accuracy. By fusing pre-attentive and context-aware methods, we highlight the abilities of state-of-the-art models while compensating for their deficiencies. We put this theory to the test in a series of experiments, comparatively evaluating the visual saliency maps and employing them for content-based image retrieval and thumbnail generation. We find that on average our model yields definitive improvements upon recall and f-measure metrics with comparable precisions. In addition, we find that all image searches using our fused method return more correct images and additionally rank them higher than the searches using the original methods alone.

  2. Greenhouse Gas Emission Accounting. Preliminary study as input to a joint Int. IPCC Expert Meeting / CKO-CCB Workshop on Comparison of Top-down versus Bottom-up Emission Estimates.

    NARCIS (Netherlands)

    Amstel, van A.; Kroeze, C.; Janssen, L.J.H.M.; Olivier, J.G.J.; Wal, van der J.T.

    1997-01-01

    Bottom-up data for carbon dioxide, methane and nitrous oxide from the official national inventories (National Communications) were compared with data from EDGAR (Emission Database for Global Atmospheric Research) and top-down emission estimates, based on the results of dispersion and climate models

  3. Greenhouse Gas Emission Accounting: preliminary study as input to a joint International IPCC Expert Meeting/CKO-CCB Workshop on Comparison of Top-down versus Bottom-up Emission Estimates

    NARCIS (Netherlands)

    Amstel AR van; Kroeze C; Janssen LHJM; Olivier JGJ; Wal JT van der; LLO; LUW/WIMEK

    1997-01-01

    Bottom-up data for carbon dioxide, methane and nitrous oxide from the official national inventories (National Communications) were compared with data from EDGAR (Emission Database for Global Atmospheric Research) and top-down emission estimates, based on the results of dispersion and climate models

  4. The impact of napping on memory for future-relevant stimuli: Prioritization among multiple salience cues.

    Science.gov (United States)

    Bennion, Kelly A; Payne, Jessica D; Kensinger, Elizabeth A

    2016-06-01

    Prior research has demonstrated that sleep enhances memory for future-relevant information, including memory for information that is salient due to emotion, reward, or knowledge of a later memory test. Although sleep has been shown to prioritize information with any of these characteristics, the present study investigates the novel question of how sleep prioritizes information when multiple salience cues exist. Participants encoded scenes that were future-relevant based on emotion (emotional vs. neutral), reward (rewarded vs. unrewarded), and instructed learning (intentionally vs. incidentally encoded), preceding a delay consisting of a nap, an equivalent time period spent awake, or a nap followed by wakefulness (to control for effects of interference). Recognition testing revealed that when multiple dimensions of future relevance co-occur, sleep prioritizes top-down, goal-directed cues (instructed learning, and to a lesser degree, reward) over bottom-up, stimulus-driven characteristics (emotion). Further, results showed that these factors interact; the effect of a nap on intentionally encoded information was especially strong for neutral (relative to emotional) information, suggesting that once one cue for future relevance is present, there are diminishing returns with additional cues. Sleep may binarize information based on whether it is future-relevant or not, preferentially consolidating memory for the former category. Potential neural mechanisms underlying these selective effects and the implications of this research for educational and vocational domains are discussed. (PsycINFO Database Record PMID:27214500

  5. A Statistical Method for Estimating Missing GHG Emissions in Bottom-Up Inventories: The Case of Fossil Fuel Combustion in Industry in the Bogota Region, Colombia

    Science.gov (United States)

    Jimenez-Pizarro, R.; Rojas, A. M.; Pulido-Guio, A. D.

    2012-12-01

    The development of environmentally, socially and financially suitable greenhouse gas (GHG) mitigation portfolios requires detailed disaggregation of emissions by activity sector, preferably at the regional level. Bottom-up (BU) emission inventories are intrinsically disaggregated, but although detailed, they are frequently incomplete. Missing and erroneous activity data are rather common in emission inventories of GHG, criteria and toxic pollutants, even in developed countries. The fraction of missing and erroneous data can be rather large in developing country inventories. In addition, the cost and time for obtaining or correcting this information can be prohibitive or can delay the inventory development. This is particularly true for regional BU inventories in the developing world. Moreover, a rather common practice is to disregard or to arbitrarily impute low default activity or emission values to missing data, which typically leads to significant underestimation of the total emissions. Our investigation focuses on GHG emissions by fossil fuel combustion in industry in the Bogota Region, composed by Bogota and its adjacent, semi-rural area of influence, the Province of Cundinamarca. We found that the BU inventories for this sub-category substantially underestimate emissions when compared to top-down (TD) estimations based on sub-sector specific national fuel consumption data and regional energy intensities. Although both BU inventories have a substantial number of missing and evidently erroneous entries, i.e. information on fuel consumption per combustion unit per company, the validated energy use and emission data display clear and smooth frequency distributions, which can be adequately fitted to bimodal log-normal distributions. This is not unexpected as industrial plant sizes are typically log-normally distributed. Moreover, our statistical tests suggest that industrial sub-sectors, as classified by the International Standard Industrial Classification (ISIC

  6. 解决变化问题的自底向上流程建模方法%Bottom-up workflow modeling approach for business changes

    Institute of Scientific and Technical Information of China (English)

    严志民; 徐玮

    2011-01-01

    为使工作流适应业务快速发展而复杂多变的特点,提出一种全新的以数据为中心的业务流程定义和业务流程建模的说明性业务流程建模方法。以自底向上机制分析解剖业务流程,提取出原子工单、活动和业务策略等,将业务要素和业务变化的描述分离成不同的层次。执行语义上,以数据中心的业务流程建模的说明性业务流程建模方法借助有限状态自动机来描述单个工单的生命周期,利用标号迁移系统来描述工作流及多个工单间的交互。此外,还进行了从以数据中心的业务流程建模的说明性业务流程建模方法到实现可部署工作流的探讨,并结合杭州市房产管理局的实际工作流程,阐述了该方法的实际应用。%To meet with the adaptability requirements of workflow in a complicated and rapid changing business environment,a new modeling method named Declarative ARTifact-centric workflow(DART) was proposed.The business process was analyzed in the bottom-up manner so that its building blocks such as artifacts,activities and business policies were extracted.Representation of business component and change were differentiated.DART also took Finite State Automata(FSA) to illustrate single artifact's lifecycle,and Labeled Transition Systems(LTS) to describe workflow and interactions among artifacts.In addition,from DART modeling method to realize deployable workflow was also discussed.This method was tested in Hangzhou real estate administration bureau and application was finally studied.

  7. Benefits of China's efforts in gaseous pollutant control indicated by the bottom-up emissions and satellite observations 2000-2014

    Science.gov (United States)

    Xia, Yinmin; Zhao, Yu; Nielsen, Chris P.

    2016-07-01

    To evaluate the effectiveness of national air pollution control policies, the emissions of SO2, NOX, CO and CO2 in China are estimated using bottom-up methods for the most recent 15-year period (2000-2014). Vertical column densities (VCDs) from satellite observations are used to test the temporal and spatial patterns of emissions and to explore the ambient levels of gaseous pollutants across the country. The inter-annual trends in emissions and VCDs match well except for SO2. Such comparison is improved with an optimistic assumption in emission estimation that the emission standards for given industrial sources issued after 2010 have been fully enforced. Underestimation of emission abatement and enhanced atmospheric oxidization likely contribute to the discrepancy between SO2 emissions and VCDs. As suggested by VCDs and emissions estimated under the assumption of full implementation of emission standards, the control of SO2 in the 12th Five-Year Plan period (12th FYP, 2011-2015) is estimated to be more effective than that in the 11th FYP period (2006-2010), attributed to improved use of flue gas desulfurization in the power sector and implementation of new emission standards in key industrial sources. The opposite was true for CO, as energy efficiency improved more significantly from 2005 to 2010 due to closures of small industrial plants. Iron & steel production is estimated to have had particularly strong influence on temporal and spatial patterns of CO. In contrast to fast growth before 2011 driven by increased coal consumption and limited controls, NOX emissions decreased from 2011 to 2014 due to the penetration of selective catalytic/non-catalytic reduction systems in the power sector. This led to reduced NO2 VCDs, particularly in relatively highly polluted areas such as the eastern China and Pearl River Delta regions. In developed areas, transportation is playing an increasingly important role in air pollution, as suggested by the increased ratio of NO2 to SO

  8. A bottom-up, vulnerability-based framework for identifying the adaptive capacity of water resources systems in a changing climate

    Science.gov (United States)

    Culley, Sam; Noble, Stephanie; Timbs, Michael; Yates, Adam; Giuliani, Matteo; Castelletti, Andrea; Maier, Holger; Westra, Seth

    2015-04-01

    Water resource system infrastructure and operating policies are commonly designed on the assumption that the statistics of future rainfall, temperature and other hydrometeorological variables are equal to those of the historical record. There is now substantial evidence demonstrating that this assumption is no longer valid, and that climate change will significantly impact water resources systems worldwide. Under different climatic inputs, the performance of these systems may degrade to a point where they become unable to meet the primary objectives for which they were built. In such a changing context, using existing infrastructure more efficiently - rather than planning additional infrastructure - becomes key to restore the system's performance at acceptable levels and minimize financial investments and associated risk. The traditional top-down approach for assessing climate change impacts relies on the use of a cascade of models from the global to the local scale. However, it is often difficult to utilize this top-down approach in a decision-making procedure, as there is disparity amongst various climate projections, arising from incomplete scientific understanding of the complicated processes and feedbacks within the climate system, and model limitations in reproducing those relationships. In contrast with this top-down approach, this study contributes a framework to identify the adaptive capacity of water resource systems under changing climatic conditions adopting a bottom-up, vulnerability-based approach. The performance of the current system management is first assessed for a comprehensive range of climatic conditions, which are independent of climate model forecasts. The adaptive capacity of the system is then estimated by re-evaluating the performance of a set of adaptive operating policies, which are optimized for each climatic condition under which the system is simulated. The proposed framework reverses the perspective by identifying water system

  9. DISC: Deep Image Saliency Computing via Progressive Representation Learning.

    Science.gov (United States)

    Chen, Tianshui; Lin, Liang; Liu, Lingbo; Luo, Xiaonan; Li, Xuelong

    2016-06-01

    Salient object detection increasingly receives attention as an important component or step in several pattern recognition and image processing tasks. Although a variety of powerful saliency models have been intensively proposed, they usually involve heavy feature (or model) engineering based on priors (or assumptions) about the properties of objects and backgrounds. Inspired by the effectiveness of recently developed feature learning, we provide a novel deep image saliency computing (DISC) framework for fine-grained image saliency computing. In particular, we model the image saliency from both the coarse-and fine-level observations, and utilize the deep convolutional neural network (CNN) to learn the saliency representation in a progressive manner. In particular, our saliency model is built upon two stacked CNNs. The first CNN generates a coarse-level saliency map by taking the overall image as the input, roughly identifying saliency regions in the global context. Furthermore, we integrate superpixel-based local context information in the first CNN to refine the coarse-level saliency map. Guided by the coarse saliency map, the second CNN focuses on the local context to produce fine-grained and accurate saliency map while preserving object details. For a testing image, the two CNNs collaboratively conduct the saliency computing in one shot. Our DISC framework is capable of uniformly highlighting the objects of interest from complex background while preserving well object details. Extensive experiments on several standard benchmarks suggest that DISC outperforms other state-of-the-art methods and it also generalizes well across data sets without additional training. The executable version of DISC is available online: http://vision.sysu.edu.cn/projects/DISC.

  10. DISC: Deep Image Saliency Computing via Progressive Representation Learning.

    Science.gov (United States)

    Chen, Tianshui; Lin, Liang; Liu, Lingbo; Luo, Xiaonan; Li, Xuelong

    2016-06-01

    Salient object detection increasingly receives attention as an important component or step in several pattern recognition and image processing tasks. Although a variety of powerful saliency models have been intensively proposed, they usually involve heavy feature (or model) engineering based on priors (or assumptions) about the properties of objects and backgrounds. Inspired by the effectiveness of recently developed feature learning, we provide a novel deep image saliency computing (DISC) framework for fine-grained image saliency computing. In particular, we model the image saliency from both the coarse-and fine-level observations, and utilize the deep convolutional neural network (CNN) to learn the saliency representation in a progressive manner. In particular, our saliency model is built upon two stacked CNNs. The first CNN generates a coarse-level saliency map by taking the overall image as the input, roughly identifying saliency regions in the global context. Furthermore, we integrate superpixel-based local context information in the first CNN to refine the coarse-level saliency map. Guided by the coarse saliency map, the second CNN focuses on the local context to produce fine-grained and accurate saliency map while preserving object details. For a testing image, the two CNNs collaboratively conduct the saliency computing in one shot. Our DISC framework is capable of uniformly highlighting the objects of interest from complex background while preserving well object details. Extensive experiments on several standard benchmarks suggest that DISC outperforms other state-of-the-art methods and it also generalizes well across data sets without additional training. The executable version of DISC is available online: http://vision.sysu.edu.cn/projects/DISC. PMID:26742147

  11. Salience Effects in the North-West of England

    OpenAIRE

    Sandra Jansen

    2014-01-01

    The question of how we can define salience, what properties it includes and how we can quantify it have been discussed widely over the past thirty years but we still have more questions than answers about this phenomenon, e. g. not only how salience arises, but also how we can define it. However, despite the lack of a clear definition, salience is often taken into account as an explanatory factor in language change. The scientific discourse on salience has in most cases revolved around phonet...

  12. Predicting Subjective Affective Salience from Cortical Responses to Invisible Object Stimuli.

    Science.gov (United States)

    Schmack, Katharina; Burk, Julia; Haynes, John-Dylan; Sterzer, Philipp

    2016-08-01

    The affective value of a stimulus substantially influences its potency to gain access to awareness. Here, we sought to elucidate the neural mechanisms underlying such affective salience in a combined behavioral and fMRI experiment. Healthy individuals with varying degrees of spider phobia were presented with pictures of spiders and flowers suppressed from view by continuous flash suppression. Applying multivoxel pattern analysis, we found that the average time that spider stimuli took relative to flowers to gain access to awareness in each participant could be decoded from fMRI signals evoked by suppressed spider versus flower stimuli in occipitotemporal and orbitofrontal cortex. Our results indicate neural signals during unconscious processing of complex visual information in orbitofrontal and ventral visual areas predict access to awareness of this information, suggesting a crucial role for these higher-level cortical regions in mediating affective salience. PMID:26232987

  13. What is the role of dopamine in reward: hedonic impact, reward learning, or incentive salience?

    Science.gov (United States)

    Berridge, K C; Robinson, T E

    1998-12-01

    What roles do mesolimbic and neostriatal dopamine systems play in reward? Do they mediate the hedonic impact of rewarding stimuli? Do they mediate hedonic reward learning and associative prediction? Our review of the literature, together with results of a new study of residual reward capacity after dopamine depletion, indicates the answer to both questions is 'no'. Rather, dopamine systems may mediate the incentive salience of rewards, modulating their motivational value in a manner separable from hedonia and reward learning. In a study of the consequences of dopamine loss, rats were depleted of dopamine in the nucleus accumbens and neostriatum by up to 99% using 6-hydroxydopamine. In a series of experiments, we applied the 'taste reactivity' measure of affective reactions (gapes, etc.) to assess the capacity of dopamine-depleted rats for: 1) normal affect (hedonic and aversive reactions), 2) modulation of hedonic affect by associative learning (taste aversion conditioning), and 3) hedonic enhancement of affect by non-dopaminergic pharmacological manipulation of palatability (benzodiazepine administration). We found normal hedonic reaction patterns to sucrose vs. quinine, normal learning of new hedonic stimulus values (a change in palatability based on predictive relations), and normal pharmacological hedonic enhancement of palatability. We discuss these results in the context of hypotheses and data concerning the role of dopamine in reward. We review neurochemical, electrophysiological, and other behavioral evidence. We conclude that dopamine systems are not needed either to mediate the hedonic pleasure of reinforcers or to mediate predictive associations involved in hedonic reward learning. We conclude instead that dopamine may be more important to incentive salience attributions to the neural representations of reward-related stimuli. Incentive salience, we suggest, is a distinct component of motivation and reward. In other words, dopamine systems are necessary

  14. Color edge saliency boosting using natural image statistics

    NARCIS (Netherlands)

    D. Rojas Vigo; J. van de Weijer; T. Gevers

    2010-01-01

    State of the art methods for image matching, content-based retrieval and recognition use local features. Most of these still exploit only the luminance information for detection. The color saliency boosting algorithm has provided an efficient method to exploit the saliency of color edges based on in

  15. The Aberrant Salience Inventory: A New Measure of Psychosis Proneness

    Science.gov (United States)

    Cicero, David C.; Kerns, John G.; McCarthy, Denis M.

    2010-01-01

    Aberrant salience is the unusual or incorrect assignment of salience, significance, or importance to otherwise innocuous stimuli and has been hypothesized to be important for psychosis and psychotic disorders such as schizophrenia. Despite the importance of this concept in psychosis research, no questionnaire measures are available to assess…

  16. Greenhouse Gas Emission Accounting. Preliminary study as input to a joint Int. IPCC Expert Meeting / CKO-CCB Workshop on Comparison of Top-down versus Bottom-up Emission Estimates.

    OpenAIRE

    Amstel, van, R.; Kroeze, C.; Janssen, L.J.H.M.; Olivier, J. G. J.; Wal, van der, M.F.

    1997-01-01

    Bottom-up data for carbon dioxide, methane and nitrous oxide from the official national inventories (National Communications) were compared with data from EDGAR (Emission Database for Global Atmospheric Research) and top-down emission estimates, based on the results of dispersion and climate models using measured concentrations of greenhouse gases in the atmosphere. The aims of this preliminary study were to investigate the possibilities of comparing different types of emission inventories, t...

  17. Moving object detection in aerial video based on spatiotemporal saliency

    Institute of Scientific and Technical Information of China (English)

    Shen Hao; Li Shuxiao; Zhu Chengfei; Chang Hongxing; Zhang Jinglan

    2013-01-01

    In this paper, the problem of moving object detection in aerial video is addressed. While motion cues have been extensively exploited in the literature, how to use spatial information is still an open problem. To deal with this issue, we propose a novel hierarchical moving target detection method based on spatiotemporal saliency. Temporal saliency is used to get a coarse segmentation, and spatial saliency is extracted to obtain the object’s appearance details in candidate motion regions. Finally, by combining temporal and spatial saliency information, we can get refined detec-tion results. Additionally, in order to give a full description of the object distribution, spatial sal-iency is detected in both pixel and region levels based on local contrast. Experiments conducted on the VIVID dataset show that the proposed method is efficient and accurate.

  18. Improving Saliency Models by Predicting Human Fixation Patches

    KAUST Repository

    Dubey, Rachit

    2015-04-16

    There is growing interest in studying the Human Visual System (HVS) to supplement and improve the performance of computer vision tasks. A major challenge for current visual saliency models is predicting saliency in cluttered scenes (i.e. high false positive rate). In this paper, we propose a fixation patch detector that predicts image patches that contain human fixations with high probability. Our proposed model detects sparse fixation patches with an accuracy of 84 % and eliminates non-fixation patches with an accuracy of 84 % demonstrating that low-level image features can indeed be used to short-list and identify human fixation patches. We then show how these detected fixation patches can be used as saliency priors for popular saliency models, thus, reducing false positives while maintaining true positives. Extensive experimental results show that our proposed approach allows state-of-the-art saliency methods to achieve better prediction performance on benchmark datasets.

  19. Salience Effects in the North-West of England

    Directory of Open Access Journals (Sweden)

    Sandra Jansen

    2014-06-01

    Full Text Available The question of how we can define salience, what properties it includes and how we can quantify it have been discussed widely over the past thirty years but we still have more questions than answers about this phenomenon, e. g. not only how salience arises, but also how we can define it. However, despite the lack of a clear definition, salience is often taken into account as an explanatory factor in language change. The scientific discourse on salience has in most cases revolved around phonetic features, while hardly any variables on other linguistic levels have been investigated in terms of their salience. Hence, one goal of this paper is to argue for an expanded view of salience in the sociolinguistic context. This article investigates the variation and change of two groups of variables in Carlisle, an urban speech community in the north west of England. I analyse the variable (th and in particular the replacement of /θ/ with [f] which is widely known as th-fronting. The use of three discourse markers is also examined. Both groups of features will then be discussed in the light of sociolinguistic salience.

  20. Invariant Spectral Hashing of Image Saliency Graph

    CERN Document Server

    Taquet, Maxime; De Vleeschouwer, Christophe; Macq, Benoit

    2010-01-01

    Image hashing is the process of associating a short vector of bits to an image. The resulting summaries are useful in many applications including image indexing, image authentication and pattern recognition. These hashes need to be invariant under transformations of the image that result in similar visual content, but should drastically differ for conceptually distinct contents. This paper proposes an image hashing method that is invariant under rotation, scaling and translation of the image. The gist of our approach relies on the geometric characterization of salient point distribution in the image. This is achieved by the definition of a "saliency graph" connecting these points jointly with an image intensity function on the graph nodes. An invariant hash is then obtained by considering the spectrum of this function in the eigenvector basis of the Laplacian graph, that is, its graph Fourier transform. Interestingly, this spectrum is invariant under any relabeling of the graph nodes. The graph reveals geomet...

  1. A computational substrate for incentive salience.

    Science.gov (United States)

    McClure, Samuel M; Daw, Nathaniel D; Montague, P Read

    2003-08-01

    Theories of dopamine function are at a crossroads. Computational models derived from single-unit recordings capture changes in dopaminergic neuron firing rate as a prediction error signal. These models employ the prediction error signal in two roles: learning to predict future rewarding events and biasing action choice. Conversely, pharmacological inhibition or lesion of dopaminergic neuron function diminishes the ability of an animal to motivate behaviors directed at acquiring rewards. These lesion experiments have raised the possibility that dopamine release encodes a measure of the incentive value of a contemplated behavioral act. The most complete psychological idea that captures this notion frames the dopamine signal as carrying 'incentive salience'. On the surface, these two competing accounts of dopamine function seem incommensurate. To the contrary, we demonstrate that both of these functions can be captured in a single computational model of the involvement of dopamine in reward prediction for the purpose of reward seeking.

  2. On the Salience-Based Level-k Model

    OpenAIRE

    Wolff, Irenaeus

    2015-01-01

    In the current literature, there is a lively debate about whether a level-k model can be based on salience to explain behaviour in games with distinctive action labels such as hide-and-seek or discoordination games. This study presents six different experiments designed to measure salience. When based on any of these empirical salience measures, the standard level-k model does not explain hide-and-seek behaviour. Modifying the model such that players followsalience when payoffs are equal, the...

  3. Development Of A Web Service And Android 'APP' For The Distribution Of Rainfall Data. A Bottom-Up Remote Sensing Data Mining And Redistribution Project In The Age Of The 'Web 2.0'

    Science.gov (United States)

    Mantas, Vasco M.; Pereira, A. J. S. C.; Liu, Zhong

    2013-12-01

    A project was devised to develop a set of freely available applications and web services that can (1) simplify access from Mobile Devices to TOVAS data and (2) support the development of new datasets through data repackaging and mash-up. The bottom-up approach enables the multiplication of new services, often of limited direct interest to the organizations that produces the original, global datasets, but significant to small, local users. Through this multiplication of services, the development cost is transferred to the intermediate or end users and the entire process is made more efficient, even allowing new players to use the data in innovative ways.

  4. Mortality salience increases personal relevance of the norm of reciprocity.

    Science.gov (United States)

    Schindler, Simon; Reinhard, Marc-André; Stahlberg, Dagmar

    2012-10-01

    Research on terror management theory found evidence that people under mortality salience strive to live up to salient cultural norms and values, like egalitarianism, pacifism, or helpfulness. A basic, strongly internalized norm in most human societies is the norm of reciprocity: people should support those who supported them (i.e., positive reciprocity), and people should injure those who injured them (i.e., negative reciprocity), respectively. In an experiment (N = 98; 47 women, 51 men), mortality salience overall significantly increased personal relevance of the norm of reciprocity (M = 4.45, SD = 0.65) compared to a control condition (M = 4.19, SD = 0.59). Specifically, under mortality salience there was higher motivation to punish those who treated them unfavourably (negative norm of reciprocity). Unexpectedly, relevance of the norm of positive reciprocity remained unaffected by mortality salience. Implications and limitations are discussed.

  5. A ventral salience network in the macaque brain.

    Science.gov (United States)

    Touroutoglou, Alexandra; Bliss-Moreau, Eliza; Zhang, Jiahe; Mantini, Dante; Vanduffel, Wim; Dickerson, Bradford C; Barrett, Lisa Feldman

    2016-05-15

    Successful navigation of the environment requires attending and responding efficiently to objects and conspecifics with the potential to benefit or harm (i.e., that have value). In humans, this function is subserved by a distributed large-scale neural network called the "salience network". We have recently demonstrated that there are two anatomically and functionally dissociable salience networks anchored in the dorsal and ventral portions of the human anterior insula (Touroutoglou et al., 2012). In this paper, we test the hypothesis that these two subnetworks exist in rhesus macaques (Macaca mulatta). We provide evidence that a homologous ventral salience network exists in macaques, but that the connectivity of the dorsal anterior insula in macaques is not sufficiently developed as a dorsal salience network. The evolutionary implications of these finding are considered. PMID:26899785

  6. Motion saliency detection using a temporal fourier transform

    Science.gov (United States)

    Chen, Zhe; Wang, Xin; Sun, Zhen; Wang, Zhijian

    2016-06-01

    Motion saliency detection aims at detecting the dynamic semantic regions in a video sequence. It is very important for many vision tasks. This paper proposes a new type of motion saliency detection method, Temporal Fourier Transform, for fast motion saliency detection. Different from conventional motion saliency detection methods that use complex mathematical models or features, variations in the phase spectrum of consecutive frames are identified and extracted as the key to obtaining the location of salient motion. As all the calculation is made on the temporal frequency spectrum, our model is independent of features, background models, or other forms of prior knowledge about scenes. The benefits of the proposed approach are evaluated for various videos where the number of moving objects, illumination, and background are all different. Compared with some the state of the art methods, our method achieves both good accuracy and fast computation.

  7. Properties of V1 neurons tuned to conjunctions of visual features: application of the V1 saliency hypothesis to visual search behavior.

    Directory of Open Access Journals (Sweden)

    Li Zhaoping

    Full Text Available From a computational theory of V1, we formulate an optimization problem to investigate neural properties in the primary visual cortex (V1 from human reaction times (RTs in visual search. The theory is the V1 saliency hypothesis that the bottom-up saliency of any visual location is represented by the highest V1 response to it relative to the background responses. The neural properties probed are those associated with the less known V1 neurons tuned simultaneously or conjunctively in two feature dimensions. The visual search is to find a target bar unique in color (C, orientation (O, motion direction (M, or redundantly in combinations of these features (e.g., CO, MO, or CM among uniform background bars. A feature singleton target is salient because its evoked V1 response largely escapes the iso-feature suppression on responses to the background bars. The responses of the conjunctively tuned cells are manifested in the shortening of the RT for a redundant feature target (e.g., a CO target from that predicted by a race between the RTs for the two corresponding single feature targets (e.g., C and O targets. Our investigation enables the following testable predictions. Contextual suppression on the response of a CO-tuned or MO-tuned conjunctive cell is weaker when the contextual inputs differ from the direct inputs in both feature dimensions, rather than just one. Additionally, CO-tuned cells and MO-tuned cells are often more active than the single feature tuned cells in response to the redundant feature targets, and this occurs more frequently for the MO-tuned cells such that the MO-tuned cells are no less likely than either the M-tuned or O-tuned neurons to be the most responsive neuron to dictate saliency for an MO target.

  8. Properties of V1 neurons tuned to conjunctions of visual features: application of the V1 saliency hypothesis to visual search behavior.

    Science.gov (United States)

    Zhaoping, Li; Zhe, Li

    2012-01-01

    From a computational theory of V1, we formulate an optimization problem to investigate neural properties in the primary visual cortex (V1) from human reaction times (RTs) in visual search. The theory is the V1 saliency hypothesis that the bottom-up saliency of any visual location is represented by the highest V1 response to it relative to the background responses. The neural properties probed are those associated with the less known V1 neurons tuned simultaneously or conjunctively in two feature dimensions. The visual search is to find a target bar unique in color (C), orientation (O), motion direction (M), or redundantly in combinations of these features (e.g., CO, MO, or CM) among uniform background bars. A feature singleton target is salient because its evoked V1 response largely escapes the iso-feature suppression on responses to the background bars. The responses of the conjunctively tuned cells are manifested in the shortening of the RT for a redundant feature target (e.g., a CO target) from that predicted by a race between the RTs for the two corresponding single feature targets (e.g., C and O targets). Our investigation enables the following testable predictions. Contextual suppression on the response of a CO-tuned or MO-tuned conjunctive cell is weaker when the contextual inputs differ from the direct inputs in both feature dimensions, rather than just one. Additionally, CO-tuned cells and MO-tuned cells are often more active than the single feature tuned cells in response to the redundant feature targets, and this occurs more frequently for the MO-tuned cells such that the MO-tuned cells are no less likely than either the M-tuned or O-tuned neurons to be the most responsive neuron to dictate saliency for an MO target. PMID:22719829

  9. Multi-scale saliency search in image analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Campisi, Anthony; Backer, Alejandro

    2005-10-01

    Saliency detection in images is an important outstanding problem both in machine vision design and the understanding of human vision mechanisms. Recently, seminal work by Itti and Koch resulted in an effective saliency-detection algorithm. We reproduce the original algorithm in a software application Vision and explore its limitations. We propose extensions to the algorithm that promise to improve performance in the case of difficult-to-detect objects.

  10. Toward a Theory of Stakeholder Salience in Family Firms

    OpenAIRE

    Mitchell, Ronald K.; Agle, Bradley; Chrisman, James; Spence, Laura

    2011-01-01

    The notion of stakeholder salience based on attributes (e.g., power, legitimacy, urgency) is applied in the family business setting. We argue that where principal institutions intersect (i.e., family and business); managerial perceptions of stakeholder salience will be different and more complex than where institutions are based on a single dominant logic. We propose that (1) whereas utilitarian power is more likely in the general business case, normative power is more typical in family ...

  11. Multi-polarimetric textural distinctiveness for outdoor robotic saliency detection

    Science.gov (United States)

    Haider, S. A.; Scharfenberger, C.; Kazemzadeh, F.; Wong, A.; Clausi, D. A.

    2015-01-01

    Mobile robots that rely on vision, for navigation and object detection, use saliency approaches to identify a set of potential candidates to recognize. The state of the art in saliency detection for mobile robotics often rely upon visible light imaging, using conventional camera setups, to distinguish an object against its surroundings based on factors such as feature compactness, heterogeneity and/or homogeneity. We are demonstrating a novel multi- polarimetric saliency detection approach which uses multiple measured polarization states of a scene. We leverage the light-material interaction known as Fresnel reflections to extract rotationally invariant multi-polarimetric textural representations to then train a high dimensional sparse texture model. The multi-polarimetric textural distinctiveness is characterized using a conditional probability framework based on the sparse texture model which is then used to determine the saliency at each pixel of the scene. It was observed that through the inclusion of additional polarized states into the saliency analysis, we were able to compute noticeably improved saliency maps in scenes where objects are difficult to distinguish from their background due to color intensity similarities between the object and its surroundings.

  12. What drives farmers to make top-down or bottom-up adaptation to climate change and fluctuations? A comparative study on 3 cases of apple farming in Japan and South Africa.

    Science.gov (United States)

    Fujisawa, Mariko; Kobayashi, Kazuhiko; Johnston, Peter; New, Mark

    2015-01-01

    Agriculture is one of the most vulnerable sectors to climate change. Farmers have been exposed to multiple stressors including climate change, and they have managed to adapt to those risks. The adaptation actions undertaken by farmers and their decision making are, however, only poorly understood. By studying adaptation practices undertaken by apple farmers in three regions: Nagano and Kazuno in Japan and Elgin in South Africa, we categorize the adaptation actions into two types: farmer initiated bottom-up adaptation and institution led top-down adaptation. We found that the driver which differentiates the type of adaptation likely adopted was strongly related to the farmers' characteristics, particularly their dependence on the institutions, e.g. the farmers' cooperative, in selling their products. The farmers who rely on the farmers' cooperative for their sales are likely to adopt the institution-led adaptation, whereas the farmers who have established their own sales channels tend to start innovative actions by bottom-up. We further argue that even though the two types have contrasting features, the combinations of the both types of adaptations could lead to more successful adaptation particularly in agriculture. This study also emphasizes that more farm-level studies for various crops and regions are warranted to provide substantial feedbacks to adaptation policy.

  13. What drives farmers to make top-down or bottom-up adaptation to climate change and fluctuations? A comparative study on 3 cases of apple farming in Japan and South Africa.

    Directory of Open Access Journals (Sweden)

    Mariko Fujisawa

    Full Text Available Agriculture is one of the most vulnerable sectors to climate change. Farmers have been exposed to multiple stressors including climate change, and they have managed to adapt to those risks. The adaptation actions undertaken by farmers and their decision making are, however, only poorly understood. By studying adaptation practices undertaken by apple farmers in three regions: Nagano and Kazuno in Japan and Elgin in South Africa, we categorize the adaptation actions into two types: farmer initiated bottom-up adaptation and institution led top-down adaptation. We found that the driver which differentiates the type of adaptation likely adopted was strongly related to the farmers' characteristics, particularly their dependence on the institutions, e.g. the farmers' cooperative, in selling their products. The farmers who rely on the farmers' cooperative for their sales are likely to adopt the institution-led adaptation, whereas the farmers who have established their own sales channels tend to start innovative actions by bottom-up. We further argue that even though the two types have contrasting features, the combinations of the both types of adaptations could lead to more successful adaptation particularly in agriculture. This study also emphasizes that more farm-level studies for various crops and regions are warranted to provide substantial feedbacks to adaptation policy.

  14. A framework for assessing inter-individual variability in pharmacokinetics using virtual human populations and integrating general knowledge of physical chemistry, biology, anatomy, physiology and genetics: A tale of 'bottom-up' vs 'top-down' recognition of covariates.

    Science.gov (United States)

    Jamei, Masoud; Dickinson, Gemma L; Rostami-Hodjegan, Amin

    2009-01-01

    An increasing number of failures in clinical stages of drug development have been related to the effects of candidate drugs in a sub-group of patients rather than the 'average' person. Expectation of extreme effects or lack of therapeutic effects in some subgroups following administration of similar doses requires a full understanding of the issue of variability and the importance of identifying covariates that determine the exposure to the drug candidates in each individual. In any drug development program the earlier these covariates are known the better. An important component of the drive to decrease this failure rate in drug development involves attempts to use physiologically-based pharmacokinetics 'bottom-up' modeling and simulation to optimize molecular features with respect to the absorption, distribution, metabolism and elimination (ADME) processes. The key element of this approach is the separation of information on the system (i.e. human body) from that of the drug (e.g. physicochemical characteristics determining permeability through membranes, partitioning to tissues, binding to plasma proteins or affinities toward certain enzymes and transporter proteins) and the study design (e.g. dose, route and frequency of administration, concomitant drugs and food). In this review, the classical 'top-down' approach in covariate recognition is compared with the 'bottom-up' paradigm. The determinants and sources of inter-individual variability in different stages of drug absorption, distribution, metabolism and excretion are discussed in detail. Further, the commonly known tools for simulating ADME properties are introduced.

  15. The Culture War and Issue Salience

    DEFF Research Database (Denmark)

    Wroe, Andrew; Ashbee, Edward; Gosling, Amanda

    2014-01-01

    Despite much talk of a culture war, scholars continue to argue over whether the American public is divided on cultural and social issues. Some of the most prominent work in this area, such as Fiorina's Culture War?, has rejected the idea. However, this work has in turn been criticized for focussi...... culture. Both findings enhance but also complicate our theoretical understanding of the culture war, and have important real-world consequences for American politics.......Despite much talk of a culture war, scholars continue to argue over whether the American public is divided on cultural and social issues. Some of the most prominent work in this area, such as Fiorina's Culture War?, has rejected the idea. However, this work has in turn been criticized for focussing...... only on the distribution of attitudes within the American public and ignoring the possibility that the culture war may also be driven by the increasing strength with which sections of the population hold their opinions. This paper tests the strength, or saliency, hypothesis using individual-level over...

  16. Visual salience guided feature-aware shape simplification

    Institute of Scientific and Technical Information of China (English)

    Yong-wei MIAO; Fei-xia HU; Min-yan CHEN; Zhen LIU; Hua-hao SHOU

    2014-01-01

    In the area of 3D digital engineering and 3D digital geometry processing, shape simplification is an important task to reduce their requirement of large memory and high time complexity. By incorporating the content-aware visual salience measure of a polygonal mesh into simplification operation, a novel feature-aware shape simplification approach is presented in this paper. Owing to the robust extraction of relief heights on 3D highly detailed meshes, our visual salience measure is defined by a center-surround operator on Gaussian-weighted relief heights in a scale-dependent manner. Guided by our visual salience map, the feature-aware shape simplification algorithm can be performed by weighting the high-dimensional feature space quadric error metric of vertex pair contractions with the weight map derived from our visual salience map. The weighted quadric error metric is calculated in a six-dimensional feature space by combining the position and normal information of mesh vertices. Experimental results demonstrate that our visual salience guided shape simplification scheme can adaptively and effectively re-sample the underlying models in a feature-aware manner, which can account for the visually salient features of the complex shapes and thus yield better visual fidelity.

  17. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    Science.gov (United States)

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset.

  18. Bridging the Gap between the Nanometer-Scale Bottom-Up and Micrometer-Scale Top-Down Approaches for Site-Defined InP/InAs Nanowires.

    Science.gov (United States)

    Zhang, Guoqiang; Rainville, Christophe; Salmon, Adrian; Takiguchi, Masato; Tateno, Kouta; Gotoh, Hideki

    2015-11-24

    This work presents a method that bridges the gap between the nanometer-scale bottom-up and micrometer-scale top-down approaches for site-defined nanostructures, which has long been a significant challenge for applications that require low-cost and high-throughput manufacturing processes. We realized the bridging by controlling the seed indium nanoparticle position through a self-assembly process. Site-defined InP nanowires were then grown from the indium-nanoparticle array in the vapor-liquid-solid mode through a "seed and grow" process. The nanometer-scale indium particles do not always occupy the same locations within the micrometer-scale open window of an InP exposed substrate due to the scale difference. We developed a technique for aligning the nanometer-scale indium particles on the same side of the micrometer-scale window by structuring the surface of a misoriented InP (111)B substrate. Finally, we demonstrated that the developed method can be used to grow a uniform InP/InAs axial-heterostructure nanowire array. The ability to form a heterostructure nanowire array with this method makes it possible to tune the emission wavelength over a wide range by employing the quantum confinement effect and thus expand the application of this technology to optoelectronic devices. Successfully pairing a controllable bottom-up growth technique with a top-down substrate preparation technique greatly improves the potential for the mass-production and widespread adoption of this technology. PMID:26348087

  19. 浅谈自底向上的Shell脚本编程及效率优化%A Brief Talk about Bottom-up Shell Script Programming and Efficiency Optimization

    Institute of Scientific and Technical Information of China (English)

    江松波; 倪子伟

    2011-01-01

    Shell scripts run in the interpreter mode, which has always been inefficient. Inefficient design will further affect the efficiency performance of the Shell script. This paper analyzes the characteristics of Shell language and its applications and proposes a "bottom up to Shell scripting" idea based on the perspective of hierarchical design. At the same time, this paper puts forward a comprehensive method, that is from the "external system environment" to “internal execution model", to mastering Shell utilities.The case fully proves that the idea and methods about bottom-up Shell scripting can effectively improve the efficiency of the script.%低效的Shell脚本设计会进一步影响原本解释器模式下并不见长的程序运行效率,使其在面对大数据量文本分析时的资源和时间消:耗变得难以接受.本文通过分析Shell语言及其应用需求的特点,从分层设计的角度提出"自底向上进行Shen脚本编程"的理论,同时提出从"外部系统环境"到"内部执行模式"全面地掌握工具软件的方法.实例充分证明,自底向上的Shell脚本编程思想及方法能够有效提高脚本的执行效率.

  20. Fast and Conspicuous? Quantifying Salience With the Theory of Visual Attention

    Science.gov (United States)

    Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid

    2016-01-01

    Particular differences between an object and its surrounding cause salience, guide attention, and improve performance in various tasks. While much research has been dedicated to identifying which feature dimensions contribute to salience, much less regard has been paid to the quantitative strength of the salience caused by feature differences. Only a few studies systematically related salience effects to a common salience measure, and they are partly outdated in the light of new findings on the time course of salience effects. We propose Bundesen’s Theory of Visual Attention (TVA) as a theoretical basis for measuring salience and introduce an empirical and modeling approach to link this theory to data retrieved from temporal-order judgments. With this procedure, TVA becomes applicable to a broad range of salience-related stimulus material. Three experiments with orientation pop-out displays demonstrate the feasibility of the method. A 4th experiment substantiates its applicability to the luminance dimension. PMID:27168868

  1. Salience and Strategy Choice in 2 × 2 Games

    Directory of Open Access Journals (Sweden)

    Jonathan W. Leland

    2015-10-01

    Full Text Available We present a model of boundedly rational play in single-shot 2 × 2 games. Players choose strategies based on the perceived salience of their own payoffs and, if own-payoff salience is uninformative, on the perceived salience of their opponent’s payoffs. When own payoffs are salient, the model’s predictions correspond to those of Level-1 players in a cognitive hierarchy model. When it is the other player’s payoffs that are salient, the predictions of the model correspond to those of traditional game theory. The model provides unique predictions for the entire class of 2 × 2 games. It identifies games where a Nash equilibrium will always occur, ones where it will never occur, and ones where it will occur only for certain payoff values. It also predicts the outcome of games for which there are no pure Nash equilibria. Experimental results supporting these predictions are presented.

  2. Moving Foreground Detection Based On Spatio-temporal Saliency

    Directory of Open Access Journals (Sweden)

    Yang Xia

    2013-01-01

    Full Text Available Detection of moving foreground in video is very important for many applications, such as visual surveillance, object-based video coding, etc. When objects move with different speeds and under illumination changes, the robustness of moving object detection methods proposed so far is still not satisfactory. In this paper, we use the semantic information to adjust the pixel-wise learning rate adaptively for more robust detection performance, which are obtained by spatial saliency map based on Gaussian mixture model (GMM in luma space and temporal saliency map obtained by background subtraction. In addition, we design a two-pass background estimation framework, in which the initial estimation is used for temporal saliency estimation, and the other is to detect foreground and update model parameters. The experimental results show that our method can achieve better moving object extraction performance than the existing background subtraction method based on GMM.

  3. The Electrophysiological Signature of Motivational Salience in Mice and Implications for Schizophrenia

    OpenAIRE

    Moessnang, Carolin; Habel, Ute; Schneider, Frank; Siegel, Steven J.

    2012-01-01

    According to the aberrant-salience hypothesis, attribution of motivational salience is severely disrupted in patients with schizophrenia. To provide a translational approach for investigating underlying mechanisms, neural correlates of salience attribution were examined in normal mice and in a MK-801 model of schizophrenia. Electrophysiological responses to standard and deviant tones were assessed in the medial prefrontal cortex (mPFC) using an auditory oddball paradigm. Motivational salience...

  4. Bottom Up Succession Planning Works Better.

    Science.gov (United States)

    Stevens, Paul

    Most succession planning practices are based on the premise that ambitious people have and want only one career direction--upwardly mobile. However, employees have 10 career direction options at any stage of their working lives. A minority want the career action requiring promotion. Employers with a comprehensive career planning support program…

  5. Mobile Handsets from the Bottom Up

    DEFF Research Database (Denmark)

    Wallis, Cara; Linchuan Qiu, Jack; Ling, Richard

    2013-01-01

    The setting could be a hole-in-the-wall that serves as a shop in a narrow alley in Guangzhou, a cart on a dusty street on the outskirts of Accra, a bustling marketplace in Mexico City, or a tiny storefront near downtown Los Angeles’ garment district. At such locales, men and women hawk an array...

  6. Bottom Up Project Cost and Risk Modeling

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcosm along with its partners HRP Systems, End-to-End Analytics, and ARES Corporation (unfunded in Phase I), propose to develop a new solution for detailed data...

  7. Bottom-up approach to spatial datamining

    OpenAIRE

    Künzi, Christophe; Stoffel, Kilian

    2013-01-01

    One of the goals of computer vision research is to design systems that provide human-like visual capabilities such that a certain environment can be sensed and interpreted to take appropriate actions. Among the different forms available to represent such an environment, the 3D point cloud (unstructured collection of points in a three dimensional space) rises a lot of challenging problems. Moreover, the number of 3D data collection drastically increased in recent years, as improvements in the ...

  8. Promoting education from the bottom up

    Science.gov (United States)

    Page, Martin

    2010-02-01

    I am not an academic, just a foot soldier: I help out with the Children's University and as a Schools Science Ambassador, giving talks and demonstrations in physics. The recent £40m cuts in the budget of the UK's Science and Technology Facilities Council (January p6) have created quite a stir in the academic community, with talk of a "brain drain". Those in the penthouse should come down to the basement, where they would see just how thin and fragile the scientific foundation is.

  9. Teaching Listening Comprehension: Bottom-Up Approach

    Science.gov (United States)

    Khuziakhmetov, Anvar N.; Porchesku, Galina V.

    2016-01-01

    Improving listening comprehension skills is one of the urgent contemporary educational problems in the field of second language acquisition. Understanding how L2 listening comprehension works can have a serious influence on language pedagogy. The aim of the paper is to discuss the practical and methodological value of the notion of the perception…

  10. Bottom-up tailoring of photonic nanofibers

    DEFF Research Database (Denmark)

    Balzer, Frank; Madsen, Morten; Frese, Ralf;

    2008-01-01

    Aligned ensembles of nanoscopic nanofibers from organic molecules such as para-phenylenes for photonic applications can be fabricated by self-assembled molecular growth on a suited dielectric substrate. Epitaxy together with alignment due to electric surface fields determines the growth directions...

  11. Research and Development from the bottom up

    DEFF Research Database (Denmark)

    Brem, Alexander; Wolfram, P.

    2014-01-01

    and ecological context or the growing interest of developed market firms in approaches from emerging markets. Hence, the presented framework supports further research in new paradigms for research and development (R&D) in developed market firms (DMFs), particularly in relation to emerging markets. This framework...... enables scholars to compare concepts from developed and emerging markets, to address studies specifically by using consistent terms, and to advance research into the concepts according their characterization....... is introduced consisting of three core dimensions: sophistication, sustainability, and emerging market orientation. On the basis of these dimensions, analogies and distinctions between the terms are identified and general tendencies are explored such as the increasing importance of sustainability in social...

  12. Milk bottom-up proteomics: method optimisation.

    Directory of Open Access Journals (Sweden)

    Delphine eVincent

    2016-01-01

    Full Text Available Milk is a complex fluid whose proteome displays a diverse set of proteins of high abundance such as caseins and medium to low abundance whey proteins such as ß-lactoglobulin, lactoferrin, immunoglobulins, glycoproteins, peptide hormones and enzymes. A sample preparation method that enables high reproducibility and throughput is key in reliably identifying proteins present or proteins responding to conditions such as a diet, health or genetics. Using skim milk samples from Jersey and Holstein-Friesian cows, we compared three extraction procedures which have not previously been applied to samples of cows’ milk. Method A (urea involved a simple dilution of the milk in a urea-based buffer, method B (TCA/acetone involved a trichloroacetic acid (TCA/acetone precipitation and method C (methanol/chloroform involved a tri-phasic partition method in chloroform/methanol solution. Protein assays, SDS-PAGE profiling, and trypsin digestion followed by nanoHPLC-electrospray ionisation-tandem mass spectrometry (nLC-ESI-MS/MS analyses were performed to assess their efficiency. Replicates were used at each analytical step (extraction, digestion, injection to assess reproducibility. Mass spectrometry (MS data are available via ProteomeXchange with identifier PXD002529. Overall 186 unique accessions, major and minor proteins, were identified with a combination of methods. Method C (methanol/chloroform yielded the best resolved SDS-patterns and highest protein recovery rates, method A (urea yielded the greatest number of accessions, and, of the three procedures, method B (TCA/acetone was the least compatible of all with a wide range of downstream analytical procedures. Our results also highlighted breed differences between the proteins in milk of Jersey and Holstein-Friesian cows.

  13. Bottom-up Experiments and Concrete Utopias

    DEFF Research Database (Denmark)

    Andersson, Lasse

    2010-01-01

    Artiklen undersøger hvorledes brugerdrevne experimenter kan udfordre den standardiserede erhvervsorienterede udgave af Oplevelsesbyen og via eksperimentet stimulerer lokalt forankrede og demokratiske udgaver af en oplevelses- og vidensbaseret by....

  14. Land Cover Change Detection Using Saliency Andwavelet Transformation

    Science.gov (United States)

    Zhang, Haopeng; Jiang, Zhiguo; Cheng, Yan

    2016-06-01

    How to obtain accurate difference map remains an open challenge in change detection. To tackle this problem, we propose a change detection method based on saliency detection and wavelet transformation. We do frequency-tuned saliency detection in initial difference image (IDI) obtained by logarithm ratio to get a salient difference image (SDI). Then, we calculate local entropy of SDI to obtain an entropic salient difference image (ESDI). The final difference image (FDI) is the wavelet fusion of IDI and ESDI, and Otsu thresholding is used to extract difference map from FDI. Experimental results validate the effectiveness and feasibility.

  15. Transmitting the sum of all fears: Iranian nuclear threat salience among offspring of Holocaust survivors.

    Science.gov (United States)

    Shrira, Amit

    2015-07-01

    Many Israelis are preoccupied with the prospect of a nuclear-armed Iran, frequently associating it with the danger of annihilation that existed during the Holocaust. The current article examined whether offspring of Holocaust survivors (OHS) are especially preoccupied and sensitive to the Iranian threat, and whether this susceptibility is a part of their increased general image of actual and potential threats, defined as the hostile world scenario (HWS). Study 1 (N = 106) showed that relative to comparisons, OHS reported more preoccupation with the Iranian nuclear threat. Moreover, the positive relationship between the salience of the Iranian threat and symptoms of anxiety was stronger among OHS. Study 2 (N = 450) replicated these findings, while focusing on the Iranian nuclear threat salience and symptoms of psychological distress. It further showed that OHS reported more negative engagement with the HWS (i.e., feeling that surrounding threats decrease one's sense of competence), which in turn mediated their increased preoccupation with the Iranian threat. The results suggest that intergenerational transmission of the Holocaust trauma includes heightened preoccupation with and sensitivity to potential threats of annihilation, and that the specific preoccupation with threats of annihilation reflects a part of a more general preoccupation with surrounding threats. PMID:25793401

  16. Visual salience modulates structure choice in relative clause production.

    Science.gov (United States)

    Montag, Jessica L; MacDonald, Maryellen C

    2014-06-01

    The role of visual salience on utterance form was investigated in a picture description study. Participants heard spoken questions about animate or inanimate entities in a picture and produced a relative clause in response. Visual properties of the scenes affected production choices such that less salient inanimate entities tended to yield longer initiation latencies and to be described with passive relative clauses more than visually salient inanimates. We suggest that the participants' question-answering task can change as a function of visual salience of entities in the picture. Less salient entities require a longer visual search of the scene, which causes the speaker to notice or attend more to the non-target competitors in the picture. As a result, it becomes more important in answering the question for the speaker to contrast the target item with a salient competitor. This effect is different from other effects of visual salience, which tend to find that more salient entities take more prominent grammatical roles in the sentence. We interpret this discrepancy as evidence that visual salience does not have a single effect on sentence production, but rather its effect is modulated by task and linguistic context.

  17. Image based Monument Recognition using Graph based Visual Saliency

    DEFF Research Database (Denmark)

    Kalliatakis, Grigorios; Triantafyllidis, Georgios

    2013-01-01

    This article presents an image-based application aiming at simple image classification of well-known monuments in the area of Heraklion, Crete, Greece. This classification takes place by utilizing Graph Based Visual Saliency (GBVS) and employing Scale Invariant Feature Transform (SIFT) or Speeded...

  18. Design of the Bottom-up Innovation project - a participatory, primary preventive, organizational level intervention on work-related stress and well-being for workers in Dutch vocational education

    Science.gov (United States)

    2013-01-01

    Background In the educational sector job demands have intensified, while job resources remained the same. A prolonged disbalance between demands and resources contributes to lowered vitality and heightened need for recovery, eventually resulting in burnout, sickness absence and retention problems. Until now stress management interventions in education focused mostly on strengthening the individual capacity to cope with stress, instead of altering the sources of stress at work at the organizational level. These interventions have been only partly effective in influencing burnout and well-being. Therefore, the “Bottom-up Innovation” project tests a two-phased participatory, primary preventive organizational level intervention (i.e. a participatory action approach) that targets and engages all workers in the primary process of schools. It is hypothesized that participating in the project results in increased occupational self-efficacy and organizational efficacy. The central research question: is an organization focused stress management intervention based on participatory action effective in reducing the need for recovery and enhancing vitality in school employees in comparison to business as usual? Methods/Design The study is designed as a controlled trial with mixed methods and three measurement moments: baseline (quantitative measures), six months and 18 months (quantitative and qualitative measures). At first follow-up short term effects of taking part in the needs assessment (phase 1) will be determined. At second follow-up the long term effects of taking part in the needs assessment will be determined as well as the effects of implemented tailored workplace solutions (phase 2). A process evaluation based on quantitative and qualitative data will shed light on whether, how and why the intervention (does not) work(s). Discussion “Bottom-up Innovation” is a combined effort of the educational sector, intervention providers and researchers. Results will

  19. Optical imaging system-based real-time image saliency extraction method

    Science.gov (United States)

    Zhao, Jufeng; Gao, Xiumin; Chen, Yueting; Feng, Huajun

    2015-04-01

    Saliency extraction has become a popular topic in imaging science. One of the challenges in image saliency extraction is to detect the saliency content efficiently with a full-resolution saliency map. Traditional methods only involve computer calculation and thus result in limitations in computational speed. An optical imaging system-based visual saliency extraction method is developed to solve this problem. The optical system is built by effectively implementing an optical Fourier process with a Fourier lens to form two frequency planes for further operation. The proposed method combines optical components and computer calculations and mainly relies on frequency selection with precise pinholes on the frequency planes to efficiently produce a saliency map. Comparison shows that the method is suitable for extracting salient information and operates in real time to generate a full-resolution saliency map with good boundaries.

  20. Bottom-up electrochemical preparation of solid-state carbon nanodots directly from nitriles/ionic liquids using carbon-free electrodes and the applications in specific ferric ion detection and cell imaging.

    Science.gov (United States)

    Niu, Fushuang; Xu, Yuanhong; Liu, Mengli; Sun, Jing; Guo, Pengran; Liu, Jingquan

    2016-03-14

    Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often required. Herein, solid-state C-dots were simply prepared by bottom-up EC carbonization of nitriles (e.g. acetonitrile) in the presence of an ionic liquid [e.g. 1-butyl-3-methylimidazolium hexafluorophosphate (BMIMPF6)], using carbon-free electrodes. Due to the positive charges of BMIM(+) on the C-dots, the final products presented in a precipitate form on the cathode, and the unreacted nitriles and BMIMPF6 can be easily removed by simple vacuum filtration. The as-prepared solid-state C-dots can be well dispersed in an aqueous medium with excellent photoluminescence properties. The average size of the C-dots was found to be 3.02 ± 0.12 nm as evidenced by transmission electron microscopy. Other techniques such as UV-vis spectroscopy, fluorescence spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy were applied for the characterization of the C-dots and to analyze the possible generation mechanism. These C-dots have been successfully applied in efficient cell imaging and specific ferric ion detection. PMID:26891173

  1. Correct primary structure assessment and extensive glyco-profiling of cetuximab by a combination of intact, middle-up, middle-down and bottom-up ESI and MALDI mass spectrometry techniques.

    Science.gov (United States)

    Ayoub, Daniel; Jabs, Wolfgang; Resemann, Anja; Evers, Waltraud; Evans, Catherine; Main, Laura; Baessmann, Carsten; Wagner-Rousset, Elsa; Suckau, Detlev; Beck, Alain

    2013-01-01

    The European Medicines Agency received recently the first marketing authorization application for a biosimilar monoclonal antibody (mAb) and adopted the final guidelines on biosimilar mAbs and Fc-fusion proteins. The agency requires high similarity between biosimilar and reference products for approval. Specifically, the amino acid sequences must be identical. The glycosylation pattern of the antibody is also often considered to be a very important quality attribute due to its strong effect on quality, safety, immunogenicity, pharmacokinetics and potency. Here, we describe a case study of cetuximab, which has been marketed since 2004. Biosimilar versions of the product are now in the pipelines of numerous therapeutic antibody biosimilar developers. We applied a combination of intact, middle-down, middle-up and bottom-up electrospray ionization and matrix assisted laser desorption ionization mass spectrometry techniques to characterize the amino acid sequence and major post-translational modifications of the marketed cetuximab product, with special emphasis on glycosylation. Our results revealed a sequence error in the reported sequence of the light chain in databases and in publications, thus highlighting the potency of mass spectrometry to establish correct antibody sequences. We were also able to achieve a comprehensive identification of cetuximab's glycoforms and glycosylation profile assessment on both Fab and Fc domains. Taken together, the reported approaches and data form a solid framework for the comparability of antibodies and their biosimilar candidates that could be further applied to routine structural assessments of these and other antibody-based products. PMID:23924801

  2. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands.

    Science.gov (United States)

    Yam, Rita S W; Fan, Yen-Tzu; Wang, Tzu-Ting

    2016-03-01

    Pomacea canaliculata (Ampullariidae) has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents. PMID:26927135

  3. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands

    Directory of Open Access Journals (Sweden)

    Rita S. W. Yam

    2016-02-01

    Full Text Available Pomacea canaliculata (Ampullariidae has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents.

  4. A top-down / bottom-up approach for multi-actors and multi-criteria assessment of mining projects for sustainable development. Application on Arlit Uranium mines (Niger)

    International Nuclear Information System (INIS)

    This thesis aims to appraise the relevance of using an hybrid top-down / bottom-up approach to evaluate mining projects in the perspective of sustainable development. With the advent of corporate social responsibility and sustainable development concepts, new social expectations have appeared towards companies that go beyond a sole requirement of profit earning capacity. If companies do not answer to these expectations, they risk to lose their social legitimacy. Traditionally associated with social, environmental, economical and political impacts and risks, mining activity is particularly concerned by these new issues. Whereas mineral resources needs have never been so high, mining companies are now expected to limit their negative effects and to take into account their different audiences' expectations in order to define, together, the terms of their social license to operate. Considering the diversity of issues, scales, actors and contexts, the challenge is real and necessitates tools to better understand issues and to structure dialogues. Based on the Uranium mines of Arlit (Niger) case study, this work shows that associating participatory approaches to structuration tools and literature propositions, appears as an efficient formula to better organize issues diversity and to build a structured dialogue between mining companies and their stakeholders. First Part aims to present the theoretical, institutional and sectorial contexts of the thesis. Second Part exposes work and results of the evaluation carried out in Niger. And, Third Part, shows the conclusions that can be derived from this work and presents a proposal for an evaluation framework, potentially applicable to other mining sites. (author)

  5. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands.

    Science.gov (United States)

    Yam, Rita S W; Fan, Yen-Tzu; Wang, Tzu-Ting

    2016-02-24

    Pomacea canaliculata (Ampullariidae) has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents.

  6. Of wealth and death: materialism, mortality salience, and consumption behavior.

    Science.gov (United States)

    Kasser, T; Sheldon, K M

    2000-07-01

    Theoretical work suggests that feelings of insecurity produce materialistic behavior, but most empirical evidence is correlational in nature. We therefore experimentally activated feelings of insecurity by having some subjects write short essays about death (mortality-salience condition). In Study 1, subjects in the mortality-salience condition, compared with subjects who wrote about a neutral topic, had higher financial expectations for themselves 15 years in the future, in terms of both their overall worth and the amount they would be spending on pleasurable items such as clothing and entertainment. Study 2 extended these findings by demonstrating that subjects exposed to death became more greedy and consumed more resources in a forest-management game. Results are discussed with regard to humanistic and terror-management theories of materialism. PMID:11273398

  7. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    Science.gov (United States)

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets.

  8. Saliency detection and edge feature matching approach for crater extraction

    Institute of Scientific and Technical Information of China (English)

    An Liu; Donghua Zhou; Lixin Chen; Maoyin Chen

    2015-01-01

    Craters are salient terrain features on planetary sur-faces, and provide useful information about the relative dating of geological unit of planets. In addition, they are ideal landmarks for spacecraft navigation. Due to low contrast and uneven il u-mination, automatic extraction of craters remains a chal enging task. This paper presents a saliency detection method for crater edges and a feature matching algorithm based on edges informa-tion. The craters are extracted through saliency edges detection, edge extraction and selection, feature matching of the same crater edges and robust el ipse fitting. In the edges matching algorithm, a crater feature model is proposed by analyzing the relationship between highlight region edges and shadow region ones. Then, crater edges are paired through the effective matching algorithm. Experiments of real planetary images show that the proposed approach is robust to different lights and topographies, and the detection rate is larger than 90%.

  9. Visual Saliency and Attention as Random Walks on Complex Networks

    CERN Document Server

    Costa, L F

    2006-01-01

    The unmatched versatility of vision in mammals is totally dependent on purposive eye movements and selective attention guided by saliencies in the presented images. The current article shows how concepts and tools from the areas of random walks, Markov chains, complex networks and artificial image analysis can be naturally combined in order to provide a unified and biologically plausible model for saliency detection and visual attention, which become indistinguishable in the process. Images are converted into complex networks by considering pixels as nodes while connections are established in terms of fields of influence defined by visual features such as tangent fields induced by luminance contrasts, distance, and size. Random walks are performed on such networks in order to emulate attentional shifts and even eye movements in the case of large shapes, and the frequency of visits to each node is conveniently obtained from the eigenequation defined by the stochastic matrix associated to the respectively drive...

  10. Saliency of product origin information in consumer choices

    OpenAIRE

    Dmitrović, Tanja; Vida, Irena

    2015-01-01

    In light of increasing market complexities resulting from globalization of consumer products/brands in both mature and transitional markets, this study examines the saliency of information related to the national origin of products in consumer choice behavior. Specifically, effects of consumer ethnocentrism and the type of product/service category on perceived importance of product origin information are investigated. Analysis of data collected on a sample of adult consumers suggests that sal...

  11. Bottom-up electrochemical preparation of solid-state carbon nanodots directly from nitriles/ionic liquids using carbon-free electrodes and the applications in specific ferric ion detection and cell imaging

    Science.gov (United States)

    Niu, Fushuang; Xu, Yuanhong; Liu, Mengli; Sun, Jing; Guo, Pengran; Liu, Jingquan

    2016-03-01

    Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often required. Herein, solid-state C-dots were simply prepared by bottom-up EC carbonization of nitriles (e.g. acetonitrile) in the presence of an ionic liquid [e.g. 1-butyl-3-methylimidazolium hexafluorophosphate (BMIMPF6)], using carbon-free electrodes. Due to the positive charges of BMIM+ on the C-dots, the final products presented in a precipitate form on the cathode, and the unreacted nitriles and BMIMPF6 can be easily removed by simple vacuum filtration. The as-prepared solid-state C-dots can be well dispersed in an aqueous medium with excellent photoluminescence properties. The average size of the C-dots was found to be 3.02 +/- 0.12 nm as evidenced by transmission electron microscopy. Other techniques such as UV-vis spectroscopy, fluorescence spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy were applied for the characterization of the C-dots and to analyze the possible generation mechanism. These C-dots have been successfully applied in efficient cell imaging and specific ferric ion detection.Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often

  12. Integrating top-down and bottom-up approaches to design a cost-effective and equitable programme of measures for adaptation of a river basin to global change

    Science.gov (United States)

    Girard, Corentin; Rinaudo, Jean-Daniel; Pulido-Velazquez, Manuel

    2016-04-01

    Adaptation to the multiple facets of global change challenges the conventional means of sustainably planning and managing water resources at the river basin scale. Numerous demand or supply management options are available, from which adaptation measures need to be selected in a context of high uncertainty of future conditions. Given the interdependency of water users, agreements need to be found at the local level to implement the most effective adaptation measures. Therefore, this work develops an approach combining economics and water resources engineering to select a cost-effective programme of adaptation measures in the context of climate change uncertainty, and to define an equitable allocation of the cost of the adaptation plan between the stakeholders involved. A framework is developed to integrate inputs from the two main approaches commonly used to plan for adaptation. The first, referred to as "top-down", consists of a modelling chain going from global greenhouse gases emission scenarios to local hydrological models used to assess the impact of climate change on water resources. Conversely, the second approach, called "bottom-up", starts from assessing vulnerability at the local level to then identify adaptation measures used to face an uncertain future. The methodological framework presented in this contribution relies on a combination of these two approaches to support the selection of adaptation measures at the local level. Outcomes from these two approaches are integrated to select a cost-effective combination of adaptation measures through a least-cost optimization model developed at the river basin scale. The performances of a programme of measures are assessed under different climate projections to identify cost-effective and least-regret adaptation measures. The issue of allocating the cost of the adaptation plan is considered through two complementary perspectives. The outcome of a negotiation process between the stakeholders is modelled through

  13. A novel bottom-up process to produce thymopentin nanoparticles and their formulation optimization%胸腺五肽纳米粒的制备及其处方优化

    Institute of Scientific and Technical Information of China (English)

    单紫筠; 谭银合; 杨志文; 余思琴; 陈宝; 吴传斌

    2012-01-01

    目的 建立制备胸腺五肽纳米粒的方法并对其处方进行优化,为制备符合要求的压力定量吸入气雾剂奠定基础.方法 将胸腺五肽、卵磷脂、乳糖溶于叔丁醇-水的混合溶剂中,冷冻干燥,将冻干产物用异丙醇混悬,离心除去多余的卵磷脂以得到纯药物纳米粒.采用星点设计-效应面法对其中水、卵磷脂、胸腺五肽的用量进行优化,因胸腺五肽的含量受此处方影响不显著,所以只选取纳米粒的粒径、粒径分布为结果考察指标.结果 最优处方:水∶叔丁醇、卵磷脂∶叔丁醇、胸腺五肽∶水的比例分别为0.5(mL∶mL) 、213.5(mg∶mL) 、17.0(mg∶mL),即水的用量1.5 mL、卵磷脂的用量640.57 mg、胸腺五肽的用量25.57 mg、叔丁醇的用量3.0 mL.按此处方制备的纳米粒粒径在150 nm左右,多分散系数为0.1以下,含量均能保持在98%以上.结论 采用该方法制备胸腺五肽纳米粒,质量佳,重现性好,方法简便,具有良好的应用前景.%Objective A noveJ bottom-up process was developed to produce nanoparticles containing thymopentin and the formulation was optimized to produce desirable nanoparticles for development of pressed metered dose inhaler ( pMDI) of thymopentin( TP-5 ). Methods A solution of TP-5 , lecithin and lactose in tert-butyl alcohol( TBA )/ water co-solvent system was freeze-dried to generate nanoparticles and residual lecithin was washed off in lyophilizate through eentrifugation. Formulation parameters such as lecithin content in organic phase,water content in TB A/water co-solvent, and TP-5 content in water were optimized with the central composite design-response surface methodology. As the retained content of TP-5 in nanoparticles did not significantly vary with the above formulation parameters, only particle size and size distribution of TP-5 nanoparticles was taken as response parameters. Results The ratios of water to TBA, lecithin to TBA and TP-5 to water in the

  14. Key Object Discovery and Tracking Based on Context-Aware Saliency

    Directory of Open Access Journals (Sweden)

    Geng Zhang

    2013-01-01

    Full Text Available In this paper, we propose an online key object discovery and tracking system based on visual saliency. We formulate the problem as a temporally consistent binary labelling task on a conditional random field and solve it by using a particle filter. We also propose a context‐aware saliency measurement, which can be used to improve the accuracy of any static or dynamic saliency maps. Our refined saliency maps provide clearer indications as to where the key object lies. Based on good saliency cues, we can further segment the key object inside the resulting bounding box, considering the spatial and temporal context. We tested our system extensively on different video clips. The results show that our method has significantly improved the saliency maps and tracks the key object accurately.

  15. Low-Complexity Saliency Detection Algorithm for Fast Perceptual Video Coding

    Directory of Open Access Journals (Sweden)

    Pengyu Liu

    2013-01-01

    Full Text Available A low-complexity saliency detection algorithm for perceptual video coding is proposed; low-level encoding information is adopted as the characteristics of visual perception analysis. Firstly, this algorithm employs motion vector (MV to extract temporal saliency region through fast MV noise filtering and translational MV checking procedure. Secondly, spatial saliency region is detected based on optimal prediction mode distributions in I-frame and P-frame. Then, it combines the spatiotemporal saliency detection results to define the video region of interest (VROI. The simulation results validate that the proposed algorithm can avoid a large amount of computation work in the visual perception characteristics analysis processing compared with other existing algorithms; it also has better performance in saliency detection for videos and can realize fast saliency detection. It can be used as a part of the video standard codec at medium-to-low bit-rates or combined with other algorithms in fast video coding.

  16. A top-down / bottom-up approach for multi-actors and multi-criteria assessment of mining projects for sustainable development. Application on Arlit Uranium mines (Niger); Une demarche Top-Down / Bottom-Up pour l'evaluation en termes multicriteres et multi-acteurs des projets miniers dans l'optique du developpement durable. Application sur les mines d'Uranium d'Arlit (Niger)

    Energy Technology Data Exchange (ETDEWEB)

    Chamaret, A

    2007-06-15

    This thesis aims to appraise the relevance of using an hybrid top-down / bottom-up approach to evaluate mining projects in the perspective of sustainable development. With the advent of corporate social responsibility and sustainable development concepts, new social expectations have appeared towards companies that go beyond a sole requirement of profit earning capacity. If companies do not answer to these expectations, they risk to lose their social legitimacy. Traditionally associated with social, environmental, economical and political impacts and risks, mining activity is particularly concerned by these new issues. Whereas mineral resources needs have never been so high, mining companies are now expected to limit their negative effects and to take into account their different audiences' expectations in order to define, together, the terms of their social license to operate. Considering the diversity of issues, scales, actors and contexts, the challenge is real and necessitates tools to better understand issues and to structure dialogues. Based on the Uranium mines of Arlit (Niger) case study, this work shows that associating participatory approaches to structuration tools and literature propositions, appears as an efficient formula to better organize issues diversity and to build a structured dialogue between mining companies and their stakeholders. First Part aims to present the theoretical, institutional and sectorial contexts of the thesis. Second Part exposes work and results of the evaluation carried out in Niger. And, Third Part, shows the conclusions that can be derived from this work and presents a proposal for an evaluation framework, potentially applicable to other mining sites. (author)

  17. Image Transformation using Modified Kmeans clustering algorithm for Parallel saliency map

    Directory of Open Access Journals (Sweden)

    Aman Sharma

    2013-08-01

    Full Text Available to design an image transformation system is Depending on the transform chosen, the input and output images may appear entirely different and have different interpretations. Image Transformationwith the help of certain module like input image, image cluster index, object in cluster and color index transformation of image. K-means clustering algorithm is used to cluster the image for bettersegmentation. In the proposed method parallel saliency algorithm with K-means clustering is used to avoid local minima and to find the saliency map. The region behind that of using parallel saliency algorithm is proved to be more than exiting saliency algorithm.

  18. 应用于Bottom-up蛋白质鉴定的质谱数据采集策略研究进展%Data Acquisition Strategy for Mass Spectrometers Applied to Bottom-up-based Protein Identification

    Institute of Scientific and Technical Information of China (English)

    徐长明; 张纪阳; 张伟; 谢红卫

    2013-01-01

    The high complexity of the proteome has brought great challenges to mass spectrometry-based protein identification.The technical requirements continuously forward the development of mass spectrometry.The advances in hardware and software of instrument platform provide more choices and supports for protein identification.However,it is necessary to design high-quality data acquisition strategy,which is heavyly dependent on the specific biological problem and the sample,to make the best use of the performance of the instrument.Here,the data acquisition strategy that has been developed for mass spectrometers in high throughput protein identification was reviewed.The simple repetitions,ion exclusion,ion inclusion,online intelligent data acquisition and segmented scanning technology for Bottom-up strategy were highlighted,and the impact of these strategies on the protein identification was concerned.Finally,the advantages and disadvantages of various strategies were summarized,and the future directions of developing the data acquisition strategy for mass spectrometers were discussed.%蛋白质组的高度复杂性给基于质谱的蛋白质鉴定提出了很大的挑战.技术需求促进质谱技术不断向前发展.仪器平台在软硬件方面的进步,为高通量蛋白质鉴定提供了更多选择和支撑.但是,仪器性能的充分发挥,还需要根据生物学问题的需求和分析样本的特性,设计高质量的数据采集策略.本文对目前高通量蛋白质鉴定中已开发的质谱数据采集策略进行了综述,重点介绍了Bottom-up策略中使用的简单重复、离子排除和监测、在线智能化扫描和分段扫描等技术,并关注了这些策略对高通量蛋白质鉴定的影响,总结了各种策略的优缺点并展望了其未来发展方向.

  19. Sex ratio influences the motivational salience of facial attractiveness

    OpenAIRE

    Hahn, A. C; Fisher, C. I.; DeBruine, L. M.; Jones, B. C.

    2014-01-01

    The sex ratio of the local population influences mating-related behaviours in many species. Recent experiments show that male-biased sex ratios increase the amount of financial resources men will invest in potential mates, suggesting that sex ratios influence allocation of mating effort in humans. To investigate this issue further, we tested for effects of cues to the sex ratio of the local population on the motivational salience of attractiveness in own-sex and opposite-sex faces. We did thi...

  20. "Always in My Face": An Exploration of Social Class Consciousness, Salience, and Values

    Science.gov (United States)

    Martin, Georgianna L.

    2015-01-01

    This qualitative study explores social class consciousness, salience, and values of White, low-income, first-generation college students. Overall, participants minimized the salience of social class as an aspect of their identity with many of them expressing that they did not want their social class to define them. Although participants largely…

  1. Transformational leadership and employees career salience; an empirical study conducted on banks of Pakistan

    OpenAIRE

    Tabassum Riaz; Muhammad Ramzan; Hafiz Muhammad Ishaq; Muhammad Umair Akram

    2012-01-01

    The following study examines the relationship between transformational leadership and employees’ career salience. This research is conducted to answer the question that whether employees’ career salience has association with transformational leadership. This study focuses only on banking sector. Transformational leadership is measured using its four dimensions i.e. idealized influence, intellectual stimulation, inspirational motivation and individualized considerations, relationship is determ...

  2. How Important Are Items on a Student Evaluation? A Study of Item Salience

    Science.gov (United States)

    Hills, Stacey Barlow; Naegle, Natali; Bartkus, Kenneth R.

    2009-01-01

    Although student evaluations of teaching (SETs) have been the subject of numerous research studies, the salience of SET items to students has not been examined. In the present study, the authors surveyed 484 students from a large public university. The authors suggest that not all items are viewed equally and that measures of item salience can…

  3. Emotional salience, emotional awareness, peculiar beliefs, and magical thinking.

    Science.gov (United States)

    Berenbaum, Howard; Boden, M Tyler; Baker, John P

    2009-04-01

    Two studies with college student participants (Ns = 271 and 185) tested whether peculiar beliefs and magical thinking were associated with (a) the emotional salience of the stimuli about which individuals may have peculiar beliefs or magical thinking, (b) attention to emotion, and (c) clarity of emotion. Study 1 examined belief that a baseball team was cursed. Study 2 measured magical thinking using a procedure developed by P. Rozin and C. Nemeroff (2002). In both studies, peculiar beliefs and magical thinking were associated with Salience x Attention x Clarity interactions. Among individuals for whom the objects of the belief-magical thinking were highly emotionally salient and who had high levels of attention to emotion, higher levels of emotional clarity were associated with increased peculiar beliefs-magical thinking. In contrast, among individuals for whom the objects of the belief-magical thinking were not emotionally salient and who had high levels of attention to emotion, higher levels of emotional clarity were associated with diminished peculiar beliefs-magical thinking. PMID:19348532

  4. Scalable mobile image retrieval by exploring contextual saliency.

    Science.gov (United States)

    Yang, Xiyu; Qian, Xueming; Xue, Yao

    2015-06-01

    Nowadays, it is very convenient to capture photos by a smart phone. As using, the smart phone is a convenient way to share what users experienced anytime and anywhere through social networks, it is very possible that we capture multiple photos to make sure the content is well photographed. In this paper, an effective scalable mobile image retrieval approach is proposed by exploring contextual salient information for the input query image. Our goal is to explore the high-level semantic information of an image by finding the contextual saliency from multiple relevant photos rather than solely using the input image. Thus, the proposed mobile image retrieval approach first determines the relevant photos according to visual similarity, then mines salient features by exploring contextual saliency from multiple relevant images, and finally determines contributions of salient features for scalable retrieval. Compared with the existing mobile-based image retrieval approaches, our approach requires less bandwidth and has better retrieval performance. We can carry out retrieval with retrieval. Experimental results show the effectiveness of the proposed approach. PMID:25775488

  5. Perceptual Object Extraction Based on Saliency and Clustering

    Directory of Open Access Journals (Sweden)

    Qiaorong Zhang

    2010-08-01

    Full Text Available Object-based visual attention has received an increasing interest in recent years. Perceptual object is the basic attention unit of object-based visual attention. The definition and extraction of perceptual objects is one of the key technologies in object-based visual attention computation model. A novel perceptual object definition and extraction method is proposed in this paper. Based on Gestalt theory and visual feature integration theory, perceptual object is defined using homogeneity region, salient region and edges. An improved saliency map generating algorithm is employed first. Based on the saliency map, salient edges are extracted. Then graph-based clustering algorithm is introduced to get homogeneity regions in the image. Finally an integration strategy is adopted to combine salient edges and homogeneity regions to extract perceptual objects. The proposed perceptual object extraction method has been tested on lots of natural images. Experiment results and analysis are presented in this paper also. Experiment results show that the proposed method is reasonable and valid.

  6. Reward positivity: Reward prediction error or salience prediction error?

    Science.gov (United States)

    Heydari, Sepideh; Holroyd, Clay B

    2016-08-01

    The reward positivity is a component of the human ERP elicited by feedback stimuli in trial-and-error learning and guessing tasks. A prominent theory holds that the reward positivity reflects a reward prediction error signal that is sensitive to outcome valence, being larger for unexpected positive events relative to unexpected negative events (Holroyd & Coles, 2002). Although the theory has found substantial empirical support, most of these studies have utilized either monetary or performance feedback to test the hypothesis. However, in apparent contradiction to the theory, a recent study found that unexpected physical punishments also elicit the reward positivity (Talmi, Atkinson, & El-Deredy, 2013). The authors of this report argued that the reward positivity reflects a salience prediction error rather than a reward prediction error. To investigate this finding further, in the present study participants navigated a virtual T maze and received feedback on each trial under two conditions. In a reward condition, the feedback indicated that they would either receive a monetary reward or not and in a punishment condition the feedback indicated that they would receive a small shock or not. We found that the feedback stimuli elicited a typical reward positivity in the reward condition and an apparently delayed reward positivity in the punishment condition. Importantly, this signal was more positive to the stimuli that predicted the omission of a possible punishment relative to stimuli that predicted a forthcoming punishment, which is inconsistent with the salience hypothesis. PMID:27184070

  7. Inherent Difference in Saliency for Generators with Different PM Materials

    Directory of Open Access Journals (Sweden)

    Sandra Eriksson

    2014-01-01

    Full Text Available The inherent differences between salient and nonsalient electrical machines are evaluated for two permanent magnet generators with different configurations. The neodymium based (NdFeB permanent magnets (PMs in a generator are substituted with ferrite magnets and the characteristics of the NdFeB generator and the ferrite generator are compared through FEM simulations. The NdFeB generator is a nonsalient generator, whereas the ferrite machine is a salient-pole generator, with small saliency. The two generators have almost identical properties at rated load operation. However, at overload the behaviour differs between the two generators. The salient-pole, ferrite generator has lower maximum torque than the NdFeB generator and a larger voltage drop at high current. It is concluded that, for applications where overload capability is important, saliency must be considered and the generator design adapted according to the behaviour at overload operation. Furthermore, if the maximum torque is the design criteria, additional PM mass will be required for the salient-pole machine.

  8. Multimodal region-consistent saliency based on foreground and background priors for indoor scene

    Science.gov (United States)

    Zhang, J.; Wang, Q.; Zhao, Y.; Chen, S. Y.

    2016-09-01

    Visual saliency is a very important feature for object detection in a complex scene. However, image-based saliency is influenced by clutter background and similar objects in indoor scenes, and pixel-based saliency cannot provide consistent saliency to a whole object. Therefore, in this paper, we propose a novel method that computes visual saliency maps from multimodal data obtained from indoor scenes, whilst keeping region consistency. Multimodal data from a scene are first obtained by an RGB+D camera. This scene is then segmented into over-segments by a self-adapting approach to combine its colour image and depth map. Based on these over-segments, we develop two cues as domain knowledge to improve the final saliency map, including focus regions obtained from colour images, and planar background structures obtained from point cloud data. Thus, our saliency map is generated by compounding the information of the colour data, the depth data and the point cloud data in a scene. In the experiments, we extensively compare the proposed method with state-of-the-art methods, and we also apply the proposed method to a real robot system to detect objects of interest. The experimental results show that the proposed method outperforms other methods in terms of precisions and recall rates.

  9. Salience and Attention in Surprisal-Based Accounts of Language Processing

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  10. Learning to predict where human gaze is using quaternion DCT based regional saliency detection

    Science.gov (United States)

    Li, Ting; Xu, Yi; Zhang, Chongyang

    2014-09-01

    Many current visual attention approaches used semantic features to accurately capture human gaze. However, these approaches demand high computational cost and can hardly be applied to daily use. Recently, some quaternion-based saliency detection models, such as PQFT (phase spectrum of Quaternion Fourier Transform), QDCT (Quaternion Discrete Cosine Transform), have been proposed to meet real-time requirement of human gaze tracking tasks. However, current saliency detection methods used global PQFT and QDCT to locate jump edges of the input, which can hardly detect the object boundaries accurately. To address the problem, we improved QDCT-based saliency detection model by introducing superpixel-wised regional saliency detection mechanism. The local smoothness of saliency value distribution is emphasized to distinguish noises of background from salient regions. Our algorithm called saliency confidence can distinguish the patches belonging to the salient object and those of the background. It decides whether the image patches belong to the same region. When an image patch belongs to a region consisting of other salient patches, this patch should be salient as well. Therefore, we use saliency confidence map to get background weight and foreground weight to do the optimization on saliency map obtained by QDCT. The optimization is accomplished by least square method. The optimization approach we proposed unifies local and global saliency by combination of QDCT and measuring the similarity between each image superpixel. We evaluate our model on four commonly-used datasets (Toronto, MIT, OSIE and ASD) using standard precision-recall curves (PR curves), the mean absolute error (MAE) and area under curve (AUC) measures. In comparison with most state-of-art models, our approach can achieve higher consistency with human perception without training. It can get accurate human gaze even in cluttered background. Furthermore, it achieves better compromise between speed and accuracy.

  11. Salience and Attention in Surprisal-Based Accounts of Language Processing.

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  12. Salience and attention in surprisal-based accounts of language processing

    Directory of Open Access Journals (Sweden)

    Alessandra eZarcone

    2016-06-01

    Full Text Available The notion of salience has been singled out as the explanatory factor for a diverse range oflinguistic phenomena. In particular, perceptual salience (e.g. visual salience of objects in the world,acoustic prominence of linguistic sounds and semantic-pragmatic salience (e.g. prominence ofrecently mentioned or topical referents have been shown to influence language comprehensionand production. A different line of research has sought to account for behavioral correlates ofcognitive load during comprehension as well as for certain patterns in language usage usinginformation-theoretic notions, such as surprisal. Surprisal and salience both affect languageprocessing at different levels, but the relationship between the two has not been adequatelyelucidated, and the question of whether salience can be reduced to surprisal / predictability isstill open. Our review identifies two main challenges in addressing this question: terminologicalinconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalise upon work in visual cognition inorder to orient ourselves in surveying the different facets of the notion of salience in linguisticsand their relation with models of surprisal. We find that work on salience highlights aspects oflinguistic communication that models of surprisal tend to overlook, namely the role of attentionand relevance to current goals, and we argue that the Predictive Coding framework provides aunified view which can account for the role played by attention and predictability at different levelsof processing and which can clarify the interplay between low and high levels of processes andbetween predictability-driven expectation and attention-driven focus.

  13. Effects of norm referent salience on young people's dietary orientation.

    Science.gov (United States)

    Tarrant, Mark; Khan, Sammyh S; Qin, Qi

    2015-02-01

    We examined the effects of making salient different norm referents on young people's dietary orientation. Participants were exposed to a referent who was either of similar age to themselves or older before reporting their normative beliefs, attitudes and intentions concerning dietary behavior. As predicted, exposure to the older referent was associated with stronger perceptions that eating five portions of fruit and vegetables each day was normative. Compared to those exposed to the same-age referent, participants exposed to the older referent reported more positive attitudes towards eating "five-a-day" and stronger intentions to do so over the coming week. Referent salience was also associated with a behavioral outcome, with those participants exposed to the older referent more likely to take a piece of fruit upon completion of the study (OR: 4.97, 95% CI: 1.39-17.82). The implications of these findings for norms-based interventions for changing dietary behavior are discussed. PMID:25447012

  14. Place recognition based on saliency for topological localization

    Institute of Scientific and Technical Information of China (English)

    WANG Lu; CAI Zi-xing

    2006-01-01

    Based on salient visual regions for mobile robot navigation in unknown environments, a new place recognition system was presented. The system uses monocular camera to acquire omni-directional images of the environment where the robot locates. Salient local regions are detected from these images using center-surround difference method, which computes opponencies of color and texture among multi-scale image spaces. And then they are organized using hidden Markov model (HMM) to form the vertex of topological map. So localization, that is place recognition in our system, can be converted to evaluation of HMM. Experimental results show that the saliency detection is immune to the changes of scale, 2D rotation and viewpoint etc. The created topological map has smaller size and a higher ratio of recognition is obtained.

  15. Incentive salience attribution under reward uncertainty: A Pavlovian model.

    Science.gov (United States)

    Anselme, Patrick

    2015-02-01

    There is a vast literature on the behavioural effects of partial reinforcement in Pavlovian conditioning. Compared with animals receiving continuous reinforcement, partially rewarded animals typically show (a) a slower development of the conditioned response (CR) early in training and (b) a higher asymptotic level of the CR later in training. This phenomenon is known as the partial reinforcement acquisition effect (PRAE). Learning models of Pavlovian conditioning fail to account for it. In accordance with the incentive salience hypothesis, it is here argued that incentive motivation (or 'wanting') plays a more direct role in controlling behaviour than does learning, and reward uncertainty is shown to have an excitatory effect on incentive motivation. The psychological origin of that effect is discussed and a computational model integrating this new interpretation is developed. Many features of CRs under partial reinforcement emerge from this model.

  16. Mortality salience increases defensive distancing from people with terminal cancer.

    Science.gov (United States)

    Smith, Lauren M; Kasser, Tim

    2014-01-01

    Based on principles of terror management theory, the authors hypothesized that participants would distance more from a target person with terminal cancer than from a target with arthritis, and that this effect would be stronger following mortality salience. In Study 1, adults rated how similar their personalities were to a target person; in Study 2, participants arranged two chairs in preparation for meeting the target person. Both studies found that distancing from the person with terminal cancer increased after participants wrote about their own death (vs. giving a speech). Thus, death anxiety may explain why people avoid close contact with terminally ill people; further analyses suggest that gender and self-esteem may also influence such distancing from the terminally ill. PMID:24521045

  17. Metabolic mapping reveals sex-dependent involvement of default mode and salience network in alexithymia.

    Science.gov (United States)

    Colic, L; Demenescu, L R; Li, M; Kaufmann, J; Krause, A L; Metzger, C; Walter, M

    2016-02-01

    Alexithymia, a personality construct marked by difficulties in processing one's emotions, has been linked to the altered activity in the anterior cingulate cortex (ACC). Although longitudinal studies reported sex differences in alexithymia, what mediates them is not known. To investigate sex-specific associations of alexithymia and neuronal markers, we mapped metabolites in four brain regions involved differentially in emotion processing using a point-resolved spectroscopy MRS sequence in 3 Tesla. Both sexes showed negative correlations between alexithymia and N-acetylaspartate (NAA) in pregenual ACC (pgACC). Women showed a robust negative correlation of the joint measure of glutamate and glutamine (Glx) to NAA in posterior cingulate cortex (PCC), whereas men showed a weak positive association of Glx to NAA in dorsal ACC (dACC). Our results suggest that lowered neuronal integrity in pgACC, a region of the default mode network (DMN), might primarily account for the general difficulties in emotional processing in alexithymia. Association of alexithymia in women extends to another region in the DMN-PCC, while in men a region in the salience network (SN) was involved. These observations could be representative of sex specific regulation strategies that include diminished internal evaluation of feelings in women and cognitive emotion suppression in men. PMID:26341904

  18. Mortality salience enhances racial in-group bias in empathic neural responses to others' suffering.

    Science.gov (United States)

    Li, Xiaoyang; Liu, Yi; Luo, Siyang; Wu, Bing; Wu, Xinhuai; Han, Shihui

    2015-09-01

    Behavioral research suggests that mortality salience (MS) leads to increased in-group identification and in-group favoritism in prosocial behavior. What remains unknown is whether and how MS influences brain activity that mediates emotional resonance with in-group and out-group members and is associated with in-group favoritism in helping behavior. The current work investigated MS effects on empathic neural responses to racial in-group and out-group members' suffering. Experiments 1 and 2 respectively recorded event related potentials (ERPs) and blood oxygen level dependent signals to pain/neutral expressions of Asian and Caucasian faces from Chinese adults who had been primed with MS or negative affect (NA). Experiment 1 found that an early frontal/central activity (P2) was more strongly modulated by pain vs. neutral expressions of Asian than Caucasian faces, but this effect was not affected by MS vs. NA priming. However, MS relative to NA priming enhanced racial in-group bias in long-latency neural response to pain expressions over the central/parietal regions (P3). Experiment 2 found that MS vs. NA priming increased racial in-group bias in empathic neural responses to pain expression in the anterior and mid-cingulate cortex. Our findings indicate that reminding mortality enhances brain activity that differentiates between racial in-group and out-group members' emotional states and suggest a neural basis of in-group favoritism under mortality threat. PMID:26074201

  19. Hopelessly Mortal: The Role of Mortality Salience, Immortality and Trait Self-esteem in Personal Hope

    OpenAIRE

    Wisman, Arnaud; Heflick, Nathan A

    2015-01-01

    Do people lose hope when thinking about death? Based on Terror Management Theory, we predicted that thoughts of death (i.e., mortality salience) would reduce personal hope for people low, but not high, in self-esteem, and that this reduction in hope would be ameliorated by promises of immortality. In Studies 1 and 2, mortality salience reduced personal hope for people low in self-esteem, but not for people high in self-esteem. In Study 3, mortality salience reduced hope for people low in self...

  20. In search of salience: phenomenological analysis of moral distress.

    Science.gov (United States)

    Manara, Duilio F; Villa, Giulia; Moranda, Dina

    2014-07-01

    The nurse's moral competences in the management of situations which present ethical implications are less investigated in literature than other ethical problems related to clinical nursing. Phenomenology affirms that emotional warmth is the first fundamental attitude as well as the premise of any ethical reasoning. Nevertheless, it is not clear how and when this could be confirmed in situations where the effect of emotions on the nurse's decisional process is undiscovered. To explore the processes through which situations of moral distress are determined for the nurses involved in nursing situations, a phenomenological-hermeneutic analysis of a nurse's report of an experience lived by her as a moral distress situation has been conducted. Nursing emerges as a relational doctrine that requires the nurse to have different degrees of personal involvement, the integration between logical-formal thinking and narrative thinking, the perception of the salience of the given situation also through the interpretation and management of one's own emotions, and the capacity to undergo a process of co-construction of shared meanings that the others might consider adequate for the resolution of her problem. Moral action requires the nurse to think constantly about the important things that are happening in a nursing situation. Commitment towards practical situations is directed to training in order to promote the nurse's reflective ability towards finding salience in nursing situations, but it is also directed to the management of nursing assistance and human resources for the initial impact that this reflexive ability has on patients' and their families' lives and on their need to be heard and assisted. The only case analysed does not allow generalizations. Further research is needed to investigate how feelings generated by emotional acceptance influence ethical decision making and moral distress in nursing situations. PMID:24528533

  1. The time course of color- and luminance-based salience effects.

    Directory of Open Access Journals (Sweden)

    Isabel C Dombrowe

    2010-11-01

    Full Text Available Salient objects in the visual field attract our attention. Recent work in the orientation domain has shown that the effects of the relative salience of two singleton elements on covert visual attention disappear over time. The present study aims to investigate how salience derived from color and luminance differences affects covert selection. In two experiments, observers indicated the location of a probe which was presented at different stimulus-onset-asynchronies after the presentation of a singleton display containing a homogeneous array of oriented lines and two distinct color singletons (Experiment 1 or luminance singletons (Experiment 2. The results show that relative singleton salience from luminance and color differences, just as from orientation differences, affects covert visual attention in a brief time span after stimulus onset. The mere presence of an object, however, can affect covert attention for a longer time span regardless of salience.

  2. The Time Course of Color- and Luminance-Based Salience Effects

    OpenAIRE

    MiekeDonk

    2010-01-01

    Salient objects in the visual field attract our attention. Recent work in the orientation domain has shown that the effects of the relative salience of two singleton elements on covert visual attention disappear over time. The present study aims to investigate how salience derived from color and luminance differences affects covert selection. In two experiments, observers indicated the location of a probe which was presented at different stimulus-onset-asynchronies after the presentation of a...

  3. Evidence Inhibition Responds Reactively to the Salience of Distracting Information during Focused Attention

    OpenAIRE

    Natalie Wyatt; Liana Machado

    2013-01-01

    Along with target amplification, distractor inhibition is regarded as a major contributor to selective attention. Some theories suggest that the strength of inhibitory processing is proportional to the salience of the distractor (i.e., inhibition reacts to the distractor intensity). Other theories suggest that the strength of inhibitory processing does not depend on the salience of the distractor (i.e., inhibition does not react to the distractor intensity). The present study aimed to elucida...

  4. Hypergraph-based saliency map generation with potential region-of-interest approximation and validation

    Science.gov (United States)

    Liang, Zhen; Fu, Hong; Chi, Zheru; Feng, Dagan

    2012-01-01

    A novel saliency model is proposed in this paper to automatically process images in the similar way as the human visual system which focuses on conspicuous regions that catch human beings' attention. The model combines a hypergraph representation and a partitioning process with potential region-of-interest (p-ROI) approximation and validation. Experimental results demonstrate that the proposed method shows considerable improvement in the performance of saliency map generation.

  5. Competition of synonyms through time : Conceptual and social salience factors and their interrelations

    OpenAIRE

    Soares da Silva, Augusto

    2015-01-01

    This paper highlights three theoretical and descriptive insights into synonymy and lexical variation and change: (1) the diachronic development of synonymous forms reveals essential aspects about the nature and motivations of synonymy; (2) the emergence and competition of synonymous forms can either result from conceptual salience factors or from social salience factors; (3) synonym competition sheds light upon processes of language variation and change. Focusing on the interplay between conc...

  6. DeepSaliency: Multi-Task Deep Neural Network Model for Salient Object Detection.

    Science.gov (United States)

    Li, Xi; Zhao, Liming; Wei, Lina; Yang, Ming-Hsuan; Wu, Fei; Zhuang, Yueting; Ling, Haibin; Wang, Jingdong

    2016-08-01

    A key problem in salient object detection is how to effectively model the semantic properties of salient objects in a data-driven manner. In this paper, we propose a multi-task deep saliency model based on a fully convolutional neural network with global input (whole raw images) and global output (whole saliency maps). In principle, the proposed saliency model takes a data-driven strategy for encoding the underlying saliency prior information, and then sets up a multi-task learning scheme for exploring the intrinsic correlations between saliency detection and semantic image segmentation. Through collaborative feature learning from such two correlated tasks, the shared fully convolutional layers produce effective features for object perception. Moreover, it is capable of capturing the semantic information on salient objects across different levels using the fully convolutional layers, which investigate the feature-sharing properties of salient object detection with a great reduction of feature redundancy. Finally, we present a graph Laplacian regularized nonlinear regression model for saliency refinement. Experimental results demonstrate the effectiveness of our approach in comparison with the state-of-the-art approaches.

  7. The Motivational Salience of Faces Is Related to Both Their Valence and Dominance.

    Science.gov (United States)

    Wang, Hongyi; Hahn, Amanda C; DeBruine, Lisa M; Jones, Benedict C

    2016-01-01

    Both behavioral and neural measures of the motivational salience of faces are positively correlated with their physical attractiveness. Whether physical characteristics other than attractiveness contribute to the motivational salience of faces is not known, however. Research with male macaques recently showed that more dominant macaques' faces hold greater motivational salience. Here we investigated whether dominance also contributes to the motivational salience of faces in human participants. Principal component analysis of third-party ratings of faces for multiple traits revealed two orthogonal components. The first component ("valence") was highly correlated with rated trustworthiness and attractiveness. The second component ("dominance") was highly correlated with rated dominance and aggressiveness. Importantly, both components were positively and independently related to the motivational salience of faces, as assessed from responses on a standard key-press task. These results show that at least two dissociable components underpin the motivational salience of faces in humans and present new evidence for similarities in how humans and non-human primates respond to facial cues of dominance. PMID:27513859

  8. DeepSaliency: Multi-Task Deep Neural Network Model for Salient Object Detection.

    Science.gov (United States)

    Li, Xi; Zhao, Liming; Wei, Lina; Yang, Ming-Hsuan; Wu, Fei; Zhuang, Yueting; Ling, Haibin; Wang, Jingdong

    2016-08-01

    A key problem in salient object detection is how to effectively model the semantic properties of salient objects in a data-driven manner. In this paper, we propose a multi-task deep saliency model based on a fully convolutional neural network with global input (whole raw images) and global output (whole saliency maps). In principle, the proposed saliency model takes a data-driven strategy for encoding the underlying saliency prior information, and then sets up a multi-task learning scheme for exploring the intrinsic correlations between saliency detection and semantic image segmentation. Through collaborative feature learning from such two correlated tasks, the shared fully convolutional layers produce effective features for object perception. Moreover, it is capable of capturing the semantic information on salient objects across different levels using the fully convolutional layers, which investigate the feature-sharing properties of salient object detection with a great reduction of feature redundancy. Finally, we present a graph Laplacian regularized nonlinear regression model for saliency refinement. Experimental results demonstrate the effectiveness of our approach in comparison with the state-of-the-art approaches. PMID:27305676

  9. Visual saliency models for summarization of diagnostic hysteroscopy videos in healthcare systems.

    Science.gov (United States)

    Muhammad, Khan; Ahmad, Jamil; Sajjad, Muhammad; Baik, Sung Wook

    2016-01-01

    In clinical practice, diagnostic hysteroscopy (DH) videos are recorded in full which are stored in long-term video libraries for later inspection of previous diagnosis, research and training, and as an evidence for patients' complaints. However, a limited number of frames are required for actual diagnosis, which can be extracted using video summarization (VS). Unfortunately, the general-purpose VS methods are not much effective for DH videos due to their significant level of similarity in terms of color and texture, unedited contents, and lack of shot boundaries. Therefore, in this paper, we investigate visual saliency models for effective abstraction of DH videos by extracting the diagnostically important frames. The objective of this study is to analyze the performance of various visual saliency models with consideration of domain knowledge and nominate the best saliency model for DH video summarization in healthcare systems. Our experimental results indicate that a hybrid saliency model, comprising of motion, contrast, texture, and curvature saliency, is the more suitable saliency model for summarization of DH videos in terms of extracted keyframes and accuracy. PMID:27652068

  10. Giving good directions: order of mention reflects visual salience

    Directory of Open Access Journals (Sweden)

    Alasdair Daniel Francis Clarke

    2015-12-01

    Full Text Available In complex stimuli, there are many different possible ways to refer to a specified target. Previousstudies have shown that when people are faced with such a task, the content of their referringexpression reflects visual properties such as size, salience and clutter. Here, we extend thesefindings and present evidence that (i the influence of visual perception on sentence constructiongoes beyond content selection and in part determines the order in which different objects arementioned and (ii order of mention influences comprehension. Study 1 (a corpus study ofreference productions shows that when a speaker uses a relational description to mention asalient object, that object is treated as being in the common ground and is more likely to bementioned first. Study 2 (a visual search study asks participants to listen to referring expressionsand find the specified target; in keeping with the above result, we find that search for easy-to-findtargets is faster when the target is mentioned first, while search for harder-to-find targets isfacilitated by mentioning the target later, after a landmark in a relational description. Our findingsshow that seemingly low-level and disparate mental modules like perception and sentenceplanning interact at a high level and in task-dependent ways.

  11. The effects of mortality salience on escalation of commitment.

    Science.gov (United States)

    Yen, Chih-Long; Lin, Chun-Yu

    2012-01-01

    Based on propositions derived from terror management theory (TMT), the current study proposes that people who are reminded of their mortality exhibit a higher degree of self-justification behavior to maintain their self-esteem. For this reason, they could be expected to stick with their previous decisions and invest an increasing amount of resources in those decisions, despite the fact that negative feedback has clearly indicated that they might be on a course toward failure (i.e., "escalation of commitment"). Our experiment showed that people who were reminded of their mortality were more likely to escalate their level of commitment by maintaining their current course of action. Two imaginary scenarios were tested. One of the scenarios involved deciding whether to send additional troops into the battlefield when previous attempts had failed; the other involved deciding whether to continue developing an anti-radar fighter plane when the enemy had already developed a device to detect it. The results supported our hypothesis that mortality salience increases the tendency to escalate one's level of commitment.

  12. [The effect of group size on salience of member desirability].

    Science.gov (United States)

    Sugimori, S

    1993-04-01

    This study tested the hypothesis that undesirable members are salient in a small group, while desirable members become salient in a larger group. One hundred and forty-five students were randomly assigned to twelve conditions, and read sentences desirably, undesirably, or neutrally describing each member of a college student club. The twelve clubs had one of three group sizes: 13, 39, or 52, and the proportion of the desirable or undesirable to the neutral was either 11:2 or 2:11, forming a three-way (3 x 2 x 2) factorial. Twelve subjects each were asked to make proportion judgments and impression ratings. Results indicated that proportion of the undesirable members was over estimated when the group size was 13, showing negativity bias, whereas proportion of the desirable was overestimated when the size was 52, displaying positivity bias. The size 39 showed neither positivity nor negativity bias. These results along with those from impression ratings suggested that salience of member desirability interacted with group size. It is argued that illusory correlation and group cognition studies may well take these effects into consideration.

  13. [The effect of group size on salience of member desirability].

    Science.gov (United States)

    Sugimori, S

    1993-04-01

    This study tested the hypothesis that undesirable members are salient in a small group, while desirable members become salient in a larger group. One hundred and forty-five students were randomly assigned to twelve conditions, and read sentences desirably, undesirably, or neutrally describing each member of a college student club. The twelve clubs had one of three group sizes: 13, 39, or 52, and the proportion of the desirable or undesirable to the neutral was either 11:2 or 2:11, forming a three-way (3 x 2 x 2) factorial. Twelve subjects each were asked to make proportion judgments and impression ratings. Results indicated that proportion of the undesirable members was over estimated when the group size was 13, showing negativity bias, whereas proportion of the desirable was overestimated when the size was 52, displaying positivity bias. The size 39 showed neither positivity nor negativity bias. These results along with those from impression ratings suggested that salience of member desirability interacted with group size. It is argued that illusory correlation and group cognition studies may well take these effects into consideration. PMID:8355426

  14. Relative saliency of pitch versus phonetic cues in infancy

    Science.gov (United States)

    Cardillo, Gina; Kuhl, Patricia; Sundara, Megha

    2005-09-01

    Infants in their first year are highly sensitive to different acoustic components of speech, including phonetic detail and pitch information. The present investigation examined whether relative sensitivity to these two dimensions changes during this period, as the infant acquires language-specific phonetic categories. If pitch and phonetic discrimination are hierarchical, then the relative salience of pitch and phonetic change may become reversed between 8 and 12 months of age. Thirty-two- and 47-week-old infants were tested using an auditory preference paradigm in which they first heard a recording of a person singing a 4-note song (i.e., ``go-bi-la-tu'') and were then presented with both the familiar and an unfamiliar, modified version of that song. Modifications were either a novel pitch order (keeping syllables constant) or a novel syllable order (keeping melody constant). Compared to the younger group, older infants were predicted to show greater relative sensitivity to syllable order than pitch order, in accordance with an increased tendency to attend to linguistically relevant information (phonetic patterns) as opposed to cues that are initially more salient (pitch patterns). Preliminary data show trends toward the predicted interaction, with preference patterns commensurate with previously reported data. [Work supported by the McDonnell Foundation and NIH.

  15. Decisive Visual Saliency and Consumers' In-store Decisions

    DEFF Research Database (Denmark)

    Clement, Jesper; Aastrup, Jesper; Forsberg, Signe Charlotte

    2015-01-01

    This paper focuses on consumers' in-store visual tactics and decision-making. It has been argued that many consumers shop by routine or by simple rules and justification techniques when they purchase daily commodities. It has also been argued that they make a majority of decisions in the shop, an...... and sale for both goods. The use of signage increases visual attention and sale as well, yet only for the product that the label addressed, implying a cannibalization effect. The limitation of the study and implications for retail managers and for brand owners are discussed.......This paper focuses on consumers' in-store visual tactics and decision-making. It has been argued that many consumers shop by routine or by simple rules and justification techniques when they purchase daily commodities. It has also been argued that they make a majority of decisions in the shop......, and that they are affected by the visual stimuli in the store. The objective for this paper is to investigate the visual saliency from two factors: 1) in-store signage and 2) placement of products. This is done by a triangulation method where we utilize data from an eye-track study and sales data from grocery stores...

  16. Issue Salience and the Domestic Legitimacy Demands of European Integration. The Cases of Britain and Germany

    Directory of Open Access Journals (Sweden)

    Henrike Viehrig

    2008-04-01

    Full Text Available The salience of European issues to the general public is a major determinant of the domestic legitimacy demands that governments face when they devise their European policies. The higher the salience of these issues, the more restrictive will be the legitimacy demands that governments have to meet on the domestic level. Whereas the domestic legitimacy of European policy can rest on a permissive consensus among the public in cases of low issue salience, it requires the electorate’s explicit endorsement in cases of high issue salience. Polling data from Britain and Germany show that the salience of European issues is clearly higher in Britain than in Germany. We thus conclude that British governments face tougher domestic legitimacy demands when formulating their European policies than German governments. This may contribute to accounting for both countries’ different approaches to the integration process: Germany as a role model of a pro-integrationist member state and, in contrast, Britain as the eternal 'awkward partner'.

  17. The scent of salience--is there olfactory-trigeminal conditioning in humans?

    Science.gov (United States)

    Moessnang, C; Pauly, K; Kellermann, T; Krämer, J; Finkelmeyer, A; Hummel, T; Siegel, S J; Schneider, F; Habel, U

    2013-08-15

    Pavlovian fear conditioning has been thoroughly studied in the visual, auditory and somatosensory domain, but evidence is scarce with regard to the chemosensory modality. Under the assumption that Pavlovian conditioning relies on the supra-modal mechanism of salience attribution, the present study was set out to attest the existence of chemosensory aversive conditioning in humans as a specific instance of salience attribution. fMRI was performed in 29 healthy subjects during a differential aversive conditioning paradigm. Two odors (rose, vanillin) served as conditioned stimuli (CS), one of which (CS+) was intermittently coupled with intranasally administered CO2. On the neural level, a robust differential response to the CS+ emerged in frontal, temporal, occipito-parietal and subcortical brain regions, including the amygdala. These changes were paralleled by the development of a CS+-specific connectivity profile of the anterior midcingulate cortex (aMCC), which is a key structure for processing salience information in order to guide adaptive response selection. Increased coupling could be found between key nodes of the salience network (anterior insula, neo-cerebellum) and sensorimotor areas, representing putative input and output structures of the aMCC for exerting adaptive motor control. In contrast, behavioral and skin conductance responses did not show significant effects of conditioning, which has been attributed to contingency unawareness. These findings imply substantial similarities of conditioning involving chemosensory and other sensory modalities, and suggest that salience attribution and adaptive control represent a general, modality-independent principle underlying Pavlovian conditioning.

  18. Parietal cortex integrates contextual and saliency signals during the encoding of natural scenes in working memory.

    Science.gov (United States)

    Santangelo, Valerio; Di Francesco, Simona Arianna; Mastroberardino, Serena; Macaluso, Emiliano

    2015-12-01

    The Brief presentation of a complex scene entails that only a few objects can be selected, processed indepth, and stored in memory. Both low-level sensory salience and high-level context-related factors (e.g., the conceptual match/mismatch between objects and scene context) contribute to this selection process, but how the interplay between these factors affects memory encoding is largely unexplored. Here, during fMRI we presented participants with pictures of everyday scenes. After a short retention interval, participants judged the position of a target object extracted from the initial scene. The target object could be either congruent or incongruent with the context of the scene, and could be located in a region of the image with maximal or minimal salience. Behaviourally, we found a reduced impact of saliency on visuospatial working memory performance when the target was out-of-context. Encoding-related fMRI results showed that context-congruent targets activated dorsoparietal regions, while context-incongruent targets de-activated the ventroparietal cortex. Saliency modulated activity both in dorsal and ventral regions, with larger context-related effects for salient targets. These findings demonstrate the joint contribution of knowledge-based and saliency-driven attention for memory encoding, highlighting a dissociation between dorsal and ventral parietal regions.

  19. ERP evidence on the interaction between information structure and emotional salience of words.

    Science.gov (United States)

    Wang, Lin; Bastiaansen, Marcel; Yang, Yufang; Hagoort, Peter

    2013-06-01

    Both emotional words and words focused by information structure can capture attention. This study examined the interplay between emotional salience and information structure in modulating attentional resources in the service of integrating emotional words into sentence context. Event-related potentials (ERPs) to affectively negative, neutral, and positive words, which were either focused or nonfocused in question-answer pairs, were evaluated during sentence comprehension. The results revealed an early negative effect (90-200 ms), a P2 effect, as well as an effect in the N400 time window, for both emotional salience and information structure. Moreover, an interaction between emotional salience and information structure occurred within the N400 time window over right posterior electrodes, showing that information structure influences the semantic integration only for neutral words, but not for emotional words. This might reflect the fact that the linguistic salience of emotional words can override the effect of information structure on the integration of words into context. The interaction provides evidence for attention-emotion interactions at a later stage of processing. In addition, the absence of interaction in the early time window suggests that the processing of emotional information is highly automatic and independent of context. The results suggest independent attention capture systems of emotional salience and information structure at the early stage but an interaction between them at a later stage, during the semantic integration of words.

  20. Quantifying individual variation in the propensity to attribute incentive salience to reward cues.

    Science.gov (United States)

    Meyer, Paul J; Lovic, Vedran; Saunders, Benjamin T; Yager, Lindsay M; Flagel, Shelly B; Morrow, Jonathan D; Robinson, Terry E

    2012-01-01

    If reward-associated cues acquire the properties of incentive stimuli they can come to powerfully control behavior, and potentially promote maladaptive behavior. Pavlovian incentive stimuli are defined as stimuli that have three fundamental properties: they are attractive, they are themselves desired, and they can spur instrumental actions. We have found, however, that there is considerable individual variation in the extent to which animals attribute Pavlovian incentive motivational properties ("incentive salience") to reward cues. The purpose of this paper was to develop criteria for identifying and classifying individuals based on their propensity to attribute incentive salience to reward cues. To do this, we conducted a meta-analysis of a large sample of rats (N = 1,878) subjected to a classic Pavlovian conditioning procedure. We then used the propensity of animals to approach a cue predictive of reward (one index of the extent to which the cue was attributed with incentive salience), to characterize two behavioral phenotypes in this population: animals that approached the cue ("sign-trackers") vs. others that approached the location of reward delivery ("goal-trackers"). This variation in Pavlovian approach behavior predicted other behavioral indices of the propensity to attribute incentive salience to reward cues. Thus, the procedures reported here should be useful for making comparisons across studies and for assessing individual variation in incentive salience attribution in small samples of the population, or even for classifying single animals.

  1. Quantifying individual variation in the propensity to attribute incentive salience to reward cues.

    Directory of Open Access Journals (Sweden)

    Paul J Meyer

    Full Text Available If reward-associated cues acquire the properties of incentive stimuli they can come to powerfully control behavior, and potentially promote maladaptive behavior. Pavlovian incentive stimuli are defined as stimuli that have three fundamental properties: they are attractive, they are themselves desired, and they can spur instrumental actions. We have found, however, that there is considerable individual variation in the extent to which animals attribute Pavlovian incentive motivational properties ("incentive salience" to reward cues. The purpose of this paper was to develop criteria for identifying and classifying individuals based on their propensity to attribute incentive salience to reward cues. To do this, we conducted a meta-analysis of a large sample of rats (N = 1,878 subjected to a classic Pavlovian conditioning procedure. We then used the propensity of animals to approach a cue predictive of reward (one index of the extent to which the cue was attributed with incentive salience, to characterize two behavioral phenotypes in this population: animals that approached the cue ("sign-trackers" vs. others that approached the location of reward delivery ("goal-trackers". This variation in Pavlovian approach behavior predicted other behavioral indices of the propensity to attribute incentive salience to reward cues. Thus, the procedures reported here should be useful for making comparisons across studies and for assessing individual variation in incentive salience attribution in small samples of the population, or even for classifying single animals.

  2. When death is not a problem: Regulating implicit negative affect under mortality salience.

    Science.gov (United States)

    Lüdecke, Christina; Baumann, Nicola

    2015-12-01

    Terror management theory assumes that death arouses existential anxiety in humans which is suppressed in focal attention. Whereas most studies provide indirect evidence for negative affect under mortality salience by showing cultural worldview defenses and self-esteem strivings, there is only little direct evidence for implicit negative affect under mortality salience. In the present study, we assume that this implicit affective reaction towards death depends on people's ability to self-regulate negative affect as assessed by the personality dimension of action versus state orientation. Consistent with our expectations, action-oriented participants judged artificial words to express less negative affect under mortality salience compared to control conditions whereas state-oriented participants showed the reversed pattern. PMID:26335149

  3. Neural Dynamics of Emotional Salience Processing in Response to Voices during the Stages of Sleep.

    Science.gov (United States)

    Chen, Chenyi; Sung, Jia-Ying; Cheng, Yawei

    2016-01-01

    Sleep has been related to emotional functioning. However, the extent to which emotional salience is processed during sleep is unknown. To address this concern, we investigated night sleep in healthy adults regarding brain reactivity to the emotionally (happily, fearfully) spoken meaningless syllables dada, along with correspondingly synthesized nonvocal sounds. Electroencephalogram (EEG) signals were continuously acquired during an entire night of sleep while we applied a passive auditory oddball paradigm. During all stages of sleep, mismatch negativity (MMN) in response to emotional syllables, which is an index for emotional salience processing of voices, was detected. In contrast, MMN to acoustically matching nonvocal sounds was undetected during Sleep Stage 2 and 3 as well as rapid eye movement (REM) sleep. Post-MMN positivity (PMP) was identified with larger amplitudes during Stage 3, and at earlier latencies during REM sleep, relative to wakefulness. These findings clearly demonstrated the neural dynamics of emotional salience processing during the stages of sleep. PMID:27378870

  4. Search for the best matching ultrasound frame based on spatial and temporal saliencies

    Science.gov (United States)

    Feng, Shaolei; Xiang, Xiaoyan; Zhou, S. Kevin; Lazebnik, Roee

    2011-03-01

    In this paper we present a generic system for fast and accurate retrieval of the best matching frame from Ultrasound video clips given a reference Ultrasound image. It is challenging to build a generic system to handle various lesion types without any prior information of the anatomic structures of the Ultrasound data. We propose to solve the problem based on both spatial and temporal saliency maps calculated from the Ultrasound images, which implicitly analyze the semantics of images and emphasize the anatomic regions of interest. The spatial saliency map describes the importance of the pixels of the reference image while the temporal saliency map further distinguishes the subtle changes of the anatomic structure in a video. A hierarchical comparison scheme based on a novel similarity measure is employed to locate the most similar frames quickly and precisely. Our system ensures the robustness, accuracy and efficiency. Experiments show that our system achieves more accurate results with fast speed.

  5. Neural Dynamics of Emotional Salience Processing in Response to Voices during the Stages of Sleep

    Science.gov (United States)

    Chen, Chenyi; Sung, Jia-Ying; Cheng, Yawei

    2016-01-01

    Sleep has been related to emotional functioning. However, the extent to which emotional salience is processed during sleep is unknown. To address this concern, we investigated night sleep in healthy adults regarding brain reactivity to the emotionally (happily, fearfully) spoken meaningless syllables dada, along with correspondingly synthesized nonvocal sounds. Electroencephalogram (EEG) signals were continuously acquired during an entire night of sleep while we applied a passive auditory oddball paradigm. During all stages of sleep, mismatch negativity (MMN) in response to emotional syllables, which is an index for emotional salience processing of voices, was detected. In contrast, MMN to acoustically matching nonvocal sounds was undetected during Sleep Stage 2 and 3 as well as rapid eye movement (REM) sleep. Post-MMN positivity (PMP) was identified with larger amplitudes during Stage 3, and at earlier latencies during REM sleep, relative to wakefulness. These findings clearly demonstrated the neural dynamics of emotional salience processing during the stages of sleep. PMID:27378870

  6. Work demands and resources and the work-family interface : Testing a salience model on German service sector employees

    NARCIS (Netherlands)

    Beham, Barbara; Drobnic, Sonja; Prag, Patrick; Drobnič, S.; Präg, P.

    2011-01-01

    The present study tested an extended version of Voydanoffs "differential salience comparable salience model" in a sample of German service workers. Our findings par support the model in a different national/cultural context but also yielded some divei findings with respect to within-domain resources

  7. The self salience model of other-to-self effects : Integrating principles of self-enhancement, complementarity, and imitation

    NARCIS (Netherlands)

    Stapel, DA; Van der Zee, KI

    2006-01-01

    In a series of studies the Self Salience Model of other-to-self effects is tested. This model posits that self-construal salience is all important determinant of whether other-to-self effects follow the principles of self-enhancement, imitation, or complementarity. Participants imagined interactions

  8. Toward isolating the role of dopamine in the acquisition of incentive salience attribution.

    Science.gov (United States)

    Chow, Jonathan J; Nickell, Justin R; Darna, Mahesh; Beckmann, Joshua S

    2016-10-01

    Stimulus-reward learning has been heavily linked to the reward-prediction error learning hypothesis and dopaminergic function. However, some evidence suggests dopaminergic function may not strictly underlie reward-prediction error learning, but may be specific to incentive salience attribution. Utilizing a Pavlovian conditioned approach procedure consisting of two stimuli that were equally reward-predictive (both undergoing reward-prediction error learning) but functionally distinct in regard to incentive salience (levers that elicited sign-tracking and tones that elicited goal-tracking), we tested the differential role of D1 and D2 dopamine receptors and nucleus accumbens dopamine in the acquisition of sign- and goal-tracking behavior and their associated conditioned reinforcing value within individuals. Overall, the results revealed that both D1 and D2 inhibition disrupted performance of sign- and goal-tracking. However, D1 inhibition specifically prevented the acquisition of sign-tracking to a lever, instead promoting goal-tracking and decreasing its conditioned reinforcing value, while neither D1 nor D2 signaling was required for goal-tracking in response to a tone. Likewise, nucleus accumbens dopaminergic lesions disrupted acquisition of sign-tracking to a lever, while leaving goal-tracking in response to a tone unaffected. Collectively, these results are the first evidence of an intraindividual dissociation of dopaminergic function in incentive salience attribution from reward-prediction error learning, indicating that incentive salience, reward-prediction error, and their associated dopaminergic signaling exist within individuals and are stimulus-specific. Thus, individual differences in incentive salience attribution may be reflective of a differential balance in dopaminergic function that may bias toward the attribution of incentive salience, relative to reward-prediction error learning only.

  9. Toward isolating the role of dopamine in the acquisition of incentive salience attribution.

    Science.gov (United States)

    Chow, Jonathan J; Nickell, Justin R; Darna, Mahesh; Beckmann, Joshua S

    2016-10-01

    Stimulus-reward learning has been heavily linked to the reward-prediction error learning hypothesis and dopaminergic function. However, some evidence suggests dopaminergic function may not strictly underlie reward-prediction error learning, but may be specific to incentive salience attribution. Utilizing a Pavlovian conditioned approach procedure consisting of two stimuli that were equally reward-predictive (both undergoing reward-prediction error learning) but functionally distinct in regard to incentive salience (levers that elicited sign-tracking and tones that elicited goal-tracking), we tested the differential role of D1 and D2 dopamine receptors and nucleus accumbens dopamine in the acquisition of sign- and goal-tracking behavior and their associated conditioned reinforcing value within individuals. Overall, the results revealed that both D1 and D2 inhibition disrupted performance of sign- and goal-tracking. However, D1 inhibition specifically prevented the acquisition of sign-tracking to a lever, instead promoting goal-tracking and decreasing its conditioned reinforcing value, while neither D1 nor D2 signaling was required for goal-tracking in response to a tone. Likewise, nucleus accumbens dopaminergic lesions disrupted acquisition of sign-tracking to a lever, while leaving goal-tracking in response to a tone unaffected. Collectively, these results are the first evidence of an intraindividual dissociation of dopaminergic function in incentive salience attribution from reward-prediction error learning, indicating that incentive salience, reward-prediction error, and their associated dopaminergic signaling exist within individuals and are stimulus-specific. Thus, individual differences in incentive salience attribution may be reflective of a differential balance in dopaminergic function that may bias toward the attribution of incentive salience, relative to reward-prediction error learning only. PMID:27371135

  10. Perspectives on the Salience and Magnitude of Dam Impacts for Hydro Development Scenarios in China

    Directory of Open Access Journals (Sweden)

    Desiree Tullos

    2010-06-01

    Survey results indicate differences in the perceived salience and magnitude of impacts across both expert groups and dam scenarios. Furthermore, surveys indicate that stakeholder perceptions changed as the information provided regarding dam impacts became more specific, suggesting that stakeholder evaluation may be influenced by quality of information. Finally, qualitative comments from the survey reflect some of the challenges of interdisciplinary dam assessment, including cross-disciplinary cooperation, data standardisation and weighting, and the distribution and potential mitigation of impacts. Given the complexity of data and perceptions around dam impacts, decision-support tools that integrate the objective magnitude and perceived salience of impacts are required urgently.

  11. Multi-scale mesh saliency with local adaptive patches for viewpoint selection

    OpenAIRE

    Nouri, Anass; Charrier, Christophe; Lézoray, Olivier

    2015-01-01

    International audience Our visual attention is attracted by specific areas into 3D objects (represented by meshes). This visual attention depends on the degree of saliency exposed by these areas. In this paper, we propose a novel multi-scale approach for detecting salient regions. To do so, we define a local surface descriptor based on patches of adaptive size and filled in with a local height field. The single-scale saliency of a vertex is defined as its degree measure in the mesh with ed...

  12. “眼光向下”:科举民俗研究的价值、方法与目标%The Value,Approach and Objectives of Research on Folk Customs of the Imperial Examination:A Bottom-up Perspective

    Institute of Scientific and Technical Information of China (English)

    杜春燕

    2015-01-01

    As an important area in the studies of the imperial examination,research on folk customs of the Imperial Examination,featuring a bottom-up perspective,deals with the system of the imperial examination, social customs and influences in order to have a better understanding of the cultural characteristics and the value of the Imperial Examination. Its interdisciplinary nature and bottom-up approach entail that is has to draw on historical anthropology,sociology,folklore,education science,and linguistics. The study on folk customs of the Imperial Examination may broaden the academic vision,better explore folk historical materials and enriching research results.%科举民俗作为科举学研究的重要方向,是从“自下而上”的视角,探究科举考试制度、活动、习俗及社会影响,以加深对科举考试文化特质与价值的认识。科举民俗研究具有学科交叉的特点,需要进行跨学科研究。通过借鉴人类学、社会学、民俗学、教育学、语言学等学科理论与研究方法,科举民俗研究可拓展学术视野,发掘科举民间史料,深化和丰富科举学研究的内涵。

  13. Washback from the Bottom-Up: A Grounded Theory

    OpenAIRE

    Seyyed Ali Ostovar-Namaghi

    2013-01-01

    Theory-driven wash back studies inculcate the view that tests are the only causal factor determining what teachers and students do and as such ignore other local constraints. This data-driven study aims at filling in the gap in the wash back knowledge-base by conceptualizing teachers’ perceptions of the university entrance exam (UEE) in Iran. In line with grounded theory, theoretically relevant concepts were sampled from qualitative interviews with experienced language teachers who were willi...

  14. Washback from the Bottom-Up: A Grounded Theory

    Directory of Open Access Journals (Sweden)

    Seyyed Ali Ostovar-Namaghi

    2013-11-01

    Full Text Available Theory-driven wash back studies inculcate the view that tests are the only causal factor determining what teachers and students do and as such ignore other local constraints. This data-driven study aims at filling in the gap in the wash back knowledge-base by conceptualizing teachers’ perceptions of the university entrance exam (UEE in Iran. In line with grounded theory, theoretically relevant concepts were sampled from qualitative interviews with experienced language teachers who were willing to share their views with the researcher. Iterative data collection and analysis revealed: (1 a set of local conditions that that make teachers shift away from language teaching towards preparing students for the UEE; (2 how the UEE deprives the nation from professional workforce by deprofessionalizing language teachers and producing a hose of communicatively incompetent high school graduates; and (3 data-driven suggestions for reform.Keywords: wash back, grounded theory, deprofessionalization, local conditions, communicatively-incompetent

  15. A Bottom-up View of Toddler Word Learning

    OpenAIRE

    Pereira, Alfredo F.; Smith, Linda B.; Yu, Chen

    2014-01-01

    A head-camera was used to examine the visual correlates of object name learning by toddlers, as they played with novel objects, and as the parent spontaneously named those objects. The toddlers’ learning of the object names was tested after play and the visual properties of the head-camera images during naming events associated with learned and unlearned object names were analyzed. Naming events associated with learning had a clear visual signature, one in which the visual information itself ...

  16. Bottom-Up Energy Analysis System - Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    McNeil, Michael A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Letschert, Virginie E. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stephane, de la Rue du Can [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-06-15

    The main objective of the development of BUENAS is to provide a global model with sufficient detail and accuracy for technical assessment of policy measures such as energy efficiency standards and labeling (EES&L) programs. In most countries where energy efficiency policies exist, the initial emphasis is on household appliances and lighting. Often, equipment used in commercial buildings, particularly heating, air conditioning and ventilation (HVAC) is also covered by EES&L programs. In the industrial sector, standards and labeling generally covers electric motors and distribution transformers, although a few more types of industrial equipment are covered by some programs, and there is a trend toward including more of them. In order to make a comprehensive estimate of the total potential impacts, development of the model prioritized coverage of as many end uses commonly targeted by EES&L programs as possible, for as many countries as possible.

  17. Glycan Node Analysis: A Bottom-up Approach to Glycomics.

    Science.gov (United States)

    Zaare, Sahba; Aguilar, Jesús S; Hu, Yueming; Ferdosi, Shadi; Borges, Chad R

    2016-01-01

    Synthesized in a non-template-driven process by enzymes called glycosyltransferases, glycans are key players in various significant intra- and extracellular events. Many pathological conditions, notably cancer, affect gene expression, which can in turn deregulate the relative abundance and activity levels of glycoside hydrolase and glycosyltransferase enzymes. Unique aberrant whole glycans resulting from deregulated glycosyltransferase(s) are often present in trace quantities within complex biofluids, making their detection difficult and sometimes stochastic. However, with proper sample preparation, one of the oldest forms of mass spectrometry (gas chromatography-mass spectrometry, GC-MS) can routinely detect the collection of branch-point and linkage-specific monosaccharides ("glycan nodes") present in complex biofluids. Complementary to traditional top-down glycomics techniques, the approach discussed herein involves the collection and condensation of each constituent glycan node in a sample into a single independent analytical signal, which provides detailed structural and quantitative information about changes to the glycome as a whole and reveals potentially deregulated glycosyltransferases. Improvements to the permethylation and subsequent liquid/liquid extraction stages provided herein enhance reproducibility and overall yield by facilitating minimal exposure of permethylated glycans to alkaline aqueous conditions. Modifications to the acetylation stage further increase the extent of reaction and overall yield. Despite their reproducibility, the overall yields of N-acetylhexosamine (HexNAc) partially permethylated alditol acetates (PMAAs) are shown to be inherently lower than their expected theoretical value relative to hexose PMAAs. Calculating the ratio of the area under the extracted ion chromatogram (XIC) for each individual hexose PMAA (or HexNAc PMAA) to the sum of such XIC areas for all hexoses (or HexNAcs) provides a new normalization method that facilitates relative quantification of individual glycan nodes in a sample. Although presently constrained in terms of its absolute limits of detection, this method expedites the analysis of clinical biofluids and shows considerable promise as a complementary approach to traditional top-down glycomics.

  18. Hacking Health: Bottom-up Innovation for Healthcare

    Directory of Open Access Journals (Sweden)

    Jeeshan Chowdhury

    2012-07-01

    Full Text Available Healthcare is not sustainable and still functions with outdated technology (e.g., pagers, paper records. Top-down approaches by governments and corporations have failed to deliver digital technologies to modernize healthcare. Disruptive innovation must come from the ground up by bridging the gap between front-line health experts and innovators in the latest web and mobile technology. Hacking Health is a hackathon that is focused on social innovation more than technical innovation. Our approach to improve healthcare is to pair technological innovators with healthcare experts to build realistic, human-centric solutions to front-line healthcare problems.

  19. Emulating biology: Building nanostructures from the bottom up

    OpenAIRE

    Seeman, Nadrian C.; Belcher, Angela M.

    2002-01-01

    The biological approach to nanotechnology has produced self-assembled objects, arrays and devices; likewise, it has achieved the recognition of inorganic systems and the control of their growth. Can these approaches now be integrated to produce useful systems?

  20. Education from the bottom up: UNICEF's education programme in Somalia

    OpenAIRE

    Williams, James; William Cummings

    2013-01-01

    The failure of the Somali state from 1993 to 2012 represents one of the world's most profound and prolonged cases of state collapse. Initially, education and other government services came to a standstill. With the halt of fighting in some areas, local communities with the support of the United Nations Children's Fund (UNICEF) and other agencies began to provide education and other critical services. Since then, slow progress has been made in providing educational services to increasing numbe...