WorldWideScience

Sample records for bottom-up saliency mediates

  1. Neural activities in V1 create the bottom-up saliency map of natural scenes.

    Science.gov (United States)

    Chen, Cheng; Zhang, Xilin; Wang, Yizhou; Zhou, Tiangang; Fang, Fang

    2016-06-01

    A saliency map is the bottom-up contribution to the deployment of exogenous attention. It, as well as its underlying neural mechanism, is hard to identify because of the influence of top-down signals. A recent study showed that neural activities in V1 could create a bottom-up saliency map (Zhang et al. in Neuron 73(1):183-192, 2012). In this paper, we tested whether their conclusion can generalize to complex natural scenes. In order to avoid top-down influences, each image was presented with a low contrast for only 50 ms and was followed by a high contrast mask, which rendered the whole image invisible to participants (confirmed by a forced-choice test). The Posner cueing paradigm was adopted to measure the spatial cueing effect (i.e., saliency) by an orientation discrimination task. A positive cueing effect was found, and the magnitude of the cueing effect was consistent with the saliency prediction of a computational saliency model. In a following fMRI experiment, we used the same masked natural scenes as stimuli and measured BOLD signals responding to the predicted salient region (relative to the background). We found that the BOLD signal in V1, but not in other cortical areas, could well predict the cueing effect. These results suggest that the bottom-up saliency map of natural scenes could be created in V1, providing further evidence for the V1 saliency theory (Li in Trends Cogn Sci 6(1):9-16, 2002). PMID:26879771

  2. Bottom-Up Visual Saliency Estimation With Deep Autoencoder-Based Sparse Reconstruction.

    Science.gov (United States)

    Xia, Chen; Qi, Fei; Shi, Guangming

    2016-06-01

    Research on visual perception indicates that the human visual system is sensitive to center-surround (C-S) contrast in the bottom-up saliency-driven attention process. Different from the traditional contrast computation of feature difference, models based on reconstruction have emerged to estimate saliency by starting from original images themselves instead of seeking for certain ad hoc features. However, in the existing reconstruction-based methods, the reconstruction parameters of each area are calculated independently without taking their global correlation into account. In this paper, inspired by the powerful feature learning and data reconstruction ability of deep autoencoders, we construct a deep C-S inference network and train it with the data sampled randomly from the entire image to obtain a unified reconstruction pattern for the current image. In this way, global competition in sampling and learning processes can be integrated into the nonlocal reconstruction and saliency estimation of each pixel, which can achieve better detection results than the models with separate consideration on local and global rarity. Moreover, by learning from the current scene, the proposed model can achieve the feature extraction and interaction simultaneously in an adaptive way, which can form a better generalization ability to handle more types of stimuli. Experimental results show that in accordance with different inputs, the network can learn distinct basic features for saliency modeling in its code layer. Furthermore, in a comprehensive evaluation on several benchmark data sets, the proposed method can outperform the existing state-of-the-art algorithms. PMID:26800552

  3. The Roles of Feature-Specific Task Set and Bottom-Up Salience in Attentional Capture: An ERP Study

    Science.gov (United States)

    Eimer, Martin; Kiss, Monika; Press, Clare; Sauter, Disa

    2009-01-01

    We investigated the roles of top-down task set and bottom-up stimulus salience for feature-specific attentional capture. Spatially nonpredictive cues preceded search arrays that included a color-defined target. For target-color singleton cues, behavioral spatial cueing effects were accompanied by cue-induced N2pc components, indicative of…

  4. Modeling Visual Exploration in Rhesus Macaques with Bottom-Up Salience and Oculomotor Statistics

    Science.gov (United States)

    König, Seth D.; Buffalo, Elizabeth A.

    2016-01-01

    There is a growing interest in studying biological systems in natural settings, in which experimental stimuli are less artificial and behavior is less controlled. In primate vision research, free viewing of complex images has elucidated novel neural responses, and free viewing in humans has helped discover attentional and behavioral impairments in patients with neurological disorders. In order to fully interpret data collected from free viewing of complex scenes, it is critical to better understand what aspects of the stimuli guide viewing behavior. To this end, we have developed a novel viewing behavior model called a Biased Correlated Random Walk (BCRW) to describe free viewing behavior during the exploration of complex scenes in monkeys. The BCRW can predict fixation locations better than bottom-up salience. Additionally, we show that the BCRW can be used to test hypotheses regarding specific attentional mechanisms. For example, we used the BCRW to examine the source of the central bias in fixation locations. Our analyses suggest that the central bias may be caused by a natural tendency to reorient the eyes toward the center of the stimulus, rather than a photographer's bias to center salient items in a scene. Taken together these data suggest that the BCRW can be used to further our understanding of viewing behavior and attention, and could be useful in optimizing stimulus and task design.

  5. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    Science.gov (United States)

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not. PMID:21316191

  6. Bottom-up and top-down mechanisms indirectly mediate interactions between benthic biotic ecosystem components

    Science.gov (United States)

    Van Colen, Carl; Thrush, Simon F.; Parkes, Samantha; Harris, Rachel; Woodin, Sally A.; Wethey, David S.; Pilditch, Conrad A.; Hewitt, Judi E.; Lohrer, Andrew M.; Vincx, Magda

    2015-04-01

    The loss or decline in population size of key species can instigate a cascade of effects that have implications for interacting species, therewith impacting biodiversity and ecosystem functioning. We examined how top-down and bottom-up interactions may mediate knock-on effects of a coastal deposit-feeding clam, Macomona liliana (hereafter Macomona), on sandflat meiobenthos densities. Therefore we manipulated densities of Macomona in combination with predator exclusion and experimental shading that was expected to alter microphytobenthos biomass. We show that Macomona regulated densities of meiobenthic (38-500 μm) nematodes, copepods, polychaetes, turbellarians, and ostracodes during the three months of incubation via indirect mechanisms. Predator pressure on Macomona by eagle rays (Myliobatis tenuicaudatus) was found to have a negative effect on densities of some meiobenthic taxa. Furthermore, experimental shading resulted in the loss of a positive relation between Macomona and microphytobenthos biomass, while concurrently increasing the density of some meiobenthic taxa. We suggest that this observation can be explained by the release from bioturbation interference effects of the cockle Austrovenus stutchburyi that was found to thrive in the presence of Macomona under non-shaded conditions. Our results highlight the importance of interactions between macrofaunal bioturbation, microphyte biomass, sediment stability, and predation pressure for the structuring of benthic communities. This experiment illustrates that manipulative field experiments may be particularly suitable to study such multiple indirect mechanisms that regulate ecosystem diversity and related functioning because such approaches may best capture the complex feedbacks and processes that determine ecosystem dynamics.

  7. Adaptive genetic variation mediates bottom-up and top-down control in an aquatic ecosystem.

    Science.gov (United States)

    Rudman, Seth M; Rodriguez-Cabal, Mariano A; Stier, Adrian; Sato, Takuya; Heavyside, Julian; El-Sabaawi, Rana W; Crutsinger, Gregory M

    2015-08-01

    Research in eco-evolutionary dynamics and community genetics has demonstrated that variation within a species can have strong impacts on associated communities and ecosystem processes. Yet, these studies have centred around individual focal species and at single trophic levels, ignoring the role of phenotypic variation in multiple taxa within an ecosystem. Given the ubiquitous nature of local adaptation, and thus intraspecific variation, we sought to understand how combinations of intraspecific variation in multiple species within an ecosystem impacts its ecology. Using two species that co-occur and demonstrate adaptation to their natal environments, black cottonwood (Populus trichocarpa) and three-spined stickleback (Gasterosteus aculeatus), we investigated the effects of intraspecific phenotypic variation on both top-down and bottom-up forces using a large-scale aquatic mesocosm experiment. Black cottonwood genotypes exhibit genetic variation in their productivity and consequently their leaf litter subsidies to the aquatic system, which mediates the strength of top-down effects from stickleback on prey abundances. Abundances of four common invertebrate prey species and available phosphorous, the most critically limiting nutrient in freshwater systems, are dictated by the interaction between genetic variation in cottonwood productivity and stickleback morphology. These interactive effects fit with ecological theory on the relationship between productivity and top-down control and are comparable in strength to the effects of predator addition. Our results illustrate that intraspecific variation, which can evolve rapidly, is an under-appreciated driver of community structure and ecosystem function, demonstrating that a multi-trophic perspective is essential to understanding the role of evolution in structuring ecological patterns. PMID:26203004

  8. Community context mediates the top-down vs. bottom-up effects of grazers on rocky shores

    OpenAIRE

    Bracken, MES; Dolecal, RE; Long, JD

    2014-01-01

    Interactions between grazers and autotrophs are complex, including both topdown consumptive and bottom-up facilitative effects of grazers. Thus, in addition to consuming autotrophs, herbivores can also enhance autotroph biomass by recycling limiting nutrients, thereby increasing nutrient availability. Here, we evaluated these consumptive and facilitative interactions between snails (Littorina littorea) and seaweeds (Fucus vesiculosus and Ulva lactuca) on a rocky shore. We partitioned herbivor...

  9. Culture from the Bottom Up

    Science.gov (United States)

    Atkinson, Dwight; Sohn, Jija

    2013-01-01

    The culture concept has been severely criticized for its top-down nature in TESOL, leading arguably to its falling out of favor in the field. But what of the fact that people do "live culturally" (Ingold, 1994)? This article describes a case study of culture from the bottom up--culture as understood and enacted by its individual users.…

  10. Bottom-up effects on attention capture and choice

    DEFF Research Database (Denmark)

    Peschel, Anne; Orquin, Jacob Lund; Mueller Loose, Simone

    information available to form a decision. Does changing one visual cue in the stimulus set affect attention towards this cue and what does that mean for the choice outcome? To address this, we conducted a combined eye tracking and choice experiment in a consumer choice setting with visual shelf simulations of......Attention processes and decision making are accepted to be closely linked together because only information that is attended to can be incorporated in the decision process. Little is known however, to which extent bottom-up processes of attention affect stimulus selection and therefore the...... different product categories. Surface size and visual saliency of a product label were manipulated to determine bottom-up effects on attention and choice. Results show a strong and significant increase in attention in terms of fixation likelihood towards product labels which are larger and more visually...

  11. Implementation Alternatives for Bottom-Up Evaluation

    OpenAIRE

    Brass, Stefan

    2010-01-01

    Bottom-up evaluation is a central part of query evaluation / program execution in deductive databases. It is used after a source code optimization like magic sets or SLDmagic that ensures that only facts relevant for the query can be derived. Then bottom-up evaluation simply performs the iteration of the standard TP -operator to compute the minimal model. However, there are different ways to implement bottom-up evaluation efficiently. Since this is most critical for the performance of a deduc...

  12. Emergence of visual saliency from natural scenes via context-mediated probability distributions coding.

    Directory of Open Access Journals (Sweden)

    Jinhua Xu

    Full Text Available Visual saliency is the perceptual quality that makes some items in visual scenes stand out from their immediate contexts. Visual saliency plays important roles in natural vision in that saliency can direct eye movements, deploy attention, and facilitate tasks like object detection and scene understanding. A central unsolved issue is: What features should be encoded in the early visual cortex for detecting salient features in natural scenes? To explore this important issue, we propose a hypothesis that visual saliency is based on efficient encoding of the probability distributions (PDs of visual variables in specific contexts in natural scenes, referred to as context-mediated PDs in natural scenes. In this concept, computational units in the model of the early visual system do not act as feature detectors but rather as estimators of the context-mediated PDs of a full range of visual variables in natural scenes, which directly give rise to a measure of visual saliency of any input stimulus. To test this hypothesis, we developed a model of the context-mediated PDs in natural scenes using a modified algorithm for independent component analysis (ICA and derived a measure of visual saliency based on these PDs estimated from a set of natural scenes. We demonstrated that visual saliency based on the context-mediated PDs in natural scenes effectively predicts human gaze in free-viewing of both static and dynamic natural scenes. This study suggests that the computation based on the context-mediated PDs of visual variables in natural scenes may underlie the neural mechanism in the early visual cortex for detecting salient features in natural scenes.

  13. Bottom-up approach to silicon nanoelectronics

    OpenAIRE

    Mizumita, Hiroshi; Oda, S

    2005-01-01

    Submitted on behalf of EDA Publishing Association (http://irevues.inist.fr/handle/2042/5920) International audience This paper presents a brief review of our recent work investigating a novel bottom-up approach to realize silicon based nanoelectronics. We discuss fabrication technique, electronic properties and device applications of silicon nanodots as a building block for nanoscale silicon devices.

  14. Bottom-up organic integrated circuits

    OpenAIRE

    Smits, Edsger C. P; Mathijssen, Simon G. J.; van Hal, Paul A.; Setayesh, Sepas; Geuns, Thomas C. T.; Mutsaers, Kees A. H. A.; Cantatore, Eugenio; Wondergem, Harry J.; Werzer, Oliver; Resel, Roland; Kemerink, Martijn; Kirchmeyer, Stephan; Muzafarov, Aziz M.; Ponomarenko, Sergei A.; de Boer, Bert

    2008-01-01

    Self- assembly - the autonomous organization of components into patterns and structures(1) - is a promising technology for the mass production of organic electronics. Making integrated circuits using a bottom- up approach involving self- assembling molecules was proposed(2) in the 1970s. The basic building block of such an integrated circuit is the self- assembled- monolayer field- effect transistor ( SAMFET), where the semiconductor is a monolayer spontaneously formed on the gate dielectric....

  15. Bottom-up holographic approach to QCD

    Energy Technology Data Exchange (ETDEWEB)

    Afonin, S. S. [V. A. Fock Department of Theoretical Physics, Saint Petersburg State University, 1 ul. Ulyanovskaya, 198504 (Russian Federation)

    2016-01-22

    One of the most known result of the string theory consists in the idea that some strongly coupled gauge theories may have a dual description in terms of a higher dimensional weakly coupled gravitational theory — the so-called AdS/CFT correspondence or gauge/gravity correspondence. The attempts to apply this idea to the real QCD are often referred to as “holographic QCD” or “AdS/QCD approach”. One of directions in this field is to start from the real QCD and guess a tentative dual higher dimensional weakly coupled field model following the principles of gauge/gravity correspondence. The ensuing phenomenology can be then developed and compared with experimental data and with various theoretical results. Such a bottom-up holographic approach turned out to be unexpectedly successful in many cases. In the given short review, the technical aspects of the bottom-up holographic approach to QCD are explained placing the main emphasis on the soft wall model.

  16. Bottom-up holographic approach to QCD

    International Nuclear Information System (INIS)

    One of the most known result of the string theory consists in the idea that some strongly coupled gauge theories may have a dual description in terms of a higher dimensional weakly coupled gravitational theory — the so-called AdS/CFT correspondence or gauge/gravity correspondence. The attempts to apply this idea to the real QCD are often referred to as “holographic QCD” or “AdS/QCD approach”. One of directions in this field is to start from the real QCD and guess a tentative dual higher dimensional weakly coupled field model following the principles of gauge/gravity correspondence. The ensuing phenomenology can be then developed and compared with experimental data and with various theoretical results. Such a bottom-up holographic approach turned out to be unexpectedly successful in many cases. In the given short review, the technical aspects of the bottom-up holographic approach to QCD are explained placing the main emphasis on the soft wall model

  17. On the Temporal Relation of Top-Down and Bottom-Up Mechanisms during Guidance of Attention

    Science.gov (United States)

    Wykowska, Agnieszka; Schubo, Anna

    2010-01-01

    Two mechanisms are said to be responsible for guiding focal attention in visual selection: bottom-up, saliency-driven capture and top-down control. These mechanisms were examined with a paradigm that combined a visual search task with postdisplay probe detection. Two SOAs between the search display and probe onsets were introduced to investigate…

  18. Bottom-up assembly of metallic germanium

    Science.gov (United States)

    Scappucci, Giordano; Klesse, Wolfgang M.; Yeoh, Lareine A.; Carter, Damien J.; Warschkow, Oliver; Marks, Nigel A.; Jaeger, David L.; Capellini, Giovanni; Simmons, Michelle Y.; Hamilton, Alexander R.

    2015-08-01

    Extending chip performance beyond current limits of miniaturisation requires new materials and functionalities that integrate well with the silicon platform. Germanium fits these requirements and has been proposed as a high-mobility channel material, a light emitting medium in silicon-integrated lasers, and a plasmonic conductor for bio-sensing. Common to these diverse applications is the need for homogeneous, high electron densities in three-dimensions (3D). Here we use a bottom-up approach to demonstrate the 3D assembly of atomically sharp doping profiles in germanium by a repeated stacking of two-dimensional (2D) high-density phosphorus layers. This produces high-density (1019 to 1020 cm-3) low-resistivity (10-4Ω · cm) metallic germanium of precisely defined thickness, beyond the capabilities of diffusion-based doping technologies. We demonstrate that free electrons from distinct 2D dopant layers coalesce into a homogeneous 3D conductor using anisotropic quantum interference measurements, atom probe tomography, and density functional theory.

  19. Bottom-up Attention Orienting in Young Children with Autism

    Science.gov (United States)

    Amso, Dima; Haas, Sara; Tenenbaum, Elena; Markant, Julie; Sheinkopf, Stephen J.

    2014-01-01

    We examined the impact of simultaneous bottom-up visual influences and meaningful social stimuli on attention orienting in young children with autism spectrum disorders (ASDs). Relative to typically-developing age and sex matched participants, children with ASDs were more influenced by bottom-up visual scene information regardless of whether…

  20. Bottom-up Initiatives for Photovoltaic: Incentives and Barriers

    Directory of Open Access Journals (Sweden)

    Kathrin Reinsberger

    2014-06-01

    Full Text Available When facing the challenge of restructuring the energy system, bottom-up initiatives can aid the diffusion of decentralized and clean energy technologies. We focused here on a bottom-up initiative of citizen-funded and citizen-operated photovoltaic power plants. The project follows a case study-based approach and examines two different community initiatives. The aim is to investigate the potential incentives and barriers relating to participation or non-participation in predefined community PV projects. Qualitative, as well as quantitative empirical research was used to examine the key factors in the further development of bottom-up initiatives as contributors to a general energy transition.

  1. Nanoelectronics: Thermoelectric Phenomena in «Bottom-Up» Approach

    OpenAIRE

    Yu.A. Kruglyak; P.A. Kondratenko; Yu.М. Lopatkin

    2014-01-01

    Thermoelectric phenomena of Seebeck and Peltier, quality indicators and thermoelectric optimization, ballistic and diffusive phonon heat current are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  2. Nanoelectronics: Thermoelectric Phenomena in «Bottom-Up» Approach

    Directory of Open Access Journals (Sweden)

    Yu.A. Kruglyak

    2014-04-01

    Full Text Available Thermoelectric phenomena of Seebeck and Peltier, quality indicators and thermoelectric optimization, ballistic and diffusive phonon heat current are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  3. Selecting category specific visual information: Top-down and bottom-up control of object based attention.

    Science.gov (United States)

    Corradi-Dell'Acqua, Corrado; Fink, Gereon R; Weidner, Ralph

    2015-09-01

    The ability to select, within the complexity of sensory input, the information most relevant for our purposes is influenced by both internal settings (i.e., top-down control) and salient features of external stimuli (i.e., bottom-up control). We here investigated using fMRI the neural underpinning of the interaction of top-down and bottom-up processes, as well as their effects on extrastriate areas processing visual stimuli in a category-selective fashion. We presented photos of bodies or buildings embedded into frequency-matched visual noise to the subjects. Stimulus saliency changed gradually due to an altered degree to which photos stood-out in relation to the surrounding noise (hence generating stronger bottom-up control signals). Top-down settings were manipulated via instruction: participants were asked to attend one stimulus category (i.e., "is there a body?" or "is there a building?"). Highly salient stimuli that were inconsistent with participants' attentional top-down template activated the inferior frontal junction and dorsal parietal regions bilaterally. Stimuli consistent with participants' current attentional set additionally activated insular cortex and the parietal operculum. Furthermore, the extrastriate body area (EBA) exhibited increased neural activity when attention was directed to bodies. However, the latter effect was found only when stimuli were presented at intermediate saliency levels, thus suggesting a top-down modulation of this region only in the presence of weak bottom-up signals. Taken together, our results highlight the role of the inferior frontal junction and posterior parietal regions in integrating bottom-up and top-down attentional control signals. PMID:25735196

  4. Two Dimensional Polymerization of Graphene Oxide: Bottom-up Approach

    OpenAIRE

    Atanasov, Victor; Russev, Stoyan; Lyutov, Lyudmil; Zagranyarski, Yulian; Dimitrova, Iglika; Avdeev, Georgy; Avramova, Ivalina; Vulcheva, Evgenia; Kirilov, Kiril; Tzonev, Atanas; Abrashev, Miroslav; Tsutsumanova, Gichka

    2012-01-01

    We demonstrate a bottom-up synthesis of structures similar to graphene oxide via a two dimensional polymerization. Experimental evidence and discussion are conveyed as well as a general framework for this two dimensional polymerization. The proposed morphologies and lattice structures of these graphene oxides are derived from aldol condensation of alternating three nucleophilic and three electrophilic centers of benzenetriol.

  5. Top-down but not bottom-up visual scanning is affected in hereditary pure cerebellar ataxia.

    Directory of Open Access Journals (Sweden)

    Shunichi Matsuda

    Full Text Available The aim of this study was to clarify the nature of visual processing deficits caused by cerebellar disorders. We studied the performance of two types of visual search (top-down visual scanning and bottom-up visual scanning in 18 patients with pure cerebellar types of spinocerebellar degeneration (SCA6: 11; SCA31: 7. The gaze fixation position was recorded with an eye-tracking device while the subjects performed two visual search tasks in which they looked for a target Landolt figure among distractors. In the serial search task, the target was similar to the distractors and the subject had to search for the target by processing each item with top-down visual scanning. In the pop-out search task, the target and distractor were clearly discernible and the visual salience of the target allowed the subjects to detect it by bottom-up visual scanning. The saliency maps clearly showed that the serial search task required top-down visual attention and the pop-out search task required bottom-up visual attention. In the serial search task, the search time to detect the target was significantly longer in SCA patients than in normal subjects, whereas the search time in the pop-out search task was comparable between the two groups. These findings suggested that SCA patients cannot efficiently scan a target using a top-down attentional process, whereas scanning with a bottom-up attentional process is not affected. In the serial search task, the amplitude of saccades was significantly smaller in SCA patients than in normal subjects. The variability of saccade amplitude (saccadic dysmetria, number of re-fixations, and unstable fixation (nystagmus were larger in SCA patients than in normal subjects, accounting for a substantial proportion of scattered fixations around the items. Saccadic dysmetria, re-fixation, and nystagmus may play important roles in the impaired top-down visual scanning in SCA, hampering precise visual processing of individual items.

  6. Bottom-up approaches for defining future climate mitigation commitments

    Energy Technology Data Exchange (ETDEWEB)

    Den Elzen, M.G.J.; Berk, M.M.

    2004-07-01

    This report analyses a number of alternative, bottom-up approaches, i.e. technology and performance standards; technology Research and Development agreements, sectoral targets (national /transnational), sector based Clean Development Mechanism (CDM), and sustainable development policies and measures (SD-PAMs). Included are technology and performance standards; technology, research and development agreements, sectoral targets (national /transnational), and sector-based (CDM), and sustainable development policies and measures (SD-PAMs). A more bottom-up approach for defining national emission targets, the so-called Triptych approach is also explored and compared with more top-down types of approaches (Multi-Stage and Contraction and Convergence) based on a quantitative and qualitative analysis. While bottom-up approaches are concluded as being valuable components of a future climate regime, they, in themselves, do not seem to offer a real alternative to emission reduction and limitation targets, as they provide little certainty about the overall environmental effectiveness of climate policies. In comparison with Multi-stage and the C and C approaches, the global Triptych approach offers the opportunity of early participation by developing countries' without the risk of creating large amounts of surplus emissions as in C and C; in using the approach we also avoid the need for dividing up the non-Annex I countries as in Multi-Stage. However, there will be substantial implementation problems related to the institutional and technical capabilities required. Thus it would seem better to exclude the least developing countries and have them first participate in some of the alternative bottom-up approaches.

  7. Magic for Filter Optimization in Dynamic Bottom-up Processing

    CERN Document Server

    Minnen, G

    1996-01-01

    Off-line compilation of logic grammars using Magic allows an incorporation of filtering into the logic underlying the grammar. The explicit definite clause characterization of filtering resulting from Magic compilation allows processor independent and logically clean optimizations of dynamic bottom-up processing with respect to goal-directedness. Two filter optimizations based on the program transformation technique of Unfolding are discussed which are of practical and theoretical interest.

  8. Bottom-up Budgeting FY 2015 Assessment: Camarines Sur

    OpenAIRE

    Maramot, Joyce Anne; Yasay, Donald B.; de Guzman, Reinier

    2015-01-01

    Bottom-up budgeting (BUB) is an adaptation of the participatory budgeting model in identifying and providing solutions to poverty at the municipal/city level. Leaders of civil society organizations engage with LGU officials in formulating a poverty alleviation plan to be considered in preparing the budget of national agencies the following fiscal year. This paper reports on how the guideline was implemented in three municipalities in Camarines Sur. The study then presents suggestions and reco...

  9. Una implementación computacional de un modelo de atención visual Bottom-up aplicado a escenas naturales/A Computational Implementation of a Bottom-up Visual Attention Model Applied to Natural Scenes

    Directory of Open Access Journals (Sweden)

    Juan F. Ramírez Villegas

    2011-12-01

    Full Text Available El modelo de atención visual bottom-up propuesto por Itti et al., 2000 [1], ha sido un modelo popular en tanto exhibe cierta evidencia neurobiológica de la visión en primates. Este trabajo complementa el modelo computacional de este fenómeno desde la dinámica realista de una red neuronal. Asimismo, esta aproximación se basa en la existencia de mapas topográficos que representan la prominencia de los objetos del campo visual para la formación de una representación general (mapa de prominencia, esta representación es la entrada de una red neuronal dinámica con interacciones locales y globales de colaboración y competencia que convergen sobre las principales particularidades (objetos de la escena.The bottom-up visual attention model proposed by Itti et al. 2000 [1], has been a popular model since it exhibits certain neurobiological evidence of primates’ vision. This work complements the computational model of this phenomenon using a neural network with realistic dynamics. This approximation is based on several topographical maps representing the objects saliency that construct a general representation (saliency map, which is the input for a dynamic neural network, whose local and global collaborative and competitive interactions converge to the main particularities (objects presented by the visual scene as well.

  10. A Bottom-Up Approach to SUSY Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Horn, Claus; /SLAC

    2011-11-11

    This paper proposes a new way to do event generation and analysis in searches for new physics at the LHC. An abstract notation is used to describe the new particles on a level which better corresponds to detector resolution of LHC experiments. In this way the SUSY discovery space can be decomposed into a small number of eigenmodes each with only a few parameters, which allows to investigate the SUSY parameter space in a model-independent way. By focusing on the experimental observables for each process investigated the Bottom-Up Approach allows to systematically study the boarders of the experimental efficiencies and thus to extend the sensitivity for new physics.

  11. Recent progress in backreacted bottom-up holographic QCD

    Energy Technology Data Exchange (ETDEWEB)

    Järvinen, Matti [Laboratoire de Physique Théorique, École Normale Supérieure, 24 rue Lhomond, 75231 Paris Cedex 05 (France)

    2016-01-22

    Recent progress in constructing holographic models for QCD is discussed, concentrating on the bottom-up models which implement holographically the renormalization group flow of QCD. The dynamics of gluons can be modeled by using a string-inspired model termed improved holographic QCD, and flavor can be added by introducing space filling branes in this model. The flavor fully backreacts to the glue in the Veneziano limit, giving rise to a class of models which are called V-QCD. The phase diagrams and spectra of V-QCD are in good agreement with results for QCD obtained by other methods.

  12. Wikipedia: organisation from a bottom-up approach

    OpenAIRE

    Spek, Sander; Postma, Eric; Herik, H. Jaap van den

    2006-01-01

    Wikipedia can be considered as an extreme form of a self-managing team, as a means of labour division. One could expect that this bottom-up approach, with the absense of top-down organisational control, would lead to a chaos, but our analysis shows that this is not the case. In the Dutch Wikipedia, an integrated and coherent data structure is created, while at the same time users succeed in distributing roles by self-selection. Some users focus on an area of expertise, while others edit over ...

  13. The Interplay of Top-Down and Bottom-Up

    DEFF Research Database (Denmark)

    Winkler, Till; Brown, Carol V.; Ozturk, Pinar

    2014-01-01

    The exchange of patient health information across different organizations involved in healthcare delivery has potential benefits for a wide range of stakeholders. However, many governments in Europe and in the U.S. have, despite both top-down and bottom-up initiatives, experienced major barriers in...... achieving sustainable models for implementing health information exchange (HIE) throughout their healthcare systems. In the case of the U.S., three years after stimulus funding allocated as part of the 2009 HITECH Act, the extent to which government funding will be needed to sustain health information...

  14. Distinguishing Top-Down From Bottom-Up Effects

    OpenAIRE

    Shea, Nicholas

    2015-01-01

    The distinction between top-down and bottom-up effects is widely relied on in experimental psychology. However, there is an important problem with the way it is normally defined. Top-down effects are effects of previously-stored information on processing the current input. But on the face of it that includes the information that is implicit in the operation of any psychological process – in its dispositions to transition from some types of representational state to others. This paper suggests...

  15. Bottom-up graphene nanoribbon field-effect transistors

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, Patrick B. [Applied Science and Technology, University of California, Berkeley, California 94720 (United States); Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States); Pedramrazi, Zahra [Department of Physics, University of California, Berkeley, California 94720 (United States); Madani, Ali [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States); Chen, Yen-Chia; Crommie, Michael F. [Department of Physics, University of California, Berkeley, California 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratories, Berkeley, California 94720 (United States); Oteyza, Dimas G. de [Department of Physics, University of California, Berkeley, California 94720 (United States); Centro de Física de Materiales CSIC/UPV-EHU-Materials Physics Center, San Sebastián E-20018 (Spain); Chen, Chen [Department of Chemistry, University of California, Berkeley, California 94720 (United States); Fischer, Felix R. [Department of Chemistry, University of California, Berkeley, California 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratories, Berkeley, California 94720 (United States); Bokor, Jeffrey, E-mail: jbokor@eecs.berkeley.edu [Department of Electrical Engineering and Computer Sciences, University of California, Berkeley, California 94720 (United States); Materials Sciences Division, Lawrence Berkeley National Laboratories, Berkeley, California 94720 (United States)

    2013-12-16

    Recently developed processes have enabled bottom-up chemical synthesis of graphene nanoribbons (GNRs) with precise atomic structure. These GNRs are ideal candidates for electronic devices because of their uniformity, extremely narrow width below 1 nm, atomically perfect edge structure, and desirable electronic properties. Here, we demonstrate nano-scale chemically synthesized GNR field-effect transistors, made possible by development of a reliable layer transfer process. We observe strong environmental sensitivity and unique transport behavior characteristic of sub-1 nm width GNRs.

  16. Saliency detection for videos using 3D FFT local spectra

    Science.gov (United States)

    Long, Zhiling; AlRegib, Ghassan

    2015-03-01

    Bottom-up spatio-temporal saliency detection identifies perceptually important regions of interest in video sequences. The center-surround model proves to be useful for visual saliency detection. In this work, we explore using 3D FFT local spectra as features for saliency detection within the center-surround framework. We develop a spectral location based decomposition scheme to divide a 3D FFT cube into two components, one related to temporal changes and the other related to spatial changes. Temporal saliency and spatial saliency are detected separately using features derived from each spectral component through a simple center-surround comparison method. The two detection results are then combined to yield a saliency map. We apply the same detection algorithm to different color channels (YIQ) and incorporate the results into the final saliency determination. The proposed technique is tested with the public CRCNS database. Both visual and numerical evaluations verify the promising performance of our technique.

  17. Fast full resolution saliency detection based on incoherent imaging system

    Science.gov (United States)

    Lin, Guang; Zhao, Jufeng; Feng, Huajun; Xu, Zhihai; Li, Qi; Chen, Yueting

    2016-05-01

    Image saliency detection is widely applied in many tasks in the field of the computer vision. In this paper, we combine the saliency detection with the Fourier optics to achieve acceleration of saliency detection algorithm. An actual optical saliency detection system is constructed within the framework of incoherent imaging system. Additionally, the application of our system to implement the bottom-up rapid pre-saliency process of primate visual saliency is discussed with dual-resolution camera. A set of experiments over our system are conducted and discussed. We also demonstrate the comparisons between our method and pure computer methods. The results show our system can produce full resolution saliency maps faster and more effective.

  18. Inverse Magnetic Catalysis in Bottom-Up Holographic QCD

    CERN Document Server

    Evans, Nick; Scott, Marc

    2016-01-01

    We explore the effect of magnetic field on chiral condensation in QCD via a simple bottom up holographic model which inputs QCD dynamics through the running of the anomalous dimension of the quark bilinear. Bottom up holography is a form of effective field theory and we use it to explore the dependence on the coefficients of the two lowest order terms linking the magnetic field and the quark condensate. In the massless theory, we identify a region of parameter space where magnetic catalysis occurs at zero temperature but inverse magnetic catalysis at temperatures of order the thermal phase transition. The model shows similar non-monotonic behaviour in the condensate with B at intermediate T as the lattice data. This behaviour is due to the separation of the meson melting and chiral transitions in the holographic framework. The introduction of quark mass raises the scale of B where inverse catalysis takes over from catalysis until the inverse catalysis lies outside the regime of validity of the effective descr...

  19. Making the results of bottom-up energy savings comparable

    Directory of Open Access Journals (Sweden)

    Moser Simon

    2012-01-01

    Full Text Available The Energy Service Directive (ESD has pushed forward the issue of energy savings calculations without clarifying the methodological basis. Savings achieved in the Member States are calculated with rather non-transparent and hardly comparable Bottom-up (BU methods. This paper develops the idea of parallel evaluation tracks separating the Member States’ issue of ESD verification and comparable savings calculations. Comparability is ensured by developing a standardised BU calculation kernel for different energy efficiency improvement (EEI actions which simultaneously depicts the different calculation options in a structured way (e.g. baseline definition, system boundaries, double counting. Due to the heterogeneity of BU calculations the approach requires a central database where Member States feed in input data on BU actions according to a predefined structure. The paper demonstrates the proposed approach including a concrete example of application.

  20. Bottom-Up Discrete Symmetries for Cabibbo Mixing

    CERN Document Server

    Varzielas, Ivo de Medeiros; Talbert, Jim

    2016-01-01

    We perform a bottom-up search for discrete non-Abelian symmetries capable of quantizing the Cabibbo angle that parameterizes CKM mixing. Given a particular Abelian symmetry structure in the up and down sectors, we construct representations of the associated residual generators which explicitly depend on the degrees of freedom present in our effective mixing matrix. We then discretize those degrees of freedom and utilize the Groups, Algorithms, Programming (GAP) package to close the associated finite groups. This short study is performed in the context of recent results indicating that, without resorting to special model-dependent corrections, no small-order finite group can simultaneously predict all four parameters of the three-generation CKM matrix and that only groups of $\\mathcal{O}(10^{2})$ can predict the analogous parameters of the leptonic PMNS matrix, regardless of whether neutrinos are Dirac or Majorana particles. Therefore a natural model of flavour might instead incorporate small(er) finite groups...

  1. Spatiochromatic Context Modeling for Color Saliency Analysis.

    Science.gov (United States)

    Zhang, Jun; Wang, Meng; Zhang, Shengping; Li, Xuelong; Wu, Xindong

    2016-06-01

    Visual saliency is one of the most noteworthy perceptual abilities of human vision. Recent progress in cognitive psychology suggests that: 1) visual saliency analysis is mainly completed by the bottom-up mechanism consisting of feedforward low-level processing in primary visual cortex (area V1) and 2) color interacts with spatial cues and is influenced by the neighborhood context, and thus it plays an important role in a visual saliency analysis. From a computational perspective, the most existing saliency modeling approaches exploit multiple independent visual cues, irrespective of their interactions (or are not computed explicitly), and ignore contextual influences induced by neighboring colors. In addition, the use of color is often underestimated in the visual saliency analysis. In this paper, we propose a simple yet effective color saliency model that considers color as the only visual cue and mimics the color processing in V1. Our approach uses region-/boundary-defined color features with spatiochromatic filtering by considering local color-orientation interactions, therefore captures homogeneous color elements, subtle textures within the object and the overall salient object from the color image. To account for color contextual influences, we present a divisive normalization method for chromatic stimuli through the pooling of contrary/complementary color units. We further define a color perceptual metric over the entire scene to produce saliency maps for color regions and color boundaries individually. These maps are finally globally integrated into a one single saliency map. The final saliency map is produced by Gaussian blurring for robustness. We evaluate the proposed method on both synthetic stimuli and several benchmark saliency data sets from the visual saliency analysis to salient object detection. The experimental results demonstrate that the use of color as a unique visual cue achieves competitive results on par with or better than 12 state

  2. On an elementary definition of visual saliency

    DEFF Research Database (Denmark)

    Loog, Marco

    2008-01-01

    Various approaches to computational modelling of bottom-up visual attention have been proposed in the past two decades. As part of this trend, researchers have studied ways to characterize the saliency map underlying many of these models. In more recent years, several definitions based on...... probabilistic and information or decision theoretic considerations have been proposed. These provide experimentally successful, appealing, low-level, operational, and elementary definitions of visual saliency (see eg, Bruce, 2005 Neurocomputing 65 125 - 133). Here, I demonstrate that, in fact, all these...

  3. Towards three-dimensional visual saliency

    OpenAIRE

    Sharma, Puneet

    2014-01-01

    A salient image region is defined as an image part that is clearly different from its surround in terms of a number of attributes. In bottom-up processing, these attributes are defined as: contrast, color difference, brightness, and orientation. By measuring these attributes, visual saliency algorithms aim to predict the regions in an image that would attract our attention under free viewing conditions, i.e., when the observer is viewing an image without a specific task such as searching for ...

  4. BUEES:a bottom-up event extraction system

    Institute of Scientific and Technical Information of China (English)

    Xiao DING; Bing QIN; Ting LIU

    2015-01-01

    Traditional event extraction systems focus mainly on event type identifi cation and event participant extraction based on pre-specifi ed event type paradigms and manually annotated corpora. However, different domains have different event type paradigms. When transferring to a new domain, we have to build a new event type paradigm and annotate a new corpus from scratch. This kind of conventional event extraction system requires massive human effort, and hence prevents event extraction from being widely applicable. In this paper, we present BUEES, a bottom-up event extraction system, which extracts events from the web in a completely unsupervised way. The system automatically builds an event type paradigm in the input corpus, and then proceeds to extract a large number of instance patterns of these events. Subsequently, the system extracts event arguments according to these patterns. By conducting a series of experiments, we demonstrate the good performance of BUEES and compare it to a state-of-the-art Chinese event extraction system, i.e., a supervised event extraction system. Experimental results show that BUEES performs comparably (5% higher F-measure in event type identifi cation and 3% higher F-measure in event argument extraction), but without any human effort.

  5. Nonplanar conductive surfaces via "bottom-up" nanostructured gold coating.

    Science.gov (United States)

    Vinod, T P; Jelinek, Raz

    2014-03-12

    Development of technologies for the construction of bent, curved, and flexible conductive surfaces is among the most important albeit challenging goals in the promising field of "flexible electronics". We present a generic solution-based "bottom-up" approach for assembling conductive gold nanostructured layers on nonplanar polymer surfaces. The simple two-step experimental scheme is based upon incubation of an amine-displaying polymer [the abundantly used poly(dimethylsiloxane) (PDMS), selected here as a proof of concept] with Au(SCN)4(-), followed by a brief treatment with a conductive polymer [poly(3,4-thylenedioxythiophene)/poly(styrenesulfonate)] solution. Importantly, no reducing agent is co-added to the gold complex solution. The resultant surfaces are conductive and exhibit a unique "nanoribbon" gold morphology. The scheme yields conductive layers upon PDMS in varied configurations: planar, "wrinkled", and mechanically bent surfaces. The technology is simple, inexpensive, and easy to implement for varied polymer surfaces (and other substances), opening the way for practical applications in flexible electronics and related fields. PMID:24548243

  6. Bottom-Up Synthesis and Sensor Applications of Biomimetic Nanostructures

    Directory of Open Access Journals (Sweden)

    Li Wang

    2016-01-01

    Full Text Available The combination of nanotechnology, biology, and bioengineering greatly improved the developments of nanomaterials with unique functions and properties. Biomolecules as the nanoscale building blocks play very important roles for the final formation of functional nanostructures. Many kinds of novel nanostructures have been created by using the bioinspired self-assembly and subsequent binding with various nanoparticles. In this review, we summarized the studies on the fabrications and sensor applications of biomimetic nanostructures. The strategies for creating different bottom-up nanostructures by using biomolecules like DNA, protein, peptide, and virus, as well as microorganisms like bacteria and plant leaf are introduced. In addition, the potential applications of the synthesized biomimetic nanostructures for colorimetry, fluorescence, surface plasmon resonance, surface-enhanced Raman scattering, electrical resistance, electrochemistry, and quartz crystal microbalance sensors are presented. This review will promote the understanding of relationships between biomolecules/microorganisms and functional nanomaterials in one way, and in another way it will guide the design and synthesis of biomimetic nanomaterials with unique properties in the future.

  7. Bottom-up Visual Integration in the Medial Parietal Lobe.

    Science.gov (United States)

    Pflugshaupt, Tobias; Nösberger, Myriam; Gutbrod, Klemens; Weber, Konrad P; Linnebank, Michael; Brugger, Peter

    2016-03-01

    Largely based on findings from functional neuroimaging studies, the medial parietal lobe is known to contribute to internally directed cognitive processes such as visual imagery or episodic memory. Here, we present 2 patients with behavioral impairments that extend this view. Both had chronic unilateral lesions of nearly the entire medial parietal lobe, but in opposite hemispheres. Routine neuropsychological examination conducted >4 years after the onset of brain damage showed little deficits of minor severity. In contrast, both patients reported persistent unusual visual impairment. A comprehensive series of tachistoscopic experiments with lateralized stimulus presentation and comparison with healthy participants revealed partial visual hemiagnosia for stimuli presented to their contralesional hemifield, applying inferential single-case statistics to evaluate deficits and dissociations. Double dissociations were found in 4 experiments during which participants had to integrate more than one visual element, either through comparison or formation of a global gestalt. Against the background of recent neuroimaging findings, we conclude that of all medial parietal structures, the precuneus is the most likely candidate for a crucial involvement in such bottom-up visual integration. PMID:25331599

  8. Bottom-Up Colloidal Crystal Assembly with a Twist.

    Science.gov (United States)

    Mahynski, Nathan A; Rovigatti, Lorenzo; Likos, Christos N; Panagiotopoulos, Athanassios Z

    2016-05-24

    Globally ordered colloidal crystal lattices have broad utility in a wide range of optical and catalytic devices, for example, as photonic band gap materials. However, the self-assembly of stereospecific structures is often confounded by polymorphism. Small free-energy differences often characterize ensembles of different structures, making it difficult to produce a single morphology at will. Current techniques to handle this problem adopt one of two approaches: that of the "top-down" or "bottom-up" methodology, whereby structures are engineered starting from the largest or smallest relevant length scales, respectively. However, recently, a third approach for directing high fidelity assembly of colloidal crystals has been suggested which relies on the introduction of polymer cosolutes into the crystal phase [Mahynski, N.; Panagiotopoulos, A. Z.; Meng, D.; Kumar, S. K. Nat. Commun. 2014, 5, 4472]. By tuning the polymer's morphology to interact uniquely with the void symmetry of a single desired crystal, the entropy loss associated with polymer confinement has been shown to strongly bias the formation of that phase. However, previously, this approach has only been demonstrated in the limiting case of close-packed crystals. Here, we show how this approach may be generalized and extended to complex open crystals, illustrating the utility of this "structure-directing agent" paradigm in engineering the nanoscale structure of ordered colloidal materials. The high degree of transferability of this paradigm's basic principles between relatively simple crystals and more complex ones suggests that this represents a valuable addition to presently known self-assembly techniques. PMID:27124487

  9. Saliency computation via whitened frequency band selection.

    Science.gov (United States)

    Lv, Qi; Wang, Bin; Zhang, Liming

    2016-06-01

    Many saliency computational models have been proposed to simulate bottom-up visual attention mechanism of human visual system. However, most of them only deal with certain kinds of images or aim at specific applications. In fact, human beings have the ability to correctly select attentive focuses of objects with arbitrary sizes within any scenes. This paper proposes a new bottom-up computational model from the perspective of frequency domain based on the biological discovery of non-Classical Receptive Field (nCRF) in the retina. A saliency map can be obtained according to the idea of Extended Classical Receptive Field. The model is composed of three major steps: firstly decompose the input image into several feature maps representing different frequency bands that cover the whole frequency domain by utilizing Gabor wavelet. Secondly, whiten the feature maps to highlight the embedded saliency information. Thirdly, select some optimal maps, simulating the response of receptive field especially nCRF, to generate the saliency map. Experimental results show that the proposed algorithm is able to work with stable effect and outstanding performance in a variety of situations as human beings do and is adaptive to both psychological patterns and natural images. Beyond that, biological plausibility of nCRF and Gabor wavelet transform make this approach reliable. PMID:27275381

  10. Acute alcohol effects on attentional bias are mediated by subcortical areas associated with arousal and salience attribution.

    Science.gov (United States)

    Nikolaou, Kyriaki; Field, Matt; Critchley, Hugo; Duka, Theodora

    2013-06-01

    Acute alcohol ingestion increases attentional bias to alcohol-related stimuli; however, the underlying cognitive and brain mechanisms remain unknown. We combined functional magnetic resonance imaging (fMRI) with performance of a dual task that probed attentional distraction by alcohol-related stimuli during 'conflict' processing: the Concurrent Flanker/Alcohol-Attentional bias task (CFAAT). In this task, an Eriksen Flanker task is superimposed on task-unrelated background pictures with alcohol-associated or neutral content. Participants respond to the direction of a central 'target' arrow and ignore adjacent congruent (low cognitive load) or incongruent (high cognitive load) 'flanking' arrows. Using a between-subject design, 40 healthy moderate-to-heavy social drinkers received either no alcohol (placebo), 0.4 g/kg (low dose), or 0.8 g/kg (high dose) of alcohol, and underwent fMRI while performing the CFAAT. The low alcohol dose, relative to placebo, increased response latencies on trials with alcohol-associated backgrounds and, under low cognitive load, increased the activity evoked by these pictures within a medial hypothalamic region. Under high cognitive load, the low alcohol dose, relative to placebo, elicited greater activity within a more lateral hypothalamic region, and reduced activity within frontal motor areas. The high alcohol dose, relative to placebo, did not reliably affect response latencies or neural responses to background images, but reduced overall accuracy under high cognitive load. This effect correlated with changes in reactivity within medial and dorsal prefrontal cortices. These data suggest that alcohol at a low dose primes attentional bias to alcohol-associated stimuli, an effect mediated by activation of subcortical hypothalamic areas implicated in arousal and salience attribution. PMID:23361162

  11. Visual saliency computations: mechanisms, constraints, and the effect of feedback.

    Science.gov (United States)

    Soltani, Alireza; Koch, Christof

    2010-09-22

    The primate visual system continuously selects spatial proscribed regions, features or objects for further processing. These selection mechanisms--collectively termed selective visual attention--are guided by intrinsic, bottom-up and by task-dependent, top-down signals. While much psychophysical research has shown that overt and covert attention is partially allocated based on saliency-driven exogenous signals, it is unclear how this is accomplished at the neuronal level. Recent electrophysiological experiments in monkeys point to the gradual emergence of saliency signals when ascending the dorsal visual stream and to the influence of top-down attention on these signals. To elucidate the neural mechanisms underlying these observations, we construct a biologically plausible network of spiking neurons to simulate the formation of saliency signals in different cortical areas. We find that saliency signals are rapidly generated through lateral excitation and inhibition in successive layers of neural populations selective to a single feature. These signals can be improved by feedback from a higher cortical area that represents a saliency map. In addition, we show how top-down attention can affect the saliency signals by disrupting this feedback through its action on the saliency map. While we find that saliency computations require dominant slow NMDA currents, the signal rapidly emerges from successive regions of the network. In conclusion, using a detailed spiking network model we find biophysical mechanisms and limitations of saliency computations which can be tested experimentally. PMID:20861387

  12. The Galactic Center Excess from the Bottom Up

    CERN Document Server

    Izaguirre, Eder; Shuve, Brian

    2014-01-01

    It has recently been shown that dark-matter annihilation to bottom quarks provides a good fit to the galactic-center gamma-ray excess identified in the Fermi-LAT data. In the favored dark matter mass range $m\\sim 30-40$ GeV, achieving the best-fit annihilation rate $\\sigma v \\sim 5\\times 10^{-26}$ cm$^{3}$ s$^{-1}$ with perturbative couplings requires a sub-TeV mediator particle that interacts with both dark matter and bottom quarks. In this paper, we consider the minimal viable scenarios in which a Standard Model singlet mediates s-channel interactions {\\it only} between dark matter and bottom quarks, focusing on axial-vector, vector, and pseudoscalar couplings. Using simulations that include on-shell mediator production, we show that existing sbottom searches currently offer the strongest sensitivity over a large region of the favored parameter space explaining the gamma-ray excess, particularly for axial-vector interactions. The 13 TeV LHC will be even more sensitive; however, it may not be sufficient to f...

  13. Saccade generation by the frontal eye fields in rhesus monkeys is separable from visual detection and bottom-up attention shift.

    Directory of Open Access Journals (Sweden)

    Kyoung-Min Lee

    Full Text Available The frontal eye fields (FEF, originally identified as an oculomotor cortex, have also been implicated in perceptual functions, such as constructing a visual saliency map and shifting visual attention. Further dissecting the area's role in the transformation from visual input to oculomotor command has been difficult because of spatial confounding between stimuli and responses and consequently between intermediate cognitive processes, such as attention shift and saccade preparation. Here we developed two tasks in which the visual stimulus and the saccade response were dissociated in space (the extended memory-guided saccade task, and bottom-up attention shift and saccade target selection were independent (the four-alternative delayed saccade task. Reversible inactivation of the FEF in rhesus monkeys disrupted, as expected, contralateral memory-guided saccades, but visual detection was demonstrated to be intact at the same field. Moreover, saccade behavior was impaired when a bottom-up shift of attention was not a prerequisite for saccade target selection, indicating that the inactivation effect was independent of the previously reported dysfunctions in bottom-up attention control. These findings underscore the motor aspect of the area's functions, especially in situations where saccades are generated by internal cognitive processes, including visual short-term memory and long-term associative memory.

  14. Fabricating ordered functional nanostructures onto polycrystalline substrates from the bottom-up

    Energy Technology Data Exchange (ETDEWEB)

    Torres, Maria, E-mail: mtorres@drexel.edu; Pardo, Lorena; Ricote, Jesus [Instituto de Ciencia de Materiales de Madrid (Spain); Fuentes-Cobas, Luis E. [Centro de Investigacion en Materiales Avanzados (Mexico); Rodriguez, Brian J. [University College Dublin, Belfield, School of Physics (Ireland); Calzada, M. Lourdes, E-mail: lcalzada@icmm.csic.es [Instituto de Ciencia de Materiales de Madrid (Spain)

    2012-10-15

    Microemulsion-mediated synthesis has emerged as a powerful bottom-up procedure for the preparation of ferroelectric nanostructures onto substrates. However, periodical order has yet to be achieved onto polycrystalline Pt-coated Si substrates. Here, we report a new methodology that involves microemulsion-mediated synthesis and the controlled modification of the surface of the substrate by coating it with a template-layer of water-micelles. This layer modifies the surface tension of the substrate and yields a periodic arrangement of ferroelectric crystalline nanostructures. The size of the nanostructures is decreased to the sub-50 nm range and they show a hexagonal order up to the third neighbors, which corresponds to a density of 275 Gb in{sup -2}. The structural analysis of the nanostructures by synchrotron X-ray diffraction confirms that the nanostructures have a PbTiO{sub 3} perovskite structure, with lattice parameters of a = b = 3.890(0) A and c = 4.056(7) A. Piezoresponse force microscopy confirmed the ferro-piezoelectric character of the nanostructures. This simple methodology is valid for the self-assembly of other functional oxides onto polycrystalline substrates, enabling their reliable integration into micro/nano devices.

  15. Mapping practices of project management – merging top-down and bottom-up perspectives

    DEFF Research Database (Denmark)

    Thuesen, Christian

    2015-01-01

    This paper presents a new methodology for studying different accounts of project management practices based on network mapping and analysis. Drawing upon network mapping and visualization as an analytical strategy top-down and bottom-up accounts of project management practice are analysed and...... promising strategy for visualizing and analysing different accounts of project management practices....... compared. The analysis initially reveals a substantial difference between the top-down and bottom-up accounts of practice. Furthermore it identifies a soft side of project management that is central in the bottom-up account but absent from the top-down. Finally, the study shows that network mapping is a...

  16. Cooperation between Top-Down and Bottom-Up Theorem Provers by Subgoal Clause Transfer

    OpenAIRE

    Fuchs, Dirk

    1999-01-01

    Top-down and bottom-up theorem proving approaches have each specific ad-vantages and disadvantages. Bottom-up provers profit from strong redundancycontrol and suffer from the lack of goal-orientation, whereas top-down provers aregoal-oriented but have weak calculi when their proof lengths are considered. Inorder to integrate both approaches our method is to achieve cooperation betweena top-down and a bottom-up prover: The top-down prover generates subgoalclauses, then they are processed by a ...

  17. Saliency-Based Fidelity Adaptation Preprocessing for Video Coding

    Institute of Scientific and Technical Information of China (English)

    Shao-Ping Lu; Song-Hai Zhang

    2011-01-01

    In this paper, we present a video coding scheme which applies the technique of visual saliency computation to adjust image fidelity before compression. To extract visually salient features, we construct a spatio-temporal saliency map by analyzing the video using a combined bottom-up and top-down visual saliency model. We then use an extended bilateral filter, in which the local intensity and spatial scales are adjusted according to visual saliency, to adaptively alter the image fidelity. Our implementation is based on the H.264 video encoder JM12.0. Besides evaluating our scheme with the H.264 reference software, we also compare it to a more traditional foreground-background segmentation-based method and a foveation-based approach which employs Gaussian blurring. Our results show that the proposed algorithm can improve the compression ratio significantly while effectively preserving perceptual visual quality.

  18. Social and ethical checkpoints for bottom-up synthetic biology, or protocells.

    Science.gov (United States)

    Bedau, Mark A; Parke, Emily C; Tangen, Uwe; Hantsche-Tangen, Brigitte

    2009-12-01

    An alternative to creating novel organisms through the traditional "top-down" approach to synthetic biology involves creating them from the "bottom up" by assembling them from non-living components; the products of this approach are called "protocells." In this paper we describe how bottom-up and top-down synthetic biology differ, review the current state of protocell research and development, and examine the unique ethical, social, and regulatory issues raised by bottom-up synthetic biology. Protocells have not yet been developed, but many expect this to happen within the next five to ten years. Accordingly, we identify six key checkpoints in protocell development at which particular attention should be given to specific ethical, social and regulatory issues concerning bottom-up synthetic biology, and make ten recommendations for responsible protocell science that are tied to the achievement of these checkpoints. PMID:19816801

  19. The updated bottom up solution applied to atmospheric pressure photoionization and electrospray ionization mass spectrometry

    Science.gov (United States)

    The Updated Bottom Up Solution (UBUS) was recently applied to atmospheric pressure chemical ionization (APCI) mass spectrometry (MS) of triacylglycerols (TAGs). This report demonstrates that the UBUS applies equally well to atmospheric pressure photoionization (APPI) MS and to electrospray ionizatio...

  20. A Bottom up Initiative: Meditation & Mindfulness 'Eastern' Practices in the "Western" Academia

    DEFF Research Database (Denmark)

    Singla, Rashmi

    case of bottom up initiative, where the students themselves have demanded inclusion of non- conventional psychosocial interventions illustrated by meditation and mindfulness as Eastern psychological practices, thus filling the gap related to the existential, spiritual approaches. The western...

  1. the effect of intergroup threat and social identity salience on the belief in conspiracy theories over terrorism in indonesia: collective angst as a mediator

    Directory of Open Access Journals (Sweden)

    Ali Mashuri

    2015-01-01

    Full Text Available The present study tested how intergroup threat (high versus low and social identity as a Muslim (salient versus non-salient affected belief in conspiracy theories. Data among Indonesian Muslim students (N = 139 from this study demonstrated that intergroup threat and social identity salience interacted to influence belief in conspiracy theories. High intergroup threat triggered greater belief in conspiracy theories than low intergroup threat, more prominently in the condition in which participants’ Muslim identity was made salient. Collective angst also proved to mediate the effect of intergroup threat on the belief. However, in line with the prediction, evidence of this mediation effect of collective angst was only on the salient social identity condition. Discussions on these research findings build on both theoretical and practical implications.

  2. The Chicago Fire of 1871: A Bottom Up Approach to Disaster Relief

    OpenAIRE

    Emily C. Skarbek

    2014-01-01

    Can bottom-up relief efforts lead to recovery after disasters? Conventional wisdom and contemporary public policy suggest that major crises require centralized authority to provide disaster relief goods. Using a novel set of comprehensive donation and expenditure data collected from archival records, this paper examines a bottom-up relief effort following one of the most devastating natural disasters of the nineteenth century: the Chicago Fire of 1871. Findings show that while there was no ce...

  3. A Critique of 'Bottom-up' Peacebuilding: Do Peaceful Individuals Make Peaceful Societies?

    OpenAIRE

    Lefranc, Sandrine

    2011-01-01

    This chapter is concerned with dialogue-based post-conflict practices currently being promoted vigorously by certain international organizations. On the one hand, there is "transitional justice" with its cornerstone forums, deemed truth and reconciliation commissions. On the other hand, there are various "bottom-up" peacebuilding techniques (local dialogues, coexistence programs, conflict resolution training, and so on). Generally speaking, these "bottom-up" approaches work to transform indiv...

  4. Mapping practices of project management – merging top-down and bottom-up perspectives

    OpenAIRE

    Thuesen, Christian

    2015-01-01

    This paper presents a new methodology for studying different accounts of project management practices based on network mapping and analysis. Drawing upon network mapping and visualization as an analytical strategy top-down and bottom-up accounts of project management practice are analysed and compared. The analysis initially reveals a substantial difference between the top-down and bottom-up accounts of practice. Furthermore it identifies a soft side of project management that is central in t...

  5. Methane and ethane from global oil and gas production: bottom-up simulations over three decades

    OpenAIRE

    Höglund-Isaksson, L.

    2016-01-01

    Existing bottom-up emission inventories of historical methane and ethane emissions from global oil and gas systems do not well explain year-on-year variations estimated by top-down models from atmospheric measurements. This paper develops a bottom-up methodology which allows for country- and year specific source attribution of methane and ethane emissions from global oil and natural gas production for the period 1980 to 2012. The analysis rests on country-specific simulations of associated ga...

  6. Bottoms Up

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    China’s high-end liquor is becoming a luxury item and a favorite among collectors spring Festival, the most important festival for the Chinese, is a time for celebration—and what would a celebration be without bottles of holi-

  7. Visual Saliency Computations: Mechanisms, Constraints, and the Effect of Feedback

    OpenAIRE

    Soltani, Alireza; Koch, Christof

    2010-01-01

    The primate visual system continuously selects spatial proscribed regions, features or objects for further processing. These selection mechanisms—collectively termed selective visual attention—are guided by intrinsic, bottom-up and by task-dependent, top-down signals. While much psychophysical research has shown that overt and covert attention is partially allocated based on saliency-driven exogenous signals, it is unclear how this is accomplished at the neuronal level. Recent electrophysiolo...

  8. High thermoelectric figure of merit nanostructured pnictogen chalcogenides by bottom-up synthesis and assembly

    Science.gov (United States)

    Mehta, Rutvik J.

    Thermoelectric materials offer promise for realizing transformative environmentallyfriendly solid-state refrigeration technologies that could replace current technologies based on ozone-depleting liquid coolants. The fruition of this vision requires factorial enhancements in the figure of merit (ZT) of thermoelectric materials, necessitating high Seebeck coefficient (alpha), high electrical conductivity (sigma) and low thermal conductivity (kappa). This thesis reports a novel bottom-up approach to scalably sculpt large quantities (>10g/minute) of V 2VI3 nanocrystals with controllable shapes and sizes, and assemble them into bulk samples to obtain both high power factors alpha 2sigma as well as unprecedentedly low kappa through tunable doping and nanostructuring. The thesis demonstrates a surfactant-mediated microwave-solvothermal synthesis technique that selectively yields both n- and p-typed pnictogen chalcogenide (Bi2Te3, Sb2Te3, Bi2Se3) nanoplates and, nanowires and nanotubes (Sb 2Se3) that can be sintered to obtain 25-250 % increases in ZT>1 compared to their non-nanostructured and un-doped counterparts. A key result is that nanostructuring diminishes the lattice thermal conductivity kappa L to ultra-low values of 0.2-0.5 Wm-1K-1. Sub-atomic-percent sulfur doping and sulfurization of the pnictogen chalcogenides induced through mercaptan-terminated organic surfactants used in the synthesis result in large Seebeck coefficients between -240 nanocomposites by mixing nanoplates of different materials (e.g., S-doped Sb2Te3 and S-doped Bi2Te3) and forming heterostructures of metals and chalcogenides. The thesis finally demonstrates the extendibility of the novel synthesis and assembly approach to tailor the thermoelectric properties of other non-traditional thermoelectric materials systems.

  9. Bottom-up learning of hierarchical models in a class of deterministic POMDP environments

    Directory of Open Access Journals (Sweden)

    Itoh Hideaki

    2015-09-01

    Full Text Available The theory of partially observable Markov decision processes (POMDPs is a useful tool for developing various intelligent agents, and learning hierarchical POMDP models is one of the key approaches for building such agents when the environments of the agents are unknown and large. To learn hierarchical models, bottom-up learning methods in which learning takes place in a layer-by-layer manner from the lowest to the highest layer are already extensively used in some research fields such as hidden Markov models and neural networks. However, little attention has been paid to bottom-up approaches for learning POMDP models. In this paper, we present a novel bottom-up learning algorithm for hierarchical POMDP models and prove that, by using this algorithm, a perfect model (i.e., a model that can perfectly predict future observations can be learned at least in a class of deterministic POMDP environments

  10. Integrated Assessment of Energy Policies: A Decomposition of Top-Down and Bottom-Up

    Energy Technology Data Exchange (ETDEWEB)

    Boehringer, Christoph (Univ. of Oldenburg (Germany)); Rutherford, Thomas F. (ETH Zuerich (Switzerland))

    2008-01-15

    The formulation of market equilibrium problems as mixed complementarity problems (MCP) permits integration of bottom-up programming models of the energy system into top-down general equilibrium models of the overall economy. Yet, in practise the MCP approach loses analytical tractability of income effects, when the energy system includes upper and lowrbounds on many decision variables . We therefore advocate the use of complementarity methods to solve only the top-down economic equilibrium model and employ quadratic programming to solve the underlying bottom-up energy supply model. A simple iterative procedure reconciles the equilibrium prices and quantities between both models.

  11. Increased performance in a bottom-up designed robot by experimentally guided redesign

    DEFF Research Database (Denmark)

    Larsen, Jørgen Christian

    2013-01-01

    Purpose – Using a bottom-up, model-free approach when building robots is often seen as a less scientific way, compared to a top-down model-based approach, because the results are not easily generalizable to other systems. The authors, however, hypothesize that this problem may be addressed by using...... solid experimental methods. The purpose of this paper is to show how well-known experimental methods from bio-mechanics are used to measure and locate weaknesses in a bottom-up, model-free implementation of a quadruped walker and come up with a better solution. Design/methodology/approach – To study the...

  12. Fast and bottom-up object detection, segmentation, and evaluation using gestalt principles

    OpenAIRE

    Kootstra G.; Kragic D.

    2011-01-01

    In many scenarios, domestic robot will regularly encounter unknown objects. In such cases, top-down knowledge about the object for detection, recognition, and classification cannot be used. To learn about the object, or to be able to grasp it, bottom-up object segmentation is an important competence for the robot. Also when there is top-down knowledge, prior segmentation of the object can improve recognition and classification. In this paper, we focus on the problem of bottom-up detection and...

  13. TMS-induced theta phase synchrony reveals a bottom-up network in working memory.

    Science.gov (United States)

    Miyauchi, Eri; Kitajo, Keiichi; Kawasaki, Masahiro

    2016-05-27

    Global theta phase synchronization between the frontal and sensory areas has been suggested to connect the relevant areas for executive processes of working memory (WM). However, little is known regarding network directionality (i.e. top-down or bottom-up) of this interaction. To address the issue, the present study conducted transcranial magnetic stimulation (TMS)-electroencephalography (EEG) experiment during WM tasks. Results showed that TMS-induced increases in theta phase synchronization were observed only when TMS was delivered to the sensory areas but not the frontal area. These findings suggest that network directionality represented in WM is bottom-up rather than top-down. PMID:27063284

  14. Social and ethical checkpoints for bottom-up synthetic biology, or protocells

    OpenAIRE

    Bedau M.A.; Parke E.C.; Tangen U.; Hantsche-Tangen B.

    2009-01-01

    An alternative to creating novel organisms through the traditional “top-down” approach to synthetic biology involves creating them from the “bottom up” by assembling them from non-living components; the products of this approach are called “protocells.” In this paper we describe how bottom-up and top-down synthetic biology differ, review the current state of protocell research and development, and examine the unique ethical, social, and regulatory issues raised by bottom-up synthetic biology....

  15. A bottom-up model to describe consumers’ preferences towards late season peaches

    OpenAIRE

    Etiénne Groot; Luis M. Albisu

    2015-01-01

    Peaches are consumed in Mediterranean countries since ancient times. Nowadays there are few areas in Europe that produce peaches with Protected Designation of Origin (PDO), and the Calanda area is one of them. The aim of this work is to describe consumers’ preferences towards late season PDO Calanda peaches in the city of Zaragoza, Spain, by a bottom-up model. The bottom-up model proves greater amount of information than top-down models. In this approach it is estimated one utility function p...

  16. Bottom-up or top-down in dream neuroscience? A top-down critique of two bottom-up studies.

    Science.gov (United States)

    Foulkes, David; Domhoff, G William

    2014-07-01

    Recent neuroscientific studies of dreaming, specifically those in relation to waking sensory-motor impairments, but also more generally, betray a faulty understanding of the sort of process that dreaming is. They adhere to the belief that dreaming is a bottom-up phenomenon, whose form and content is dictated by sensory-motor brain stem activity, rather than a top-down process initiated and controlled by higher-level cognitive systems. But empirical data strongly support the latter alternative, and refute the conceptualization and interpretation of recent studies of dreaming in sensory-motor impairment in particular and of recent dream neuroscience in general. PMID:24905546

  17. A proto-object-based computational model for visual saliency.

    Science.gov (United States)

    Yanulevskaya, Victoria; Uijlings, Jasper; Geusebroek, Jan-Mark; Sebe, Nicu; Smeulders, Arnold

    2013-01-01

    State-of-the-art bottom-up saliency models often assign high saliency values at or near high-contrast edges, whereas people tend to look within the regions delineated by those edges, namely the objects. To resolve this inconsistency, in this work we estimate saliency at the level of coherent image regions. According to object-based attention theory, the human brain groups similar pixels into coherent regions, which are called proto-objects. The saliency of these proto-objects is estimated and incorporated together. As usual, attention is given to the most salient image regions. In this paper we employ state-of-the-art computer vision techniques to implement a proto-object-based model for visual attention. Particularly, a hierarchical image segmentation algorithm is used to extract proto-objects. The two most powerful ways to estimate saliency, rarity-based and contrast-based saliency, are generalized to assess the saliency at the proto-object level. The rarity-based saliency assesses if the proto-object contains rare or outstanding details. The contrast-based saliency estimates how much the proto-object differs from the surroundings. However, not all image regions with high contrast to the surroundings attract human attention. We take this into account by distinguishing between external and internal contrast-based saliency. Where the external contrast-based saliency estimates the difference between the proto-object and the rest of the image, the internal contrast-based saliency estimates the complexity of the proto-object itself. We evaluate the performance of the proposed method and its components on two challenging eye-fixation datasets (Judd, Ehinger, Durand, & Torralba, 2009; Subramanian, Katti, Sebe, Kankanhalli, & Chua, 2010). The results show the importance of rarity-based and both external and internal contrast-based saliency in fixation prediction. Moreover, the comparison with state-of-the-art computational models for visual saliency demonstrates the

  18. An integrated top-down and bottom-up strategy for characterization protein isoforms and modifications

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Si; Tolic, Nikola; Tian, Zhixin; Robinson, Errol W.; Pasa-Tolic, Ljiljana

    2011-04-15

    Bottom-up and top-down strategies are two commonly used methods for mass spectrometry (MS) based protein identification; each method has its own advantages and disadvantages. In this chapter, we describe an integrated top-down and bottom-up approach facilitated by concurrent liquid chromatography-mass spectrometry (LC-MS) analysis and fraction collection for comprehensive high-throughput intact protein profiling. The approach employs a high resolution reversed phase (RP) LC separation coupled with LC eluent fraction collection and concurrent on-line MS with a high field (12 Tesla) Fourier-transform ion cyclotron resonance (FTICR) mass spectrometer. Protein elusion profiles and tentative modified protein identification are made using detected intact protein mass in conjunction with bottom-up protein identifications from the enzymatic digestion and analysis of corresponding LC fractions. Specific proteins of biological interest are incorporated into a target ion list for subsequent off-line gas-phase fragmentation that uses an aliquot of the original collected LC fraction, an aliquot of which was also used for bottom-up analysis.

  19. Bottom-Up Molecular Tunneling Junctions Formed by Self-Assembly

    NARCIS (Netherlands)

    Zhang, Yanxi; Zhao, Zhiyuan; Fracasso, Davide; Chiechi, Ryan C

    2014-01-01

    This Minireview focuses on bottom-up molecular tunneling junctions - a fundamental component of molecular electronics - that are formed by self-assembly. These junctions are part of devices that, in part, fabricate themselves, and therefore, are particularly dependent on the chemistry of the molecul

  20. Ways toward a European Vocational Education and Training Space: A "Bottom-Up" Approach

    Science.gov (United States)

    Blings, Jessica; Spottl, Georg

    2008-01-01

    Purpose: This paper seeks to concentrate on bottom-up approaches in order to promote a European vocational education and training (VET) concept. The overall aim of this article is to demonstrate that sophisticated approaches still have a chance of becoming common practice in European countries. Design/methodology/approach: The centre of the…

  1. Top-Down or Bottom-Up? Coping with Territorial Fragmentation in the Czech Republic

    Czech Academy of Sciences Publication Activity Database

    Illner, Michal

    Houndmills, Basingstoke: PALGRAVE MACMILLAN, 2010 - (Baldersheim, H.; Lawrence, E.), s. 214-233 ISBN 978-0-230-23333-1 R&D Projects: GA MŠk 2D06006 Institutional research plan: CEZ:AV0Z70280505 Keywords : Czech Republic * municipalities * bottom-up consolidation Subject RIV: AO - Sociology, Demography

  2. Conciliating TOP-DOWN and BOTTOM-UP approaches in Websites quality evaluation

    OpenAIRE

    Biscoglio, Isabella; Trentanni, Gianluca

    2007-01-01

    Websites are the most important media of our times. Consequently a method which allows us to better evaluate websites quality is priceless. In this paper two websites evaluation opposite approaches, namely "bottom-up" and "top-down", are compared and an hypothesis of their meeting in the middle is shown.

  3. Bottom-up control of water hyacinth weevil populations: Do the plants regulate the insects?

    Science.gov (United States)

    A key measure of dietary sufficiency relates to an insect’s reproductive ability so oögenesis, a nutrient-limited process, can be subject to bottom-up regulation. We hypothesized that aquatic nutrient flux seasonally affects ovarian development thereby controlling population growth of two specialis...

  4. Nanoelectronics: the Hall Effect and Measurement of Electrochemical Potentials by «Bottom-Up» Approach

    OpenAIRE

    Yu.A. Kruglyak; P.A. Kondratenko; Yu.М. Lopatkin

    2015-01-01

    Classical and quantum Hall effects, measurement of electrochemical potentials, the Landauer formulas and Buttiker formula, measurement of Hall potential, an account of magnetic field in the NEGF method, quantum Hall effect, Landau method, and edge states in graphene are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  5. Oriented bottom-up growth of armchair graphene nanoribbons on germanium

    Energy Technology Data Exchange (ETDEWEB)

    Arnold, Michael Scott; Jacobberger, Robert Michael

    2016-03-15

    Graphene nanoribbon arrays, methods of growing graphene nanoribbon arrays and electronic and photonic devices incorporating the graphene nanoribbon arrays are provided. The graphene nanoribbons in the arrays are formed using a scalable, bottom-up, chemical vapor deposition (CVD) technique in which the (001) facet of the germanium is used to orient the graphene nanoribbon crystals along the [110] directions of the germanium.

  6. Nanoelectronics: the Hall Effect and Measurement of Electrochemical Potentials by «Bottom-Up» Approach

    Directory of Open Access Journals (Sweden)

    Yu.A. Kruglyak

    2015-06-01

    Full Text Available Classical and quantum Hall effects, measurement of electrochemical potentials, the Landauer formulas and Buttiker formula, measurement of Hall potential, an account of magnetic field in the NEGF method, quantum Hall effect, Landau method, and edge states in graphene are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  7. How Adolescents Comprehend Unfamiliar Proverbs: The Role of Top-Down and Bottom-Up Processes.

    Science.gov (United States)

    Nippold, Marilyn A.; Allen, Melissa M.; Kirsch, Dixon I.

    2000-01-01

    Relationships between word knowledge and proverb comprehension was examined in 150 typically achieving adolescents (ages 12, 15, and 18). Word knowledge was associated with proverb comprehension in all groups, particularly in the case of abstract proverbs. Results support a model of proverb comprehension in adolescents that includes bottom-up in…

  8. Co-financing of bottom-up approaches towards Broadband Infrastructure Development

    DEFF Research Database (Denmark)

    Williams, Idongesit

    2016-01-01

    networks –leading to the demise of some of these initiatives. This paper proposes co-financing of these networks as a means of sustaining the bottom-up Broadband network. The argument of this paper is anchored on two of developing country cases. One in India and the other in Ghana. One survived with...

  9. Learning affects top down and bottom up modulation of eye movements in decision making

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund; Bagger, Martin; Mueller Loose, Simone

    2013-01-01

    different information presentation formats. We thereby operationalized top down and bottom up control as the effect of individual utility levels and presentation formats on attention capture on a trial-by-trial basis. The experiment revealed an increase in top down control of eye movements over time...

  10. Learning affects top down and bottom up modulation of eye movements in decision making

    Directory of Open Access Journals (Sweden)

    Jacob L. Orquin

    2013-11-01

    Full Text Available Repeated decision making is subject to changes over time such as decreases in decision time and information use and increases in decision accuracy. We show that a traditional strategy selection view of decision making cannot account for these temporal dynamics without relaxing main assumptions about what defines a decision strategy. As an alternative view we suggest that temporal dynamics in decision making are driven by attentional and perceptual processes and that this view has been expressed in the information reduction hypothesis. We test the information reduction hypothesis by integrating it in a broader framework of top down and bottom up processes and derive the predictions that repeated decisions increase top down control of attention capture which in turn leads to a reduction in bottom up attention capture. To test our hypotheses we conducted a repeated discrete choice experiment with three different information presentation formats. We thereby operationalized top down and bottom up control as the effect of individual utility levels and presentation formats on attention capture on a trial-by-trial basis. The experiment revealed an increase in top down control of eye movements over time and that decision makers learn to attend to high utility stimuli and ignore low utility stimuli. We furthermore find that the influence of presentation format on attention capture reduces over time indicating diminishing bottom up control.

  11. A photofunctional bottom-up bis(dipyrrinato)zinc(II) complex nanosheet

    Science.gov (United States)

    Sakamoto, Ryota; Hoshiko, Ken; Liu, Qian; Yagi, Toshiki; Nagayama, Tatsuhiro; Kusaka, Shinpei; Tsuchiya, Mizuho; Kitagawa, Yasutaka; Wong, Wai-Yeung; Nishihara, Hiroshi

    2015-04-01

    Two-dimensional polymeric nanosheets have recently gained much attention, particularly top-down nanosheets such as graphene and metal chalcogenides originating from bulk-layered mother materials. Although molecule-based bottom-up nanosheets manufactured directly from molecular components can exhibit greater structural diversity than top-down nanosheets, the bottom-up nanosheets reported thus far lack useful functionalities. Here we show the design and synthesis of a bottom-up nanosheet featuring a photoactive bis(dipyrrinato)zinc(II) complex motif. A liquid/liquid interfacial synthesis between a three-way dipyrrin ligand and zinc(II) ions results in a multi-layer nanosheet, whereas an air/liquid interfacial reaction produces a single-layer or few-layer nanosheet with domain sizes of >10 μm on one side. The bis(dipyrrinato)zinc(II) metal complex nanosheet is easy to deposit on various substrates using the Langmuir-Schäfer process. The nanosheet deposited on a transparent SnO2 electrode functions as a photoanode in a photoelectric conversion system, and is thus the first photofunctional bottom-up nanosheet.

  12. Oral bioavailability and pharmacodynamic activity of hesperetin nanocrystals generated using a novel bottom-up technology

    DEFF Research Database (Denmark)

    Shete, Ganesh; Pawar, Yogesh B; Thanki, Kaushik; Jain, Sanyog; Bansal, Arvind Kumar

    2015-01-01

    In the present study, nanocrystalline solid dispersion (NSD) was developed to enhance the release rate and oral bioavailability of hesperetin (HRN). NSD of HRN was prepared using a novel bottom-up technology platform. It is a spray drying based technology to generate solid particles, containing...

  13. Integration of top-down and bottom-up information for audio organization and retrieval

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand

    sources based on latent Dirichlet allocation (LDA). The model is used to integrate bottom-up features (reflecting timbre, loudness, tempo and chroma), meta-data aspects (lyrics) and top-down aspects, namely user generated open vocabulary tags. The model and representation is evaluated on the auxiliary...

  14. Spectral saliency via automatic adaptive amplitude spectrum analysis

    Science.gov (United States)

    Wang, Xiaodong; Dai, Jialun; Zhu, Yafei; Zheng, Haiyong; Qiao, Xiaoyan

    2016-03-01

    Suppressing nonsalient patterns by smoothing the amplitude spectrum at an appropriate scale has been shown to effectively detect the visual saliency in the frequency domain. Different filter scales are required for different types of salient objects. We observe that the optimal scale for smoothing amplitude spectrum shares a specific relation with the size of the salient region. Based on this observation and the bottom-up saliency detection characterized by spectrum scale-space analysis for natural images, we propose to detect visual saliency, especially with salient objects of different sizes and locations via automatic adaptive amplitude spectrum analysis. We not only provide a new criterion for automatic optimal scale selection but also reserve the saliency maps corresponding to different salient objects with meaningful saliency information by adaptive weighted combination. The performance of quantitative and qualitative comparisons is evaluated by three different kinds of metrics on the four most widely used datasets and one up-to-date large-scale dataset. The experimental results validate that our method outperforms the existing state-of-the-art saliency models for predicting human eye fixations in terms of accuracy and robustness.

  15. The life cycle of bottom-up ideas : case studies of the companies where the simulation game method was applied

    OpenAIRE

    Forssén, Minna

    2002-01-01

    The main aim of this thesis is to study the life cycle of the incremental "bottom-up" ideas, which concern process and organizational matters. According to earlier studies, bottom-up ideas are not always successfully used and managed and as well there exists need for more study on organizational and process innovations. It is therefore useful to study this phenomenon more and gain more information about how organizations manage the development and implementation of these bottom-up ideas. ...

  16. The generation of myricetin-nicotinamide nanococrystals by top down and bottom up technologies.

    Science.gov (United States)

    Liu, Mingyu; Hong, Chao; Li, Guowen; Ma, Ping; Xie, Yan

    2016-09-30

    Myricetin-nicotinamide (MYR-NIC) nanococrystal preparation methods were developed and optimized using both top down and bottom up approaches. The grinding (top down) method successfully achieved nanococrystals, but there were some micrometer range particles and aggregation. The key consideration of the grinding technology was to control the milling time to determine a balance between the particle size and distribution. In contrast, a modified bottom up approach based on a solution method in conjunction with sonochemistry resulted in a uniform MYR-NIC nanococrystal that was confirmed by powder x-ray diffraction, scanning electron microscopy, dynamic light scattering, and differential scanning calorimeter, and the particle dissolution rate and amount were significantly greater than that of MYR-NIC cocrystal. Notably, this was a simple method without the addition of any non-solvent. We anticipate our findings will provide some guidance for future nanococrystal preparation as well as its application in both chemical and pharmaceutical area. PMID:27535365

  17. Cognitive Functions of the Posterior Parietal Cortex: Top-down and bottom-up attentional control

    Directory of Open Access Journals (Sweden)

    Sarah Shomstein

    2012-07-01

    Full Text Available Although much less is known about human parietal cortex than that of homologous monkey cortex, recent studies, employing neuroimaging and neuropsychological methods, have begun to elucidate increasingly fine-grained functional and structural distinctions.\tThis review is focused on recent neuroimaging and neuropsychological studies elucidating the cognitive roles of dorsal and ventral regions of parietal cortex in top-down and bottom-up attentional orienting, and on the interaction between the two attentional allocation mechanisms. Evidence is reviewed arguing that regions along the dorsal areas of the parietal cortex, including the superior parietal lobule (SPL are involved in top-down attentional orienting, while ventral regions including the temporo-parietal junction (TPJ are involved in bottom-up attentional orienting.

  18. Hybrid Top-Down/Bottom-Up Strategy Using Superwettability for the Fabrication of Patterned Colloidal Assembly.

    Science.gov (United States)

    Wang, Yuezhong; Wei, Cong; Cong, Hailin; Yang, Qiang; Wu, Yuchen; Su, Bin; Zhao, Yongsheng; Wang, Jingxia; Jiang, Lei

    2016-02-24

    Superwettability of substrates has had a profound influence on the production of novel and advanced colloidal assemblies in recent decades owing to its effect on the spreading area, evaporation rate, and the resultant assembly structure. In this paper, we investigated in detail the influence of the superwettability of a transfer/template substrate on the colloidal assembly from a hybrid top-down/bottom-up strategy. By taking advantage of a superhydrophilic flat transfer substrate and a superhydrophobic groove-structured silicon template, the patterned colloidal microsphere assembly was formed including linear and mesh-, cyclic-, and multistopband assembly arrays of microspheres, and the optic-waveguide of a circular colloidal structure was demonstrated. We believed this liquid top-down/bottom-up strategy would open an efficient avenue for assembling/integrating microspheres building blocks into device applications in a low-cost manner. PMID:26824430

  19. Co-financing of bottom-up approaches towards Broadband Infrastructure Development:

    DEFF Research Database (Denmark)

    Williams, Idongesit

    2016-01-01

    Bottom – up Broadband infrastructure development facilitated by the civil societies and social enterprises are on the increase. However, the problem plaguing the development of these bottom-up approaches in developing countries is the financial capacity to expand their small networks into larger...... networks –leading to the demise of some of these initiatives. This paper proposes co-financing of these networks as a means of sustaining the bottom-up Broadband network. The argument of this paper is anchored on two of developing country cases. One in India and the other in Ghana. One survived with...... financial injection and the other did not due to low revenue. This paper, based on these cases, proposes the utilization and the reintroduction of Universal Service funds in developing countries to aid these small networks. This is a qualitative study, the Grounded Theory approach was used adopted gather...

  20. A balance of bottom-up and top-down in linking climate policies

    Science.gov (United States)

    Green, Jessica F.; Sterner, Thomas; Wagner, Gernot

    2014-12-01

    Top-down climate negotiations embodied by the Kyoto Protocol have all but stalled, chiefly because of disagreements over targets and objections to financial transfers. To avoid those problems, many have shifted their focus to linkage of bottom-up climate policies such as regional carbon markets. This approach is appealing, but we identify four obstacles to successful linkage: different levels of ambition; competing domestic policy objectives; objections to financial transfers; and the difficulty of close regulatory coordination. Even with a more decentralized approach, overcoming the 'global warming gridlock' of the intergovernmental negotiations will require close international coordination. We demonstrate how a balance of bottom-up and top-down elements can create a path toward an effective global climate architecture.

  1. Transition UGent: a bottom-up initiative towards a more sustainable university

    OpenAIRE

    Block, Thomas; Van de Velde, Riet

    2016-01-01

    The vibrant think-tank ‘Transition UGent’ engaged over 250 academics, students and people from the university management in suggesting objectives and actions for the Sustainability Policy of Ghent University (Belgium). Founded in 2012, this bottom-up initiative succeeded to place sustainability high on the policy agenda of our university. Through discussions within 9 working groups and using the transition management method, Transition UGent developed system analyses, sustainability visions a...

  2. Integrating top down policies and bottom up practices in Urban and Periurban Agriculture: an Italian dilemma

    OpenAIRE

    Cinà, Giuseppe; Di Iacovo, Francesco

    2015-01-01

    The paper deals with some relevant and contradictory aspects of urban and peri-urban agriculture in Italy: the traditional exclusion of agricultural areas from the goals of territorial planning; the separation between top-down policies and bottom-up practices; the lack of agricultural policies at local scale. In the first part the paper summarises the weak relation between urban planning and agriculture, showing how in Italy this gap has been only partially overcome by new laws and plans. Mor...

  3. Reforming the taxation of multijurisdictional enterprises in Europe: coopetition in a bottom-up federation

    OpenAIRE

    Gérard, Marcel

    2006-01-01

    This paper investigates replacing separate taxation by consolidation and formulary apportionment in a Bottom-up Federation, when a multijurisdictional firm is mobile in various respects. The reform is decided cooperatively by all the jurisdictions or by some of them, while tax rates remain within the competence of each jurisdiction. The paper sets forth the conditions for the reform to be social welfare enhancing, while not increasing tax competition. Among them, the formula should emphasize ...

  4. Bottom-up effects of soil quality on a coffee arthropod interaction web

    OpenAIRE

    Gonthier, DJ; Dominguez, GM; Witter, JD; Spongberg, AL; Philpott, SM

    2013-01-01

    Nutrient availability and soil quality influence herbivores through changes in plant traits and can have cascading effects on herbivore interactions. In complex systems, with many positive and negative interactions, the consequences of these bottom-up effects are still not well established. We carried out a set of studies to determine the impact of soil quality (organic compost amendments) on a hemipteran herbivore (Coccus viridis), two ant mutualists, predators, pathogens, parasitoids of C. ...

  5. Self-assembled nanostructured resistive switching memory devices fabricated by templated bottom-up growth

    OpenAIRE

    Ji-Min Song; Jang-Sik Lee

    2016-01-01

    Metal-oxide-based resistive switching memory device has been studied intensively due to its potential to satisfy the requirements of next-generation memory devices. Active research has been done on the materials and device structures of resistive switching memory devices that meet the requirements of high density, fast switching speed, and reliable data storage. In this study, resistive switching memory devices were fabricated with nano-template-assisted bottom up growth. The electrochemical ...

  6. Public participation GIS to support a bottom-up approach in forest landscape planning

    OpenAIRE

    Paletto A; Lora C; Frattegiani M; De Meo I; Ferretti F

    2013-01-01

    Forest landscape planning analyses all forest aspects (economic, ecological and social) and defines long-term forest management guidelines. Various actors are involved in landscape planning; therefore the analysis needs to take into account goals and targets of the different stakeholders. The participatory process can strongly support the development of a bottom-up forest plan definition when stakeholders are involved throughout the decision-making process. In this way, management guidelines ...

  7. Environmental Sustainability and Regulation: To-Down Versus Bottom-Up Regulation

    OpenAIRE

    Mariam, Yohannes

    2001-01-01

    Environmental regulation can be broadly divided into those that follow the top-down and bottom-up approaches. The two approaches have similar objective with respect to environmental protection and sustainability. However, the success with which each approach achieves goals of environmental protection and sustainability may vary. Moreover, the costs and benefits of each approach differ. The present study will explore the implication of environmental regulation to sustainability, costs associat...

  8. New bottom-up algorithm for assembly plan generation : opportunities for micro-factory design.

    OpenAIRE

    Perrard, Christophe; Lutz, Philippe; Salgueiro, Paulo

    2007-01-01

    This paper discusses a new approach dedicated to assembly plan generation, called "bottom-up algorithm". It is compared to the traditional "top-down approach", usually used to perform this stage of the design process of the assembly systems for "macro-products". We explore why this new algorithm is better adapted for designing a microassembly system. The case of watch assembly plans generation is described through the both approaches and the obtained results are compared.

  9. Bottom-Up Dependent Gating of Frontal Signals in Early Visual Cortex

    OpenAIRE

    Ekstrom, L. B.; P. R. Roelfsema; Arsenault, J.T.; Bonmassar, G.; Vanduffel, W.

    2008-01-01

    The frontal eye field (FEF) is one of several cortical regions thought to modulate sensory inputs. Moreover, several hypotheses suggest that the FEF can only modulate early visual areas in the presence of a visual stimulus. To test for bottom-up gating of frontal signals, we microstimulated subregions in the FEF of two monkeys and measured the effects throughout the brain with functional magnetic resonance imaging. The activity of higher-order visual areas was strongly modulated by FEF stimul...

  10. Combining shape and color: a bottom-up approach to evaluate object similarities

    OpenAIRE

    PASCUCCI, ALESSIO

    2011-01-01

    The objective of the present work is to develop a bottom-up approach to estimate the similarity between two unknown objects. Given a set of digital images, we want to identify the main objects and to determine whether they are similar or not. In the last decades many object recognition and classification strategies, driven by higher-level activities, have been successfully developed. The peculiarity of this work, instead, is the attempt to work without any training phase nor a priori knowledg...

  11. The Application of Bottom-up and Top-down Processing in L2 Listening Comprehension

    Institute of Scientific and Technical Information of China (English)

    温颖茜

    2008-01-01

    Listening comprehension is one of the four basic skills for language learning and is also one of the most difficult tasks L2 learners ever experienced.L2 listening comprehemion is a cognitvive process,in which listeners use both bottom-up andtop-downprocessing to comprehend the auraltext.Thepaper focmes on the applicationof the two approaches in L2 lis-tening comprehemiom

  12. A VHDL-AMS Modeling Methodology for Top-Down/Bottom-Up Design of RF Systems

    OpenAIRE

    Maehne, Torsten; Vachoux, Alain; Giroud, Frédéric; Contaldo, Matteo

    2009-01-01

    This paper presents a modelling methodology for the top-down/bottom-up design of RF systems based on systematic use of VHDL-AMS models. The model interfaces are parameterizable and pin-accurate. The designer can choose to parameterize the models using performance specifications or device parameters back-annotated from the transistor-level implementation. The abstraction level used for the description of the respective analog/digital component behavior has been chosen to achieve a good t...

  13. A computational study of liposome logic: towards cellular computing from the bottom up

    OpenAIRE

    Smaldon, James; Romero-Campero, Francisco J.; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron; Krasnogor, Natalio

    2010-01-01

    In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This “liposome logic” approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in “top-down” synthetic biology, particularly in the specification, design and impl...

  14. Can bottom-up ocean CO2 fluxes be reconciled with atmospheric 13C observations?

    OpenAIRE

    Alden, Caroline B.; Miller, John B.; White, James W.C.

    2011-01-01

    The rare stable carbon isotope, 13C, has been used previously to partition CO2 fluxes into land and ocean components. Net ocean and land fluxes impose distinctive and predictable fractionation patterns upon the stable isotope ratio, making it an excellent tool for distinguishing between them. Historically, isotope constrained inverse methods for calculating CO2 surface fluxes—the ‘double deconvolution’—have disagreed with bottom-up ocean flux estimates. In this study, we use the double deconv...

  15. Bottom-Up Cost Analysis of a High Concentration PV Module

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, Kelsey A. W.; Woodhouse, Michael; Lee, Hohyun; Smestad, Greg P.

    2016-03-31

    We present a bottom-up model of III-V multi-junction cells, as well as a high concentration PV (HCPV) module. We calculate $0.59/W(DC) manufacturing costs for our model HCPV module design with today's capabilities, and find that reducing cell costs and increasing module efficiency offer the most promising paths for future cost reductions. Cell costs could be significantly reduced via substrate reuse and improved manufacturing yields.

  16. Nanomaterial processing using self-assembly-bottom-up chemical and biological approaches

    International Nuclear Information System (INIS)

    Nanotechnology is touted as the next logical sequence in technological evolution. This has led to a substantial surge in research activities pertaining to the development and fundamental understanding of processes and assembly at the nanoscale. Both top-down and bottom-up fabrication approaches may be used to realize a range of well-defined nanostructured materials with desirable physical and chemical attributes. Among these, the bottom-up self-assembly process offers the most realistic solution toward the fabrication of next-generation functional materials and devices. Here, we present a comprehensive review on the physical basis behind self-assembly and the processes reported in recent years to direct the assembly of nanoscale functional blocks into hierarchically ordered structures. This paper emphasizes assembly in the synthetic domain as well in the biological domain, underscoring the importance of biomimetic approaches toward novel materials. In particular, two important classes of directed self-assembly, namely, (i) self-assembly among nanoparticle–polymer systems and (ii) external field-guided assembly are highlighted. The spontaneous self-assembling behavior observed in nature that leads to complex, multifunctional, hierarchical structures within biological systems is also discussed in this review. Recent research undertaken to synthesize hierarchically assembled functional materials have underscored the need as well as the benefits harvested in synergistically combining top-down fabrication methods with bottom-up self-assembly. (review article)

  17. Plasma-surface interactions for top-down and bottom-up nanofabrication

    Science.gov (United States)

    Ono, Kouichi

    2015-09-01

    Plasma processing is now widely employed for the fabrication of nanostructures in diverse fields of micro/nanoelectronic, optoelectronic, energy conversion, and sensing devices. The top-down plasma processes are indispensable in today's microelectronics industry, relying on the use of primarily anisotropic plasma etching following the lithography to define mask patterns; in some cases, self-assembled masks are served for the subsequent etching. The bottom-up ones are often employed to synthesize nanostructures such as nanotubes and nanowires, relying on the use of plasma enhanced chemical vapor deposition and plasma sputtering on self-assembled as well as lithographically formed patterns of metal catalysts. Moreover, the mask-less top-down approaches have recently been demonstrated to form nanopillars and periodic nanoripples, and the catalyst-free bottom-up approaches have been demonstrated to form nanowires. This talk is concerned with the current understanding and future prospects for plasma-surface interactions responsible for these top-down and bottom-up plasma nanofabrication processes, with attention placed on the fabrication of nanoscale fins and gates and also nanowires of silicon. On nanometer scale, ions and neutrals incident on surfaces are few in number during processing; thus, the nanoscale plasma-surface interactions concerned are stochastic, owing to the temporal as well as spatial uniformity of the incident flux and angle of them on surfaces being processed at nanoscale.

  18. Top-down and bottom-up definitions of human failure events in human reliability analysis

    International Nuclear Information System (INIS)

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down - defined as a subset of the PRA - whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up - derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  19. A Bottom-up Trend in Research of Management of Technology

    Directory of Open Access Journals (Sweden)

    Yoko Ishino

    2014-12-01

    Full Text Available Management of Technology (MOT is defined as an academic discipline of management that enables organizations to manage their technological fundamentals to create competitive advantage. MOT covers a wide range of contents including administrative strategy, R&D management, manufacturing management, technology transfer, production control, marketing, accounting, finance, business ethics, and others. For each topic, researchers have conducted their MOT research at various levels. However, a practical and pragmatic side of MOT surely affects its research trends. Finding changes of MOT research trends, or the chronological transitions of principal subjects, can help understand the key concepts of current MOT. This paper studied a bottom-up trend in research fields in MOT by applying a text-mining method to the conference proceedings of IAMOT (International Association for Management of Technology. First, focusing on only nouns found several keywords, which more frequently emerge over time in the IAMOT proceedings. Then, expanding the scope into other parts of speech viewed the keywords in a natural context. Finally, it was found that the use of an important keyword has qualitatively and quantitatively extended over time. In conclusion, a bottom-up trend in MOT research was detected and the effects of the social situation on the trend were discussed.Keywords: Management of Technology; Text Mining; Research Trend; Bottom-up Trend; Patent

  20. Top-down and bottom-up modulation in processing bimodal face/voice stimuli

    Directory of Open Access Journals (Sweden)

    VanRullen Rufin

    2010-03-01

    Full Text Available Abstract Background Processing of multimodal information is a critical capacity of the human brain, with classic studies showing bimodal stimulation either facilitating or interfering in perceptual processing. Comparing activity to congruent and incongruent bimodal stimuli can reveal sensory dominance in particular cognitive tasks. Results We investigated audiovisual interactions driven by stimulus properties (bottom-up influences or by task (top-down influences on congruent and incongruent simultaneously presented faces and voices while ERPs were recorded. Subjects performed gender categorisation, directing attention either to faces or to voices and also judged whether the face/voice stimuli were congruent in terms of gender. Behaviourally, the unattended modality affected processing in the attended modality: the disruption was greater for attended voices. ERPs revealed top-down modulations of early brain processing (30-100 ms over unisensory cortices. No effects were found on N170 or VPP, but from 180-230 ms larger right frontal activity was seen for incongruent than congruent stimuli. Conclusions Our data demonstrates that in a gender categorisation task the processing of faces dominate over the processing of voices. Brain activity showed different modulation by top-down and bottom-up information. Top-down influences modulated early brain activity whereas bottom-up interactions occurred relatively late.

  1. Top-down or bottom-up modelling. An application to CO2 abatement

    International Nuclear Information System (INIS)

    In four articles a comparison is made of bottom-up, or engineers'' models, and top-down models, which comprise macro-econometric models, computable general equilibrium models and also models in the system dynamics tradition. In the first article the history of economic modelling is outlined. In the second article the multi-sector macro-economic Computable General Equilibrium model for the Netherlands is described. It can be used to study the long-term effects of fiscal policy measures on economic and environmental indicators, in particular the effects on the level of CO2-emissions. The aim of article 3 is to describe the structure of the electricity supply industry in the UK and how it can be represented in a bottom-up sub-model within a more general E3 sectoral model of the UK economy. The objective of the last paper (4) is mainly a methodological discussion about integrating top-down and bottom-up models which can be used to assess CO2 abatement policies impacts on economic activity

  2. Perceptual salience affects the contents of working memory during free-recollection of objects from natural scenes.

    Science.gov (United States)

    Pedale, Tiziana; Santangelo, Valerio

    2015-01-01

    One of the most important issues in the study of cognition is to understand which are the factors determining internal representation of the external world. Previous literature has started to highlight the impact of low-level sensory features (indexed by saliency-maps) in driving attention selection, hence increasing the probability for objects presented in complex and natural scenes to be successfully encoded into working memory (WM) and then correctly remembered. Here we asked whether the probability of retrieving high-saliency objects modulates the overall contents of WM, by decreasing the probability of retrieving other, lower-saliency objects. We presented pictures of natural scenes for 4 s. After a retention period of 8 s, we asked participants to verbally report as many objects/details as possible of the previous scenes. We then computed how many times the objects located at either the peak of maximal or minimal saliency in the scene (as indexed by a saliency-map; Itti et al., 1998) were recollected by participants. Results showed that maximal-saliency objects were recollected more often and earlier in the stream of successfully reported items than minimal-saliency objects. This indicates that bottom-up sensory salience increases the recollection probability and facilitates the access to memory representation at retrieval, respectively. Moreover, recollection of the maximal- (but not the minimal-) saliency objects predicted the overall amount of successfully recollected objects: The higher the probability of having successfully reported the most-salient object in the scene, the lower the amount of recollected objects. These findings highlight that bottom-up sensory saliency modulates the current contents of WM during recollection of objects from natural scenes, most likely by reducing available resources to encode and then retrieve other (lower saliency) objects. PMID:25741266

  3. Perceptual salience affects the contents of working memory during free-recollection of objects from natural scenes

    Directory of Open Access Journals (Sweden)

    Tiziana Pedale

    2015-02-01

    Full Text Available One of the most important issues in the study of cognition is to understand which are the factors determining internal representation of the external world. Previous literature has started to highlight the impact of low-level sensory features (indexed by saliency-maps in driving attention selection, hence increasing the probability for objects presented in complex and natural scenes to be successfully encoded into working memory(WM and then correctly remembered. Here we asked whether the probability of retrieving high-saliency objects modulates the overall contents of WM, by decreasing the probability of retrieving other, lower-saliency objects. We presented pictures of natural scenes for 4 secs. After a retention period of 8 secs, we asked participants to verbally report as many objects/details as possible of the previous scenes. We then computed how many times the objects located at either the peak of maximal or minimal saliency in the scene (as indexed by a saliency-map; Itti et al., 1998 were recollected by participants. Results showed that maximal-saliency objects were recollected more often and earlier in the stream of successfully reported items than minimal-saliency objects. This indicates that bottom-up sensory salience increases the recollection probability and facilitates the access to memory representation at retrieval, respectively. Moreover, recollection of the maximal- (but not the minimal- saliency objects predicted the overall amount of successfully recollected objects: The higher the probability of having successfully reported the most-salient object in the scene, the lower the amount of recollected objects. These findings highlight that bottom-up sensory saliency modulates the current contents of WM during recollection of objects from natural scenes, most likely by reducing available resources to encode and then retrieve other (lower saliency objects.

  4. Visual anticipation biases conscious perception but not bottom-up visual processing

    Directory of Open Access Journals (Sweden)

    Paul F.M.J. Verschure

    2015-01-01

    Full Text Available Theories of consciousness can be grouped with respect to their stance on embodiment, sensori-motor contingencies, prediction and integration. In this list prediction plays a key role and it is not clear which aspects of prediction are most prominent in the conscious scene. An evolving view on the brain is that it can be seen as a prediction machine that optimizes its ability to predict states of the world and the self through the top-down propagation of predictions and the bottom-up presentation of prediction errors. There are competing views though on whether prediction or prediction errors dominate the conscious scene. Yet, due to the lack of efficient indirect measures, the dynamic effects of prediction on perception, decision making and consciousness have been difficult to assess and to model. We propose a novel mathematical framework and psychophysical paradigm that allows us to assess both the hierarchical structuring of perceptual consciousness, its content and the impact of predictions and / or errors on the conscious scene. Using a displacement detection task combined with reverse correlation we reveal signatures of the usage of prediction at three different levels of perception: bottom-up early saccades, top-down driven late saccades and conscious decisions. Our results suggest that the brain employs multiple parallel mechanisms at different levels of information processing to restrict the sensory field using predictions. We observe that cognitive load has a quantifiable effect on this dissociation of the bottom-up sensory and top-down predictive processes. We propose a probabilistic data association model from dynamical systems theory to model this predictive bias in different information processing levels.

  5. A computational study of liposome logic: towards cellular computing from the bottom up.

    Science.gov (United States)

    Smaldon, James; Romero-Campero, Francisco J; Fernández Trillo, Francisco; Gheorghe, Marian; Alexander, Cameron; Krasnogor, Natalio

    2010-09-01

    In this paper we propose a new bottom-up approach to cellular computing, in which computational chemical processes are encapsulated within liposomes. This "liposome logic" approach (also called vesicle computing) makes use of supra-molecular chemistry constructs, e.g. protocells, chells, etc. as minimal cellular platforms to which logical functionality can be added. Modeling and simulations feature prominently in "top-down" synthetic biology, particularly in the specification, design and implementation of logic circuits through bacterial genome reengineering. The second contribution in this paper is the demonstration of a novel set of tools for the specification, modelling and analysis of "bottom-up" liposome logic. In particular, simulation and modelling techniques are used to analyse some example liposome logic designs, ranging from relatively simple NOT gates and NAND gates to SR-Latches, D Flip-Flops all the way to 3 bit ripple counters. The approach we propose consists of specifying, by means of P systems, gene regulatory network-like systems operating inside proto-membranes. This P systems specification can be automatically translated and executed through a multiscaled pipeline composed of dissipative particle dynamics (DPD) simulator and Gillespie's stochastic simulation algorithm (SSA). Finally, model selection and analysis can be performed through a model checking phase. This is the first paper we are aware of that brings to bear formal specifications, DPD, SSA and model checking to the problem of modeling target computational functionality in protocells. Potential chemical routes for the laboratory implementation of these simulations are also discussed thus for the first time suggesting a potentially realistic physiochemical implementation for membrane computing from the bottom-up. PMID:21886681

  6. Coupled multi-physics simulation frameworks for reactor simulation: A bottom-up approach

    International Nuclear Information System (INIS)

    A 'bottom-up' approach to multi-physics frameworks is described, where first common interfaces to simulation data are developed, then existing physics modules are adapted to communicate through those interfaces. Physics modules read and write data through those common interfaces, which also provide access to common simulation services like parallel IO, mesh partitioning, etc.. Multi-physics codes are assembled as a combination of physics modules, services, interface implementations, and driver code which coordinates calling these various pieces. Examples of various physics modules and services connected to this framework are given. (author)

  7. Unsupervised tattoo segmentation combining bottom-up and top-down cues

    Science.gov (United States)

    Allen, Josef D.; Zhao, Nan; Yuan, Jiangbo; Liu, Xiuwen

    2011-06-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for finding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a figureground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is efficient and suitable for further tattoo classification and retrieval purpose.

  8. Public engagement as a field of tension between bottom-up and top-down strategies

    DEFF Research Database (Denmark)

    Horsbøl, Anders; Lassen, Inger

    2012-01-01

    In the ongoing debate about climate change, public engagement is given increasing prominence as a possible solution to a general lack of citizen participation in climate change mitigation efforts. Recent years have seen a surge in public engagement initiatives in many countries in the Western world....... These initiatives often have to deal with dilemmas between participatory aspects and other considerations such as planning efficiency, dilemmas that potentially bring about tension between bottom-up and top-down strategies. Literature on climate change issues has addressed the failure of public response...... more knowledge and information about climate change has not significantly changed people’s behaviour towards higher involvement....

  9. Supporting Frequent Updates in R-Trees: A Bottom-Up Approach

    DEFF Research Database (Denmark)

    Lee, Mong Li; Hsu, Wynne; Jensen, Christian Søndergaard;

    2003-01-01

    locality. While the R-tree is the index of choice for multi-dimensional data with low dimensionality, and is thus relevant to these applications, R-tree updates are also relatively inefficient. We present a generalized bottom-up update strategy for R-trees that generalizes existing update techniques and......Advances in hardware-related technologies promise to enable new data management applications that monitor continuous processes. In these applications, enormous amounts of state samples are obtained via sensors and are streamed to a database. Further, updates are very frequent and may exhibit...

  10. NEMO. Netherlands Energy demand MOdel. A top-down model based on bottom-up information

    International Nuclear Information System (INIS)

    The title model links energy use to other production factors, (physical) production, energy prices, technological trends and government policies. It uses a 'putty-semiputty' vintage production structure, in which new investments, adaptations to existing capital goods (retrofit) and 'good-housekeeping' are discerned. Price elasticities are relatively large in the long term and small in the short term. Most predictions of energy use are based on either econometric models or on 'bottom-up information', i.e. disaggregated lists of technical possibilities for and costs of saving energy. Typically, one predicts more energy-efficiency improvements using bottom-up information than using econometric ('top-down') models. We bridged this so-called 'energy-efficiency gap' by designing our macro/meso model NEMO in such a way that we can use bottom-up (micro) information to estimate most model parameters. In our view, reflected in NEMO, the energy-efficiency gap arises for two reasons. The first is that firms and households use a fairly high discount rate of 15% when evaluating the profitability of energy-efficiency improvements. The second is that our bottom-up information ('ICARUS') for most economic sectors does not (as NEMO does) take account of the fact that implementation of new, energy-efficient technology in capital stock takes place only gradually. Parameter estimates for 19 sectors point at a long-term technological energy efficiency improvement trend in Netherlands final energy use of 0.8% per year. The long-term price elasticity is estimated to be 0.29. These values are comparable to other studies based on time series data. Simulations of the effects of the oil price shocks in the seventies and the subsequent fall of oil prices show that the NEMO's price elasticities are consistent with historical data. However, the present pace at which new technologies become available (reflected in NEMO) appears to be lower than in the seventies and eighties. This suggests that it

  11. Bottom-up approach for the fabrication of spin torque nano-oscillators

    Energy Technology Data Exchange (ETDEWEB)

    Darques, M; De la Torre Medina, J; Abreu Araujo, F; Piraux, L [Institute of Condensed Matter and Nanosciences, Universite catholique de Louvain, Croix du Sud 1, Louvain-la-Neuve (Belgium); Dussaux, A; Khvalkovskiy, A V; Guillemet, R; Bouzehouane, K; Fusil, S; Grollier, J; Cros, V [Unite Mixte de Physique CNRS/Thales and Universite Paris Sud 11, Palaiseau (France); Avanesyan, G G; Zvezdin, K A, E-mail: michael.darques@uclouvain.be [A.M. Prokhorov General Physics Institute of RAS, Vavilova str. 38, Moscow (Russian Federation)

    2011-03-16

    We report on a bottom-up approach for the fabrication of spin-transfer nano-oscillators (STNOs). Porous alumina is used as a template for the growth by electrodeposition of metallic spin valves in series. Under specific magnetic field and injected current conditions, emission of microwave current is detected with frequency in the 1.5 GHz range and linewidth as low as 8 MHz. We find strong indications that the microwave emission is due to spin-transfer-driven vortex oscillations. This technique is promising for the fabrication of dense arrays of STNOs in view of device synchronization.

  12. Bottom-up assembly of hydrophobic nanocrystals and graphene nanosheets into mesoporous nanocomposites.

    Science.gov (United States)

    Huang, Jijiang; Liu, Wenxian; Wang, Li; Sun, Xiaoming; Huo, Fengwei; Liu, Junfeng

    2014-04-22

    A general strategy for constructing graphene-based nanocomposites is achieved by emulsion-based bottom-up self-assembly of hydrophobic nanocrystals (NCs) to positively charged colloidal spheres, followed by the electrostatic assembly of NC colloidal spheres with negatively charged graphene oxide in an acidulous aqueous solution. With a simple heat treatment, 3D mesoporous NC spheres/graphene composites are obtained. TiO2/graphene composites typically exhibit a better rate capability and cycle performance than do the corresponding isolated TiO2 spheres. PMID:24684553

  13. Assessment of the Bottom-up Budgeting Process for FY 2015

    OpenAIRE

    Manasan, Rosario G.

    2015-01-01

    The bottom-up budgeting (BUB) process is one of the major reform initiatives of the Aquino administration and has been tagged as such from several perspectives. First, it is seen as a component of its budget reform thrusts that are aimed at making the national government budgeting process more responsive to local needs. Second, the BUB is viewed as part of the democracy/empowerment reform as it opens another avenue for people`s participation in local planning and budgeting and for generating ...

  14. Representing energy technologies in top-down economic models using bottom-up information

    International Nuclear Information System (INIS)

    The rate and magnitude of technological change is a critical component in estimating future anthropogenic carbon emissions. We present a methodology for modeling low-carbon emitting technologies within the MIT Emissions Prediction and Policy Analysis (EPPA) model, a computable general equilibrium (CGE) model of the world economy. The methodology translates bottom-up engineering information for two carbon capture and sequestration (CCS) technologies in the electric power sector into the EPPA model and discusses issues that arise in assuring an accurate representation and realistic market penetration. We find that coal-based technologies with sequestration penetrate, despite their higher cost today, because of projected rising natural gas prices. (author)

  15. WORD-OF-MOUTH MARKETING AND ENTERPRISE STRATEGIES: A BOTTOM-UP DIFFUSION MODEL

    OpenAIRE

    Remondino, Marco

    2011-01-01

    A comprehensive simulation model is presented, aimed to show the dynamics of social diffusion based on the word of mouth (e.g.: viral marketing) over a social network of interconnected individuals. The model is build following a bottom-up approach and the agent based paradigm; this means that the dynamics of the diffusion is simulated in real time and generated at the micro level, not calculated by using mathematical formulas. This allows both to follow – step by step – the emergent process a...

  16. Supporting Frequent Updates in R-Trees: A Bottom-Up Approach

    DEFF Research Database (Denmark)

    Lee, Mong Li; Hsu, Wynne; Jensen, Christian Søndergaard; Cui, Bin; Teo, Keng Lik

    2004-01-01

    Advances in hardware-related technologies promise to enable new data management applications that monitor continuous processes. In these applications, enormous amounts of state samples are obtained via sensors and are streamed to a database. Further, updates are very frequent and may exhibit...... improve update performance. It has different levels of reorganization - ranging from global to local - during updates, avoiding expensive top-down updates. A compact main-memory summary structure that allows direct access to the R-tree index nodes is used together with efficient bottom-up algorithms...

  17. Bottom-up reconstruction scenarios for (un)constrained MSSM parameters at the CERN LHC

    International Nuclear Information System (INIS)

    We consider some specific inverse problem or 'bottom-up' reconstruction strategies at the CERN LHC for both general and constrained minimal supersymmetric standard model (MSSM) parameters, starting from a plausibly limited set of sparticle identification and mass measurements, using mainly gluino/squark cascade decays, plus eventually the lightest Higgs boson mass. For the three naturally separated sectors of gaugino/Higgsino, squark/slepton, and Higgs parameters, we examine different step-by-step algorithms based on rather simple, entirely analytical, inverted relations between masses and basic MSSM parameters. This includes also reasonably good approximations of some of the relevant radiative correction calculations. We distinguish the constraints obtained for a general MSSM from those obtained with universality assumptions in the three different sectors. Our results are compared at different stages with the determination from more standard 'top-down' fit of models to data, and finally combined into a global determination of all the relevant parameters. Our approach gives complementary information to more conventional analysis, and is not restricted to the specific LHC measurement specificities. In addition, the bottom-up renormalization group evolution of general MSSM parameters, being an important ingredient in this framework, is illustrated as a new publicly available option of the MSSM spectrum calculation code SuSpect.

  18. Bottom-up approach for decentralised energy planning: Case study of Tumkur district in India

    International Nuclear Information System (INIS)

    Decentralized Energy Planning (DEP) is one of the options to meet the rural and small-scale energy needs in a reliable, affordable and environmentally sustainable way. The main aspect of the energy planning at decentralized level would be to prepare an area-based DEP to meet energy needs and development of alternate energy sources at least-cost to the economy and environment. Present work uses goal-programming method in order to analyze the DEP through bottom-up approach. This approach includes planning from the lowest scale of Tumkur district in India. The scale of analysis included village level-Ungra, panchayat level (local council)-Yedavani, block level-Kunigal and district level-Tumkur. The approach adopted was bottom-up (village to district) to allow a detailed description of energy services and the resulting demand for energy forms and supply technologies. Different scenarios are considered at four decentralized scales for the year 2005 and are developed and analyzed for the year 2020. Decentralized bioenergy system for producing biogas and electricity, using local biomass resources, are shown to promote development compared to other renewables. This is because, apart from meeting energy needs, multiple goals could be achieved such as self-reliance, local employment, and land reclamation apart from CO2 emissions reduction.

  19. Dynamic formulation of a top-down and bottom-up merging energy policy model

    International Nuclear Information System (INIS)

    The impact of energy policy measures is not restricted to the energy system and should therefore be analysed within an economy-wide framework, while keeping the essential details of the energy sector. The aim of this paper is to present new developments in the field of the consistent evaluation of indicators for the sustainability assessment of energy policy measures. Starting from the static concept of Boehringer (Energy Econ. 20 (1998) 233), this paper shows how the complementarity format can be used in computable general equilibrium (CGE) modelling for a dynamic formulation of bottom-up and top-down approach merging models. While a hybrid approach increases the credibility of CGE models in energy policy analysis by replacing the energy sector generic functional forms with a bottom-up activity analysis based on specific technologies, the endogenous formulation of investment decisions makes an explicit description of evolving specific capital stocks and technology mixes possible. Both features are essential when assessing effects of policy measures that may be affected by structural change--which is typically the case in the long-term assessment of energy policy measures

  20. Dynamic formulation of a top-down and bottom-up merging energy policy model

    Energy Technology Data Exchange (ETDEWEB)

    Frei, C.W. [Swiss Federal Inst. of Technology, Lausanne (Switzerland). Centre for Energy Policy and Economics; Haldi, P.A.; Sarlos, G. [Swiss Federal Inst. of Technology, Lausanne (Switzerland). Lab. of Energy Systems

    2003-08-01

    The impact of energy policy measures is not restricted to the energy system and should therefore be analysed within an economy-wide framework, while keeping the essential details of the energy sector. The aim of this paper is to present new developments in the field of the consistent evaluation of indicators for the sustainability assessment of energy policy measures. Starting from the static concept of Boehringer (Energy Econ. 20 (1998) 233), this paper shows how the complementarity format can be used in computable general equilibrium (CGE) modelling for a dynamic formulation of bottom-up and top-down approach merging models. While a hybrid approach increases the credibility of CGE models in energy policy analysis by replacing the energy sector generic functional forms with a bottom-up activity analysis based on specific technologies, the endogenous formulation of investment decisions makes an explicit description of evolving specific capital stocks and technology mixes possible. Both features are essential when assessing effects of policy measures that may be affected by structural change - which is typically the case in the long-term assessment of energy policy measures.(author)

  1. From bottom-up approaches to levels of organization and extended critical transitions

    Directory of Open Access Journals (Sweden)

    Giuseppe eLongo

    2012-07-01

    Full Text Available Biological thinking is structured by the notion of level of organization. We will show that this notion acquires a precise meaning in critical phenomena: they disrupt, by the appearance of infinite quantities, the mathematical (possibly equational determination at a given level, when moving at an ``higher'' one. As a result, their analysis cannot be called genuinely bottom-up, even though it remains upward in a restricted sense.At the same time, criticality and related phenomena are very common in biology. Because of this, we claim that bottom-up approaches are not sufficient, in principle, to capture biological phenomena. In the second part of this paper, following the work of Francis Bailly, we discuss a strong criterium of level transition. The core idea of the criterium is to start from the breaking of the symmetries and determination at a ``first'' level in order to ``move'' at the others. If biological phenomena have multiple, emph{sustained} levels of organization in this sense, then they should be interpreted as emph{extended} critical transitions.

  2. A bottom-up institutional approach to cooperative governance of risky commons

    Science.gov (United States)

    Vasconcelos, Vítor V.; Santos, Francisco C.; Pacheco, Jorge M.

    2013-09-01

    Avoiding the effects of climate change may be framed as a public goods dilemma, in which the risk of future losses is non-negligible, while realizing that the public good may be far in the future. The limited success of existing attempts to reach global cooperation has been also associated with a lack of sanctioning institutions and mechanisms to deal with those who do not contribute to the welfare of the planet or fail to abide by agreements. Here we investigate the emergence and impact of different types of sanctioning to deter non-cooperative behaviour in climate agreements. We show that a bottom-up approach, in which parties create local institutions that punish free-riders, promotes the emergence of widespread cooperation, mostly when risk perception is low, as it is at present. On the contrary, global institutions provide, at best, marginal improvements regarding overall cooperation. Our results clearly suggest that a polycentric approach involving multiple institutions is more effective than that associated with a single, global one, indicating that such a bottom-up, self-organization approach, set up at a local scale, provides a better ground on which to attempt a solution for such a complex and global dilemma.

  3. Top-down and bottom-up definitions of human failure events in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  4. Formation of three-dimensional hepatic tissue by the bottom-up method using spheroids.

    Science.gov (United States)

    Okudaira, Tatsuya; Amimoto, Naoki; Mizumoto, Hiroshi; Kajiwara, Toshihisa

    2016-08-01

    Liver regenerative medicine has attracted attention as a possible alternative to organ transplantation. To address the challenge of liver regenerative medicine, the development of a construction method has been proposed for liver tissue in vitro with a high cell density and high functionality for transplantation into patients with severe liver failure. In this study, we fabricated highly functional three-dimensional hepatic tissue by a bottom-up method using spheroids. The hepatic tissue was formed by stacking hepatocyte spheroids covered with human umbilical vein endothelial cells (HUVECs). Hepatic tissue constructs were evaluated for cell survival, liver-specific functions, and histologically. As a result, we identified improvements in liver-specific functions (ammonia removal and albumin secretion) and cell survival. In addition, HUVECs were regularly distributed at every 100 μm within the tissue, and live cells were present within the whole tissue construct throughout the culture period. In summary, we successfully fabricated highly functional hepatic tissue by the bottom-up method using HUVEC-covered hepatocyte spheroids. PMID:26803704

  5. A bottom-up approach for the synthesis of highly ordered fullerene-intercalated graphene hybrids

    Directory of Open Access Journals (Sweden)

    Dimitrios eGournis

    2015-02-01

    Full Text Available Much of the research effort on graphene focuses on its use as a building block for the development of new hybrid nanostructures with well-defined dimensions and properties suitable for applications such as gas storage, heterogeneous catalysis, gas/liquid separations, nanosensing and biomedicine. Towards this aim, here we describe a new bottom-up approach, which combines self-assembly with the Langmuir Schaefer deposition technique to synthesize graphene-based layered hybrid materials hosting fullerene molecules within the interlayer space. Our film preparation consists in a bottom-up layer-by-layer process that proceeds via the formation of a hybrid organo-graphene oxide Langmuir film. The structure and composition of these hybrid fullerene-containing thin multilayers deposited on hydrophobic substrates were characterized by a combination of X-ray diffraction, Raman and X-ray photoelectron spectroscopies, atomic force microscopy and conductivity measurements. The latter revealed that the presence of C60 within the interlayer spacing leads to an increase in electrical conductivity of the hybrid material as compared to the organo-graphene matrix alone.

  6. Top-down (Prior Knowledge) and Bottom-up (Perceptual Modality) Influences on Spontaneous Interpersonal Synchronization.

    Science.gov (United States)

    Gipson, Christina L; Gorman, Jamie C; Hessler, Eric E

    2016-04-01

    Coordination with others is such a fundamental part of human activity that it can happen unintentionally. This unintentional coordination can manifest as synchronization and is observed in physical and human systems alike. We investigated the role of top-down influences (prior knowledge of the perceptual modality their partner is using) and bottom-up factors (perceptual modality combination) on spontaneous interpersonal synchronization. We examine this phenomena with respect to two different theoretical perspectives that differently emphasize top-down and bottom-up factors in interpersonal synchronization: joint-action/shared cognition theories and ecological-interactive theories. In an empirical study twelve dyads performed a finger oscillation task while attending to each other's movements through either visual, auditory, or visual and auditory perceptual modalities. Half of the participants were given prior knowledge of their partner's perceptual capabilities for coordinating across these different perceptual modality combinations. We found that the effect of top-down influence depends on the perceptual modality combination between two individuals. When people used the same perceptual modalities, top-down influence resulted in less synchronization and when people used different perceptual modalities, top-down influence resulted in more synchronization. Furthermore, persistence in the change in behavior as a result of having perceptual information about each other ('social memory') was stronger when this top-down influence was present. PMID:27033133

  7. The Challenge of Bottom-Up Paradigm and Popular Participation in Sustainable Rural Development of Nigeria: The Way Forward

    OpenAIRE

    J. O. Adefila

    2012-01-01

    The paper is entitled ‘The challenge of bottom-up paradigm and popular participation in rural economic development of Nigeria’. There is the clamour for a shift from centre-down to bottom-up paradigm particularly among the rural developers considering the back-wash effects of the latter which tends to undermine the economic growth and development of the rural areas. The paper aims at reinforcing the adoption of bottom-up and popular participation approaches to rural socio-economic transformat...

  8. Integrating the bottom-up and top-down approach to energy economy modelling. The case of Denmark

    DEFF Research Database (Denmark)

    Klinge Jacobsen, Henrik

    1998-01-01

    This paper presents results from an integration project covering Danish models based on bottom-up and top-down approaches to energy]economy modelling. The purpose of the project was to identify theoretical and methodological problems for integrating existing models for Denmark and to implement an...... integration of the models. The integration was established through a number of links between energy bottom-up modules and a macroeconomic model. In this integrated model it is possible to analyse both top-down instruments, such as taxes along with bottom-up instruments, such as regulation of technology...

  9. An Improved Model of Producing Saliency Map for Visual Attention System

    Science.gov (United States)

    Huang, Jingang; Kong, Bin; Cheng, Erkang; Zheng, Fei

    The iLab Neuromorphic Vision Toolkit (iINVT), steadily kept up to date by the group around Laurent Itti, is one of the currently best known attention systems. Their model of bottom up or saliency-based visual attention as well as their implementation serves as a basis for many research groups. How to combine the feature maps finally into the saliency map is a key point for this kind of visual attention system. We modified the original model of Laurent Itti to make it more corresponding with our perception.

  10. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  11. A bottom up approach for engineering catchments through sustainable runoff management

    Science.gov (United States)

    Wilkinson, M.; Quinn, P. F.; Jonczyk, J.; Burke, S.

    2010-12-01

    There is no doubt that our catchments are under great stress. There have been many accounts around the world of severe flood events and water quality issues within channels. As a result of these, ecological habitats in rivers are also under pressure. Within the United Kingdom, all these issues have been identified as key target areas for policy. Traditionally this has been managed by a policy driven top down approach which is usually ineffective. A one ‘size fits all’ attitude often does not work. This paper presents a case study in northern England whereby a bottom up approach is applied to multipurpose managing of catchments at the source (in the order of 1-10km2). This includes simultaneous tackling of water quality, flooding and ecological issues by creating sustainable runoff management solutions such as storage ponds, wetlands, beaver dams and willow riparian features. In order to identify the prevailing issues in a specific catchment, full and transparent stakeholder engagement is essential, with everybody who has a vested interest in the catchment being involved from the beginning. These problems can then be dealt with through the use of a novel catchment management toolkit, which is transferable to similar scale catchments. However, evidence collected on the ground also allows for upscaling of the toolkit. The process gathers the scientific evidence about the effectiveness of existing or new measures, which can really change the catchment functions. Still, we need to get better at communicating the science to policy makers and policy therefore must facilitate a bottom up approach to land and water management. We show a test site for this approach in the Belford burn catchment (6km2), northern England. This catchment has problems with flooding and water quality. Increased sediment loads are affecting the nearby estuary which is an important ecological zone and numerous floods have affected the local village. A catchment engineering toolkit has been

  12. Succumbing to Bottom-Up Biases on Task Choice Predicts Increased Switch Costs in the Voluntary Task Switching Paradigm

    OpenAIRE

    JosephMichaelOrr

    2011-01-01

    Bottom-up biases are widely thought to influence task choice in the voluntary task switching paradigm. Definitive support for this hypothesis is lacking, however, because task choice and task performance are usually confounded. We therefore revisited this hypothesis using a paradigm in which task choice and task performance are temporally separated. As predicted, participants tended to choose the task that was primed by bottom-up biases. Moreover, such choices were linked to increased swit...

  13. Unsupervised Tattoo Segmentation Combining Bottom-Up and Top-Down Cues

    Energy Technology Data Exchange (ETDEWEB)

    Allen, Josef D [ORNL

    2011-01-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for nding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a gure-ground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is e cient and suitable for further tattoo classi cation and retrieval purpose.

  14. Bottom-up and top-down controls on picoplankton in the East China Sea

    Directory of Open Access Journals (Sweden)

    C. Guo

    2013-05-01

    Full Text Available Dynamics of picoplankton population distribution in the East China Sea (ECS, a marginal sea in the western North Pacific Ocean, were studied during two "CHOICE-C" cruises in August 2009 (summer and January 2010 (winter. Dilution experiments were conducted during the two cruises to investigate the growth and grazing among picophytoplantkon populations. Picoplankton accounted for an average of ~29% (2% to 88% of community carbon biomass in the ECS on average, with lower percentages in plume region than in shelf and kuroshio regions. Averaged growth rates (μ for Prochlorococcus (Pro, Synechococcus (Syn and picoeukaryotes (peuk were 0.36, 0.89, 0.90 d−1, respectively, in summer, and 0.46, 0.58, 0.56 d−1, respectively, in winter. Seawater salinity and nutrient availability exerted significant controls on picoplankton growth rate. Averaged grazing mortality (m were 0.46, 0.63, 0.68 d−1 in summer, and 0.22, 0.32, 0.22 d−1 in winter for Pro, Syn and peuk respectively. The three populations demonstrated very different distribution patterns regionally and seasonally affected by both bottom-up and top-down controls. In summer, Pro, Syn and peuk were dominant in Kuroshio, transitional and plume regions respectively. Protist grazing consumed 84%, 78%, 73% and 45%, 47%, 57% of production for Pro, Syn and peuk in summer and winter respectively, suggesting more significant top-down controls in summer. In winter, all three populations tended to distribute in offshore regions, although the area of coverage was different (peuk > Syn > Pro. Bottom-up factors can explain as much as 91.5%, 82% and 81.2% of Pro, Syn and peuk abundance variance in winter, while only 59.1% and 43.7% for Pro and peuk in summer. Regionally, Yangtze River discharge plays a significant role in affecting the intensity of top-down control, indicated by significant and negative association between salinity and grazing mortality of all three populations and higher grazing mortality to

  15. Enhancing Bottom-up and Top-down Proteomic Measurements with Ion Mobility Separations

    Energy Technology Data Exchange (ETDEWEB)

    Baker, Erin Shammel [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Burnum-Johnson, Kristin E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Ibrahim, Yehia M. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Orton, Daniel J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Monroe, Matthew E. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Kelly, Ryan T. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Moore, Ronald J. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Zhang, Xing [Pacific Northwest National Lab. (PNNL), Richland, WA (United States); Theberge, Roger [Boston Univ. of Medicine, MA (United States); Costello, Catherine E. [Boston Univ. of Medicine, MA (United States); Smith, Richard D. [Pacific Northwest National Lab. (PNNL), Richland, WA (United States)

    2015-07-03

    Proteomic measurements with greater throughput, sensitivity and additional structural information enhance the in-depth characterization of complex mixtures and targeted studies with additional information and higher confidence. While liquid chromatography separation coupled with mass spectrometry (LC-MS) measurements have provided information on thousands of proteins in different sample types, the additional of another rapid separation stage providing structural information has many benefits for analyses. Technical advances in ion funnels and multiplexing have enabled ion mobility separations to be easily and effectively coupled with LC-MS proteomics to enhance the information content of measurements. Herein, we report on applications illustrating increased sensitivity, throughput, and structural information by utilizing IMS-MS and LC-IMS-MS measurements for both bottom-up and top-down proteomics measurements.

  16. A bottom-up perspective on leadership of collaborative innovation in the public sector

    DEFF Research Database (Denmark)

    Hansen, Jesper Rohr

    organizations. A crucial condition for success is iterative leadership adaptation. In conclusion, the thesis finds that specialized professionals are indeed able to develop politically viable, innovative and collaborative solutions to wicked problems; and that such professionals are able to transform themselves......The thesis investigates how new forms of public leadership can contribute to solving complex problems in today’s welfare societies through innovation. A bottom-up type of leadership for collaborative innovation addressing wicked problems is theorised, displaying a social constructive process...... approach to leadership; a theoretical model emphasises that leadership emerges through social processes of recognition. Leadership is recognised by utilising the uncertainty of a wicked problem and innovation to influence collaborators’ sensemaking processes. The empirical setting is the City of Copenhagen...

  17. Bottom-up design of 2D organic photocatalysts for visible-light driven hydrogen evolution

    Science.gov (United States)

    Wang, Peng; Jiang, Xue; Zhao, Jijun

    2016-01-01

    To design two-dimensional (2D) organocatalysts, three series of covalent organic frameworks (COFs) are constructed using bottom-up strategies, i.e. molecular selection, tunable linkage, and functionalization. First-principles calculations are performed to confirm their photocatalytic activity under visible light. Two of our constructed 2D COF models (B1 and C3) are identified as a sufficiently efficient organocatalyst for visible light water splitting. The controllable construction of such COFs from suitable organic subunit, linkage, and functional groups paves the way for correlating band edge alignments and geometry parameters of 2D organic materials. Our theoretical prediction not only provides essential insights into designing 2D-COF photocatalysts for water splitting, but also sparks other technological applications for 2D organic materials.

  18. Bottom-Up Cost Analysis of a High Concentration PV Module; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, K.; Woodhouse, M.; Lee, H.; Smestad, G.

    2015-04-13

    We present a bottom-up model of III-V multi-junction cells, as well as a high concentration PV (HCPV) module. We calculate $0.65/Wp(DC) manufacturing costs for our model HCPV module design with today’s capabilities, and find that reducing cell costs and increasing module efficiency offer the promising pathways for future cost reductions. Cell costs could be significantly reduced via an increase in manufacturing scale, substrate reuse, and improved manufacturing yields. We also identify several other significant drivers of HCPV module costs, including the Fresnel lens primary optic, module housing, thermal management, and the receiver board. These costs could potentially be lowered by employing innovative module designs.

  19. Coupled multi-physics simulation frameworks for reactor simulation: A bottom-up approach

    International Nuclear Information System (INIS)

    In this paper, we present a 'bottom-up' approach to multi-physics frameworks, where we first develop common interfaces to simulation data, then adapt existing physics modules to communicate through those interfaces. Interfaces are provided for geometry, mesh, and field data, and are independent of one another; a fourth interface is available for relating data between these interfaces. Physics modules read and write data through those common interfaces, which also provide access to common simulation services like parallel IO, mesh partitioning, etc. Multi-physics codes are assembled as a combination of physics modules, services, interface implementations, and driver code which coordinates calling these various pieces. The framework being constructed as part of this effort, referred to as SHARP, is shown

  20. Bottom-up formation of endohedral mono-metallofullerenes is directed by charge transfer

    Science.gov (United States)

    Dunk, Paul W.; Mulet-Gas, Marc; Nakanishi, Yusuke; Kaiser, Nathan K.; Rodríguez-Fortea, Antonio; Shinohara, Hisanori; Poblet, Josep M.; Marshall, Alan G.; Kroto, Harold W.

    2014-12-01

    An understanding of chemical formation mechanisms is essential to achieve effective yields and targeted products. One of the most challenging endeavors is synthesis of molecular nanocarbon. Endohedral metallofullerenes are of particular interest because of their unique properties that offer promise in a variety of applications. Nevertheless, the mechanism of formation from metal-doped graphite has largely eluded experimental study, because harsh synthetic methods are required to obtain them. Here we report bottom-up formation of mono-metallofullerenes under core synthesis conditions. Charge transfer is a principal factor that guides formation, discovered by study of metallofullerene formation with virtually all available elements of the periodic table. These results could enable production strategies that overcome long-standing problems that hinder current and future applications of metallofullerenes.

  1. Strain Response of Hot-Mix Asphalt Overlays for Bottom-Up Reflective Cracking

    CERN Document Server

    Ghauch, Ziad G

    2011-01-01

    This paper examines the strain response of typical HMA overlays above jointed PCC slabs prone to bottom-up reflective cracking. The occurrence of reflective cracking under the combined effect of traffic and environmental loading significantly reduces the design life of the HMA overlays and can lead to its premature failure. In this context, viscoelastic material properties combined with cyclic vehicle loadings and pavement temperature distribution were implemented in a series of FE models in order to study the evolution of horizontal tensile and shear strains at the bottom of the HMA overlay. The effect of several design parameters, such as subbase and subgrade moduli, vehicle speed, overlay thickness, and temperature condition, on the horizontal and shear strain response was investigated. Results obtained show that the rate of horizontal and shear strain increase at the bottom of the HMA overlay drop with higher vehicle speed, higher subgrade modulus, and higher subbase modulus. Moreover, the rate of horizon...

  2. Bottom-up design of 2D organic photocatalysts for visible-light driven hydrogen evolution

    International Nuclear Information System (INIS)

    To design two-dimensional (2D) organocatalysts, three series of covalent organic frameworks (COFs) are constructed using bottom-up strategies, i.e. molecular selection, tunable linkage, and functionalization. First-principles calculations are performed to confirm their photocatalytic activity under visible light. Two of our constructed 2D COF models (B1 and C3) are identified as a sufficiently efficient organocatalyst for visible light water splitting. The controllable construction of such COFs from suitable organic subunit, linkage, and functional groups paves the way for correlating band edge alignments and geometry parameters of 2D organic materials. Our theoretical prediction not only provides essential insights into designing 2D-COF photocatalysts for water splitting, but also sparks other technological applications for 2D organic materials. (paper)

  3. Collective Inclusioning: A Grounded Theory of a Bottom-Up Approach to Innovation and Leading

    Directory of Open Access Journals (Sweden)

    Michal Lysek

    2016-06-01

    Full Text Available This paper is a grounded theory study of how leaders (e.g., entrepreneurs, managers, etc. engage people in challenging undertakings (e.g., innovation that require everyone’s commitment to such a degree that they would have to go beyond what could be reasonably expected in order to succeed. Company leaders sometimes wonder why their employees no longer show the same responsibility towards their work, and why they are more concerned with internal politics than solving customer problems. It is because company leaders no longer apply collective inclusioning to the same extent as they did in the past. Collective inclusioning can be applied in four ways by convincing, afinitizing, goal congruencing, and engaging. It can lead to fostering strong units of people for taking on challenging undertakings. Collective inclusioning is a complementing theory to other strategic management and leading theories. It offers a new perspective on how to implement a bottom-up approach to innovation.

  4. Bottom-Up Nanofabrication of Supported Noble Metal Alloy Nanoparticle Arrays for Plasmonics

    DEFF Research Database (Denmark)

    Nugroho, Ferry A. A.; Iandolo, Beniamino; Wagner, Jakob Birkedal; Langhammer, Christoph

    2016-01-01

    concept, we focus on Au-based binary and ternary alloy systems with Ag, Cu, and Pd, due to their high relevance for nanoplasmonics and complete miscibility, and characterize their optical properties. Moreover, as an example for the relevance of the obtained materials for integration in devices, we......Mixing different elements at the nanoscale to obtain alloy nanostructures with fine-tuned physical and chemical properties offers appealing opportunities for nanotechnology and nanoscience. However, despite widespread successful application of alloy nanoparticles made by colloidal synthesis in...... optimization of the surface density. These cannot be fulfilled even using state-of-the-art self -assembly strategies of colloids. As a solution, we present here a generic bottom-up nanolithography-compatible fabrication approach for large-area arrays of alloy nanoparticles on surfaces. To illustrate the...

  5. Manufacturing at Nanoscale: Top-Down, Bottom-up and System Engineering

    International Nuclear Information System (INIS)

    The current nano-technology revolution is facing several major challenges: to manufacture nanodevices below 20 nm, to fabricate three-dimensional complex nano-structures, and to heterogeneously integrate multiple functionalities. To tackle these grand challenges, the Center for Scalable and Integrated NAno-Manufacturing (SINAM), a NSF Nanoscale Science and Engineering Center, set its goal to establish a new manufacturing paradigm that integrates an array of new nano-manufacturing technologies, including the plasmonic imaging lithography and ultramolding imprint lithography aiming toward critical resolution of 1-10 nm and the hybrid top-down and bottom-up technologies to achieve massively parallel integration of heterogeneous nanoscale components into higher-order structures and devices. Furthermore, SINAM will develop system engineering strategies to scale-up the nano-manufacturing technologies. SINAMs integrated research and education platform will shed light to a broad range of potential applications in computing, telecommunication, photonics, biotechnology, health care, and national security

  6. Using dichotic listening to study bottom-up and top-down processing in children and adults.

    Science.gov (United States)

    Andersson, Martin; Llera, John Eric; Rimol, Lars M; Hugdahl, Kenneth

    2008-09-01

    The study examined top-down attention modulation of bottom-up processing in children and adults under conditions of varying bottom-up stimulus demands. Voiced and unvoiced consonant-vowel syllables were used in a dichotic-listening situation to manipulate the strength of the bottom-up stimulus-driven right ear advantage when subjects were instructed to focus attention on, and report, either the left or right ear stimulus. We predicted that children would differ from adults in their ability to use attention to modulate a lateralized ear advantage, and particularly when there was a conflict between the direction of the bottom-up ear advantage and the direction of the top-down attention instruction. Thirty children and 30 adults were presented with dichotic presentations of consonant-vowel syllables. The results showed that the voicing manipulation affected the strength of the ear advantage, and that the children performed significantly below the adults when the voicing parameter caused a strong conflict between bottom-up and top down processing. Thus, children seem to lack the cognitive flexibility necessary to modulate a stimulus-driven bottom-up ear advantage, particularly in situations where right ear advantage (REA) is enhanced by the acoustic properties of the stimuli and attentional demands require a left ear shift. It is suggested that varying the stimulus demands in a dichotic-listening situation may be a novel way to study cognitive development. PMID:18608228

  7. Achieving social-ecological fit through bottom-up collaborative governance: an empirical investigation

    Directory of Open Access Journals (Sweden)

    Angela M. Guerrero

    2015-12-01

    Full Text Available Significant benefits can arise from collaborative forms of governance that foster self-organization and flexibility. Likewise, governance systems that fit with the extent and complexity of the system under management are considered essential to our ability to solve environmental problems. However, from an empirical perspective the fundamental question of whether self-organized (bottom-up collaborative forms of governance are able to accomplish adequate fit is unresolved. We used new theory and methodological approaches underpinned by interdisciplinary network analysis to address this gap by investigating three governance challenges that relate to the problem of fit: shared management of ecological resources, management of interconnected ecological resources, and cross-scale management. We first identified a set of social-ecological network configurations that represent the hypothesized ways in which collaborative arrangements can contribute to addressing these challenges. Using social and ecological data from a large-scale biodiversity conservation initiative in Australia, we empirically determined how well the observed patterns of stakeholder interactions reflect these network configurations. We found that stakeholders collaborate to manage individual parcels of native vegetation, but not for the management of interconnected parcels. In addition, our data show that the collaborative arrangements enable management across different scales (local, regional, supraregional. Our study provides empirical support for the ability of collaborative forms of governance to address the problem of fit, but also suggests that in some cases the establishment of bottom-up collaborative arrangements would likely benefit from specific guidance to facilitate the establishment of collaborations that better align with the ways ecological resources are interconnected across the landscape. In our case study region, this would improve the capacity of stakeholders to

  8. A bottom-up approach of stochastic demand allocation in water quality modelling

    Directory of Open Access Journals (Sweden)

    E. J. M. Blokker

    2010-04-01

    Full Text Available An "all pipes" hydraulic model of a drinking water distribution system was constructed with two types of demand allocations. One is constructed with the conventional top-down approach, i.e. a demand multiplier pattern from the booster station is allocated to all demand nodes with a correction factor to account for the average water demand on that node. The other is constructed with a bottom-up approach of demand allocation, i.e., each individual home is represented by one demand node with its own stochastic water demand pattern. This was done for a drinking water distribution system of approximately 10 km of mains and serving ca. 1000 homes. The system was tested in a real life situation.

    The stochastic water demand patterns were constructed with the end-use model SIMDEUM on a per second basis and per individual home. Before applying the demand patterns in a network model, some temporal aggregation was done. The flow entering the test area was measured and a tracer test with sodium chloride was performed to determine travel times. The two models were validated on the total sum of demands and on travel times.

    The study showed that the bottom-up approach leads to realistic water demand patterns and travel times, without the need for any flow measurements or calibration. In the periphery of the drinking water distribution system it is not possible to calibrate models on pressure, because head losses are too low. The study shows that in the periphery it is also difficult to calibrate on water quality (e.g. with tracer measurements, as a consequence of the high variability between days. The stochastic approach of hydraulic modelling gives insight into the variability of travel times as an added feature beyond the conventional way of modelling.

  9. Top-down and bottom-up approaches for cost estimating new reactor designs

    International Nuclear Information System (INIS)

    For several years, Generation-4 designs will be 'pre-conceptual' for the less mature concepts and 'preliminary' for the more mature concepts. In this situation, appropriate data for some of the plant systems may be lacking to develop a bottom-up cost estimate. Therefore, a more global approach, the Top-Down Approach (TDA), is needed to help the designers and decision makers in comparing design options. It utilizes more or less simple models for cost estimating the different parts of a design. TDA cost estimating effort applies to a whole functional element whose cost is approached by similar estimations coming from existing data, ratios and models, for a given range of variation of parameters. Modeling is used when direct analogy is not possible. There are two types of models, global and specific ones. Global models are applied to cost modules related to Code Of Account. Exponential formulae such as Ci = Ai + (Bi x Pin) are used when there are cost data for comparable modules in nuclear or other industries. Specific cost models are developed for major specific components of the plant: - process equipment such as reactor vessel, steam generators or large heat exchangers. - buildings, with formulae estimating the construction cost from base cost of m3 of building volume. - systems, when unit costs, cost ratios and models are used, depending on the level of detail of the design. Bottom Up Approach (BUA), which is based on unit prices coming from similar equipment or from manufacturer consulting, is very valuable and gives better cost estimations than TDA when it can be applied, that is at a rather late stage of the design. Both approaches are complementary when some parts of the design are detailed enough to be estimated by BUA, and when BUA results are used to check TDA results and to improve TDA models. This methodology is applied to the HTR (High Temperature Reactor) concept and to an advanced PWR design

  10. Scaling up self-assembly: bottom-up approaches to macroscopic particle organization.

    Science.gov (United States)

    Lash, M H; Fedorchak, M V; McCarthy, J J; Little, S R

    2015-07-28

    This review presents an overview of recent work in the field of non-Brownian particle self-assembly. Compared to nanoparticles that naturally self-assemble due to Brownian motion, larger, non-Brownian particles (d > 6 μm) are less prone to autonomously organize into crystalline arrays. The tendency for particle systems to experience immobilization and kinetic arrest grows with particle radius. In order to overcome this kinetic limitation, some type of external driver must be applied to act as an artificial "thermalizing force" upon non-Brownian particles, inducing particle motion and subsequent crystallization. Many groups have explored the use of various agitation methods to overcome the natural barriers preventing self-assembly to which non-Brownian particles are susceptible. The ability to create materials from a bottom-up approach with these characteristics would allow for precise control over their pore structure (size and distribution) and surface properties (topography, functionalization and area), resulting in improved regulation of key characteristics such as mechanical strength, diffusive properties, and possibly even photonic properties. This review will highlight these approaches, as well as discuss the potential impact of bottom-up macroscale particle assembly. The applications of such technology range from customizable and autonomously self-assembled niche microenvironments for drug delivery and tissue engineering to new acoustic dampening, battery, and filtration materials, among others. Additionally, crystals made from non-Brownian particles resemble naturally derived materials such as opals, zeolites, and biological tissue (i.e. bone, cartilage and lung), due to their high surface area, pore distribution, and tunable (multilevel) hierarchy. PMID:25947543

  11. Top-down instead of bottom-up estimates of uncertainty in INAA results?

    International Nuclear Information System (INIS)

    The initial publication of the ISO Guide to the Expression of Uncertainty in Measurement (GUM) and many related documents has resulted in a worldwide awareness of the importance of a realistic estimate of the value reported after the +/- sign. The evaluation of uncertainty in measurement, as introduced by the GUM, is derived from the principles applied in physical measurements. Many testing laboratories have already experienced large problems in applying these principles in e.g. (bio)chemical measurements, resulting in time-consuming evaluations and costly additional experiments. Other, more pragmatic and less costly approaches have been proposed to obtain a realistic estimate of the range in which the true value of the measurement may be found with a certain degree of probability. One of these approaches, the 'top-down method', is based on the standard deviation in the results of intercomparison data. This approach is much easier for tests for which it is either difficult to establish a full measurement equation, or if e.g. matrix-matching reference materials are absent. It has been demonstrated that the GUM 'bottom-up' approach of evaluating uncertainty in measurement can easily be applied in instrumental neutron activation analysis (INAA) as all significant sources of uncertainty can be evaluated. INAA is therefore a valuable technique to test the validity of the top-down approach. In this contribution, examples of the top-down evaluation of uncertainty in INAA derived from participation in intercomparison rounds and proficiency testing schemes will be presented. The results will be compared with the bottom-up evaluation of uncertainty, and ease of applicability, validity and usefullness of both approaches will be discussed.

  12. Prefrontal /accumbal catecholamine system processes high motivational salience.

    Directory of Open Access Journals (Sweden)

    Stefano Puglisi-Allegra

    2012-06-01

    Neural mechanisms mediating motivational salience attribution are, therefore, very important for individual and species survival and for well-being. However, these neural mechanisms could be implicated in attribution of abnormal motivational salience to different stimuli leading to maladaptive compulsive seeking or avoidance. We have offered the first evidence that prefrontal cortical norepinephrine transmission is a necessary condition for motivational salience attribution to highly salient stimuli, through modulation of dopamine in the nucleus accumbens, a brain area involved in all motivated behaviors. Moreover, we have shown that prefrontal-accumbal catecholamine system determines approach or avoidance responses to both reward- and aversion-related stimuli only when the salience of the unconditioned stimulus is high enough to induce sustained catecholamine activation, thus affirming that this system processes motivational salience attribution selectively to highly salient events.

  13. A Novel GBM Saliency Detection Model Using Multi-Channel MRI.

    Directory of Open Access Journals (Sweden)

    Subhashis Banerjee

    Full Text Available The automatic computerized detection of regions of interest (ROI is an important step in the process of medical image processing and analysis. The reasons are many, and include an increasing amount of available medical imaging data, existence of inter-observer and inter-scanner variability, and to improve the accuracy in automatic detection in order to assist doctors in diagnosing faster and on time. A novel algorithm, based on visual saliency, is developed here for the identification of tumor regions from MR images of the brain. The GBM saliency detection model is designed by taking cue from the concept of visual saliency in natural scenes. A visually salient region is typically rare in an image, and contains highly discriminating information, with attention getting immediately focused upon it. Although color is typically considered as the most important feature in a bottom-up saliency detection model, we circumvent this issue in the inherently gray scale MR framework. We develop a novel pseudo-coloring scheme, based on the three MRI sequences, viz. FLAIR, T2 and T1C (contrast enhanced with Gadolinium. A bottom-up strategy, based on a new pseudo-color distance and spatial distance between image patches, is defined for highlighting the salient regions in the image. This multi-channel representation of the image and saliency detection model help in automatically and quickly isolating the tumor region, for subsequent delineation, as is necessary in medical diagnosis. The effectiveness of the proposed model is evaluated on MRI of 80 subjects from the BRATS database in terms of the saliency map values. Using ground truth of the tumor regions for both high- and low- grade gliomas, the results are compared with four highly referred saliency detection models from literature. In all cases the AUC scores from the ROC analysis are found to be more than 0.999 ± 0.001 over different tumor grades, sizes and positions.

  14. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Directory of Open Access Journals (Sweden)

    Merger Eduard

    2012-08-01

    Full Text Available Abstract Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV, and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to

  15. Pressurized Pepsin Digestion in Proteomics: An Automatable Alternative to Trypsin for Integrated Top-down Bottom-up Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Ferrer, Daniel; Petritis, Konstantinos; Robinson, Errol W.; Hixson, Kim K.; Tian, Zhixin; Lee, Jung Hwa; Lee, Sang-Won; Tolic, Nikola; Weitz, Karl K.; Belov, Mikhail E.; Smith, Richard D.; Pasa-Tolic, Ljiljana

    2011-02-01

    Integrated top-down bottom-up proteomics combined with online digestion has great potential to improve the characterization of protein isoforms in biological systems and is amendable to highthroughput proteomics experiments. Bottom-up proteomics ultimately provides the peptide sequences derived from the tandem MS analyses of peptides after the proteome has been digested. Top-down proteomics conversely entails the MS analyses of intact proteins for more effective characterization of genetic variations and/or post-translational modifications (PTMs). Herein, we describe recent efforts towards efficient integration of bottom-up and top-down LCMS based proteomic strategies. Since most proteomic platforms (i.e. LC systems) operate in acidic environments, we exploited the compatibility of the pepsin (i.e. the enzyme’s natural acidic activity) for the integration of bottom-up and top-down proteomics. Pressure enhanced pepsin digestions were successfully performed and characterized with several standard proteins in either an offline mode using a Barocycler or an online mode using a modified high pressure LC system referred to as a fast online digestion system (FOLDS). FOLDS was tested using pepsin and a whole microbial proteome, and the results compared against traditional trypsin digestions on the same platform. Additionally, FOLDS was integrated with a RePlay configuration to demonstrate an ultra-rapid integrated bottom-up top-down proteomic strategy employing a standard mixture of proteins and a monkey pox virus proteome.

  16. Bottom-up and top-down influences at untrained conditions determine perceptual learning specificity and transfer

    Science.gov (United States)

    Xiong, Ying-Zi; Zhang, Jun-Yun; Yu, Cong

    2016-01-01

    Perceptual learning is often orientation and location specific, which may indicate neuronal plasticity in early visual areas. However, learning specificity diminishes with additional exposure of the transfer orientation or location via irrelevant tasks, suggesting that the specificity is related to untrained conditions, likely because neurons representing untrained conditions are neither bottom-up stimulated nor top-down attended during training. To demonstrate these top-down and bottom-up contributions, we applied a “continuous flash suppression” technique to suppress the exposure stimulus into sub-consciousness, and with additional manipulations to achieve pure bottom-up stimulation or top-down attention with the transfer condition. We found that either bottom-up or top-down influences enabled significant transfer of orientation and Vernier discrimination learning. These results suggest that learning specificity may result from under-activations of untrained visual neurons due to insufficient bottom-up stimulation and/or top-down attention during training. High-level perceptual learning thus may not functionally connect to these neurons for learning transfer. DOI: http://dx.doi.org/10.7554/eLife.14614.001 PMID:27377357

  17. The Early Anthropogenic Hypothesis: Top-Down and Bottom-up Evidence

    Science.gov (United States)

    Ruddiman, W. F.

    2014-12-01

    Two complementary lines of evidence support the early anthropogenic hypothesis. Top-down evidence comes from comparing Holocene greenhouse-gas trends with those during equivalent intervals of previous interglaciations. The increases in CO2 and CH4 during the late Holocene are anomalous compared to the decreasing trends in a stacked average of previous interglaciations, thereby supporting an anthropogenic origin. During interglacial stage 19, the closest Holocene insolation analog, CO2 fell to 245 ppm by the time equivalent to the present, in contrast to the observed pre-industrial rise to 280-285 ppm. The 245-ppm level measured in stage 19 falls at the top of the natural range predicted by the original anthropogenic hypothesis of Ruddiman (2003). Bottom-up evidence comes from a growing list of archeological and other compilations showing major early anthropogenic transformations of Earth's surface. Key examples include: efforts by Dorian Fuller and colleagues mapping the spread of irrigated rice agriculture across southern Asia and its effects on CH4 emissions prior to the industrial era; an additional effort by Fuller showing the spread of methane-emitting domesticated livestock across Asia and Africa (coincident with the spread of fertile crescent livestock across Europe); historical compilations by Jed Kaplan and colleagues documenting very high early per-capita forest clearance in Europe, thus underpinning simulations of extensive pre-industrial clearance and large CO2 emissions; and wide-ranging studies by Erle Ellis and colleagues of early anthropogenic land transformations in China and elsewhere.

  18. Top-down silicon microcantilever with coupled bottom-up silicon nanowire for enhanced mass resolution

    International Nuclear Information System (INIS)

    A stepped cantilever composed of a bottom-up silicon nanowire coupled to a top-down silicon microcantilever electrostatically actuated and with capacitive or optical readout is fabricated and analyzed, both theoretically and experimentally, for mass sensing applications. The mass sensitivity at the nanowire free end and the frequency resolution considering thermomechanical noise are computed for different nanowire dimensions. The results obtained show that the coupled structure presents a very good mass sensitivity thanks to the nanowire, where the mass depositions take place, while also presenting a very good frequency resolution due to the microcantilever, where the transduction is carried out. A two-fold improvement in mass sensitivity with respect to that of the microcantilever standalone is experimentally demonstrated, and at least an order-of-magnitude improvement is theoretically predicted, only changing the nanowire length. Very close frequency resolutions are experimentally measured and theoretically predicted for a standalone microcantilever and for a microcantilever-nanowire coupled system. Thus, an improvement in mass sensing resolution of the microcantilever-nanowire stepped cantilever is demonstrated with respect to that of the microcantilever standalone. (paper)

  19. Vasculogenic potential evaluation of bottom-up, PCL scaffolds guiding early angiogenesis in tissue regeneration.

    Science.gov (United States)

    Rossi, L; Attanasio, C; Vilardi, E; De Gregorio, M; Netti, P A

    2016-06-01

    Vascularization is a key factor in the successful integration of tissue engineered (TE) grafts inside the host body. Biological functions of the newly formed tissue depend, in fact, on a reliable and fast spread of the vascular network inside the scaffold. In this study, we propose a technique for evaluating vascularization in TE constructs assembled by a bottom-up approach. The rational, ordered assembly of building blocks (BBs) into a 3D scaffold can improve vessel penetration, and-unlike most current technologies-is compatible with the insertion of different elements that can be designed independently (e.g. structural units, growth factor depots etc.). Poly(ε-caprolactone) scaffolds composed of orderly and randomly assembled sintered microspheres were used to assess the degree of vascularization in a pilot in vivo study. Scaffolds were implanted in a rat subcutaneous pocket model, and retrieved after 7 days. We introduce three quantitative factors as a measure of vascularization: the total percentage of vascularization, the vessels diameter distribution and the vascular penetration depth. These parameters were derived by image analysis of microcomputed tomographic scans of biological specimens perfused with a radiopaque polymer. The outcome of this study suggests that the rational assembly of BBs helps the onset and organization of a fully functional vascular network. PMID:27117793

  20. Optical and electronic properties study of bottom-up graphene nanoribbons for photovoltaic applications

    Science.gov (United States)

    Villegas, Cesar E. P.; Rocha, Alexandre

    2015-03-01

    Graphene nanoribbons (GNRs), turn out to be serious contender for several optolectronic applications due to their physical properties. Recently, bottom-up methods, using the assembly of appropriate precursor molecules were shown to be an exciting pathway towards making precise nanoribbons. In particular, it has been demonstrated that so-called cove-shaped GNRs absorb light in the visible part of the spectrum, suggesting they could be used for photovoltaic applications. In solar cells, the key ingredient is the presence excitons and their subsequent diffusion along a donor material. This is influenced by the character of the different excitations taking place, as well as, the exciton binding energy. Thus, In this work we use many-body corrected density functional theory to simulate the optical properties of these nanoribbons. We elucidate the most important transitions occurring in these systems, and identify types of excitatiions that have not been previously observed in conventional nanoribbons. We also find that the exciton binding energies for all the structures we considered are in the eV range, which enhances the diffusion lengths for the particle-hole pairs. Finally, we estimate the potencial of these systems as solar cells by calculating the short-circuit current. The Authors thank FAPESP for financial support.

  1. Bottom-Up, Wet Chemical Technique for the Continuous Synthesis of Inorganic Nanoparticles

    Directory of Open Access Journals (Sweden)

    Annika Betke

    2014-01-01

    Full Text Available Continuous wet chemical approaches for the production of inorganic nanoparticles are important for large scale production of nanoparticles. Here we describe a bottom-up, wet chemical method applying a microjet reactor. This technique allows the separation between nucleation and growth in a continuous reactor environment. Zinc oxide (ZnO, magnetite (Fe3O4, as well as brushite (CaHPO4·2H2O, particles with a small particle size distribution can be obtained continuously by using the rapid mixing of two precursor solutions and the fast removal of the nuclei from the reaction environment. The final particles were characterized by FT-IR, TGA, DLS, XRD and SEM techniques. Systematic studies on the influence of the different process parameters, such as flow rate and process temperature, show that the particle size can be influenced. Zinc oxide was obtained with particle sizes between 44 nm and 102 nm. The obtained magnetite particles have particle sizes in the range of 46 nm to 132 nm. Brushite behaves differently; the obtained particles were shaped like small plates with edge lengths between 100 nm and 500 nm.

  2. Rational design of modular circuits for gene transcription: A test of the bottom-up approach

    Directory of Open Access Journals (Sweden)

    Giordano Emanuele

    2010-11-01

    Full Text Available Abstract Background Most of synthetic circuits developed so far have been designed by an ad hoc approach, using a small number of components (i.e. LacI, TetR and a trial and error strategy. We are at the point where an increasing number of modular, inter-changeable and well-characterized components is needed to expand the construction of synthetic devices and to allow a rational approach to the design. Results We used interchangeable modular biological parts to create a set of novel synthetic devices for controlling gene transcription, and we developed a mathematical model of the modular circuits. Model parameters were identified by experimental measurements from a subset of modular combinations. The model revealed an unexpected feature of the lactose repressor system, i.e. a residual binding affinity for the operator site by induced lactose repressor molecules. Once this residual affinity was taken into account, the model properly reproduced the experimental data from the training set. The parameters identified in the training set allowed the prediction of the behavior of networks not included in the identification procedure. Conclusions This study provides new quantitative evidences that the use of independent and well-characterized biological parts and mathematical modeling, what is called a bottom-up approach to the construction of gene networks, can allow the design of new and different devices re-using the same modular parts.

  3. DIGESTIF: a universal quality standard for the control of bottom-up proteomics experiments.

    Science.gov (United States)

    Lebert, Dorothée; Louwagie, Mathilde; Goetze, Sandra; Picard, Guillaume; Ossola, Reto; Duquesne, Caroline; Basler, Konrad; Ferro, Myriam; Rinner, Oliver; Aebersold, Ruedi; Garin, Jérôme; Mouz, Nicolas; Brunner, Erich; Brun, Virginie

    2015-02-01

    In bottom-up mass spectrometry-based proteomics analyses, variability at any step of the process, particularly during sample proteolysis, directly affects the sensitivity, accuracy, and precision of peptide detection and quantification. Currently, no generic internal standards are available to control the quality of sample processing steps. This makes it difficult to assess the comparability of MS proteomic data obtained under different experimental conditions. Here, we describe the design, synthesis, and validation of a universal protein standard, called DIGESTIF, that can be added to any biological sample. The DIGESTIF standard consists of a soluble recombinant protein scaffold to which a set of 11 artificial peptides (iRT peptides) with good ionization properties has been incorporated. In the protein scaffold, the amino acids flanking iRT peptide cleavage sites were selected either to favor or hinder protease cleavage. After sample processing, the retention time and relative intensity pattern of the released iRT peptides can be used to assess the quality of sample workup, the extent of digestion, and the performance of the LC-MS system. Thus, DIGESTIF can be used to standardize a broad spectrum of applications, ranging from simple replicate measurements to large-scale biomarker screening in biomedical applications. PMID:25495225

  4. Top-Down Network Analysis to Drive Bottom-Up Modeling of Physiological Processes

    Science.gov (United States)

    Poirel, Christopher L.; Rodrigues, Richard R.; Chen, Katherine C.; Tyson, John J.

    2013-01-01

    Abstract Top-down analyses in systems biology can automatically find correlations among genes and proteins in large-scale datasets. However, it is often difficult to design experiments from these results. In contrast, bottom-up approaches painstakingly craft detailed models that can be simulated computationally to suggest wet lab experiments. However, developing the models is a manual process that can take many years. These approaches have largely been developed independently. We present Linker, an efficient and automated data-driven method that can analyze molecular interactomes to propose extensions to models that can be simulated. Linker combines teleporting random walks and k-shortest path computations to discover connections from a source protein to a set of proteins collectively involved in a particular cellular process. We evaluate the efficacy of Linker by applying it to a well-known dynamic model of the cell division cycle in Saccharomyces cerevisiae. Compared to other state-of-the-art methods, subnetworks computed by Linker are heavily enriched in Gene Ontology (GO) terms relevant to the cell cycle. Finally, we highlight how networks computed by Linker elucidate the role of a protein kinase (Cdc5) in the mitotic exit network of a dynamic model of the cell cycle. PMID:23641868

  5. Achieving integrated urban water management: planning top-down or bottom-up?

    Science.gov (United States)

    Gabe, J; Trowsdale, S; Vale, R

    2009-01-01

    Integrated Urban Water Management (IUWM) acknowledges a broad range of environmental and socio-economic outcomes but the link between design intentions and operational performance is not always clear. This may be due in part to a lack of shared principles that remove bias and inconsistency in assessing the operational performance of IUWM. This paper investigates the possibility of developing shared principles through examination of shared objectives and shared indicators within two logical and integrated frameworks for urban residential developments that aspire for IUWM and sustainable development. The framework method was applied using very different approaches-one a top-down urban planning process, the other a bottom-up community consultation process. Both frameworks highlight the extent to which IUWM is part of a broad social and environmental system. Core environmental performance objectives and indicators were very similar, highlighting the potential to develop shared principles in reporting and benchmarking the environmental performance of neighbourhood developments. Socio-economic indicators were highly variable due to process and likely contextual differences, thus it is unclear if the influence of IUWM on these variables can transcend the social context unless the practice of urban water management can expand its core responsibility beyond "hard" physical infrastructure. PMID:19474495

  6. Evidence for differential top-down and bottom-up suppression in posterior parietal cortex.

    Science.gov (United States)

    Mirpour, Koorosh; Bisley, James W

    2013-10-19

    When searching for an object, we usually avoid items that are visually different from the target and objects or places that have been searched already. Previous studies have shown that neural activity in the lateral intraparietal area (LIP) can be used to guide this behaviour; responses to task irrelevant stimuli or to stimuli that have been fixated previously in the trial are reduced compared with responses to potential targets. Here, we test the hypothesis that these reduced responses have a different genesis. Two animals were trained on a visual foraging task, in which they had to find a target among a number of physically identical potential targets (T) and task irrelevant distractors. We recorded neural activity and local field potentials (LFPs) in LIP while the animals performed the task. We found that LFP power was similar for potential targets and distractors but was greater in the alpha and low beta bands when a previously fixated T was in the response field. We interpret these data to suggest that the reduced single-unit response to distractors is a bottom-up feed-forward result of processing in earlier areas and the reduced response to previously fixated Ts is a result of active top-down suppression. PMID:24018730

  7. Top-down and bottom-up regulation of macroalgal community structure on a Kenyan reef

    Science.gov (United States)

    Mörk, Erik; Sjöö, Gustaf Lilliesköld; Kautsky, Nils; McClanahan, Tim R.

    2009-09-01

    Top-down and bottom-up regulation in the form of grazing by herbivores and nutrient availability are important factors governing macroalgal communities in the coral reef ecosystem. Today, anthropogenic activities, such as over-harvesting of herbivorous fish and sea urchins and increased nutrient loading, are altering the interaction of these two structuring forces. The present study was conducted in Kenya and investigates the relative importance of herbivory and nutrient loading on macroalgal community dynamics, by looking at alterations in macroalgal functional groups, species diversity ( H') and biomass within experimental quadrats. The experiment was conducted in situ for 42 days during the dry season. Cages excluding large herbivorous fish and sea urchins were used in the study and nutrient addition was conducted using coated, slow-release fertilizer (nitrogen and phosphorous) at a site where herbivory is generally low and nutrient levels are relatively high for the region. Nutrient addition increased tissue nutrient content in the algae, and fertilized quadrats had 24% higher species diversity. Herbivore exclusion resulted in a 77% increase in algal biomass, mainly attributable to a >1000% increase in corticated forms. These results are in accordance with similar studies in other regions, but are unique in that they indicate that, even when prevailing nutrient levels are relatively high and herbivore pressure is relatively low, continued anthropogenic disturbance results in further ecological responses and increased reef degradation.

  8. Template-Free Bottom-Up Method for Fabricating Diblock Copolymer Patchy Particles.

    Science.gov (United States)

    Ye, Xianggui; Li, Zhan-Wei; Sun, Zhao-Yan; Khomami, Bamin

    2016-05-24

    Patchy particles are one of most important building blocks for hierarchical structures because of the discrete patches on their surface. We have demonstrated a convenient, simple, and scalable bottom-up method for fabricating diblock copolymer patchy particles through both experiments and dissipative particle dynamics (DPD) simulations. The experimental method simply involves reducing the solvent quality of the diblock copolymer solution by the slow addition of a nonsolvent. Specifically, the fabrication of diblock copolymer patchy particles begins with a crew-cut soft-core micelle, where the micelle core is significantly swelled by the solvent. With water addition at an extremely slow rate, the crew-cut soft-core micelles first form a larger crew-cut micelle. With further water addition, the corona-forming blocks of the crew-cut micelles begin to aggregate and eventually form well-defined patches. Both experiments and DPD simulations indicate that the number of patches has a very strong dependence on the diblock copolymer composition-the particle has more patches on the surface with a lower volume fraction of patch-forming blocks. Furthermore, particles with more patches have a greater ability to assemble, and particles with fewer patches have a greater ability to merge once assembled. PMID:27109249

  9. Bottom-up study of flaw tolerance properties of protein networks

    Science.gov (United States)

    Qin, Zhao; Buehler, Markus

    2012-02-01

    We study the material properties of an intermediate filament proten network by computational modeling using a bottom-up approach. We start with an atomic model of each filament's and obtain the mechanical behavior of them. We then use these parameters in setting up a mesoscale model of the network material at scales of micrometers. Using this multi-scale method, we report a detailed analysis of the associated deformation and failure mechanisms of this hierarchical material. Our modeling reveals that a structure transition that occurs at the proteins' secondary structure level is crucial for the networks' flaw tolerance property, which implies that the material retains its mechanical function despite the existence of large defects. We also examine the effect of crosslink strength on the failure properties. We discover that relatively weaker crosslinks lead to a more flaw tolerant network that is 23% stronger. This unexpected behavior is caused by that the crosslink strength functions as a switch to alter the failure mechanism. Weak crosslinks are able to efficiently diffuse the stress around the crack tip, making the crack more difficult to propagate. We compare our results to that of elastic and softening materials and find that the effect of crosslink strength is much smaller in those systems. These findings imply that the mechanical properties of both the filaments and interfaces among filaments are critical for bioinspired material designs, challenging the conventional paradigm in engineering design.

  10. A Bottom-Up Engineered Broadband Optical Nanoabsorber for Radiometry and Energy Harnessing Applications

    Science.gov (United States)

    Kaul, Anupama B.; Coles, James B.; Megerian, Krikor G.; Eastwood, Michael; Green, Robert O.; Bandaru, Prabhakar R.

    2013-01-01

    Optical absorbers based on vertically aligned multi-walled carbon nanotubes (MWCNTs), synthesized using electric-field assisted growth, are described here that show an ultra-low reflectance, 100X lower compared to Au-black from wavelength lamba approximately 350 nm - 2.5 micron. A bi-metallic Co/Ti layer was shown to catalyze a high site density of MWCNTs on metallic substrates and the optical properties of the absorbers were engineered by controlling the bottom-up synthesis conditions using dc plasma-enhanced chemical vapor deposition (PECVD). Reflectance measurements on the MWCNT absorbers after heating them in air to 400deg showed negligible changes in reflectance which was still low, approximately 0.022 % at lamba approximately 2 micron. In contrast, the percolated structure of the reference Au-black samples collapsed completely after heating, causing the optical response to degrade at temperatures as low as 200deg. The high optical absorption efficiency of the MWCNT absorbers, synthesized on metallic substrates, over a broad spectral range, coupled with their thermal ruggedness, suggests they have promise in solar energy harnessing applications, as well as thermal detectors for radiometry.

  11. Bottom-Up Abstract Modelling of Optical Networks-on-Chip: From Physical to Architectural Layer

    Directory of Open Access Journals (Sweden)

    Alberto Parini

    2012-01-01

    Full Text Available This work presents a bottom-up abstraction procedure based on the design-flow FDTD + SystemC suitable for the modelling of optical Networks-on-Chip. In this procedure, a complex network is decomposed into elementary switching elements whose input-output behavior is described by means of scattering parameters models. The parameters of each elementary block are then determined through 2D-FDTD simulation, and the resulting analytical models are exported within functional blocks in SystemC environment. The inherent modularity and scalability of the S-matrix formalism are preserved inside SystemC, thus allowing the incremental composition and successive characterization of complex topologies typically out of reach for full-vectorial electromagnetic simulators. The consistency of the outlined approach is verified, in the first instance, by performing a SystemC analysis of a four-input, four-output ports switch and making a comparison with the results of 2D-FDTD simulations of the same device. Finally, a further complex network encompassing 160 microrings is investigated, the losses over each routing path are calculated, and the minimum amount of power needed to guarantee an assigned BER is determined. This work is a basic step in the direction of an automatic technology-aware network-level simulation framework capable of assembling complex optical switching fabrics, while at the same time assessing the practical feasibility and effectiveness at the physical/technological level.

  12. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    Science.gov (United States)

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-01

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem. PMID:26709623

  13. Visionmaker NYC: A bottom-up approach to finding shared socioeconomic pathways in New York City

    Science.gov (United States)

    Sanderson, E. W.; Fisher, K.; Giampieri, M.; Barr, J.; Meixler, M.; Allred, S. B.; Bunting-Howarth, K. E.; DuBois, B.; Parris, A. S.

    2015-12-01

    Visionmaker NYC is a free, public participatory, bottom-up web application to develop and share climate mitigation and adaptation strategies for New York City neighborhoods. The goal is to develop shared socioeconomic pathways by allowing a broad swath of community members - from schoolchildren to architects and developers to the general public - to input their concepts for a desired future. Visions are comprised of climate scenarios, lifestyle choices, and ecosystem arrangements, where ecosystems are broadly defined to include built ecosystems (e.g. apartment buildings, single family homes, etc.), transportation infrastructure (e.g. highways, connector roads, sidewalks), and natural land cover types (e.g. wetlands, forests, estuary.) Metrics of water flows, carbon cycling, biodiversity patterns, and population are estimated for the user's vision, for the same neighborhood today, and for that neighborhood as it existed in the pre-development state, based on the Welikia Project (welikia.org.) Users can keep visions private, share them with self-defined groups of other users, or distribute them publicly. Users can also propose "challenges" - specific desired states of metrics for specific parts of the city - and others can post visions in response. Visionmaker contributes by combining scenario planning, scientific modelling, and social media to create new, wide-open possibilities for discussion, collaboration, and imagination regarding future, shared socioeconomic pathways.

  14. A Computational Strategy to Analyze Label-Free Temporal Bottom-up Proteomics Data

    Energy Technology Data Exchange (ETDEWEB)

    Du, Xiuxia; Callister, Stephen J.; Manes, Nathan P.; Adkins, Joshua N.; Alexandridis, Roxana A.; Zeng, Xiaohua; Roh, Jung Hyeob; Smith, William E.; Donohue, Timothy J.; Kaplan, Samuel; Smith, Richard D.; Lipton, Mary S.

    2008-07-01

    Motivation: Biological systems are in a continual state of flux, which necessitates an understanding of the dynamic nature of protein abundances. The study of protein abundance dynamics has become feasible with recent improvements in mass spectrometry-based quantitative proteomics. However, a number of challenges still re-main related to how best to extract biological information from dy-namic proteomics data; for example, challenges related to extrane-ous variability, missing abundance values, and the identification of significant temporal patterns. Results: This article describes a strategy that addresses the afore-mentioned issues for the analysis of temporal bottom-up proteomics data. The core strategy for the data analysis algorithms and subse-quent data interpretation was formulated to take advantage of the temporal properties of the data. The analysis procedure presented herein was applied to data from a Rhodobacter sphaeroides 2.4.1 time-course study. The results were in close agreement with existing knowledge about R. sphaeroides, therefore demonstrating the utility of this analytical strategy.

  15. Elicited Salience and Salience-Based Level-k

    OpenAIRE

    Wolff, Irenaeus

    2016-01-01

    A level-k model based on a specific salience-pattern is the only model in the literature that accounts for behaviour in hide-and-seek games. This paper presents nine different experiments designed to measure salience. The elicited salience patterns tend to be similar, but none of them is similar to the pattern needed to allow the level-k model explain the hide-and-seek data. When based on any of the empirical salience measures, the salience-based level-k model does not fit the data well. p...

  16. The Role of Top-Down Focused Spatial Attention in Preattentive Salience Coding and Salience-based Attentional Capture.

    Science.gov (United States)

    Bertleff, Sabine; Fink, Gereon R; Weidner, Ralph

    2016-08-01

    Selective visual attention requires an efficient coordination between top-down and bottom-up attention control mechanisms. This study investigated the behavioral and neural effects of top-down focused spatial attention on the coding of highly salient distractors and their tendency to capture attention. Combining spatial cueing with an irrelevant distractor paradigm revealed bottom-up based attentional capture only when attention was distributed across the whole search display, including the distractor location. Top-down focusing spatial attention on the target location abolished attentional capture of a salient distractor outside the current attentional focus. Functional data indicated that the missing capture effect was not based on diminished bottom-up salience signals at unattended distractor locations. Irrespectively of whether salient distractors occurred at attended or unattended locations, their presence enhanced BOLD signals at their respective spatial representation in early visual areas as well as in inferior frontal, superior parietal, and medial parietal cortex. Importantly, activity in these regions reflected the presence of a salient distractor rather than attentional capture per se. Moreover, successfully inhibiting attentional capture of a salient distractor at an unattended location further increased neural responses in medial parietal regions known to be involved in controlling spatial attentional shifts. Consequently, data provide evidence that top-down focused spatial attention prevents automatic attentional capture by supporting attentional control processes counteracting a spatial bias toward a salient distractor. PMID:27054402

  17. Bottom-up synthesis of ordered metal/oxide/metal nanodots on substrates for nanoscale resistive switching memory

    Science.gov (United States)

    Han, Un-Bin; Lee, Jang-Sik

    2016-05-01

    The bottom-up approach using self-assembled materials/processes is thought to be a promising solution for next-generation device fabrication, but it is often found to be not feasible for use in real device fabrication. Here, we report a feasible and versatile way to fabricate high-density, nanoscale memory devices by direct bottom-up filling of memory elements. An ordered array of metal/oxide/metal (copper/copper oxide/copper) nanodots was synthesized with a uniform size and thickness defined by self-organized nanotemplate mask by sequential electrochemical deposition (ECD) of each layer. The fabricated memory devices showed bipolar resistive switching behaviors confirmed by conductive atomic force microscopy. This study demonstrates that ECD with bottom-up growth has great potential to fabricate high-density nanoelectronic devices beyond the scaling limit of top-down device fabrication processes.

  18. Bottom-up synthesis of ordered metal/oxide/metal nanodots on substrates for nanoscale resistive switching memory

    Science.gov (United States)

    Han, Un-Bin; Lee, Jang-Sik

    2016-01-01

    The bottom-up approach using self-assembled materials/processes is thought to be a promising solution for next-generation device fabrication, but it is often found to be not feasible for use in real device fabrication. Here, we report a feasible and versatile way to fabricate high-density, nanoscale memory devices by direct bottom-up filling of memory elements. An ordered array of metal/oxide/metal (copper/copper oxide/copper) nanodots was synthesized with a uniform size and thickness defined by self-organized nanotemplate mask by sequential electrochemical deposition (ECD) of each layer. The fabricated memory devices showed bipolar resistive switching behaviors confirmed by conductive atomic force microscopy. This study demonstrates that ECD with bottom-up growth has great potential to fabricate high-density nanoelectronic devices beyond the scaling limit of top-down device fabrication processes. PMID:27157385

  19. The Comparative Effect of Top-down Processing and Bottom-up Processing through TBLT on Extrovert and Introvert EFL

    Directory of Open Access Journals (Sweden)

    Pezhman Nourzad Haradasht

    2013-09-01

    Full Text Available This research seeks to examine the effect of two models of reading comprehension, namely top-down and bottom-up processing, on the reading comprehension of extrovert and introvert EFL learners’ reading comprehension. To do this, 120 learners out of a total number of 170 intermediate learners being educated at Iran Mehr English Language School were selected all taking a PET (Preliminary English Test first for homogenization prior to the study. They also answered the Eysenck Personality Inventory (EPI which in turn categorized them into two subgroups within each reading models consisting of introverts and extroverts. All in all, there were four subgroups: 30 introverts and 30 extroverts undergoing the top-down processing treatment, and 30 introverts and 30 extroverts experiencing the bottom-up processing treatment. The aforementioned PET was administered as the post test of the study after each group was exposed to the treatment for 18 sessions in six weeks. After the instructions finished, the mean scores of all four groups on this post test were computed and a two-way ANOVA was run to test all the four hypotheses raise in this study. the results showed that while learners generally benefitted more from the bottom-up processing setting compared  to the top-down processing one, the extrovert group was better off receiving top-down instruction. Furthermore, introverts outperformed extroverts in bottom-up group; yet between the two personalities subgroups in the top-down setting no difference was seen. A predictable pattern of benefitting from teaching procedures could not be drawn for introverts as in both top-down and bottom-up settings, they benefitted more than extroverts.Keywords: Reading comprehension, top-down processing, bottom-up processing, extrovert, introvert

  20. Quantifying the uncertainties of a bottom-up emission inventory of anthropogenic atmospheric pollutants in China

    Directory of Open Access Journals (Sweden)

    Y. Zhao

    2010-11-01

    Full Text Available The uncertainties of a national, bottom-up inventory of Chinese emissions of anthropogenic SO2, NOx, and particulate matter (PM of different size classes and carbonaceous species are comprehensively quantified, for the first time, using Monte Carlo simulation. The inventory is structured by seven dominant sectors: coal-fired electric power, cement, iron and steel, other industry (boiler combustion, other industry (non-combustion processes, transportation, and residential. For each parameter related to emission factors or activity-level calculations, the uncertainties, represented as probability distributions, are either statistically fitted using results of domestic field tests or, when these are lacking, estimated based on foreign or other domestic data. The uncertainties (i.e., 95% confidence intervals around the central estimates of Chinese emissions of SO2, NOx, total PM, PM10, PM2.5, black carbon (BC, and organic carbon (OC in 2005 are estimated to be −14%~12%, −10%~36%, −10%~36%, −12%~42% −16%~52%, −23%~130%, and −37%~117%, respectively. Variations at activity levels (e.g., energy consumption or industrial production are not the main source of emission uncertainties. Due to narrow classification of source types, large sample sizes, and relatively high data quality, the coal-fired power sector is estimated to have the smallest emission uncertainties for all species except BC and OC. Due to poorer source classifications and a wider range of estimated emission factors, considerable uncertainties of NOx and PM emissions from cement production and boiler combustion in other industries are found. The probability distributions of emission factors for biomass burning, the largest source of BC and OC, are fitted based on very limited domestic field measurements, and special caution should thus be taken interpreting these emission uncertainties. Although Monte

  1. OBJCUT: efficient segmentation using top-down and bottom-up cues.

    Science.gov (United States)

    Kumar, M Pawan; Torr, P H S; Zisserman, A

    2010-03-01

    We present a probabilistic method for segmenting instances of a particular object category within an image. Our approach overcomes the deficiencies of previous segmentation techniques based on traditional grid conditional random fields (CRF), namely that 1) they require the user to provide seed pixels for the foreground and the background and 2) they provide a poor prior for specific shapes due to the small neighborhood size of grid CRF. Specifically, we automatically obtain the pose of the object in a given image instead of relying on manual interaction. Furthermore, we employ a probabilistic model which includes shape potentials for the object to incorporate top-down information that is global across the image, in addition to the grid clique potentials which provide the bottom-up information used in previous approaches. The shape potentials are provided by the pose of the object obtained using an object category model. We represent articulated object categories using a novel layered pictorial structures model. Nonarticulated object categories are modeled using a set of exemplars. These object category models have the advantage that they can handle large intraclass shape, appearance, and spatial variation. We develop an efficient method, OBJCUT, to obtain segmentations using our probabilistic framework. Novel aspects of this method include: 1) efficient algorithms for sampling the object category models of our choice and 2) the observation that a sampling-based approximation of the expected log-likelihood of the model can be increased by a single graph cut. Results are presented on several articulated (e.g., animals) and nonarticulated (e.g., fruits) object categories. We provide a favorable comparison of our method with the state of the art in object category specific image segmentation, specifically the methods of Leibe and Schiele and Schoenemann and Cremers. PMID:20075476

  2. Bottom-up model of self-organized criticality on networks.

    Science.gov (United States)

    Noël, Pierre-André; Brummitt, Charles D; D'Souza, Raissa M

    2014-01-01

    The Bak-Tang-Wiesenfeld (BTW) sandpile process is an archetypal, stylized model of complex systems with a critical point as an attractor of their dynamics. This phenomenon, called self-organized criticality, appears to occur ubiquitously in both nature and technology. Initially introduced on the two-dimensional lattice, the BTW process has been studied on network structures with great analytical successes in the estimation of macroscopic quantities, such as the exponents of asymptotically power-law distributions. In this article, we take a microscopic perspective and study the inner workings of the process through both numerical and rigorous analysis. Our simulations reveal fundamental flaws in the assumptions of past phenomenological models, the same models that allowed accurate macroscopic predictions; we mathematically justify why universality may explain these past successes. Next, starting from scratch, we obtain microscopic understanding that enables mechanistic models; such models can, for example, distinguish a cascade's area from its size. In the special case of a 3-regular network, we use self-consistency arguments to obtain a zero-parameter mechanistic (bottom-up) approximation that reproduces nontrivial correlations observed in simulations and that allows the study of the BTW process on networks in regimes otherwise prohibitively costly to investigate. We then generalize some of these results to configuration model networks and explain how one could continue the generalization. The numerous tools and methods presented herein are known to enable studying the effects of controlling the BTW process and other self-organizing systems. More broadly, our use of multitype branching processes to capture information bouncing back and forth in a network could inspire analogous models of systems in which consequences spread in a bidirectional fashion. PMID:24580281

  3. Bottom-up impact on the cecidomyiid leaf galler and its parasitism in a tropical rainforest.

    Science.gov (United States)

    Malinga, Geoffrey M; Valtonen, Anu; Nyeko, Philip; Vesterinen, Eero J; Roininen, Heikki

    2014-10-01

    The relative importance of host-plant resources, natural enemies or their interactions in controlling the population of galling insects and their parasitism is poorly known for tropical gallers. In this study, we assessed the impacts of plant quality and density of host trees in regulating the densities of a galler species, the cecidomyiid leaf galler (Cecidomyiini sp. 1EJV) and its parasitoids and inquilines on Neoboutonia macrocalyx trees in Uganda. We manipulated the nutritional quality (or vigour) and the resource concentration with four levels each of fertilization and the group size of host tree. We then recorded the effects of these treatments on the growth rate and total leaf area of host plants, the density of gallers and their mortality by parasitoids and inquilines. Higher levels of fertilization and host density resulted in significantly higher total leaf area than did ambient nutrient levels, and lowest tree densities, respectively. Fertilization also caused significant change in the growth rate of leaf area. Both higher fertilization and host density caused higher density of gallers. Total leaf area was positively associated with galler density, but within galled replicates, the galled leaves were larger than the ungalled leaves. Although highest levels of fertilization and density of host trees caused significant change in the densities of parasitoids, the rate of parasitism did not change. However, tree-density manipulations increased the rate of inquilinism, but on a very low level. Our results demonstrate a trophic cascade in the tropical galler and its parasitoids as a response to bottom-up effects. PMID:25124946

  4. Top-down and bottom-up processes in grassland and forested streams.

    Science.gov (United States)

    Nyström, Per; McIntosh, Angus R; Winterbourn, Michael J

    2003-08-01

    The influence of predatory fish on the structure of stream food webs may be altered by the presence of forest canopy cover, and consequent differences in allochthonous inputs and primary production. Eight sites containing introduced brown trout ( Salmo trutta) and eight sites that did not were sampled in the Cass region, South Island, New Zealand. For each predator category, half the sites were located in southern beech (Nothofagus) forest patches (range of canopy cover, 65-90%) and the other half were in tussock grassland. Food resources used by two dominant herbivores-detritivores were assessed using stable isotopes. (13)C/(12)C ratios were obtained for coarse particulate organic matter (CPOM), fine particulate organic matter (FPOM), algal dominated biofilm from rocks, and larvae of Deleatidium (Ephemeroptera) and Olinga (Trichoptera). Total abundance and biomass of macroinvertebrates did not differ between streams with and without trout, but were significantly higher at grassland sites than forested sites. However, taxon richness and species composition differed substantially between trout and no-trout sites, irrespective of whether streams were located in forest or not. Trout streams typically contained more taxa, had low biomass of predatory invertebrates and large shredders, but a high proportion of consumers with cases or shells. The standing stock of CPOM was higher at forested sites, but there was less FPOM and more algae at sites with trout, regardless of the presence or absence of forest cover. The stable carbon isotope range for biofilm on rocks was broad and encompassed the narrow CPOM and FPOM ranges. At trout sites, carbon isotope ratios of Deleatidium, the most abundant invertebrate primary consumer, were closely related to biofilm values, but no relationship was found at no-trout sites where algal biomass was much lower. These results support a role for both bottom-up and top-down processes in controlling the structure of the stream communities

  5. Top-down or bottom-up: Contrasting perspectives on psychiatric diagnoses

    Directory of Open Access Journals (Sweden)

    Willem MA Verhoeven

    2008-09-01

    Full Text Available Willem MA Verhoeven1,2, Siegfried Tuinier1, Ineke van der Burgt31Vincent van Gogh Institute for Psychiatry, Venray, The Netherlands; 2Department of Psychiatry, Erasmus University Medical Centre, Rotterdam, The Netherlands; 3Department of Human Genetics, Radboud University Medical Centre, Nijmegen, The NetherlandsAbstract: Clinical psychiatry is confronted with the expanding knowledge of medical genetics. Most of the research into the genetic underpinnings of major mental disorders as described in the categorical taxonomies, however, did reveal linkage with a variety of chromosomes. This heterogeneity of results is most probably due to the assumption that the nosological categories as used in these studies are disease entities with clear boundaries. If the reverse way of looking, the so-called bottom-up approach, is applied, it becomes clear that genetic abnormalities are in most cases not associated with a single psychiatric disorder but with a certain probability to develop a variety of aspecific psychiatric symptoms. The adequacy of the categorical taxonomy, the so-called top-down approach, seems to be inversely related to the amount of empirical etiological data. This is illustrated by four rather prevalent genetic syndromes, fragile X syndrome, Prader-Willi syndrome, 22q11 deletion syndrome, and Noonan syndrome, as well as by some cases with rare chromosomal abnormalities. From these examples, it becomes clear that psychotic symptoms as well as mood, anxiety, and autistic features can be found in a great variety of different genetic syndromes. A psychiatric phenotype exists, but comprises, apart from the chance to present several psychiatric symptoms, all elements from developmental, neurocognitive, and physical characteristics.Keywords: genetic disorders, psychiatric symptoms, phenotype, mental disorders

  6. Grain size engineering of bcc refractory metals: Top-down and bottom-up-Application to tungsten

    International Nuclear Information System (INIS)

    We have used two general methodologies for the production of ultrafine grained (UFG) and nanocrystalline (NC) tungsten (W) metal samples: top-down and bottom-up. In the first, Equal channel angular extrusion (ECAE), coupled with warm rolling has been used to fabricate UFG W, and high pressure torsion (HPT) was used to fabricate NC W. We demonstrate an abrupt shift in the deformation mechanism, particularly under dynamic compressive loading, in UFG and NC W. This novel deformation mechanism, a dramatic transition from a uniform deformation mode to that of localized shearing, is shared by other UFG and NC body-centerd cubic (BCC) metals. We have also conducted a series of bottom-up experiments to consolidate powdered UFG W precursors into solid bodies. The bottom-up approach relies on rapid, high-temperature consolidation, specifically designed for UFG and NC W powders. The mechanical property results from the top-down UFG and NC W were used as minimum property benchmarks to guide and design the experimental protocols and parameters for use in the bottom-up procedures. Preliminary results, showing rapid grain growth during the consolidation cycle, did not achieve full density in the W samples. Further development of high-purity W nanopowders and appropriate grain-growth inhibitors (e.g., Zener pinning) will be required to successfully produce bulk-sized UFG and NC W samples

  7. Leadership for Quality University Teaching: How Bottom-Up Academic Insights Can Inform Top-Down Leadership

    Science.gov (United States)

    Scott, Donald E.; Scott, Shelleyann

    2016-01-01

    This paper presents the leadership implications from a study that explored how to increase the quality of teaching in a university thereby presenting data from the bottom up--the academic perspective--to inform leadership, policies, and academic development which generally flows from the top down. We report academics' perceptions of and…

  8. Addressing a "Black Box" of Bottom-Up Synthesis: Revealing the Structures of Growing Colloidal-Nanocrystal Nuclei.

    Science.gov (United States)

    Sobol, Oded; Gadot, Eyal; Wang, Yifeng; Weinstock, Ira A; Meshi, Louisa

    2015-11-16

    In bottom-up synthesis, products from reactions of structural building units rapidly pass from soluble molecular complexes to nanoscale intermediates, whose solution-state structures defy elucidation by any routine method. To address this, electron diffraction is used to reveal the structures of cryogenically "trapped" colloidal nanocrystals. PMID:26536393

  9. Assessing the Gap Between Top-down and Bottom-up Measured Methane Emissions in Indianapolis, IN.

    Science.gov (United States)

    Prasad, K.; Lamb, B. K.; Cambaliza, M. O. L.; Shepson, P. B.; Stirm, B. H.; Salmon, O. E.; Lavoie, T. N.; Lauvaux, T.; Ferrara, T.; Howard, T.; Edburg, S. L.; Whetstone, J. R.

    2014-12-01

    Releases of methane (CH4) from the natural gas supply chain in the United States account for approximately 30% of the total US CH4 emissions. However, there continues to be large questions regarding the accuracy of current emission inventories for methane emissions from natural gas usage. In this paper, we describe results from top-down and bottom-up measurements of methane emissions from the large isolated city of Indianapolis. The top-down results are based on aircraft mass balance and tower based inverse modeling methods, while the bottom-up results are based on direct component sampling at metering and regulating stations, surface enclosure measurements of surveyed pipeline leaks, and tracer/modeling methods for other urban sources. Mobile mapping of methane urban concentrations was also used to identify significant sources and to show an urban-wide low level enhancement of methane levels. The residual difference between top-down and bottom-up measured emissions is large and cannot be fully explained in terms of the uncertainties in top-down and bottom-up emission measurements and estimates. Thus, the residual appears to be, at least partly, attributed to a significant wide-spread diffusive source. Analyses are included to estimate the size and nature of this diffusive source.

  10. Reconciling Top-Down and Bottom-Up Estimates of Oil and Gas Methane Emissions in the Barnett Shale

    Science.gov (United States)

    Hamburg, S.

    2015-12-01

    Top-down approaches that use aircraft, tower, or satellite-based measurements of well-mixed air to quantify regional methane emissions have typically estimated higher emissions from the natural gas supply chain when compared to bottom-up inventories. A coordinated research campaign in October 2013 used simultaneous top-down and bottom-up approaches to quantify total and fossil methane emissions in the Barnett Shale region of Texas. Research teams have published individual results including aircraft mass-balance estimates of regional emissions and a bottom-up, 25-county region spatially-resolved inventory. This work synthesizes data from the campaign to directly compare top-down and bottom-up estimates. A new analytical approach uses statistical estimators to integrate facility emission rate distributions from unbiased and targeted high emission site datasets, which more rigorously incorporates the fat-tail of skewed distributions to estimate regional emissions of well pads, compressor stations, and processing plants. The updated spatially-resolved inventory was used to estimate total and fossil methane emissions from spatial domains that match seven individual aircraft mass balance flights. Source apportionment of top-down emissions between fossil and biogenic methane was corroborated with two independent analyses of methane and ethane ratios. Reconciling top-down and bottom-up estimates of fossil methane emissions leads to more accurate assessment of natural gas supply chain emission rates and the relative contribution of high emission sites. These results increase our confidence in our understanding of the climate impacts of natural gas relative to more carbon-intensive fossil fuels and the potential effectiveness of mitigation strategies.

  11. A bottom-up approach to urban metabolism: the perspective of BRIDGE

    Science.gov (United States)

    Chrysoulakis, N.; Borrego, C.; San Josè, R.; Grimmond, S. B.; Jones, M. B.; Magliulo, V.; Klostermann, J.; Santamouris, M.

    2011-12-01

    Urban metabolism considers a city as a system and usually distinguishes between energy and material flows as its components. "Metabolic" studies are usually top-down approaches that assess the inputs and outputs of food, water, energy, and pollutants from a city, or that compare the changing metabolic process of several cities. In contrast, bottom-up approaches are based on quantitative estimates of urban metabolism components at local to regional scales. Such approaches consider the urban metabolism as the 3D exchange and transformation of energy and matter between a city and its environment. The city is considered as a system and the physical flows between this system and its environment are quantitatively estimated. The transformation of landscapes from primarily agricultural and forest uses to urbanized landscapes can greatly modify energy and material exchanges and it is, therefore, an important aspect of an urban area. Here we focus on the exchanges and transformation of energy, water, carbon and pollutants. Recent advances in bio-physical sciences have led to new methods and models to estimate local scale energy, water, carbon and pollutant fluxes. However, there is often poor communication of new knowledge and its implications to end-users, such as planners, architects and engineers. The FP7 Project BRIDGE (SustainaBle uRban plannIng Decision support accountinG for urban mEtabolism) aims at bridging this gap and at illustrating the advantages of considering environmental issues in urban planning. BRIDGE does not perform a complete life cycle analysis or calculate whole system urban metabolism, but rather focuses on specific metabolism components (energy, water, carbon and pollutants). Its main goal is the development of a Decision Suport System (DSS) with the potential to select planning actions which better fit the goal of changing the metabolism of urban systems towards sustainability. BRIDGE evaluates how planning alternatives can modify the physical

  12. Salience and Asset Prices

    OpenAIRE

    Bordalo, Pedro; Gennaioli, Nicola; Shleifer, Andrei

    2013-01-01

    We present a simple model of asset pricing in which payoff salience drives investors' demand for risky assets. The key implication is that extreme payoffs receive disproportionate weight in the market valuation of assets. The model accounts for several puzzles in finance in an intuitive way, including preference for assets with a chance of very high payoffs, an aggregate equity premium, and countercyclical variation in stock market returns.

  13. Saliency changes appearance.

    Directory of Open Access Journals (Sweden)

    Dirk Kerzel

    Full Text Available Numerous studies have suggested that the deployment of attention is linked to saliency. In contrast, very little is known about how salient objects are perceived. To probe the perception of salient elements, observers compared two horizontally aligned stimuli in an array of eight elements. One of them was salient because of its orientation or direction of motion. We observed that the perceived luminance contrast or color saturation of the salient element increased: the salient stimulus looked even more salient. We explored the possibility that changes in appearance were caused by attention. We chose an event-related potential indexing attentional selection, the N2pc, to answer this question. The absence of an N2pc to the salient object provides preliminary evidence against involuntary attentional capture by the salient element. We suggest that signals from a master saliency map flow back into individual feature maps. These signals boost the perceived feature contrast of salient objects, even on perceptual dimensions different from the one that initially defined saliency.

  14. Fusion of multi-sensory saliency maps for automated perception and control

    Science.gov (United States)

    Huber, David J.; Khosla, Deepak; Dow, Paul A.

    2009-05-01

    In many real-world situations and applications that involve humans or machines (e.g., situation awareness, scene understanding, driver distraction, workload reduction, assembly, robotics, etc.) multiple sensory modalities (e.g., vision, auditory, touch, etc.) are used. The incoming sensory information can overwhelm processing capabilities of both humans and machines. An approach for estimating what is most important in our sensory environment (bottom-up or goal-driven) and using that as a basis for workload reduction or taking an action could be of great benefit in applications involving humans, machines or human-machine interactions. In this paper, we describe a novel approach for determining high saliency stimuli in multi-modal sensory environments, e.g., vision, sound, touch, etc. In such environments, the high saliency stimuli could be a visual object, a sound source, a touch event, etc. The high saliency stimuli are important and should be attended to from perception, cognition or/and action perspective. The system accomplishes this by the fusion of saliency maps from multiple sensory modalities (e.g., visual and auditory) into a single, fused multimodal saliency map that is represented in a common, higher-level coordinate system. This paper describes the computational model and method for generating multi-modal or fused saliency map. The fused saliency map can be used to determine primary and secondary foci of attention as well as for active control of a hardware/device. Such a computational model of fused saliency map would be immensely useful for a machine-based or robot-based application in a multi-sensory environment. We describe the approach, system and present preliminary results on a real-robotic platform.

  15. Top-down and bottom-up control of large herbivore populations: a review of natural and human-induced influences

    OpenAIRE

    Gandiwa, E.

    2013-01-01

    The question whether animal populations are top-down and/or bottom-up controlled has motivated a thriving body of research over the past five decades. In this review I address two questions: 1) how do top-down and bottom-up controls influence large herbivore populations? 2) How do human activities and control systems influence the top-down and bottom-up processes that affect large herbivore population dynamics? Previous studies suggest that the relative influence of top-down vs. bottom-up con...

  16. Price elasticities, policy measures and actual developments in household energy consumption - A bottom up analysis for the Netherlands

    International Nuclear Information System (INIS)

    In the Netherlands it seems likely that the large number of new policy measures in the past decade has influenced the response of households to changing prices. To investigate this issue the energy trends in the period 1990-2000 have been simulated with a bottom-up model, applied earlier for scenario studies, and extensive data from surveys. For a number of alternative price cases the elasticity values found are explained using the bottom-up changes in energy trends. One finding is that the specific set of saving options defines for a great part the price response. The price effect has been analysed too in combination with the policy measures standards, subsidies and energy taxes. The simulation results indicate that the elasticity value could be 30-40% higher without these measures. (author)

  17. Robust aircraft segmentation from very high-resolution images based on bottom-up and top-down cue integration

    Science.gov (United States)

    Gao, Feng; Xu, Qizhi; Li, Bo

    2016-01-01

    Existing segmentation methods require manual interventions to optimally extract objects from cluttered background, so that they can hardly work well in automated surveillance systems. In order to automatically extract aircrafts from very high-resolution images, we proposed a segmentation method that combines bottom-up and top-down cues. Three essential principles from local contrast, global contrast, and center bias are involved to compute bottom-up cue. In addition, top-down cue is computed by incorporating aircraft shape priors, and it is achieved by training a classifier from a rich set of visual features. Iterative operations and adaptive fitting are designed to get refined results. Experimental results demonstrated that the proposed method can provide significant improvements on the segmentation accuracy.

  18. Tisza, Transmission and Innovation: An Innovative Bottom-up Model for Transmission and Promotion of Tisza Cultural Heritage

    OpenAIRE

    Barberis Rami, Matías Ezequiel; Berić, Dejan; Mátai, Anikó; Opriş, Lavinia-Ioana; Ricci, Giulia; Rustja, Dritan

    2015-01-01

    The project aims to promote and preserve both tangible and intangible cultural heritage in a particular region of the Danube river basin, Tisza Region (TR). The TR cultural heritage is less-well-known in the rest of Europe and is at risk of being lost or forgotten if not preserved and supported. In this project is presented an innovative and strategic bottom-up model which allows local people to manage how their heritage is disseminated through transmission and promotion of their ...

  19. The Comparative Effect of Top-down Processing and Bottom-up Processing through TBLT on Extrovert and Introvert EFL

    OpenAIRE

    Pezhman Nourzad Haradasht; Abdollah Baradaran

    2013-01-01

    This research seeks to examine the effect of two models of reading comprehension, namely top-down and bottom-up processing, on the reading comprehension of extrovert and introvert EFL learners’ reading comprehension. To do this, 120 learners out of a total number of 170 intermediate learners being educated at Iran Mehr English Language School were selected all taking a PET (Preliminary English Test) first for homogenization prior to the study. They also answered the Eysenck Personality Invent...

  20. Radiographic Evaluation of Children with Febrile Urinary Tract Infection: Bottom-Up, Top-Down, or None of the Above?

    OpenAIRE

    Prasad, Michaella M.; Cheng, Earl Y.

    2011-01-01

    The proper algorithm for the radiographic evaluation of children with febrile urinary tract infection (FUTI) is hotly debated. Three studies are commonly administered: renal-bladder ultrasound (RUS), voiding cystourethrogram (VCUG), and dimercapto-succinic acid (DMSA) scan. However, the order in which these tests are obtained depends on the methodology followed: bottom-up or top-down. Each strategy carries advantages and disadvantages, and some groups now advocate even less of a workup (no...

  1. Effects of pollutants on bottom-up and top-down processes in insect-plant interactions

    International Nuclear Information System (INIS)

    Bottom-up (host plant quality) and top-down (natural enemies) forces both influence the fitness and population dynamics of herbivores. However, the impact of pollutants acting on these forces has not been examined, which prompted us to review the literature to test hypotheses regarding this area of research. A comprehensive literature search found 126 references which examined fitness components and population dynamics of 203 insect herbivores. One hundred and fifty-three of the 203 herbivores (75.4%) had fitness impacted due to bottom-up factors in polluted environments. In contrast, only 20 of the 203 (9.9%) had fitness significantly impacted due to top-down factors in polluted environments. The paucity of results for top-down factors impacting fitness does not necessarily mean that top-down factors are less important, but rather that fewer studies include natural enemies. We provide a synthesis of available data by pollution type and herbivore guild, and suggest future research to address this issue. - Pollutants can affect insect herbivores through bottom-up and, possibly, top-down processes

  2. An integrated top-down and bottom-up proteomic approach to characterize the antigen binding fragment of antibodies

    Energy Technology Data Exchange (ETDEWEB)

    Dekker, Leendert J.; Wu, Si; vanDuijn, Martijn M.; Tolic, Nikola; Stingl, Christoph; Zhao, Rui; Luider, Theo N.; Pasa-Tolic, Ljiljana

    2014-05-31

    We have previously shown that different individuals exposed to the same antigen produce antibodies with identical mutations in their complementarity determining regions (CDR), suggesting that CDR tryptic peptides can serve as biomarkers for disease diagnosis and prognosis. Complete Fabs derived from disease specific antibodies have even higher potential; they could potentially be used for disease treatment and are required to identify the antigens towards which the antibodies are directed. However, complete Fab sequence characterization via LC-MS analysis of tryptic peptides (i.e. bottom-up) has proven to be impractical for mixtures of antibodies. To tackle this challenge, we have developed an integrated bottom-up and top-down MS approach, employing 2D chromatography coupled with Fourier transform mass spectrometry (FTMS), and applied this approach for full characterization of the variable parts of two pharmaceutical monoclonal antibodies with sensitivity comparable to the bottom-up standard. These efforts represent an essential step towards the identification of disease specific antibodies in patient samples with potentially significant clinical impact.

  3. A comprehensive estimate of recent carbon sinks in China using both top-down and bottom-up approaches

    Science.gov (United States)

    Jiang, Fei; Chen, Jing; Zhou, Linxi; Ju, Weimin; Zhang, Huifang; Machida, Toshinobu; Ciais, Philippe; Peters, Wouter; Wang, Hengmao; Chen, Baozhang; Liu, Linxin; Zhang, Chunhua; Matsueda, Hidekazu; Sawa, Yousuke

    2016-04-01

    Atmospheric inversions use measurements of atmospheric CO2 gradients to constrain regional surface fluxes. Current inversions indicate a net terrestrial CO2 sink in China between 0.16 and 0.35 PgC/yr. The uncertainty of these estimates is as large as the mean because the atmospheric network historically contained only one high altitude station in China. Here, we revisit the calculation of the terrestrial CO2 flux in China, excluding emissions from fossil fuel burning and cement production, by using two inversions with three new CO2 monitoring stations in China as well as aircraft observations over Asia. We estimate a net terrestrial CO2 uptake of 0.39-0.51 PgC/yr with a mean of 0.45 PgC/yr in 2006-2009. After considering the lateral transport of carbon in air and water and international trade, the annual mean carbon sink is adjusted to 0.35 PgC/yr. To evaluate this top-down estimate, we constructed an independent bottom-up estimate based on ecosystem data, and giving a net land sink of 0.33 PgC/yr. This demonstrates closure between the top-down and bottom-up estimates. Both top-down and bottom-up estimates give a higher carbon sink than previous estimates made for the 1980s and 1990s, suggesting a trend towards increased uptake by land ecosystems in China.

  4. A comprehensive estimate of recent carbon sinks in China using both top-down and bottom-up approaches

    Science.gov (United States)

    Jiang, Fei; Chen, Jing M.; Zhou, Lingxi; Ju, Weimin; Zhang, Huifang; Machida, Toshinobu; Ciais, Philippe; Peters, Wouter; Wang, Hengmao; Chen, Baozhang; Liu, Lixin; Zhang, Chunhua; Matsueda, Hidekazu; Sawa, Yousuke

    2016-02-01

    Atmospheric inversions use measurements of atmospheric CO2 gradients to constrain regional surface fluxes. Current inversions indicate a net terrestrial CO2 sink in China between 0.16 and 0.35 PgC/yr. The uncertainty of these estimates is as large as the mean because the atmospheric network historically contained only one high altitude station in China. Here, we revisit the calculation of the terrestrial CO2 flux in China, excluding emissions from fossil fuel burning and cement production, by using two inversions with three new CO2 monitoring stations in China as well as aircraft observations over Asia. We estimate a net terrestrial CO2 uptake of 0.39-0.51 PgC/yr with a mean of 0.45 PgC/yr in 2006-2009. After considering the lateral transport of carbon in air and water and international trade, the annual mean carbon sink is adjusted to 0.35 PgC/yr. To evaluate this top-down estimate, we constructed an independent bottom-up estimate based on ecosystem data, and giving a net land sink of 0.33 PgC/yr. This demonstrates closure between the top-down and bottom-up estimates. Both top-down and bottom-up estimates give a higher carbon sink than previous estimates made for the 1980s and 1990s, suggesting a trend towards increased uptake by land ecosystems in China.

  5. Sponge communities on Caribbean coral reefs are structured by factors that are top-down, not bottom-up.

    Directory of Open Access Journals (Sweden)

    Joseph R Pawlik

    Full Text Available Caribbean coral reefs have been transformed in the past few decades with the demise of reef-building corals, and sponges are now the dominant habitat-forming organisms on most reefs. Competing hypotheses propose that sponge communities are controlled primarily by predatory fishes (top-down or by the availability of picoplankton to suspension-feeding sponges (bottom-up. We tested these hypotheses on Conch Reef, off Key Largo, Florida, by placing sponges inside and outside predator-excluding cages at sites with less and more planktonic food availability (15 m vs. 30 m depth. There was no evidence of a bottom-up effect on the growth of any of 5 sponge species, and 2 of 5 species grew more when caged at the shallow site with lower food abundance. There was, however, a strong effect of predation by fishes on sponge species that lacked chemical defenses. Sponges with chemical defenses grew slower than undefended species, demonstrating a resource trade-off between growth and the production of secondary metabolites. Surveys of the benthic community on Conch Reef similarly did not support a bottom-up effect, with higher sponge cover at the shallower depth. We conclude that the structure of sponge communities on Caribbean coral reefs is primarily top-down, and predict that removal of sponge predators by overfishing will shift communities toward faster-growing, undefended species that better compete for space with threatened reef-building corals.

  6. A convenient two-step bottom-up approach for developing Au/Fe{sub 3}O{sub 4} nanocomposites with useful optical and magnetic properties

    Energy Technology Data Exchange (ETDEWEB)

    Amala Jayanthi, S. [Department of Physics, Government Arts College (Autonomous), Nandanam, Chennai 600035 (India); Manovah David, T. [Department of Chemistry, Madras Christian College (Autonomous), Chennai 600059 (India); Jayashainy, J.; Muthu Gnana Theresa Nathan, D. [Department of Physics, Loyola College (Autonomous), Chennai 600034 (India); Sagayaraj, P., E-mail: psagayaraj@hotmail.com [Department of Physics, Loyola College (Autonomous), Chennai 600034 (India)

    2014-09-01

    Graphical abstract: Au/Fe{sub 3}O{sub 4} nanocomposites were successfully synthesized using a two-step bottom up approach under co-precipitation followed by solvothermal synthesis without using capping agents or additives. TEM results indicate that nanocomposites with less agglomeration and high monodispersion can be obtained even in the absence of additives or capping ligands. - Highlights: • Au/Fe{sub 3}O{sub 4} nanocomposites without using additives, mediator or capping ligands. • Surface morphology study reveals the uniform AuNPs coating in the nanocomposites. • Soft ferromagnetic behavior with larger M{sub s} values is observed at room temperature. - Abstract: A convenient two-step bottom-up approach is reported for the preparation of Au/Fe{sub 3}O{sub 4} nanocomposites. The synthesis of Fe{sub 3}O{sub 4} was achieved by co-precipitation method and rapid synthesis procedure was adopted for forming Au nanoparticles. The solutions containing the Fe{sub 3}O{sub 4} and Au nanoparticles were mixed in two different ratios and then solvothermally treated to obtain the Au/Fe{sub 3}O{sub 4} nanocomposites. The structural and optical properties of the nanocomposites were investigated by powder X-ray diffraction and optical absorption spectroscopic techniques. The field emission scanning electron microscopy pictures illustrate the surface morphology of the as-prepared nanocomposites. The energy dispersive X-ray analysis spectrum was taken to estimate the exact percentage of elemental composition of the nanopowder. The transmission electron microscopy analysis of the nanocomposites confirmed the presence and morphology of the Au and Fe{sub 3}O{sub 4} nanoparticles. The Au/Fe{sub 3}O{sub 4} nanocomposites were found to exhibit soft ferromagnetic behavior.

  7. Visually guided pointing movements are driven by the salience map.

    Science.gov (United States)

    Zehetleitner, Michael; Hegenloh, Michael; Müller, Hermann J

    2011-01-01

    Visual salience maps are assumed to mediate target selection decisions in a motor-unspecific manner; accordingly, modulations of salience influence yes/no target detection or left/right localization responses in manual key-press search tasks, as well as ocular or skeletal movements to the target. Although widely accepted, this core assumption is based on little psychophysical evidence. At least four modulations of salience are known to influence the speed of visual search for feature singletons: (i) feature contrast, (ii) cross-trial dimension sequence and (iii) semantic pre-cueing of the target dimension, and (iv) dimensional target redundancy. If salience guides also manual pointing movements, their initiation latencies (and durations) should be affected by the same four manipulations of salience. Four experiments, each examining one of these manipulations, revealed this to be the case. Thus, these effects are seen independently of the motor response required to signal the perceptual decision (e.g., directed manual pointing as well as simple yes/no detection responses). This supports the notion of a motor-unspecific salience map, which guides covert attention as well as overt eye and hand movements. PMID:21282341

  8. Exploring the rate-limiting steps in visual phototransduction recovery by bottom-up kinetic modeling

    OpenAIRE

    Brandon M. Invergo; Montanucci, Ludovica; Koch, Karl-Wilhelm; Bertranpetit, Jaume; Dell’Orco, Daniele

    2013-01-01

    Background Phototransduction in vertebrate photoreceptor cells represents a paradigm of signaling pathways mediated by G-protein-coupled receptors (GPCRs), which share common modules linking the initiation of the cascade to the final response of the cell. In this work, we focused on the recovery phase of the visual photoresponse, which is comprised of several interacting mechanisms. Results We employed current biochemical knowledge to investigate the response mechanisms of a comprehensive mod...

  9. Preference for Well-Balanced Saliency in Details Cropped from Photographs

    Directory of Open Access Journals (Sweden)

    Jonas eAbeln

    2016-01-01

    Full Text Available Photographic cropping is the act of selecting part of a photograph to enhance its aesthetic appearance or visual impact. It is common practice with both professional (expert and amateur (non-expert photographers. In a psychometric study, McManus et al. (2011b showed that participants cropped photographs confidently and reliably. Experts tended to select details from a wider range of positions than non-experts, but other croppers did not generally prefer details that were selected by experts. It remained unclear, however, on what grounds participants selected particular details from a photograph while avoiding other details. One of the factors contributing to cropping decision may be visual saliency. Indeed, various saliency-based computer algorithms are available for the automatic cropping of photographs. However, careful experimental studies on the relation between saliency and cropping are lacking to date. In the present study, we re-analyzed the data from the studies by McManus et al. (2011a,b, focusing on statistical image properties. We calculated saliency-based measures for details selected and details avoided during cropping. As expected, we found that selected details contain regions of higher saliency than avoided details on average. Moreover, the saliency center-of-mass was closer to the geometrical center in selected details than in avoided details. Results were confirmed in an eye tracking study with the same dataset of images. Interestingly, the observed regularities in cropping behavior were less pronounced for experts than for non-experts. In summary, our results suggest that, during cropping, participants tend to select salient regions and place them in an image composition that is well-balanced with respect to the distribution of saliency. Our study contributes to the knowledge of perceptual bottom-up features that are germane to aesthetic decisions in photography and their variability in non-experts and experts.

  10. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    International Nuclear Information System (INIS)

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed “pressure-matching” variational principle to determine a volume-dependent contribution to the potential, UV(V), that approximates the volume-dependence of the PMF. We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing UV, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that UV accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the “simplicity” of the model

  11. Benchmarking energy scenarios for China: perspectives from top-down, economic and bottom-up, technical modelling

    DEFF Research Database (Denmark)

    This study uses a soft-linking methodology to harmonise two complex global top-down and bottom-up models with a regional China focus. The baseline follows the GDP and demographic trends of the Shared Socio-economic Pathways (SSP2) scenario, down-scaled for China, while the carbon tax scenario fol......-specific modelling results further. These new sub-regional China features can now be used for a more detailed analysis of China's regional developments in a global context....

  12. Top-down and Bottom-up. Testing a mixed approach to the generation of priorities for sustainable urban mobility

    OpenAIRE

    E. Pieralice; F. Mameli; G. Marletto

    2015-01-01

    This paper contributes to the debate on how to make operational the concept of sustainable urban mobility and advocates the use of a mixed – top-down and bottom-up – approach to the generation of priorities for sustainable urban mobility. In particular, we tested whether a common list of priorities remain valid after a participated scrutiny performed in seven urban areas of southern Italy. The test was based on a 3-steps procedure. In step 1, we used a common conceptual framework (based on Ma...

  13. Bottom-up modelling of continuous renovation and energy balance of existing building stock: case study Kočevje

    OpenAIRE

    Šijanec Zavrl, Marjana; Stegnar, Gašper; Rakušček, Andraž; Gjerkeš, Henrik

    2016-01-01

    A dynamic bottom-up model of the building stock is developed and implemented in a case study of Kočevje urban region. In the model, national register of real estate is cross-linked to data from other registers, e.g. the energy performance certificates (EPC) and the subsidized energy renovation measures. Regular updates of the data in registers enable continual improvement of the model. Therenovation potential is determined with respect to the age of building components after the last renovati...

  14. Bottom-Up Nano-heteroepitaxy of Wafer-Scale Semipolar GaN on (001) Si

    KAUST Repository

    Hus, Jui Wei

    2015-07-15

    Semipolar {101¯1} InGaN quantum wells are grown on (001) Si substrates with an Al-free buffer and wafer-scale uniformity. The novel structure is achieved by a bottom-up nano-heteroepitaxy employing self-organized ZnO nanorods as the strain-relieving layer. This ZnO nanostructure unlocks the problems encountered by the conventional AlN-based buffer, which grows slowly and contaminates the growth chamber. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Land claim settlements and their impacts : regional dynamics and bottom-up economic development in Nunavik and Nunatsiavut (Canada)

    OpenAIRE

    Fugmann, Gerlis

    2011-01-01

    Since the 1970s, land claim and self-government agreements have provided the Inuit in the Arctic with a variety of tools, granting them considerable rights and benefits within their traditional settlement areas. One of the central purposes of these regional agreements was to create an economic base in order to enhance development from within the various regions and thus promote economic self-reliance for the Inuit beneficiaries. This bottom-up approach is a clear shift from the top-down devel...

  16. A statistical mixture method to reveal bottom-up and top-down factors guiding the eye-movements

    OpenAIRE

    Couronné, Thomas; Guérin-Dugué, Anne; Michel DUBOIS; Faye, Pauline; MARENDAZ, Christian

    2010-01-01

    When people gaze at real scenes, their visual attention is driven both by a set of bottom-up processes coming from the signal properties of the scene and also from top-down effects such as the task, the affective state, prior knowledge, or the semantic context. The context of this study is an assessment of manufactured objects (here car cab interior). From this dedicated context, this work describes a set of methods to analyze the eye-movements during the visual scene evaluation. But these me...

  17. A bottom-up approach for optimization of friction stir processing parameters; a study on aluminium 2024-T3 alloy

    International Nuclear Information System (INIS)

    Highlights: • An experimental bottom-up approach has been developed for optimizing the process parameters for friction stir processing. • Optimum parameter processed samples were tested and characterized in detail. • Ultimate tensile strength of 1.3 times the base metal strength was obtained. • Residual stresses on the processed surface were only 10% of the yield strength of base metal. • Microstructure observations revealed fine equi-axed grains with precipitate particles at the grain boundaries. - Abstract: Friction stir processing (FSP) is emerging as one of the most competent severe plastic deformation (SPD) method for producing bulk ultra-fine grained materials with improved properties. Optimizing the process parameters for a defect free process is one of the challenging aspects of FSP to mark its commercial use. For the commercial aluminium alloy 2024-T3 plate of 6 mm thickness, a bottom-up approach has been attempted to optimize major independent parameters of the process such as plunge depth, tool rotation speed and traverse speed. Tensile properties of the optimum friction stir processed sample were correlated with the microstructural characterization done using Scanning Electron Microscope (SEM) and Electron Back-Scattered Diffraction (EBSD). Optimum parameters from the bottom-up approach have led to a defect free FSP having a maximum strength of 93% the base material strength. Micro tensile testing of the samples taken from the center of processed zone has shown an increased strength of 1.3 times the base material. Measured maximum longitudinal residual stress on the processed surface was only 30 MPa which was attributed to the solid state nature of FSP. Microstructural observation reveals significant grain refinement with less variation in the grain size across the thickness and a large amount of grain boundary precipitation compared to the base metal. The proposed experimental bottom-up approach can be applied as an effective method for

  18. Benchmarking energy scenarios for China: perspectives from top-down, economic and bottom-up, technical modelling

    OpenAIRE

    Mischke, Peggy

    2014-01-01

    This study uses a soft-linking methodology to harmonise two complex global top-down and bottom-up models with a regional China focus. The baseline follows the GDP and demographic trends of the Shared Socio-economic Pathways (SSP2) scenario, down-scaled for China, while the carbon tax scenario follows the pathway of the Asia Modelling Exercise.We find that soft-linking allows "bridging the gap" and reducing uncertainty between these models. Without soft-linking, baseline result ranges for Chin...

  19. The drastic outcomes from voting alliances in three-party bottom-up democratic voting (1990 $\\rightarrow$ 2013)

    OpenAIRE

    Galam, Serge

    2013-01-01

    The drastic effect of local alliances in three-party competition is investigated in democratic hierarchical bottom-up voting. The results are obtained analytically using a model which extends a sociophysics frame introduced in 1986 \\cite{psy} and 1990 \\cite{lebo} to study two-party systems and the spontaneous formation of democratic dictatorship. It is worth stressing that the 1990 paper was published in the Journal of Statistical Physics, the first paper of its kind in the journal. It was sh...

  20. Bottom-up and top-down herbivore regulation mediated by glucosinolates in Brassica oleracea var. acephala

    OpenAIRE

    Santolamazza Carbone, Serena; Velasco Pazos, Pablo; Soengas Fernández, María del Pilar; Cartea González, María Elena

    2014-01-01

    Quantitative differences in plant defence metabolites, such as glucosinolates, may directly affect herbivore preference and performance, and indirectly affect natural enemy pressure. By assessing insect abundance and leaf damage rate, we studied the responses of insect herbivores to six genotypes of Brassica oleracea var. acephala, selected from the same cultivar for having high or low foliar content of sinigrin, glucoiberin and glucobrassicin. We also investigated whether the natural parasit...

  1. A Bottom-up Route to a Chemically End-to-End Assembly of Nanocellulose Fibers.

    Science.gov (United States)

    Yang, Han; van de Ven, Theo G M

    2016-06-13

    In this work, we take advantage of the rod-like structure of electrosterically stabilized nanocrystalline cellulose (ENCC, with a width of about 7 nm and a length of about 130 nm), which has dicarboxylated cellulose (DCC) chains protruding from both ends, providing electrosterical stability for ENCC particles, to chemically end-to-end assemble these particles into nanocellulose fibers. ENCC with shorter DCC chains can be obtained by a mild hydrolysis of ENCC with HCl, and subsequently the hydrolyzed ENCC (HENCC, with a width of about 6 nm and a length of about 120 nm) is suitable to be assembled into high aspect ratio nanofibers by chemically cross-linking HENCC from one end to another. Two sets of HENCC were prepared by carbodiimide-mediated formation of an alkyne and an azide derivative, respectively. Cross-linking these two sets of HENCC was performed by a click reaction. HENCCs were also end-to-end cross-linked by a bioconjugation reaction, with a diamine. From atomic force microscopy (AFM) images, about ten HENCC nanoparticles were cross-linked and formed high aspect ratio nanofibers with a width of about 6 nm and a length of more than 1 μm. PMID:27211496

  2. Source attribution of methane emissions from global oil and gas production: results of bottom-up simulations over three decades

    Science.gov (United States)

    Höglund-Isaksson, Lena

    2016-04-01

    Existing bottom-up emission inventories of historical methane and ethane emissions from global oil and gas systems do not well explain year-on-year variations estimated by top-down models from atmospheric measurements. This paper develops a bottom-up methodology which allows for country- and year specific source attribution of methane and ethane emissions from global oil and natural gas production for the period 1980 to 2012. The analysis rests on country-specific simulations of associated gas flows which are converted into methane and ethane emissions. The associated gas flows are constructed from country-specific information on oil and gas production and associated gas generation and recovery, and coupled with generic assumptions to bridge regional information gaps on the fractions of unrecovered associated gas that is vented instead of flared. Summing up emissions from associated gas flows with global estimates of emissions from unintended leakage and natural gas transmission and distribution, the resulting global emissions of methane and ethane from oil and gas systems are reasonably consistent with corresponding estimates from top-down models. Also revealed is that the fall of the Soviet Union in 1990 had a significant impact on methane and ethane emissions from global oil and gas systems.

  3. Thousand and one ways to quantify and compare protein abundances in label-free bottom-up proteomics.

    Science.gov (United States)

    Blein-Nicolas, Mélisande; Zivy, Michel

    2016-08-01

    How to process and analyze MS data to quantify and statistically compare protein abundances in bottom-up proteomics has been an open debate for nearly fifteen years. Two main approaches are generally used: the first is based on spectral data generated during the process of identification (e.g. peptide counting, spectral counting), while the second makes use of extracted ion currents to quantify chromatographic peaks and infer protein abundances based on peptide quantification. These two approaches actually refer to multiple methods which have been developed during the last decade, but were submitted to deep evaluations only recently. In this paper, we compiled these different methods as exhaustively as possible. We also summarized the way they address the different problems raised by bottom-up protein quantification such as normalization, the presence of shared peptides, unequal peptide measurability and missing data. This article is part of a Special Issue entitled: Plant Proteomics- a bridge between fundamental processes and crop production, edited by Dr. Hans-Peter Mock. PMID:26947242

  4. Bottom-up versus top-down effects on ciliate community composition in four eutrophic lakes (China).

    Science.gov (United States)

    Li, Jing; Chen, Feizhou; Liu, Zhengwen; Zhao, Xiuxia; Yang, Kun; Lu, Wenxuan; Cui, Kai

    2016-04-01

    Previous studies have shown that ciliate plankton is generally controlled by food resources (e.g., algae) and predators (e.g., metazooplankton). Among lakes with similar trophic levels but different distributions of phyto- and metazooplankton, the main forces acting on ciliate assemblages may be different. We investigated the relationship between ciliate communities and bottom-up versus top-down variables based on a survey of four subtropical eutrophic lakes (China). Two of the lakes (Chaohu, Taihu) are located on the Mid-lower Yangtze Plain near sea level, and the other two (Dianchi, Xingyunhu) on the Yunnan-Kweichow Plateau at 1700m above sea level. Blooms of cyanobacteria developed during summer in Lakes Chaohu and Taihu and throughout the year in Lakes Dianchi and Xingyunhu. Ciliate functional feeding groups differed significantly between lakes. The results of canonical correspondence analysis (CCA) and variation partitioning showed that cyanobacteria significantly influence ciliate species, whereas 'edible' algae (cryptophytes, diatoms) and cladocerans were the important variables in explaining the ciliate community structure of Lakes Dianchi and Xingyunhu compared with Lakes Taihu and Chaohu. Our results highlight the importance of consistent cyanobacterial blooms in shaping the ciliate community in subtropical eutrophic shallow lakes by interacting with top-down and bottom-up factors. PMID:26773905

  5. Bottom-up and top-down solid-state NMR approaches for bacterial biofilm matrix composition

    Science.gov (United States)

    Cegelski, Lynette

    2015-04-01

    The genomics and proteomics revolutions have been enormously successful in providing crucial "parts lists" for biological systems. Yet, formidable challenges exist in generating complete descriptions of how the parts function and assemble into macromolecular complexes and whole-cell assemblies. Bacterial biofilms are complex multicellular bacterial communities protected by a slime-like extracellular matrix that confers protection to environmental stress and enhances resistance to antibiotics and host defenses. As a non-crystalline, insoluble, heterogeneous assembly, the biofilm extracellular matrix poses a challenge to compositional analysis by conventional methods. In this perspective, bottom-up and top-down solid-state NMR approaches are described for defining chemical composition in complex macrosystems. The "sum-of-the-parts" bottom-up approach was introduced to examine the amyloid-integrated biofilms formed by Escherichia coli and permitted the first determination of the composition of the intact extracellular matrix from a bacterial biofilm. An alternative top-down approach was developed to define composition in Vibrio cholerae biofilms and relied on an extensive panel of NMR measurements to tease out specific carbon pools from a single sample of the intact extracellular matrix. These two approaches are widely applicable to other heterogeneous assemblies. For bacterial biofilms, quantitative parameters of matrix composition are needed to understand how biofilms are assembled, to improve the development of biofilm inhibitors, and to dissect inhibitor modes of action. Solid-state NMR approaches will also be invaluable in obtaining parameters of matrix architecture.

  6. Galvanostatic bottom-up filling of TSV-like trenches: Choline-based leveler containing two quaternary ammoniums

    International Nuclear Information System (INIS)

    Highlights: • The choline-based leveler having two quaternary ammoniums was synthesized. • The adsorption of this leveler with suppressor and accelerator was examined. • Galvanostatic Cu bottom-up filling was achieved with three-additive system. • The mechanism of gap-filling was elucidated based on the additive adsorption. - Abstract: Through Silicon Via (TSV) technology is essential to accomplish 3-dimensional packaging of electronics. Hence, more reliable and faster TSV filling by Cu electrodeposition is required. Our approach to improve Cu gap-filling in TSV is based on the development of new organic additives for feature filling. Here, we introduce our achievements from the synthesis of choline-based leveler to the feature filling using a synthesized leveler. The choline-based leveler, which includes two quaternary ammoniums at both ends of the molecule, is synthesized from glutaric acid. The characteristics of the choline-based additive are examined by the electrochemical analyses, and it is confirmed that the choline-based leveler shows a convection dependent adsorption behavior, which is essential for leveling. The interactions between the polymeric suppressor, accelerator, and the choline-based leveler are also investigated by changing the convection condition. Using the combination of suppressor, accelerator, and the choline-based leveler, the extreme bottom-up filling of Cu at trenches with dimensions similar to TSV are fulfilled. The mechanism of Cu gap-filling is demonstrated based on the results of electrochemical analyses and feature filling

  7. Optimal Environmental Conditions and Anomalous Ecosystem Responses: Constraining Bottom-up Controls of Phytoplankton Biomass in the California Current System

    Science.gov (United States)

    Jacox, Michael G.; Hazen, Elliott L.; Bograd, Steven J.

    2016-06-01

    In Eastern Boundary Current systems, wind-driven upwelling drives nutrient-rich water to the ocean surface, making these regions among the most productive on Earth. Regulation of productivity by changing wind and/or nutrient conditions can dramatically impact ecosystem functioning, though the mechanisms are not well understood beyond broad-scale relationships. Here, we explore bottom-up controls during the California Current System (CCS) upwelling season by quantifying the dependence of phytoplankton biomass (as indicated by satellite chlorophyll estimates) on two key environmental parameters: subsurface nitrate concentration and surface wind stress. In general, moderate winds and high nitrate concentrations yield maximal biomass near shore, while offshore biomass is positively correlated with subsurface nitrate concentration. However, due to nonlinear interactions between the influences of wind and nitrate, bottom-up control of phytoplankton cannot be described by either one alone, nor by a combined metric such as nitrate flux. We quantify optimal environmental conditions for phytoplankton, defined as the wind/nitrate space that maximizes chlorophyll concentration, and present a framework for evaluating ecosystem change relative to environmental drivers. The utility of this framework is demonstrated by (i) elucidating anomalous CCS responses in 1998–1999, 2002, and 2005, and (ii) providing a basis for assessing potential biological impacts of projected climate change.

  8. Orchestrated structure evolution: accelerating direct-write nanomanufacturing by combining top-down patterning with bottom-up growth

    International Nuclear Information System (INIS)

    Direct-write nanomanufacturing with scanning beams and probes is flexible and can produce high quality products, but it is normally slow and expensive to raster point-by-point over a pattern. We demonstrate the use of an accelerated direct-write nanomanufacturing method called 'orchestrated structure evolution' (OSE), where a direct-write tool patterns a small number of growth 'seeds' that subsequently grow into the final thin film pattern. Through control of seed size and spacing, it is possible to vary the ratio of 'top-down' to 'bottom-up' character of the patterning processes, ranging from conventional top-down raster patterning to nearly pure bottom-up space-filling via seed growth. Electron beam lithography (EBL) and copper electrodeposition were used to demonstrate trade-offs between process time and product quality over nano- to microlength scales. OSE can reduce process times for high-cost EBL patterning by orders of magnitude, at the expense of longer (but inexpensive) copper electrodeposition processing times. We quantify the degradation of pattern quality that accompanies fast OSE patterning by measuring deviations from the desired patterned area and perimeter. We also show that the density of OSE-induced grain boundaries depends upon the seed separation and size. As the seed size is reduced, the uniformity of an OSE film becomes more dependent on details of seed nucleation processes than normally seen for conventionally patterned films.

  9. The drastic outcomes from voting alliances in three-party bottom-up democratic voting (1990 $\\rightarrow$ 2013)

    CERN Document Server

    Galam, Serge

    2013-01-01

    The drastic effect of local alliances in three-party competition is investigated in democratic hierarchical bottom-up voting. The results are obtained analytically using a model which extends a sociophysics frame introduced in 1986 \\cite{psy} and 1990 \\cite{lebo} to study two-party systems and the spontaneous formation of democratic dictatorship. It is worth stressing that the 1990 paper was published in the Journal of Statistical Physics, the first paper of its kind in the journal. It was shown how a minority in power can preserve its leadership using bottom-up democratic elections. However such a bias holds only down to some critical value of minimum support. The results were used latter to explain the sudden collapse of European communist parties in the nineties. The extension to three-party competition reveals the mechanisms by which a very small minority party can get a substantial representation at higher levels of the hierarchy when the other two competing parties are big. Additional surprising results...

  10. Carbon balance: the top-down and bottom-up emissions accounting methodologies; Balanco de carbono: a contabilidade das emissoes nas metodologias 'Top-Down' estendida ('Top-Bottom') e 'Bottom-Up'

    Energy Technology Data Exchange (ETDEWEB)

    Alvim, Carlos Feu; Eidelman, Frida; Ferreira, Omar Campos

    2005-08-15

    The Economy and Energy Organization has carried out together with the Ministry of Science and Technology a study on the carbon balance of energy use and transformation. The publication of its results has been made through the e and e periodical in its 48 and 50 issues. In the present issue we are publishing the results corresponding to the extended Top-Down accounting process and those corresponding to the use of the coefficients calculated for the Brazilian inventory from 1990 to 1994, using the Bottom-Up process, to estimate the emissions from 1970 to 2002. By comparing the two results it is possible to evaluate their deficiencies and the possible incoherence in the use of the two methodologies. (author)

  11. Dual Low-Rank Pursuit: Learning Salient Features for Saliency Detection.

    Science.gov (United States)

    Lang, Congyan; Feng, Jiashi; Feng, Songhe; Wang, Jingdong; Yan, Shuicheng

    2016-06-01

    Saliency detection is an important procedure for machines to understand visual world as humans do. In this paper, we consider a specific saliency detection problem of predicting human eye fixations when they freely view natural images, and propose a novel dual low-rank pursuit (DLRP) method. DLRP learns saliency-aware feature transformations by utilizing available supervision information and constructs discriminative bases for effectively detecting human fixation points under the popular low-rank and sparsity-pursuit framework. Benefiting from the embedded high-level information in the supervised learning process, DLRP is able to predict fixations accurately without performing the expensive object segmentation as in the previous works. Comprehensive experiments clearly show the superiority of the proposed DLRP method over the established state-of-the-art methods. We also empirically demonstrate that DLRP provides stronger generalization performance across different data sets and inherits the advantages of both the bottom-up- and top-down-based saliency detection methods. PMID:27046853

  12. Atomic layer deposition-Sequential self-limiting surface reactions for advanced catalyst "bottom-up" synthesis

    Science.gov (United States)

    Lu, Junling; Elam, Jeffrey W.; Stair, Peter C.

    2016-06-01

    Catalyst synthesis with precise control over the structure of catalytic active sites at the atomic level is of essential importance for the scientific understanding of reaction mechanisms and for rational design of advanced catalysts with high performance. Such precise control is achievable using atomic layer deposition (ALD). ALD is similar to chemical vapor deposition (CVD), except that the deposition is split into a sequence of two self-limiting surface reactions between gaseous precursor molecules and a substrate. The unique self-limiting feature of ALD allows conformal deposition of catalytic materials on a high surface area catalyst support at the atomic level. The deposited catalytic materials can be precisely constructed on the support by varying the number and type of ALD cycles. As an alternative to the wet-chemistry based conventional methods, ALD provides a cycle-by-cycle "bottom-up" approach for nanostructuring supported catalysts with near atomic precision. In this review, we summarize recent attempts to synthesize supported catalysts with ALD. Nucleation and growth of metals by ALD on oxides and carbon materials for precise synthesis of supported monometallic catalyst are reviewed. The capability of achieving precise control over the particle size of monometallic nanoparticles by ALD is emphasized. The resulting metal catalysts with high dispersions and uniformity often show comparable or remarkably higher activity than those prepared by conventional methods. For supported bimetallic catalyst synthesis, we summarize the strategies for controlling the deposition of the secondary metal selectively on the primary metal nanoparticle but not on the support to exclude monometallic formation. As a review of the surface chemistry and growth behavior of metal ALD on metal surfaces, we demonstrate the ways to precisely tune size, composition and structure of bimetallic metal nanoparticles. The cycle-by-cycle "bottom up" construction of bimetallic (or multiple

  13. Direct and indirect bottom-up and top-down forces shape the abundance of the orb-web spider Argiope bruennichi

    OpenAIRE

    Bruggisser, Odile T; Sandau, Nadine; Blandenier, Gilles; Fabian, Yvonne; Kehrli, Patrik; Aebi, Alex; Russell E Naisbit; Bersier, Louis-Félix

    2014-01-01

    Species abundance in local communities is determined by bottom-up and top-down processes, which can act directly and indirectly on the focal species. Studies examining these effects simultaneously are rare. Here we explore the direct top-down and direct and indirect bottom-up forces regulating the abundance and predation success of an intermediate predator, the web-building spider Argiope bruennichi (Araneae: Araneidae). We manipulated plant diversity (2, 6, 12 or 20 sown species) in 9 wildfl...

  14. The influence of top-down, bottom-up and abiotic factors on the moose (Alces alces) population of Isle Royale.

    OpenAIRE

    Vucetich, John A.; Rolf O Peterson

    2004-01-01

    Long-term, concurrent measurement of population dynamics and associated top-down and bottom-up processes are rare for unmanipulated, terrestrial systems. Here, we analyse populations of moose, their predators (wolves, Canis lupus), their primary winter forage (balsam fir, Abies balsamea) and several climatic variables that were monitored for 40 consecutive years in Isle Royale National Park (544 km2), Lake Superior, USA. We judged the relative importance of top-down, bottom-up and abiotic fac...

  15. Balancing Top-Down, Bottom-Up, and Peer-to-Peer Approaches to Sustaining Distance Training

    Directory of Open Access Journals (Sweden)

    Zane BERGE

    2006-07-01

    Full Text Available Many distance training case studies identify distance training leadership as bottom-up, whereas much of the literature suggests a need for strategic, top-down approaches. With change management as an overarching framework, approaches to sustaining distance training that originate at different levels of the organization are explored. Special attention is paid to the content of the change messages involved, guided by Rogers’ five attributes of innovations. Research of change management and distance training literature suggests a combination of approaches that should fit the organizational culture as well as correctly address genuine concerns at the various organizational levels. A properly balanced approach could lead to new levels of communication and understanding in a learning organization and to distance training being sustained as a business process

  16. Mass Spectrometry Applied to Bottom-Up Proteomics: Entering the High-Throughput Era for Hypothesis Testing

    Science.gov (United States)

    Gillet, Ludovic C.; Leitner, Alexander; Aebersold, Ruedi

    2016-06-01

    Proteins constitute a key class of molecular components that perform essential biochemical reactions in living cells. Whether the aim is to extensively characterize a given protein or to perform high-throughput qualitative and quantitative analysis of the proteome content of a sample, liquid chromatography coupled to tandem mass spectrometry has become the technology of choice. In this review, we summarize the current state of mass spectrometry applied to bottom-up proteomics, the approach that focuses on analyzing peptides obtained from proteolytic digestion of proteins. With the recent advances in instrumentation and methodology, we show that the field is moving away from providing qualitative identification of long lists of proteins to delivering highly consistent and accurate quantification values for large numbers of proteins across large numbers of samples. We believe that this shift will have a profound impact for the field of proteomics and life science research in general.

  17. The faith of a physicist reflections of a bottom-up thinker : the Gifford lectures for 1993-4

    CERN Document Server

    Polkinghorne, John C

    1994-01-01

    Is it possible to think like a scientist and yet have the faith of a Christian? Although many Westerners might say no, there are also many critically minded individuals who entertain what John Polkinghorne calls a "wistful wariness" toward religion--they feel unable to accept religion on rational grounds yet cannot dismiss it completely. Polkinghorne, both a particle physicist and Anglican priest, here explores just what rational grounds there could be for Christian beliefs, maintaining that the quest for motivated understanding is a concern shared by scientists and religious thinkers alike. Anyone who assumes that religion is based on unquestioning certainties, or that it need not take into account empirical knowledge, will be challenged by Polkinghorne's bottom-up examination of Christian beliefs about events ranging from creation to the resurrection. The author organizes his inquiry around the Nicene Creed, an early statement that continues to summarize Christian beliefs. He applies to each of its tenets ...

  18. Identifying robust clusters and multi-community nodes by combining top-down and bottom-up approaches to clustering

    CERN Document Server

    Gaiteri, Chris; Szymanski, Boleslaw; Kuzmin, Konstantin; Xie, Jierui; Lee, Changkyu; Blanche, Timothy; Neto, Elias Chaibub; Huang, Su-Chun; Grabowski, Thomas; Madhyastha, Tara; Komashko, Vitalina

    2015-01-01

    Biological functions are often realized by groups of interacting molecules or cells. Membership in these groups may overlap when molecules or cells are reused in multiple functions. Traditional clustering methods assign each component to one group. Noisy measurements are common in high-throughput biological datasets. These two limitations reduce our ability to accurately define clusters in biological datasets and to interpret their biological functions. To address these limitations, we designed an algorithm called SpeakEasy, which detects overlapping or non-overlapping communities in biological networks. Input to SpeakEasy can be physical networks, such as molecular interactions, or inferred networks, such as gene coexpression networks. The networks can be directed or undirected, and may contain negative links. SpeakEasy combines traditional bottom-up and top-down approaches to clustering, by creating competition between clusters. Nodes that oscillate between multiple clusters in this competition are classifi...

  19. Aid effectiveness from Rome to Busan: some progress but lacking bottom-up approaches or behaviour changes.

    Science.gov (United States)

    Martini, Jessica; Mongo, Roch; Kalambay, Hyppolite; Fromont, Anne; Ribesse, Nathalie; Dujardin, Bruno

    2012-07-01

    The Busan partnership adopted at the 4th High Level Forum on Aid Effectiveness at the end of last year is a significant step forward towards the improvement of aid quality and the promotion of development. In particular, the inclusiveness achieved in Busan and the shift in discourse from 'aid effectiveness' to 'development effectiveness' are emblematic. However, key challenges still remain. Firstly, decision-making should be more bottom-up, finding ways to take into account the populations' needs and experiences and to enhance self-learning dynamics during the policy process. Today, it is particularly necessary to define what 'development' means at country level, according to the aspirations of particular categories of people and meeting operational and local expectations. Secondly, changes in language should be followed by a real change in mindset. Development stakeholders should further adapt their procedures to the reality of complex systems in which development interventions are being dealt with. PMID:22583911

  20. Radiographic Evaluation of Children with Febrile Urinary Tract Infection: Bottom-Up, Top-Down, or None of the Above?

    Directory of Open Access Journals (Sweden)

    Michaella M. Prasad

    2012-01-01

    Full Text Available The proper algorithm for the radiographic evaluation of children with febrile urinary tract infection (FUTI is hotly debated. Three studies are commonly administered: renal-bladder ultrasound (RUS, voiding cystourethrogram (VCUG, and dimercapto-succinic acid (DMSA scan. However, the order in which these tests are obtained depends on the methodology followed: bottom-up or top-down. Each strategy carries advantages and disadvantages, and some groups now advocate even less of a workup (none of the above due to the current controversies about treatment when abnormalities are diagnosed. New technology is available and still under investigation, but it may help to clarify the interplay between vesicoureteral reflux, renal scarring, and dysfunctional elimination in the future.

  1. Construction of membrane-bound artificial cells using microfluidics: a new frontier in bottom-up synthetic biology

    Science.gov (United States)

    Elani, Yuval

    2016-01-01

    The quest to construct artificial cells from the bottom-up using simple building blocks has received much attention over recent decades and is one of the grand challenges in synthetic biology. Cell mimics that are encapsulated by lipid membranes are a particularly powerful class of artificial cells due to their biocompatibility and the ability to reconstitute biological machinery within them. One of the key obstacles in the field centres on the following: how can membrane-based artificial cells be generated in a controlled way and in high-throughput? In particular, how can they be constructed to have precisely defined parameters including size, biomolecular composition and spatial organization? Microfluidic generation strategies have proved instrumental in addressing these questions. This article will outline some of the major principles underpinning membrane-based artificial cells and their construction using microfluidics, and will detail some recent landmarks that have been achieved. PMID:27284034

  2. Bottom-Up Fabrication of Nanopatterned Polymers on DNA Origami by In Situ Atom-Transfer Radical Polymerization.

    Science.gov (United States)

    Tokura, Yu; Jiang, Yanyan; Welle, Alexander; Stenzel, Martina H; Krzemien, Katarzyna M; Michaelis, Jens; Berger, Rüdiger; Barner-Kowollik, Christopher; Wu, Yuzhou; Weil, Tanja

    2016-05-01

    Bottom-up strategies to fabricate patterned polymers at the nanoscale represent an emerging field in the development of advanced nanodevices, such as biosensors, nanofluidics, and nanophotonics. DNA origami techniques provide access to distinct architectures of various sizes and shapes and present manifold opportunities for functionalization at the nanoscale with the highest precision. Herein, we conduct in situ atom-transfer radical polymerization (ATRP) on DNA origami, yielding differently nanopatterned polymers of various heights. After cross-linking, the grafted polymeric nanostructures can even stably exist in solution without the DNA origami template. This straightforward approach allows for the fabrication of patterned polymers with low nanometer resolution, which provides access to unique DNA-based functional hybrid materials. PMID:27058968

  3. Reconciling Long-Term Trends in Air Quality with Bottom-up Emission Inventories for Los Angeles

    Science.gov (United States)

    Mcdonald, B. C.; Kim, S. W.; Frost, G. J.; Harley, R.; Trainer, M.

    2014-12-01

    Significant long-term changes in air quality have been observed in the United States over several decades. However, reconciling ambient observations with bottom-up emission inventories has proved challenging. In this study, we perform WRF-Chem modeling in the Los Angeles basin for carbon monoxide (CO), nitrogen oxides (NOx), volatile organic compounds (VOCs), and ozone (O3) over a long time period (1987-2010). To improve reconciliation of emission inventories with atmospheric observations, we incorporate new high-resolution emissions maps of a major to dominant source of urban air pollution, motor vehicles. A fuel-based approach is used to estimate motor vehicle emissions utilizing annual fuel sales reports, traffic count data that capture spatial and temporal patterns of vehicle activity, and pollutant emission factors measured from roadway studies performed over the last twenty years. We also update emissions from stationary sources using Continuous Emissions Monitoring Systems (CEMS) data when available, and use emission inventories developed by the South Coast Air Quality Management District (SCAQMD) and California Air Resources Board (ARB) for other important emission source categories. WRF-Chem modeling is performed in three years where field-intensive measurements were made: 1987 (SCAQS: Southern California Air Quality Study), 2002 (ITCT: Intercontinental Transport and Chemical Transformation Study), and 2010 (CALNEX). We assess the ability of the improved bottom-up emissions inventory to predict long-term changes in ambient levels of CO, NOx, and O3, which are known to have occurred over this time period. We also assess changing spatial and temporal patterns of primary (CO and NOx) and secondary (O3) pollutant concentrations across the Los Angeles basin, which has important implications on human health.

  4. China's energy and emissions outlook to 2050: Perspectives from bottom-up energy end-use model

    International Nuclear Information System (INIS)

    Although China became the world's largest CO2 emitter in 2007, the country has also taken serious actions to reduce its energy and carbon intensity. This study uses the bottom-up LBNL China End-Use Energy Model to assess the role of energy efficiency policies in transitioning China to a lower emission trajectory and meeting its 2020 intensity reduction goals. Two scenarios – Continued Improvement and Accelerated Improvement – were developed to assess the impact of actions already taken by the Chinese government as well as planned and potential actions, and to evaluate the potential for China to reduce energy demand and emissions. This scenario analysis presents an important modeling approach based in the diffusion of end-use technologies and physical drivers of energy demand and thereby help illuminate China's complex and dynamic drivers of energy consumption and implications of energy efficiency policies. The findings suggest that China's CO2 emissions will not likely continue growing throughout this century because of saturation effects in appliances, residential and commercial floor area, roadways, fertilizer use; and population peak around 2030 with slowing urban population growth. The scenarios also underscore the significant role that policy-driven efficiency improvements will play in meeting 2020 carbon mitigation goals along with a decarbonized power supply. - Highlights: ► Bottom-up model of China's energy and CO2 reductions through sectoral policies. ► 2 scenarios evaluate impact of actions already taken/planned and future potential. ► China's CO2 will not likely continue growing through 2050 due to saturation effects. ► Results emphasize both policy-driven efficiency and decarbonized power supply.

  5. Zooplankton and forage fish species off Peru: Large-scale bottom-up forcing and local-scale depletion

    Science.gov (United States)

    Ayón, Patricia; Swartzman, Gordon; Bertrand, Arnaud; Gutiérrez, Mariano; Bertrand, Sophie

    2008-10-01

    The Humboldt Current System, like all upwelling systems, has dramatic quantities of plankton-feeding fish, which suggested that their population dynamics may ‘drive’ or ‘control’ ecosystem dynamics. With this in mind we analysed the relationship between forage fish populations and their main prey, zooplankton populations. Our study combined a zooplankton sampling program (1961-2005) with simultaneous acoustic observations on fish from 40 pelagic surveys (1983-2005) conducted by the Peruvian Marine Research Institute (IMARPE) and landing statistics for anchoveta ( Engraulis ringens) and sardine ( Sardinops sagax) along the Peruvian coast from 1961 to 2005. The multi-year trend of anchoveta population abundance varied consistently with zooplankton biovolume trend, suggesting bottom-up control on anchovy at the population scale (since oceanographic conditions and phytoplankton production support the changes in zooplankton abundance). For a finer-scale analysis (km) we statistically modelled zooplankton biovolume as a function of geographical (latitude and distance from the 200-m isobath), environmental (sea surface temperature), temporal (year, month and time-of-day) and biological (acoustic anchovy and sardine biomass within 5 km of each zooplankton sample) covariates over all survey using both classification and regression trees (CART) and generalized additive models (GAM). CART showed local anchoveta density to have the strongest effect on zooplankton biovolume, with significantly reduced levels of biovolume for higher neighbourhood anchoveta biomass. Additionally, zooplankton biovolume was higher offshore than on the shelf. GAM results corroborated the CART findings, also showing a clear diel effect on zooplankton biovolume, probably due to diel migration or daytime net avoidance. Apparently, the observed multi-year population scale bottom-up control is not inconsistent with local depletion of zooplankton when anchoveta are locally abundant, since the

  6. A bottom-up analysis of China’s iron and steel industrial energy consumption and CO2 emissions

    International Nuclear Information System (INIS)

    Highlights: • This paper analyses future steel demand and steel scrap consumption through bottom-up analysis. • Steel scrap consumption is analyzed individually by different sources. • Steel consumption and energy consumption will peak at around 2020 and 2015 respectively. • Energy intensity and CO2 intensity of steel production will decrease obviously in the future. • Energy efficiency improvement and structural change will play different roles in near- and long-term CO2 mitigations. - Abstract: China’s steel industry has grown significantly since the mid-1990s, and has been the backbone of Chinese heavy industry. It is also the most energy intensive industrial sector in China, accounting for 16.1% of total energy consumption in 2010. To assess energy consumption and CO2 emissions from China’s steel industry, a system dynamics model and a bottom-up energy system model-TIMES (The Integrated MARKAL-EFOM System) were used to analyze steel demand, energy consumption and CO2 emissions from China’s iron and steel industry from 2010 to 2050. The model results suggest that steel production in China will rise from 627 Mt in 2010, to a peak of 772 Mt in 2020, and then gradually decrease to 527 Mt in 2050. The share of Electric Arc Furnace (EAF) steel production will also increase significantly from 9.8% in 2010, to 45.6% in 2050. With the deployment of energy conservation technologies, such as Coke Dry Quenching, exhaust gas and heat recovery equipment, energy intensity and CO2 intensity of steel production will keep decreasing during the modeling period. In the near future, reductions in energy intensity and CO2 intensity will rely more on energy efficiency improvements; however, from a long-term perspective, structural change-the increasing share of EAF steel production, will be of great significance

  7. —Competitive Brand Salience

    OpenAIRE

    Ralf van der Lans; Rik Pieters; Michel Wedel

    2008-01-01

    Brand salience—the extent to which a brand visually stands out from its competitors—is vital in competing on the shelf, yet is not easy to achieve in practice. This study proposes a methodology to determine the competitive salience of brands, based on a model of visual search and eye-movement recordings collected during a brand search experiment. We estimate brand salience at the point of purchase, based on perceptual features (color, luminance, edges) and how these are influenced by consumer...

  8. Hybrid bottom-up/top-down energy and economy outlooks: a survey of the IMACLIM-S experiments

    Directory of Open Access Journals (Sweden)

    Frédéric eGhersi

    2015-11-01

    Full Text Available In this paper we survey the research undertaken at the Centre International de Recherche sur l’Environnement et le Développement (CIRED on the combination of the IMACLIM-S macroeconomic model with ‘bottom-up’ energy modeling, with a view to associate the strengths and circumvent the limitations of both approaches to energy-economy-environment (E3 prospective modeling. We start by presenting the two methodological avenues of coupling IMACLIM-S with detailed energy systems models pursued at CIRED since the late 1990s’: (1 the calibration of the behavioral functions of IMACLIM-S that represent the producers’ and consumers’ trade-offs between inputs or consumptions, on a large set of bottom-up modeling results; (2 the coupling of IMACLIM-S to some bottom-up model through the iterative exchange of some of each model’s outputs as the other model’s inputs until convergence of the exchanged data, comprising the main macroeconomic drivers and energy systems variables. In the following section, we turn to numerical application and address the prerequisite of harmonizing national accounts, energy balance and energy price data to produce consistent hybrid input-output matrices as a basis of scenario exploration. We highlight how this data treatment step reveals the discrepancies and biases induced by sticking to the conventional modeling usage of uniform pricing of homogeneous goods. IMACLIM-S rather calibrates agent-specific margins, which we introduce and comment upon. In a further section we sum up the results of 4 IMACLIM-S experiments, insisting upon the value-added of hybrid modeling. These varied experiments regard international climate policy burden sharing; the more general numerical consequences of shifting from a biased standard CGE model perspective to the hybrid IMACLIM approach; the macroeconomic consequences of a strong development of electric mobility in the European Union; and the resilience of public debts to energy shocks

  9. Intentional action processing results from automatic bottom-up attention: An EEG-investigation into the Social Relevance Hypothesis using hypnosis.

    Science.gov (United States)

    Neufeld, Eleonore; Brown, Elliot C; Lee-Grimm, Sie-In; Newen, Albert; Brüne, Martin

    2016-05-01

    Social stimuli grab our attention. However, it has rarely been investigated how variations in attention affect the processing of social stimuli, although the answer could help us uncover details of social cognition processes such as action understanding. In the present study, we examined how changes to bottom-up attention affects neural EEG-responses associated with intentional action processing. We induced an increase in bottom-up attention by using hypnosis. We recorded the electroencephalographic μ-wave suppression of hypnotized participants when presented with intentional actions in first and third person perspective in a video-clip paradigm. Previous studies have shown that the μ-rhythm is selectively suppressed both when executing and observing goal-directed motor actions; hence it can be used as a neural signal for intentional action processing. Our results show that neutral hypnotic trance increases μ-suppression in highly suggestible participants when they observe intentional actions. This suggests that social action processing is enhanced when bottom-up attentional processes are predominant. Our findings support the Social Relevance Hypothesis, according to which social action processing is a bottom-up driven attentional process, and can thus be altered as a function of bottom-up processing devoted to a social stimulus. PMID:26998562

  10. Analysis of the Economic Impact of Large-Scale Deployment of Biomass Resources for Energy and Materials in the Netherlands. Appendix 1. Bottom-up Scenarios

    International Nuclear Information System (INIS)

    The Bio-based Raw Materials Platform (PGG), part of the Energy Transition in The Netherlands, commissioned the Agricultural Economics Research Institute (LEI) and the Copernicus Institute of Utrecht University to conduct research on the macro-economic impact of large scale deployment of biomass for energy and materials in the Netherlands. Two model approaches were applied based on a consistent set of scenario assumptions: a bottom-up study including technoeconomic projections of fossil and bio-based conversion technologies and a topdown study including macro-economic modelling of (global) trade of biomass and fossil resources. The results of the top-down and bottom-up modelling work are reported separately. The results of the synthesis of the modelling work are presented in the main report. This report (part 1) presents scenarios for future biomass use for energy and materials, and analyses the consequences on energy supply, chemical productions, costs and greenhouse gas (GHG) emissions with a bottom-up approach. The bottom-up projections, as presented in this report, form the basis for modelling work using the top-down macro-economic model (LEITAP) to assess the economic impact of substituting fossil-based energy carriers with biomass in the Netherlands. The results of the macro-economic modelling work, and the linkage between the results of the bottom-up and top-down work, will be presented in the top-down economic part and synthesis report of this study

  11. Integrating Source Apportionment Tracers into a Bottom-up Inventory of Methane Emissions in the Barnett Shale Hydraulic Fracturing Region.

    Science.gov (United States)

    Townsend-Small, Amy; Marrero, Josette E; Lyon, David R; Simpson, Isobel J; Meinardi, Simone; Blake, Donald R

    2015-07-01

    A growing dependence on natural gas for energy may exacerbate emissions of the greenhouse gas methane (CH4). Identifying fingerprints of these emissions is critical to our understanding of potential impacts. Here, we compare stable isotopic and alkane ratio tracers of natural gas, agricultural, and urban CH4 sources in the Barnett Shale hydraulic fracturing region near Fort Worth, Texas. Thermogenic and biogenic sources were compositionally distinct, and emissions from oil wells were enriched in alkanes and isotopically depleted relative to natural gas wells. Emissions from natural gas production varied in δ(13)C and alkane ratio composition, with δD-CH4 representing the most consistent tracer of natural gas sources. We integrated our data into a bottom-up inventory of CH4 for the region, resulting in an inventory of ethane (C2H6) sources for comparison to top-down estimates of CH4 and C2H6 emissions. Methane emissions in the Barnett are a complex mixture of urban, agricultural, and fossil fuel sources, which makes source apportionment challenging. For example, spatial heterogeneity in gas composition and high C2H6/CH4 ratios in emissions from conventional oil production add uncertainty to top-down models of source apportionment. Future top-down studies may benefit from the addition of δD-CH4 to distinguish thermogenic and biogenic sources. PMID:26148556

  12. A process to develop operational bottom-up evaluation methods - from reference guidebooks to a practical culture of evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Broc, Jean-Sebastien; Bourges, Bernard [Ecole des Mines de Nantes (DSEE) (France); Adnot, Jerome [Ecole des Mines de Paris, Centre of Energy and Processes (France)

    2007-07-01

    Needs for evaluating energy efficiency (EE) activities are increasing, for the accounting of results and for understanding their success/failures. Indeed evaluation results should be used for both reporting past activities and improving future operations. Lack of easy to use methods is pointed out by local stakeholders as a major barrier to evaluation. Another issue is the frequent negative perception of evaluation, experienced as a control and/or a waste of time.This paper presents a systematic process to develop bottom-up evaluation methods designed to fit to stakeholders needs: directly operational, easy to appropriate, providing useful conclusions to improve operations and to communicate about their results.Our approach relies on the principle of experience capitalisation and on an organisation with two levels, central and on-field. It aims to create conditions for continuous improvement. Moreover it should insure involved stakeholders do actually take part in and take advantage of the evaluation process. This methodology handles both impact and process evaluation. For the impacts, focus is on calculations transparency, data quality and reliability of the results. Regarding operation process, main issues are analysing causality between actions and results, and detecting the success and failure factors.This work was first developed for the evaluation of local operations in France. The resulting methodology was tested on two case studies from the Eco Energy Plan, a local EE programme implemented in South-East of France.

  13. Conservative and dissipative force field for simulation of coarse-grained alkane molecules: a bottom-up approach.

    Science.gov (United States)

    Trément, Sébastien; Schnell, Benoît; Petitjean, Laurent; Couty, Marc; Rousseau, Bernard

    2014-04-01

    We apply operational procedures available in the literature to the construction of coarse-grained conservative and friction forces for use in dissipative particle dynamics (DPD) simulations. The full procedure rely on a bottom-up approach: large molecular dynamics trajectories of n-pentane and n-decane modeled with an anisotropic united atom model serve as input for the force field generation. As a consequence, the coarse-grained model is expected to reproduce at least semi-quantitatively structural and dynamical properties of the underlying atomistic model. Two different coarse-graining levels are studied, corresponding to five and ten carbon atoms per DPD bead. The influence of the coarse-graining level on the generated force fields contributions, namely, the conservative and the friction part, is discussed. It is shown that the coarse-grained model of n-pentane correctly reproduces self-diffusion and viscosity coefficients of real n-pentane, while the fully coarse-grained model for n-decane at ambient temperature over-predicts diffusion by a factor of 2. However, when the n-pentane coarse-grained model is used as a building block for larger molecule (e.g., n-decane as a two blobs model), a much better agreement with experimental data is obtained, suggesting that the force field constructed is transferable to large macro-molecular systems. PMID:24712786

  14. Bottom-Up Fabrication of Activated Carbon Fiber for All-Solid-State Supercapacitor with Excellent Electrochemical Performance.

    Science.gov (United States)

    Ma, Wujun; Chen, Shaohua; Yang, Shengyuan; Chen, Wenping; Weng, Wei; Zhu, Meifang

    2016-06-15

    Activated carbon (AC) is the most extensively used electrode material for commercial electric double layer capacitors (EDLC) given its high specific surface area (SSA) and moderate cost. However, AC is primarily used in the forms of powders, which remains a big challenge in developing AC powders into continuous fibers. If AC powders can be processed into fiber, then they may be scaled up for practical applications to supercapacitors (SCs) and satisfy the rapid development of flexible electronics. Herein, we report a bottom-up method to fabricate AC fiber employing graphene oxide (GO) as both dispersant and binder. After chemical reduction, the fiber has high electrical conductivity (185 S m(-1)), high specific surface area (1476.5 m(2) g(-1)), and good mechanical flexibility. An all solid-state flexible SC was constructed using the prepared fiber as electrode, which is free of binder, conducting additive, and additional current collector. The fiber-shaped SC shows high capacitance (27.6 F cm(-3) or 43.8 F g(-1), normalized to the two-electrode volume), superior cyclability (90.4% retention after 10 000 cycles), and good bendability (96.8% retention after bending 1000 times). PMID:27239680

  15. Evaluating the impact of odors from the 1955 landfills in China using a bottom-up approach.

    Science.gov (United States)

    Cai, Bofeng; Wang, Jinnan; Long, Ying; Li, Wanxin; Liu, Jianguo; Ni, Zhe; Bo, Xin; Li, Dong; Wang, Jianghao; Chen, Xuejing; Gao, Qingxian; Zhang, Lixiao

    2015-12-01

    Landfill odors have created a major concern for the Chinese public. Based on the combination of a first order decay (FOD) model and a ground-level point source Gaussian dispersion model, the impacts from odors emitted from the 1955 landfills in China are evaluated in this paper. Our bottom-up approach uses basic data related to each landfill to achieve a more accurate and comprehensive understanding of impact of landfill odors. Results reveal that the average radius of impact of landfill odors in China is 796 m, while most landfills (46.85%) are within the range of 400-1000 m, in line with the results from previous studies. The total land area impacted by odors has reached 837,476 ha, accounting for 0.09% of China's land territory. Guangdong and Sichuan provinces have the largest land areas impacted by odors, while Tibet Autonomous Region and Tianjin Municipality have the smallest. According to the CALPUFF (California Puff) model and an analysis of social big data, the overall uncertainty of our calculation of the range of odor impacts is roughly -32.88% to 32.67%. This type of study is essential for gaining an accurate and detailed estimation of the affected human population and will prove valuable for addressing the current Not In My Back Yard (NIMBY) challenge in China. PMID:26398549

  16. Conservative and dissipative force field for simulation of coarse-grained alkane molecules: A bottom-up approach

    Energy Technology Data Exchange (ETDEWEB)

    Trément, Sébastien; Rousseau, Bernard, E-mail: bernard.rousseau@u-psud.fr [Laboratoire de Chimie-Physique, UMR 8000 CNRS, Université Paris-Sud, Orsay (France); Schnell, Benoît; Petitjean, Laurent; Couty, Marc [Manufacture Française des Pneumatiques MICHELIN, Centre de Ladoux, 23 place des Carmes, 63000 Clermont-Ferrand (France)

    2014-04-07

    We apply operational procedures available in the literature to the construction of coarse-grained conservative and friction forces for use in dissipative particle dynamics (DPD) simulations. The full procedure rely on a bottom-up approach: large molecular dynamics trajectories of n-pentane and n-decane modeled with an anisotropic united atom model serve as input for the force field generation. As a consequence, the coarse-grained model is expected to reproduce at least semi-quantitatively structural and dynamical properties of the underlying atomistic model. Two different coarse-graining levels are studied, corresponding to five and ten carbon atoms per DPD bead. The influence of the coarse-graining level on the generated force fields contributions, namely, the conservative and the friction part, is discussed. It is shown that the coarse-grained model of n-pentane correctly reproduces self-diffusion and viscosity coefficients of real n-pentane, while the fully coarse-grained model for n-decane at ambient temperature over-predicts diffusion by a factor of 2. However, when the n-pentane coarse-grained model is used as a building block for larger molecule (e.g., n-decane as a two blobs model), a much better agreement with experimental data is obtained, suggesting that the force field constructed is transferable to large macro-molecular systems.

  17. Beyond Defining the Smart City. Meeting Top-Down and Bottom-Up Approaches in the Middle

    Directory of Open Access Journals (Sweden)

    Jonas Breuer

    2014-05-01

    Full Text Available This paper aims to better frame the discussion and the various, divergent operationalisations and interpretations of the Smart City concept. We start by explicating top-down approaches to the Smart City, followed by what purely bottom-up initiatives can look like. We provide a clear overview of stakeholders’ different viewpoints on the city of tomorrow. Particularly the consequences and potential impacts of these differing interpretations and approaches should be of specific interest to researchers, policy makers, city administrations, private actors and anyone involved and concerned with life in cities. Therefore the goal of this article is not so much answering the question of what the Smart City is, but rather what the concept can mean for different stakeholders as well as the consequences of their interpretation. We do this by assembling an eclectic overview, bringing together definitions, examples and operationalisations from academia, policy and industry as well as identifying major trends and approaches to realizing the Smart City. We add to the debate by proposing a different approach that starts from the collective, collaboration and context when researching Smart City initiatives.

  18. Top-Down and Bottom-Up Approaches in 3D Printing Technologies for Drug Delivery Challenges.

    Science.gov (United States)

    Katakam, Prakash; Dey, Baishakhi; Assaleh, Fathi H; Hwisa, Nagiat Tayeb; Adiki, Shanta Kumari; Chandu, Babu Rao; Mitra, Analava

    2015-01-01

    3-Dimensional printing (3DP) constitutes a raft of technologies, based on different physical mechanisms, that generate a 3-dimensional physical object from a digital model. Because of its rapid fabrication and precise geometry, 3DP has gained a prominent focus in biomedical and nanobiomaterials research. Despite advancements in targeted, controlled, and pulsatile drug delivery, the achievement of site-specific and disease-responsive drug release and stringent control over in vivo biodistribution, are still some of the important, challenging areas for pharmaceutical research and development and existing drug delivery techniques. Microelectronic industries are capable of generating nano-/microdrug delivery devices at high throughputs with a highly precise control over design. Successful miniaturizations of micro-pumps with multireservoir architectures for delivery of pharmaceuticals developed by micro-electromechanical systems technology were more acceptable than implantable devices. Inkjet printing technologies, which dispense a precise amount of polymer ink solutions, find applications in controlled drug delivery. Bioelectronic products have revolutionized drug delivery technologies. Designing nanoparticles by nanoimprint lithography showed a controlled drug release pattern, biodistribution, and in vivo transport. This review highlights the "top-down" and "bottom-up" approaches of the most promising 3DP technologies and their broader applications in biomedical and therapeutic drug delivery, with critical assessment of its merits, demerits, and intellectual property rights challenges. PMID:25746205

  19. Biochemistry-directed hollow porous microspheres: bottom-up self-assembled polyanion-based cathodes for sodium ion batteries

    Science.gov (United States)

    Lin, Bo; Li, Qiufeng; Liu, Baodong; Zhang, Sen; Deng, Chao

    2016-04-01

    Biochemistry-directed synthesis of functional nanomaterials has attracted great interest in energy storage, catalysis and other applications. The unique ability of biological systems to guide molecule self-assembling facilitates the construction of distinctive architectures with desirable physicochemical characteristics. Herein, we report a biochemistry-directed ``bottom-up'' approach to construct hollow porous microspheres of polyanion materials for sodium ion batteries. Two kinds of polyanions, i.e. Na3V2(PO4)3 and Na3.12Fe2.44(P2O7)2, are employed as cases in this study. The microalgae cell realizes the formation of a spherical ``bottom'' bio-precursor. Its tiny core is subjected to destruction and its tough shell tends to carbonize upon calcination, resulting in the hollow porous microspheres for the ``top'' product. The nanoscale crystals of the polyanion materials are tightly enwrapped by the highly-conductive framework in the hollow microsphere, resulting in the hierarchical nano-microstructure. The whole formation process is disclosed as a ``bottom-up'' mechanism. Moreover, the biochemistry-directed self-assembly process is confirmed to play a crucial role in the construction of the final architecture. Taking advantage of the well-defined hollow-microsphere architecture, the abundant interior voids and the highly-conductive framework, polyanion materials show favourable sodium-intercalation kinetics. Both materials are capable of high-rate long-term cycling. After five hundred cycles at 20 C and 10 C, Na3V2(PO4)3 and Na3.12Fe2.44(P2O7)2 retain 96.2% and 93.1% of the initial capacity, respectively. Therefore, the biochemistry-directed technique provides a low-cost, highly-efficient and widely applicable strategy to produce high-performance polyanion-based cathodes for sodium ion batteries.Biochemistry-directed synthesis of functional nanomaterials has attracted great interest in energy storage, catalysis and other applications. The unique ability of

  20. Estimation of Emissions from Sugarcane Field Burning in Thailand Using Bottom-Up Country-Specific Activity Data

    Directory of Open Access Journals (Sweden)

    Wilaiwan Sornpoon

    2014-09-01

    Full Text Available Open burning in sugarcane fields is recognized as a major source of air pollution. However, the assessment of its emission intensity in many regions of the world still lacks information, especially regarding country-specific activity data including biomass fuel load and combustion factor. A site survey was conducted covering 13 sugarcane plantations subject to different farm management practices and climatic conditions. The results showed that pre-harvest and post-harvest burnings are the two main practices followed in Thailand. In 2012, the total production of sugarcane biomass fuel, i.e., dead, dry and fresh leaves, amounted to 10.15 million tonnes, which is equivalent to a fuel density of 0.79 kg∙m−2. The average combustion factor for the pre-harvest and post-harvest burning systems was determined to be 0.64 and 0.83, respectively. Emissions from sugarcane field burning were estimated using the bottom-up country-specific values from the site survey of this study and the results compared with those obtained using default values from the 2006 IPCC Guidelines. The comparison showed that the use of default values lead to underestimating the overall emissions by up to 30% as emissions from post-harvest burning are not accounted for, but it is the second most common practice followed in Thailand.

  1. Temporal shifts in top-down vs. bottom-up control of epiphytic algae in a seagrass ecosystem

    Science.gov (United States)

    Whalen, Matthew A.; Duffy, J. Emmett; Grace, James B.

    2013-01-01

    In coastal marine food webs, small invertebrate herbivores (mesograzers) have long been hypothesized to occupy an important position facilitating dominance of habitat-forming macrophytes by grazing competitively superior epiphytic algae. Because of the difficulty of manipulating mesograzers in the field, however, their impacts on community organization have rarely been rigorously documented. Understanding mesograzer impacts has taken on increased urgency in seagrass systems due to declines in seagrasses globally, caused in part by widespread eutrophication favoring seagrass overgrowth by faster-growing algae. Using cage-free field experiments in two seasons (fall and summer), we present experimental confirmation that mesograzer reduction and nutrients can promote blooms of epiphytic algae growing on eelgrass (Zostera marina). In this study, nutrient additions increased epiphytes only in the fall following natural decline of mesograzers. In the summer, experimental mesograzer reduction stimulated a 447% increase in epiphytes, appearing to exacerbate seasonal dieback of eelgrass. Using structural equation modeling, we illuminate the temporal dynamics of complex interactions between macrophytes, mesograzers, and epiphytes in the summer experiment. An unexpected result emerged from investigating the interaction network: drift macroalgae indirectly reduced epiphytes by providing structure for mesograzers, suggesting that the net effect of macroalgae on seagrass depends on macroalgal density. Our results show that mesograzers can control proliferation of epiphytic algae, that top-down and bottom-up forcing are temporally variable, and that the presence of macroalgae can strengthen top-down control of epiphytic algae, potentially contributing to eelgrass persistence.

  2. Top-down versus bottom-up estimates of methane fluxes over the East Siberian Arctic Shelf

    Science.gov (United States)

    Shakhova, N. E.; Semiletov, I. P.; Repina, I.; Salyuk, A.; Kosmach, D.; Chernykh, D.; Aniferov, A.

    2014-12-01

    Global methane (CH4) emissions are currently quantified from statistical data without testing the results against either distribution of the actual atmospheric CH4 concentrations observed in different part of the globe or the regional dynamics of these concentrations. Measurement methods despite been improved remarkably in the past few years, especially with the advent of new optical and satellite-derived methods, are limited in their applicability in the Arctic. Modeling methodologies are still under development and cannot help to evolve very coarse global-scale understanding of CH4 sources to resolution of regional-scale emissions. As a result, contribution of the Arctic sources in the global CH4 budget are yet to be quantified adequately. We used a decadal observational data set collected from the water column and from the atmospheric boundary layer (ABL) over the East Siberian Arctic Shelf (ESAS), which is the largest continental shelf, to determine the minimum source strength required to explain observed seasonally increased concentration of CH4 in the ABL. The results of top-down modeling performed by implementing a simple box model show a good agreement with results of bottom-up estimates made using interpretation of in-situ calibrated sonar data.

  3. Encouraging the pursuit of advanced degrees in science and engineering: Top-down and bottom-up methodologies

    Science.gov (United States)

    Maddox, Anthony B.; Smith-Maddox, Renee P.; Penick, Benson E.

    1989-01-01

    The MassPEP/NASA Graduate Research Development Program (GRDP) whose objective is to encourage Black Americans, Mexican Americans, American Indians, Puerto Ricans, and Pacific Islanders to pursue graduate degrees in science and engineering is described. The GRDP employs a top-down or goal driven methodology through five modules which focus on research, graduate school climate, technical writing, standardized examinations, and electronic networking. These modules are designed to develop and reinforce some of the skills necessary to seriously consider the goal of completing a graduate education. The GRDP is a community-based program which seeks to recruit twenty participants from a pool of Boston-area undergraduates enrolled in engineering and science curriculums and recent graduates with engineering and science degrees. The program emphasizes that with sufficient information, its participants can overcome most of the barriers perceived as preventing them from obtaining graduate science and engineering degrees. Experience has shown that the top-down modules may be complemented by a more bottom-up or event-driven methodology. This approach considers events in the academic and professional experiences of participants in order to develop the personal and leadership skills necessary for graduate school and similar endeavors.

  4. Pitch and spectral resolution: A systematic comparison of bottom-up cues for top-down repair of degraded speech.

    Science.gov (United States)

    Clarke, Jeanne; Başkent, Deniz; Gaudrain, Etienne

    2016-01-01

    The brain is capable of restoring missing parts of speech, a top-down repair mechanism that enhances speech understanding in noisy environments. This enhancement can be quantified using the phonemic restoration paradigm, i.e., the improvement in intelligibility when silent interruptions of interrupted speech are filled with noise. Benefit from top-down repair of speech differs between cochlear implant (CI) users and normal-hearing (NH) listeners. This difference could be due to poorer spectral resolution and/or weaker pitch cues inherent to CI transmitted speech. In CIs, those two degradations cannot be teased apart because spectral degradation leads to weaker pitch representation. A vocoding method was developed to evaluate independently the roles of pitch and spectral resolution for restoration in NH individuals. Sentences were resynthesized with different spectral resolutions and with either retaining the original pitch cues or discarding them all. The addition of pitch significantly improved restoration only at six-bands spectral resolution. However, overall intelligibility of interrupted speech was improved both with the addition of pitch and with the increase in spectral resolution. This improvement may be due to better discrimination of speech segments from the filler noise, better grouping of speech segments together, and/or better bottom-up cues available in the speech segments. PMID:26827034

  5. Bioenergy decision-making of farms in Northern Finland: Combining the bottom-up and top-down perspectives

    International Nuclear Information System (INIS)

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers.

  6. Bioenergy decision-making of farms in Northern Finland: Combining the bottom-up and top-down perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Snaekin, Juha-Pekka, E-mail: juhapekkasnakin@luukku.co [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland); Muilu, Toivo; Pesola, Tuomo [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland)

    2010-10-15

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers.

  7. Bioenergy decision-making of farms in Northern Finland. Combining the bottom-up and top-down perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Snaekin, Juha-Pekka; Muilu, Toivo; Pesola, Tuomo [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland)

    2010-10-15

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers. (author)

  8. Calculating systems-scale energy efficiency and net energy returns: A bottom-up matrix-based approach

    International Nuclear Information System (INIS)

    In this paper we expand the work of Brandt and Dale (2011) on ERRs (energy return ratios) such as EROI (energy return on investment). This paper describes a “bottom-up” mathematical formulation which uses matrix-based computations adapted from the LCA (life cycle assessment) literature. The framework allows multiple energy pathways and flexible inclusion of non-energy sectors. This framework is then used to define a variety of ERRs that measure the amount of energy supplied by an energy extraction and processing pathway compared to the amount of energy consumed in producing the energy. ERRs that were previously defined in the literature are cast in our framework for calculation and comparison. For illustration, our framework is applied to include oil production and processing and generation of electricity from PV (photovoltaic) systems. Results show that ERR values will decline as system boundaries expand to include more processes. NERs (net energy return ratios) tend to be lower than GERs (gross energy return ratios). External energy return ratios (such as net external energy return, or NEER (net external energy ratio)) tend to be higher than their equivalent total energy return ratios. - Highlights: • An improved bottom-up mathematical method for computing net energy return metrics is developed. • Our methodology allows arbitrary numbers of interacting processes acting as an energy system. • Our methodology allows much more specific and rigorous definition of energy return ratios such as EROI or NER

  9. Integration of bottom-up and top-down models for the energy system. A practical case for Denmark

    International Nuclear Information System (INIS)

    The main objective of the project was to integrate the Danish macro economic model ADAM with elements from the energy simulation model BRUS, developed at Risoe. The project has been carried out by Risoe National Laboratory with assistance from the Ministry of Finance. A theoretical part focuses on the differences between top-down and bottom-up modelling of the energy-economy interaction. A combined hybrid model seems a relevant alternative to the two traditional approaches. The hybrid model developed is called Hybris and includes models for: supply of electricity and heat, household demand for electricity, and household demand for heat. These three models interact in a iterative procedure with the macro economic model ADAM through a number of links. A reference case as well as a number of scenarios illustrating the capabilities of the model has been set up.Hybris is a simulation model which is capable of analyzing combined CO2 reduction initiatives as regulation of the energy supply system and a CO2 tax in an integrated and consistent way. (au) 32 tabs., 98 ills., 55 refs

  10. Conservative and dissipative force field for simulation of coarse-grained alkane molecules: A bottom-up approach

    International Nuclear Information System (INIS)

    We apply operational procedures available in the literature to the construction of coarse-grained conservative and friction forces for use in dissipative particle dynamics (DPD) simulations. The full procedure rely on a bottom-up approach: large molecular dynamics trajectories of n-pentane and n-decane modeled with an anisotropic united atom model serve as input for the force field generation. As a consequence, the coarse-grained model is expected to reproduce at least semi-quantitatively structural and dynamical properties of the underlying atomistic model. Two different coarse-graining levels are studied, corresponding to five and ten carbon atoms per DPD bead. The influence of the coarse-graining level on the generated force fields contributions, namely, the conservative and the friction part, is discussed. It is shown that the coarse-grained model of n-pentane correctly reproduces self-diffusion and viscosity coefficients of real n-pentane, while the fully coarse-grained model for n-decane at ambient temperature over-predicts diffusion by a factor of 2. However, when the n-pentane coarse-grained model is used as a building block for larger molecule (e.g., n-decane as a two blobs model), a much better agreement with experimental data is obtained, suggesting that the force field constructed is transferable to large macro-molecular systems

  11. Hierarchical Image Saliency Detection on Extended CSSD.

    Science.gov (United States)

    Shi, Jianping; Yan, Qiong; Xu, Li; Jia, Jiaya

    2016-04-01

    Complex structures commonly exist in natural images. When an image contains small-scale high-contrast patterns either in the background or foreground, saliency detection could be adversely affected, resulting erroneous and non-uniform saliency assignment. The issue forms a fundamental challenge for prior methods. We tackle it from a scale point of view and propose a multi-layer approach to analyze saliency cues. Different from varying patch sizes or downsizing images, we measure region-based scales. The final saliency values are inferred optimally combining all the saliency cues in different scales using hierarchical inference. Through our inference model, single-scale information is selected to obtain a saliency map. Our method improves detection quality on many images that cannot be handled well traditionally. We also construct an extended Complex Scene Saliency Dataset (ECSSD) to include complex but general natural images. PMID:26959676

  12. Assessment of Historic Trend in Mobility and Energy Use in India Transportation Sector Using Bottom-up Approach

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan; McNeil, Michael A.

    2009-05-01

    Transportation mobility in India has increased significantly in the past decades. From 1970 to 2000, motorized mobility (passenger-km) has risen by 888%, compared with an 88% population growth (Singh,2006). This contributed to many energy and environmental issues, and an energy strategy incorporates efficiency improvement and other measures needs to be designed. Unfortunately, existing energy data do not provide information on driving forces behind energy use and sometime show large inconsistencies. Many previous studies address only a single transportation mode such as passenger road travel; did not include comprehensive data collection or analysis has yet been done, or lack detail on energy demand by each mode and fuel mix. The current study will fill a considerable gap in current efforts, develop a data base on all transport modes including passenger air and water, and freight in order to facilitate the development of energy scenarios and assess significance of technology potential in a global climate change model. An extensive literature review and data collection has been done to establish the database with breakdown of mobility, intensity, distance, and fuel mix of all transportation modes. Energy consumption was estimated and compared with aggregated transport consumption reported in IEA India transportation energy data. Different scenarios were estimated based on different assumptions on freight road mobility. Based on the bottom-up analysis, we estimated that the energy consumption from 1990 to 2000 increased at an annual growth rate of 7% for the mid-range road freight growth case and 12% for the high road freight growth case corresponding to the scenarios in mobility, while the IEA data only shows a 1.7% growth rate in those years.

  13. The benefits of China's efforts on gaseous pollutant control indicated by the bottom-up emissions and satellite observation

    Science.gov (United States)

    Xia, Y.; Zhao, Y.

    2015-12-01

    To evaluate the effectiveness of national policies of air pollution control, the emissions of SO2, NOX, CO and CO2 in China are estimated with a bottom-up method from 2000 to 2014, and vertical column densities (VCD) from satellite observation are used to evaluate the inter-annual trends and spatial distribution of emissions and the temporal and spatial patterns of ambient levels of gaseous pollutants across the country. In particular, an additional emission case named STD case, which combines the most recent issued emission standards for specific industrial sources, is developed for 2012-2014. The inter-annual trends in emissions and VCDs match well except for SO2, and the revised emissions in STD case improve the comparison, implying the benefits of emission control for most recent years. Satellite retrieval error, underestimation of emission reduction and improved atmospheric oxidization caused the differences between emissions and VCDs trend of SO2. Coal-fired power plants play key roles in SO2 and NOX emission reduction. As suggested by VCD and emission inventory, the control of CO in 11th five year plan (FYP) period was more effective than that in the 12th FYP period, while the SO2 appeared opposite. As the new control target added in 12th FYP, NOX emissions have been clearly decreased 4.3 Mt from 2011 to 2014, in contrast to the fast growth before 2011. The inter-annual trends in NO2 VCDs has the poorest correlation with vehicle ownership (R=0.796), due to the staged emission standard of vehicles. In developed regions, transportation has become the main pollutants emission source and we prove this by comparing VCDs of NO2 to VCDs of SO2. Moreover, air quality in mega cities has been evaluated based on satellite observation and emissions, and results indicate that Beijing suffered heavily from the emissions from Hebei and Tianjin, while the local emissions tend to dominate in Shanghai.

  14. Bottom-up engineering of biological systems through standard bricks: a modularity study on basic parts and devices.

    Directory of Open Access Journals (Sweden)

    Lorenzo Pasotti

    Full Text Available BACKGROUND: Modularity is a crucial issue in the engineering world, as it enables engineers to achieve predictable outcomes when different components are interconnected. Synthetic Biology aims to apply key concepts of engineering to design and construct new biological systems that exhibit a predictable behaviour. Even if physical and measurement standards have been recently proposed to facilitate the assembly and characterization of biological components, real modularity is still a major research issue. The success of the bottom-up approach strictly depends on the clear definition of the limits in which biological functions can be predictable. RESULTS: The modularity of transcription-based biological components has been investigated in several conditions. First, the activity of a set of promoters was quantified in Escherichia coli via different measurement systems (i.e., different plasmids, reporter genes, ribosome binding sites relative to an in vivo reference promoter. Second, promoter activity variation was measured when two independent gene expression cassettes were assembled in the same system. Third, the interchangeability of input modules (a set of constitutive promoters and two regulated promoters connected to a fixed output device (a logic inverter expressing GFP was evaluated. The three input modules provide tunable transcriptional signals that drive the output device. If modularity persists, identical transcriptional signals trigger identical GFP outputs. To verify this, all the input devices were individually characterized and then the input-output characteristic of the logic inverter was derived in the different configurations. CONCLUSIONS: Promoters activities (referred to a standard promoter can vary when they are measured via different reporter devices (up to 22%, when they are used within a two-expression-cassette system (up to 35% and when they drive another device in a functionally interconnected circuit (up to 44%. This paper

  15. Regime shift from phytoplankton to macrophyte dominance in a large river: Top-down versus bottom-up effects

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Carles, E-mail: carles.ibanez@irta.cat [IRTA Aquatic Ecosystems, Carretera Poble Nou, Km 5.5, 43540 St. Carles de la Rapita, Catalonia (Spain); Alcaraz, Carles; Caiola, Nuno; Rovira, Albert; Trobajo, Rosa [IRTA Aquatic Ecosystems, Carretera Poble Nou, Km 5.5, 43540 St. Carles de la Rapita, Catalonia (Spain); Alonso, Miguel [United Research Services S.L., Urgell 143, 08036 Barcelona, Catalonia (Spain); Duran, Concha [Confederacion Hidrografica del Ebro, Sagasta 24-26, 50071 Zaragoza, Aragon (Spain); Jimenez, Pere J. [Grup Natura Freixe, Major 56, 43750 Flix, Catalonia (Spain); Munne, Antoni [Agencia Catalana de l' Aigua, Provenca 204-208, 08036 Barcelona, Catalonia (Spain); Prat, Narcis [Departament d' Ecologia, Universitat de Barcelona, Diagonal 645, 08028 Barcelona Catalonia (Spain)

    2012-02-01

    The lower Ebro River (Catalonia, Spain) has recently undergone a regime shift from a phytoplankton-dominated to a macrophyte-dominated system. This shift is well known in shallow lakes but apparently it has never been documented in rivers. Two initial hypotheses to explain the collapse of the phytoplankton were considered: a) the diminution of nutrients (bottom-up); b) the filtering effect due to the colonization of the zebra mussel (top-down). Data on water quality, hydrology and biological communities (phytoplankton, macrophytes and zebra mussel) was obtained both from existing data sets and new surveys. Results clearly indicate that the decrease in phosphorus is the main cause of a dramatic decrease in chlorophyll and large increase in water transparency, triggering the subsequent colonization of macrophytes in the river bed. A Generalized Linear Model analysis showed that the decrease in dissolved phosphorus had a relative importance 14 times higher than the increase in zebra mussel density to explain the variation of total chlorophyll. We suggest that the described changes in the lower Ebro River can be considered a novel ecosystem shift. This shift is triggering remarkable changes in the biological communities beyond the decrease of phytoplankton and the proliferation of macrophytes, such as massive colonization of Simulidae (black fly) and other changes in the benthic invertebrate communities that are currently investigated. - Highlights: Black-Right-Pointing-Pointer We show a regime shift in a large river from phytoplankton to macrophyte dominance. Black-Right-Pointing-Pointer Two main hypotheses are considered: nutrient decrease and zebra mussel grazing. Black-Right-Pointing-Pointer Phosphorus depletion is found to be the main cause of the phytoplankton decline. Black-Right-Pointing-Pointer We conclude that oligotrophication triggered the colonization of macrophytes. Black-Right-Pointing-Pointer This new regime shift in a river is similar to that described

  16. Motivation and drives in bottom-up developments in natural hazards management: multiple-use of adaptation strategies in Austria

    Science.gov (United States)

    Thaler, Thomas; Fuchs, Sven

    2015-04-01

    Losses from extreme hydrological events, such as recently experienced in Europe have focused the attention of policymakers as well as researchers on vulnerability to natural hazards. In parallel, the context of changing flood risks under climate and societal change is driving transformation in the role of the state in responsibility sharing and individual responsibilities for risk management and precaution. The new policy agenda enhances the responsibilities of local authorities and private citizens in hazard management and reduces the role of central governments. Within the objective is to place added responsibility on local organisations and citizens to determine locally-based strategies for risk reduction. A major challenge of modelling adaptation is to represent the complexity of coupled human-environmental systems and particularly the feedback loops between environmental dynamics and human decision-making processes on different scales. This paper focuses on bottom-up initiatives to flood risk management which are, by definition, different from the mainstream. These initiatives are clearly influenced (positively or negatively) by a number of factors, where the combination of these interdependences can create specific conditions that alter the opportunity for effective governance arrangements in a local scheme approach. In total, this study identified six general drivers which encourage the implementation of flood storages, such as direct relation to recent major flood frequency and history, the initiative of individual stakeholders (promoters), political pressures from outside (e.g. business companies, private households) and a strong solidarity attitude of municipalities and the stakeholders involved. Although partnership approach may be seen as an 'optimal' solution for flood risk management, in practice there are many limitations and barriers in establishing these collaborations and making them effective (especially in the long term) with the consequences

  17. Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing.

    Directory of Open Access Journals (Sweden)

    Alexander eJones

    2015-01-01

    Full Text Available Selective attention to a spatial location has shown enhance perception and facilitate behaviour for events at attended locations. However, selection relies not only on where but also when an event occurs. Recently, interest has turned to how intrinsic neural oscillations in the brain entrain to rhythms in our environment, and, stimuli appearing in or out of synch with a rhythm have shown to modulate perception and performance. Temporal expectations created by rhythms and spatial attention are two processes which have independently shown to affect stimulus processing but it remains largely unknown how, and if, they interact. In four separate tasks, this study investigated the effects of voluntary spatial attention and bottom-up temporal expectations created by rhythms in both unimodal and crossmodal conditions. In each task the participant used an informative cue, either colour or pitch, to direct their covert spatial attention to the left or right, and respond as quickly as possible to a target. The lateralized target (visual or auditory was then presented at the attended or unattended side. Importantly, although not task relevant, the cue was a rhythm of either flashes or beeps. The target was presented in or out of sync (early or late with the rhythmic cue. The results showed participants were faster responding to spatially attended compared to unattended targets in all tasks. Moreover, there was an effect of rhythmic cueing upon response times in both unimodal and crossmodal conditions. Responses were faster to targets presented in sync with the rhythm compared to when they appeared too early in both crossmodal tasks. That is, rhythmic stimuli in one modality influenced the temporal expectancy in the other modality, suggesting temporal expectancies created by rhythms are crossmodal. Interestingly, there was no interaction between top-down spatial attention and rhythmic cueing in any task suggesting these two processes largely influenced

  18. Assisted editing od SensorML with EDI. A bottom-up scenario towards the definition of sensor profiles.

    Science.gov (United States)

    Oggioni, Alessandro; Tagliolato, Paolo; Fugazza, Cristiano; Bastianini, Mauro; Pavesi, Fabio; Pepe, Monica; Menegon, Stefano; Basoni, Anna; Carrara, Paola

    2015-04-01

    -product of this ongoing work is currently constituting an archive of predefined sensor descriptions. This information is being collected in order to further ease metadata creation in the next phase of the project. Users will be able to choose among a number of sensor and sensor platform prototypes: These will be specific instances on which it will be possible to define, in a bottom-up approach, "sensor profiles". We report on the outcome of this activity.

  19. Research on ethics in two large Human Biomonitoring projects ECNIS and NewGeneris: a bottom up approach.

    Science.gov (United States)

    Dumez, Birgit; Van Damme, Karel; Casteleyn, Ludwine

    2008-01-01

    Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed.This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach. This will not only increase

  20. Research on ethics in two large Human Biomonitoring projects ECNIS and NewGeneris: a bottom up approach

    Directory of Open Access Journals (Sweden)

    Casteleyn Ludwine

    2008-01-01

    Full Text Available Abstract Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed. This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach

  1. Tax Complexity, Tax Salience and Tax Politics

    OpenAIRE

    Mumford, Ann

    2015-01-01

    This article considers the implications of the tax salience literature for the United Kingdom. First, the different categories, and definitions, of tax salience that have developed in the literature are reviewed, and some of the prescriptive implications of these terms are introduced. Tax salience refers, essentially, to the capacity of taxpayers to understand legislation. Thus, the potential reasons behind tax complexity and the potential beneficiaries of it are addressed. The article consid...

  2. Tax Salience, Voting, and Deliberation

    DEFF Research Database (Denmark)

    Sausgruber, Rupert; Tyran, Jean-Robert

    Tax incentives can be more or less salient, i.e. noticeable or cognitively easy to process. Our hypothesis is that taxes on consumers are more salient to consumers than equivalent taxes on sellers because consumers underestimate the extent of tax shifting in the market. We show that tax salience...... biases consumers' voting on tax regimes, and that experience is an effective de-biasing mechanism in the experimental laboratory. Pre-vote deliberation makes initially held opinions more extreme rather than correct and does not eliminate the bias in the typical committee. Yet, if voters can discuss their...... experience with the tax regimes they are less likely to be biased....

  3. Stakeholder salience in humanitarian supply chain management

    OpenAIRE

    Schiffling, Sarah

    2013-01-01

    Mitchell et al.(1997) developed a framework for assessing the salience of stakeholder groups based on their power, urgency and the legitimacy of their claim. This has been applied to illustrate the complexities of stakeholder interactions in humanitarian supply chains and to provide insights for their management and further research. Keywords: Supply chain management, Humanitarian logistics, Stakeholder salience

  4. Energy-environment policy modeling of endogenous technological change with personal vehicles. Combining top-down and bottom-up methods

    International Nuclear Information System (INIS)

    The transportation sector offers substantial potential for greenhouse gas (GHG) emission abatement, but widely divergent cost estimates complicate policy making; energy-economy policy modelers apply top-down and bottom-up cost definitions and different assumptions about future technologies and the preferences of firms and households. Our hybrid energy-economy policy model is technology-rich, like a bottom-up model, but has empirically estimated behavioral parameters for risk and technology preferences, like a top-down model. Unlike typical top-down models, however, it simulates technological change endogenously with functions that relate the financial costs of technologies to cumulative production and adjust technology preferences as market shares change. We apply it to the choice of personal vehicles to indicate, first, the effect on cost estimates of divergent cost definitions and, second, the possible response to policies that require a minimum market share for low emission vehicles

  5. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    OpenAIRE

    de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude

    2010-01-01

    Background: Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Purpose: To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs.Results: In the first step, a diagnostic study was conducted with face-to-face interviews to g...

  6. Diagnostic, design and implementation of an integrated model of care in France: a bottom-up process with a continuous leadership

    OpenAIRE

    de Stampa, Matthieu; Vedel, Isabelle; Mauriat, Claire; Bagaragaza, Emmanuel; Routelous, Christelle; Bergman, Howard; Lapointe, Liette; Cassou, Bernard; Ankri, Joel; Henrard, Jean-Claude

    2010-01-01

    Purpose To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Context Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs). Case description In the first step, a diagnostic study was conducted with face-to-face interviews ...

  7. Ecology of phasmids (phasmatodea) in a moist neotropical forest: a study on life history, host-range and bottom-up versus top-down regulation

    OpenAIRE

    Berger, Jürgen

    2004-01-01

    Herbivory is discussed as a key agent in maintaining dynamics and stability of tropical forested ecosystems. Accordingly increasing attention has been paid to the factors that structure tropical herbivore communities. The aim of this study was (1) to describe diversity, density, distribution and host range of the phasmid community (Phasmatodea) of a moist neotropical forest in Panamá, and (2) to experimentally assess bottom-up and top-down factors that may regulate populations of the phasmid ...

  8. Top-down and bottom-up lipidomic analysis of rabbit lipoproteins under different metabolic conditions using flow field-flow fractionation, nanoflow liquid chromatography and mass spectrometry.

    Science.gov (United States)

    Byeon, Seul Kee; Kim, Jin Yong; Lee, Ju Yong; Chung, Bong Chul; Seo, Hong Seog; Moon, Myeong Hee

    2015-07-31

    This study demonstrated the performances of top-down and bottom-up approaches in lipidomic analysis of lipoproteins from rabbits raised under different metabolic conditions: healthy controls, carrageenan-induced inflammation, dehydration, high cholesterol (HC) diet, and highest cholesterol diet with inflammation (HCI). In the bottom-up approach, the high density lipoproteins (HDL) and the low density lipoproteins (LDL) were size-sorted and collected on a semi-preparative scale using a multiplexed hollow fiber flow field-flow fractionation (MxHF5), followed by nanoflow liquid chromatography-ESI-MS/MS (nLC-ESI-MS/MS) analysis of the lipids extracted from each lipoprotein fraction. In the top-down method, size-fractionated lipoproteins were directly infused to MS for quantitative analysis of targeted lipids using chip-type asymmetrical flow field-flow fractionation-electrospray ionization-tandem mass spectrometry (cAF4-ESI-MS/MS) in selected reaction monitoring (SRM) mode. The comprehensive bottom-up analysis yielded 122 and 104 lipids from HDL and LDL, respectively. Rabbits within the HC and HCI groups had lipid patterns that contrasted most substantially from those of controls, suggesting that HC diet significantly alters the lipid composition of lipoproteins. Among the identified lipids, 20 lipid species that exhibited large differences (>10-fold) were selected as targets for the top-down quantitative analysis in order to compare the results with those from the bottom-up method. Statistical comparison of the results from the two methods revealed that the results were not significantly different for most of the selected species, except for those species with only small differences in concentration between groups. The current study demonstrated that top-down lipid analysis using cAF4-ESI-MS/MS is a powerful high-speed analytical platform for targeted lipidomic analysis that does not require the extraction of lipids from blood samples. PMID:26087967

  9. "Disorganized in time": Impact of bottom-up and top-down negative emotion generation on memory formation among healthy and traumatized adolescents.

    OpenAIRE

    Guillery-Girard, Bérengère; Clochon, Patrice; Giffard, Bénédicte; Viard, Armelle; Egler, Pierre-Jean; Baleyte, Jean-Marc; Eustache, Francis; Dayan, Jacques

    2013-01-01

    International audience "Travelling in time," a central feature of episodic memory is severely affected among individuals with Post Traumatic Stress Disorder (PTSD) with two opposite effects: vivid traumatic memories are unorganized in temporality (bottom-up processes), non-traumatic personal memories tend to lack spatio-temporal details and false recognitions occur more frequently that in the general population (top-down processes). To test the effect of these two types of processes (i.e. ...

  10. Synthesis of a Cementitious Material Nanocement Using Bottom-Up Nanotechnology Concept: An Alternative Approach to Avoid CO2 Emission during Production of Cement

    OpenAIRE

    Byung Wan Jo; Sumit Chakraborty; Kwang Won Yoon

    2014-01-01

    The world’s increasing need is to develop smart and sustainable construction material, which will generate minimal climate changing gas during their production. The bottom-up nanotechnology has established itself as a promising alternative technique for the production of the cementitious material. The present investigation deals with the chemical synthesis of cementitious material using nanosilica, sodium aluminate, sodium hydroxide, and calcium nitrate as reacting phases. The characteristic ...

  11. Regional principal color based saliency detection.

    Science.gov (United States)

    Lou, Jing; Ren, Mingwu; Wang, Huan

    2014-01-01

    Saliency detection is widely used in many visual applications like image segmentation, object recognition and classification. In this paper, we will introduce a new method to detect salient objects in natural images. The approach is based on a regional principal color contrast modal, which incorporates low-level and medium-level visual cues. The method allows a simple computation of color features and two categories of spatial relationships to a saliency map, achieving higher F-measure rates. At the same time, we present an interpolation approach to evaluate resulting curves, and analyze parameters selection. Our method enables the effective computation of arbitrary resolution images. Experimental results on a saliency database show that our approach produces high quality saliency maps and performs favorably against ten saliency detection algorithms. PMID:25379960

  12. Influence of the adding bottom-up flow rate to the characteristic of the cooling system on TRIGA 2000 Bandung reactor core

    International Nuclear Information System (INIS)

    Heat generated from the fission reaction will heat up the cladding of the fuel element. For this reason, the fluid which used as a primary coolant in the reactor tank must have a good conductivity. This research is done to know the comparison between the performance of the natural convection cooling system and the performance of the forced convection cooling system which is done by spraying the bottom-up flow rate to the cylindrical nuclear reactor core. The result shows that the forced convection by adding spray pipe has a better performance than that of the natural convection. This case is indicated by decreasing of the maximum temperature on the top of the reactor core from 88,55°C to 47,35°C after the adding bottom-up flow rate. It can be assumed that the adding of the spraying bottom-up flow rate will give a better performance on the cooling system and will reduce the bubbles formation. (author)

  13. Modeling technical change in energy system analysis: analyzing the introduction of learning-by-doing in bottom-up energy models

    International Nuclear Information System (INIS)

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aimed at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options-which is absent in many top-down models-they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they often fail in capturing strategic technology diffusion behavior in the energy sector as well as the energy sector's endogenous responses to policy, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). Some suggestions on how innovation and diffusion modeling in bottom-up analysis can be improved are put forward

  14. Will ESD reporting using bottom-up energy savings calculations be a nightmare or the next step in a better understanding of national energy savings?

    Energy Technology Data Exchange (ETDEWEB)

    Vreuls, Harry; Both, Dick (SenterNovem, The Hague (Netherlands)); Thomas, Stefan (Wuppertal Institute for Climate Environment Energy, Wuppertal (Germany)); Broc, Jean-Sebastien (Ecole des Mines de Nantes (France))

    2009-07-01

    The ESD (Energy Services Directive) requires that EU Member States increase their use of bottom-up energy savings calculations to report on the results of their energy efficiency policies. To make the results more comparable over the Member States harmonised methods should be developed and improved. The first experiences with this harmonisation process from the EMEEES project are presented in this paper. It starts with the introduction of the areas that could be dealt with in the harmonisation: the policies and measures, the individual appliances and installations and the aggregation level of a building, a company or an organisation. Each of them has its own characteristics and complexity to handle with. Some case applications (Voluntary Agreements, Energy Audits, Boilers and Building envelope of existing buildings) for bottom up energy savings calculations are presented to illustrate this. But if harmonisation should be realised for all these levels and economic sectors (industry, agriculture, transport, commercial and non-commercial services and households) it would results in thousands of pages with instructions. This would be a nightmare, but is there another way to reach improved harmonisation? The paper argues on what key elements the harmonisation should concentrate: a general structure for documentation of bottom-up energy savings, the selection of baseline and baseline parameters, and a dynamic approach to ensure improvement over time.

  15. Modeling Technical Change in Energy System Analysis: Analyzing the Introduction of Learning-by-Doing in Bottom-up Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Berglund, Christer; Soederholm, Patrik [Luleaa Univ. of Technology (Sweden). Div. of Economics

    2005-02-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aiming at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options - which is absent in many top-down models - they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they fail in capturing strategic technology diffusion behavior in the energy sector, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). For these reasons bottom-up and top-down models with induced technical change should not be viewed as substitutes but rather as complements.

  16. Application of the 'bottom up' approach for the predictive modeling of sorption isotherms on Hungarian Boda clay

    International Nuclear Information System (INIS)

    Document available in extended abstract form only. Argillaceous rocks are being viewed with continuing interest in many waste management programmes as suitable host formations for the deep geological disposal of radioactive waste: Opalinus clay, Switzerland; Boom and Ypresian clays, Belgium, Callovo-Oxfordian clays, France. One of the options for disposal of radioactive waste in Hungary is the storage in the Boda Clay-stone Formation (BCF). Clay minerals such as illite, smectite, illite/smectite mixed layers and kaolinite are important components in such rock types and can often make up 50 or more wt.% of the total mass. Predicting the fate and transport of radionuclides (RNs) in the repository near- and far-fields is a key research issue in many radioactive waste management programmes, and is one of the main pillars upon which the safety cases for deep geological radioactive waste repositories are built. A broad variety of different types of sorption models i.e. empirical and mechanistic have been developed over the past few decades to describe the interaction of RNs at the clay-water interface over a wide range of conditions. Sorption edges and isotherms were measured for a large number of radionuclides with valences from II to VI on illite and montmorillonite and could be very well described by a relatively simple model, the 2 site proto-lysis non-electrostatic surface complexation and cation exchange (2SPNE SC/CE) sorption model. Cs sorption on illite was modelled in the past and was further developed to a generalised Cs sorption model. Sorption in most natural argillaceous rocks is inherently too complex and multi-faceted to be directly understood in terms of mechanisms and their associated parameters. The so-called 'bottom up' approach is based on the hypothesis that the uptake of RNs in complex mineral/groundwater systems can be quantitatively predicted from the understanding of the sorption processes on single minerals, and the models developed to describe

  17. Cannabinoid modulation of functional connectivity within regions processing attentional salience.

    Science.gov (United States)

    Bhattacharyya, Sagnik; Falkenberg, Irina; Martin-Santos, Rocio; Atakan, Zerrin; Crippa, Jose A; Giampietro, Vincent; Brammer, Mick; McGuire, Philip

    2015-05-01

    There is now considerable evidence to support the hypothesis that psychotic symptoms are the result of abnormal salience attribution, and that the attribution of salience is largely mediated through the prefrontal cortex, the striatum, and the hippocampus. Although these areas show differential activation under the influence of delta-9-tetrahydrocannabinol (delta-9-THC) and cannabidiol (CBD), the two major derivatives of cannabis sativa, little is known about the effects of these cannabinoids on the functional connectivity between these regions. We investigated this in healthy occasional cannabis users by employing event-related functional magnetic resonance imaging (fMRI) following oral administration of delta-9-THC, CBD, or a placebo capsule. Employing a seed cluster-based functional connectivity analysis that involved using the average time series from each seed cluster for a whole-brain correlational analysis, we investigated the effect of drug condition on functional connectivity between the seed clusters and the rest of the brain during an oddball salience processing task. Relative to the placebo condition, delta-9-THC and CBD had opposite effects on the functional connectivity between the dorsal striatum, the prefrontal cortex, and the hippocampus. Delta-9-THC reduced fronto-striatal connectivity, which was related to its effect on task performance, whereas this connection was enhanced by CBD. Conversely, mediotemporal-prefrontal connectivity was enhanced by delta-9-THC and reduced by CBD. Our results suggest that the functional integration of brain regions involved in salience processing is differentially modulated by single doses of delta-9-THC and CBD and that this relates to the processing of salient stimuli. PMID:25249057

  18. Assessing the construct validity of aberrant salience

    Directory of Open Access Journals (Sweden)

    Kristin Schmidt

    2009-12-01

    Full Text Available We sought to validate the psychometric properties of a recently developed paradigm that aims to measure salience attribution processes proposed to contribute to positive psychotic symptoms, the Salience Attribution Test (SAT. The “aberrant salience” measure from the SAT showed good face validity in previous results, with elevated scores both in high-schizotypy individuals, and in patients with schizophrenia suffering from delusions. Exploring the construct validity of salience attribution variables derived from the SAT is important, since other factors, including latent inhibition/learned irrelevance, attention, probabilistic reward learning, sensitivity to probability, general cognitive ability and working memory could influence these measures. Fifty healthy participants completed schizotypy scales, the SAT, a learned irrelevance task, and a number of other cognitive tasks tapping into potentially confounding processes. Behavioural measures of interest from each task were entered into a principal components analysis, which yielded a five-factor structure accounting for ~75% percent of the variance in behaviour. Implicit aberrant salience was found to load onto its own factor, which was associated with elevated “Introvertive Anhedonia” schizotypy, replicating our previous finding. Learned irrelevance loaded onto a separate factor, which also included implicit adaptive salience, but was not associated with schizotypy. Explicit adaptive and aberrant salience, along with a measure of probabilistic learning, loaded onto a further factor, though this also did not correlate with schizotypy. These results suggest that the measures of learned irrelevance and implicit adaptive salience might be based on similar underlying processes, which are dissociable both from implicit aberrant salience and explicit measures of salience.

  19. Isolating the Incentive Salience of Reward-Associated Stimuli: Value, Choice, and Persistence

    Science.gov (United States)

    Beckmann, Joshua S.; Chow, Jonathan J.

    2015-01-01

    Sign- and goal-tracking are differentially associated with drug abuse-related behavior. Recently, it has been hypothesized that sign- and goal-tracking behavior are mediated by different neurobehavioral valuation systems, including differential incentive salience attribution. Herein, we used different conditioned stimuli to preferentially elicit…

  20. Exploring the Life Expectancy Increase in Poland in the Context of CVD Mortality Fall: The Risk Assessment Bottom-Up Approach, From Health Outcome to Policies.

    Science.gov (United States)

    Kobza, Joanna; Geremek, Mariusz

    2015-01-01

    Life expectancy at birth is considered the best mortality-based summary indicator of the health status of the population and is useful for measuring long-term health changes. The objective of this article was to present the concept of the bottom-up policy risk assessment approach, developed to identify challenges involved in analyzing risk factor reduction policies and in assessing how the related health indicators have changed over time. This article focuses on the reasons of the significant life expectancy prolongation in Poland over the past 2 decades, thus includes policy context. The methodology details a bottom-up risk assessment approach, a chain of relations between the health outcome, risk factors, and health policy, based on Risk Assessment From Policy to Impact Dimension project guidance. A decline in cardiovascular disease mortality was a key factor that followed life expectancy prolongation. Among basic factors, tobacco and alcohol consumption, diet, physical activity, and new treatment technologies were identified. Poor health outcomes of the Polish population at the beginning of 1990s highlighted the need of the implementation of various health promotion programs, legal acts, and more effective public health policies. Evidence-based public health policy needs translating scientific research into policy and practice. The bottom-up case study template can be one of the focal tools in this process. Accountability for the health impact of policies and programs and legitimization of the decisions of policy makers has become one of the key questions nowadays in European countries' decision-making process and in EU public health strategy. PMID:26546595

  1. Influence of top-down and bottom-up manipulations on the R-BT065 subcluster of Betaproteobacteria, an abundant group in bacterioplankton of a freshwater reservoir

    Czech Academy of Sciences Publication Activity Database

    Šimek, Karel; Horňák, Karel; Jezbera, Jan; Mašín, Michal; Nedoma, Jiří; Gasol, J. M. .; Schauer, M.

    2005-01-01

    Roč. 71, č. 5 (2005), s. 2381-2390. ISSN 0099-2240 R&D Projects: GA ČR(CZ) GA206/05/0007; GA ČR(CZ) GA206/02/0003 Grant ostatní: CSIC(ES) DGICYT REN2001-2120/MAR; EU(XE) EVK3-CT-2002-00078; Austrian Science Foundation(AT) P15655 Institutional research plan: CEZ:AV0Z60170517 Keywords : reservoir * top-down and bottom-up control * microbial food webs * bacterivory * bacterial community composition Subject RIV: EE - Microbiology, Virology Impact factor: 3.818, year: 2005

  2. Language-experience plasticity in neural representation of changes in pitch salience.

    Science.gov (United States)

    Krishnan, Ananthanarayan; Gandour, Jackson T; Suresh, Chandan H

    2016-04-15

    Neural representation of pitch-relevant information at the brainstem and cortical levels of processing is influenced by language experience. A well-known attribute of pitch is its salience. Brainstem frequency following responses and cortical pitch specific responses, recorded concurrently, were elicited by a pitch salience continuum spanning weak to strong pitch of a dynamic, iterated rippled noise pitch contour-homolog of a Mandarin tone. Our aims were to assess how language experience (Chinese, English) affects i) enhancement of neural activity associated with pitch salience at brainstem and cortical levels, ii) the presence of asymmetry in cortical pitch representation, and iii) patterns of relative changes in magnitude along the pitch salience continuum. Peak latency (Fz: Na, Pb, and Nb) was shorter in the Chinese than the English group across the continuum. Peak-to-peak amplitude (Fz: Na-Pb, Pb-Nb) of the Chinese group grew larger with increasing pitch salience, but an experience-dependent advantage was limited to the Na-Pb component. At temporal sites (T7/T8), the larger amplitude of the Chinese group across the continuum was both limited to the Na-Pb component and the right temporal site. At the brainstem level, F0 magnitude gets larger as you increase pitch salience, and it too reveals Chinese superiority. A direct comparison of cortical and brainstem responses for the Chinese group reveals different patterns of relative changes in magnitude along the pitch salience continuum. Such differences may point to a transformation in pitch processing at the cortical level presumably mediated by local sensory and/or extrasensory influence overlaid on the brainstem output. PMID:26903418

  3. Quantifying object salience by equating distractor effects

    OpenAIRE

    Huang, L Q; Pashler, Harold

    2005-01-01

    It is commonly believed that objects viewed in certain contexts may be more or less salient. Measurements of salience have usually relied on asking observers "How much does this object stand out against the background?". In this study, we measured the salience of objects by assessing the distraction they produce for subjects searching for a different, pre-specified target. Distraction was measured through response times, but changes in response times were not assumed to be a linear measure of...

  4. Formal Modelling of Salience and Cognitive Load

    OpenAIRE

    Rukšenas, R.; Curzon, P.; Back, J.; Blandford, A.

    2008-01-01

    Well-designed interfaces use procedural and sensory cues to increase the salience of appropriate actions and intentions. However, empirical studies suggest that cognitive load can influence the strength of procedural and sensory cues. We formalise the relationship between salience and cognitive load revealed by empirical data. We add these rules to our abstract cognitive architecture developed for the verification of usability properties. The interface of a fire engine dispatch task used in t...

  5. Discrimination learning with variable stimulus 'salience'

    OpenAIRE

    2011-01-01

    Background In nature, sensory stimuli are organized in heterogeneous combinations. Salient items from these combinations 'stand-out' from their surroundings and determine what and how we learn. Yet, the relationship between varying stimulus salience and discrimination learning remains unclear. Presentation of the hypothesis A rigorous formulation of the problem of discrimination learning should account for varying salience effects. We hypothesize that structural variations in the environment ...

  6. MPEG-4 AVC saliency map computation

    Science.gov (United States)

    Ammar, M.; Mitrea, M.; Hasnaoui, M.

    2014-02-01

    A saliency map provides information about the regions inside some visual content (image, video, ...) at which a human observer will spontaneously look at. For saliency maps computation, current research studies consider the uncompressed (pixel) representation of the visual content and extract various types of information (intensity, color, orientation, motion energy) which are then fusioned. This paper goes one step further and computes the saliency map directly from the MPEG-4 AVC stream syntax elements with minimal decoding operations. In this respect, an a-priori in-depth study on the MPEG-4 AVC syntax elements is first carried out so as to identify the entities appealing the visual attention. Secondly, the MPEG-4 AVC reference software is completed with software tools allowing the parsing of these elements and their subsequent usage in objective benchmarking experiments. This way, it is demonstrated that an MPEG-4 saliency map can be given by a combination of static saliency and motion maps. This saliency map is experimentally validated under a robust watermarking framework. When included in an m-QIM (multiple symbols Quantization Index Modulation) insertion method, PSNR average gains of 2.43 dB, 2.15dB, and 2.37 dB are obtained for data payload of 10, 20 and 30 watermarked blocks per I frame, i.e. about 30, 60, and 90 bits/second, respectively. These quantitative results are obtained out of processing 2 hours of heterogeneous video content.

  7. Increased water salinity applied to tomato plants accelerates the development of the leaf miner Tuta absoluta through bottom-up effects.

    Science.gov (United States)

    Han, Peng; Wang, Zhi-Jian; Lavoir, Anne-Violette; Michel, Thomas; Seassau, Aurélie; Zheng, Wen-Yan; Niu, Chang-Ying; Desneux, Nicolas

    2016-01-01

    Variation in resource inputs to plants may trigger bottom-up effects on herbivorous insects. We examined the effects of water input: optimal water vs. limited water; water salinity: with vs. without addition of 100 mM NaCl; and their interactions on tomato plants (Solanum lycopersicum), and consequently, the bottom-up effects on the tomato leaf miner, Tuta absoluta (Meytick) (Lepidoptera: Gelechiidae). Plant growth was significantly impeded by limited water input and NaCl addition. In terms of leaf chemical defense, the production of tomatidine significantly increased with limited water and NaCl addition, and a similar but non-significant trend was observed for the other glycoalkaloids. Tuta absoluta survival did not vary with the water and salinity treatments, but the treatment "optimal water-high salinity" increased the development rate without lowering pupal mass. Our results suggest that caution should be used in the IPM program against T. absoluta when irrigating tomato crops with saline water. PMID:27619473

  8. 自底向上的应用层组播树重构算法%Bottom-Up Application Layer Multicast Tree Reconstruction Algorithm

    Institute of Scientific and Technical Information of China (English)

    邓正伟; 李锋

    2011-01-01

    Based on the analysis of the traditional application layer multicast tree reconstruction algorithm, combined with proactive reconstruction technique, a bottom-up application layer multicast tree reconstruction algorithm is proposed. The algorithm employs a bottom-up strategy, which combines both local and global-selection strategies for backup parent node choice. Simulation results show that the algorithm has improvement in the respect of recovery delay of multicast tree, the quality of the reconstructed tree and the control overhead of tree reconstruction.%分析传统应用层组播树重构算法的不足,结合前向式重构技术,提出一种自底向上的应用层组播树重构算法.采用自底向上的方法将备用父节点的本地选择策略和全局选择策略进行有机结合.仿真结果表明,该算法在组播树的恢复时延、重构树的质量、树重建的控制开销方面都有一定的改进.

  9. A Local Texture-Based Superpixel Feature Coding for Saliency Detection Combined with Global Saliency

    Directory of Open Access Journals (Sweden)

    Bingfei Nan

    2015-12-01

    Full Text Available Because saliency can be used as the prior knowledge of image content, saliency detection has been an active research area in image segmentation, object detection, image semantic understanding and other relevant image-based applications. In the case of saliency detection from cluster scenes, the salient object/region detected needs to not only be distinguished clearly from the background, but, preferably, to also be informative in terms of complete contour and local texture details to facilitate the successive processing. In this paper, a Local Texture-based Region Sparse Histogram (LTRSH model is proposed for saliency detection from cluster scenes. This model uses a combination of local texture patterns and color distribution as well as contour information to encode the superpixels to characterize the local feature of image for region contrast computing. Combining the region contrast as computed with the global saliency probability, a full-resolution salient map, in which the salient object/region detected adheres more closely to its inherent feature, is obtained on the bases of the corresponding high-level saliency spatial distribution as well as on the pixel-level saliency enhancement. Quantitative comparisons with five state-of-the-art saliency detection methods on benchmark datasets are carried out, and the comparative results show that the method we propose improves the detection performance in terms of corresponding measurements.

  10. Aposematism: balancing salience and camouflage.

    Science.gov (United States)

    Barnett, James B; Scott-Samuel, Nicholas E; Cuthill, Innes C

    2016-08-01

    Aposematic signals are often characterized by high conspicuousness. Larger and brighter signals reinforce avoidance learning, distinguish defended from palatable prey and are more easily memorized by predators. Conspicuous signalling, however, has costs: encounter rates with naive, specialized or nutritionally stressed predators are likely to increase. It has been suggested that intermediate levels of aposematic conspicuousness can evolve to balance deterrence and detectability, especially for moderately defended species. The effectiveness of such signals, however, has not yet been experimentally tested under field conditions. We used dough caterpillar-like baits to test whether reduced levels of aposematic conspicuousness can have survival benefits when predated by wild birds in natural conditions. Our results suggest that, when controlling for the number and intensity of internal contrast boundaries (stripes), a reduced-conspicuousness aposematic pattern can have a survival advantage over more conspicuous signals, as well as cryptic colours. Furthermore, we find a survival benefit from the addition of internal contrast for both high and low levels of conspicuousness. This adds ecological validity to evolutionary models of aposematic saliency and the evolution of honest signalling. PMID:27484645

  11. Toward consistency between bottom-up CO2 emissions trends and top-down atmospheric measurements in the Los Angeles megacity

    Science.gov (United States)

    Newman, S.; Xu, X.; Gurney, K. R.; Hsu, Y.-K.; Li, K.-F.; Jiang, X.; Keeling, R.; Feng, S.; O'Keefe, D.; Patarasuk, R.; Wong, K. W.; Rao, P.; Fischer, M. L.; Yung, Y. L.

    2015-10-01

    Large urban emissions of greenhouse gases result in large atmospheric enhancements relative to background that are easily measured. Using CO2 mole fractions and Δ14C and δ13C values of CO2 in the Los Angeles megacity observed in inland Pasadena (2006-2013) and coastal Palos Verdes peninsula (autumn 2009-2013), we have determined time series for CO2 contributions from fossil fuel combustion for both sites and broken those down into contributions from petroleum/gasoline and natural gas burning for Pasadena. We find a 10 % reduction in Pasadena CO2 emissions from fossil fuel combustion during the Great Recession of 2008-2010, which is consistent with the bottom-up inventory determined by the California Air Resources Board. The isotopic variations and total atmospheric CO2 from our observations are used to infer seasonality of natural gas and petroleum combustion. For natural gas, inferred emissions are out of phase with the seasonal cycle of total natural gas combustion seasonal patterns in bottom-up inventories but are consistent with the seasonality of natural gas usage by the area's electricity generating power plants. For petroleum, the inferred seasonality of CO2 emissions from burning petroleum is delayed by several months relative to usage indicated by statewide gasoline taxes. Using the high-resolution Hestia-LA data product to compare emissions from parts of the basin sampled by winds at different times of year, we find that variations in observed fossil fuel CO2 reflect seasonal variations in wind direction. The seasonality of the local CO2 excess from fossil fuel combustion along the coast, on Palos Verdes peninsula, is higher in fall and winter than spring and summer, almost completely out of phase with that from Pasadena, also because of the annual variations of winds in the region. Variations in fossil fuel CO2 signals are consistent with sampling the bottom-up Hestia-LA fossil CO2 emissions product for sub-city source regions in the LA megacity domain

  12. Toward consistency between trends in bottom-up CO2 emissions and top-down atmospheric measurements in the Los Angeles megacity

    Science.gov (United States)

    Newman, Sally; Xu, Xiaomei; Gurney, Kevin R.; Kuang Hsu, Ying; Li, King Fai; Jiang, Xun; Keeling, Ralph; Feng, Sha; O'Keefe, Darragh; Patarasuk, Risa; Weng Wong, Kam; Rao, Preeti; Fischer, Marc L.; Yung, Yuk L.

    2016-03-01

    Large urban emissions of greenhouse gases result in large atmospheric enhancements relative to background that are easily measured. Using CO2 mole fractions and Δ14C and δ13C values of CO2 in the Los Angeles megacity observed in inland Pasadena (2006-2013) and coastal Palos Verdes peninsula (autumn 2009-2013), we have determined time series for CO2 contributions from fossil fuel combustion (Cff) for both sites and broken those down into contributions from petroleum and/or gasoline and natural gas burning for Pasadena. We find a 10 % reduction in Pasadena Cff during the Great Recession of 2008-2010, which is consistent with the bottom-up inventory determined by the California Air Resources Board. The isotopic variations and total atmospheric CO2 from our observations are used to infer seasonality of natural gas and petroleum combustion. The trend of CO2 contributions to the atmosphere from natural gas combustion is out of phase with the seasonal cycle of total natural gas combustion seasonal patterns in bottom-up inventories but is consistent with the seasonality of natural gas usage by the area's electricity generating power plants. For petroleum, the inferred seasonality of CO2 contributions from burning petroleum is delayed by several months relative to usage indicated by statewide gasoline taxes. Using the high-resolution Hestia-LA data product to compare Cff from parts of the basin sampled by winds at different times of year, we find that variations in observed fossil fuel CO2 reflect seasonal variations in wind direction. The seasonality of the local CO2 excess from fossil fuel combustion along the coast, on Palos Verdes peninsula, is higher in autumn and winter than spring and summer, almost completely out of phase with that from Pasadena, also because of the annual variations of winds in the region. Variations in fossil fuel CO2 signals are consistent with sampling the bottom-up Hestia-LA fossil CO2 emissions product for sub-city source regions in the LA

  13. Relative Influence of Top-Down ond Bottom-Up Controls on Mixed Severity Burn Patterns in Yosemite National Park, California, USA

    Science.gov (United States)

    Kane, V. R.; Povak, N.; Brooks, M.; Collins, B.; Smith, D.; Churchill, D.

    2015-12-01

    In western North America, recent and projected increases in the frequency and severity of large wildfires have elevated the need to understand the key drivers of fire regimes across landscapes so that managers can predict where fires will have the greatest ecological impact, and anticipate changes under future climate change. Yosemite National Park offers a unique opportunity to study potential biophysical controls on fire severity patterns - fire management in this area has allowed many fires to burn since the 1970s, re-establishing a mixed severity fire regime. Previous studies within the park showed a high level of control from a variety of bottom-up (e.g., fire history, topography) and top-down (e.g., climate) variables on fire severity within a portion of the current study area, and found some evidence controls may break down for the largest fires. In the current study, we sought to identify (1) controls on fire severity across all fires that burned within Yosemite (1984-2013), (2) differences in controls across fire sizes, (3) the contributions of topographic, climatic, and fire history variables to total variance explained, and (4) the influence of spatial autocorrelation on model results. Our study includes 147 fires that burned over 78,500 ha within Yosemite. Modeling results suggested that fire size and shape, topography, and localized climate variables explained fire severity patterns. Fires responded to inter-annual climate variability (top-down) plus local variation in water balance, past fire history, and local topographic variability (bottom-up). Climate-only models lead to the highest level of pure variance explained followed by fire history, and topography models. Climate variables had distinctly non-linear relationships with fire severity, and key drivers were related to winter conditions. Fire severity was positively correlated with fire size, and severity increased towards fire interiors. Steeper and more complex topographies were associated

  14. HCFC-142b emissions in China: An inventory for 2000 to 2050 basing on bottom-up and top-down methods

    Science.gov (United States)

    Han, Jiarui; Li, Li; Su, Shenshen; Hu, Jianxin; Wu, Jing; Wu, Yusheng; Fang, Xuekun

    2014-05-01

    1-Chloro-1,1-difluoroethane (HCFC-142b) is both ozone depleting substance included in the Montreal Protocol on Substances that Deplete the Ozone Layer (Montreal Protocol) and potent greenhouse gas with high global warming potential. As one of the major HCFC-142b consumption and production countries in the world, China's control action will contribute to both mitigating climate change and protecting ozone layer. Estimating China's HCFC-142b emission is a crucial step for understanding its emission status, drawing up phasing-out plan and evaluating mitigation effect. Both the bottom-up and top-down method were adopted in this research to estimate HCFC-142b emissions from China. Results basing on different methods were compared to test the effectiveness of two methods and validate inventory's reliability. Firstly, a national bottom-up emission inventory of HCFC-142b for China during 2000-2012 was established based on the 2006 IPCC Guidelines for National Greenhouse Gas Inventories and the Montreal Protocol, showing that in contrast to the downward trend revealed by existing results, HCFC-142b emissions kept increasing from 0.1 kt/yr in 2000 to the peak of 14.4 kt/yr in 2012. Meanwhile a top-down emission estimation was also developed using interspecies correlation method. By correlating atmospheric mixing ratio data of HCFC-142b and reference substance HCFC-22 sampled from four representative cities (Beijing, Hangzhou, Lanzhou and Guangzhou, for northern, eastern, western and southern China, respectively), China's HCFC-142b emission in 2012 was calculated. It was 16.24(13.90-18.58) kt, equivalent to 1.06 kt ODP and 37 Tg CO2-eq, taking up 9.78% (ODP) of total HCFCs emission in China or 30.5% of global HCFC-142b emission. This result was 12.7% higher than that in bottom-up inventory. Possible explanations were discussed. The consistency of two results lend credit to methods effectiveness and results reliability. Finally, future HCFC-142b emission was projected to 2050

  15. Chitosan microspheres with an extracellular matrix-mimicking nanofibrous structure as cell-carrier building blocks for bottom-up cartilage tissue engineering

    Science.gov (United States)

    Zhou, Yong; Gao, Huai-Ling; Shen, Li-Li; Pan, Zhao; Mao, Li-Bo; Wu, Tao; He, Jia-Cai; Zou, Duo-Hong; Zhang, Zhi-Yuan; Yu, Shu-Hong

    2015-12-01

    Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation. Recently, as a valuable alternative, a bottom-up TE approach utilizing cell-loaded micrometer-scale modular components as building blocks to reconstruct a new tissue in vitro or in vivo has been proved to demonstrate a number of desirable advantages compared with the traditional bulk scaffold based top-down TE approach. Nevertheless, micro-components with an ECM-mimicking nanofibrous structure are still very scarce and highly desirable. Chitosan (CS), an accessible natural polymer, has demonstrated appealing intrinsic properties and promising application potential for TE, especially the cartilage tissue regeneration. According to this background, we report here the fabrication of chitosan microspheres with an ECM-mimicking nanofibrous structure for the first time based on a physical gelation process. By combining this physical fabrication procedure with microfluidic technology, uniform CS microspheres (CMS) with controlled nanofibrous microstructure and tunable sizes can be facilely obtained. Especially, no potentially toxic or denaturizing chemical crosslinking agent was introduced into the products. Notably, in vitro chondrocyte culture tests revealed that enhanced cell attachment and proliferation were realized, and a macroscopic 3D geometrically shaped cartilage-like composite can be easily constructed with the nanofibrous CMS (NCMS) and chondrocytes, which demonstrate significant application potential of NCMS as the bottom-up cell-carrier components for cartilage tissue engineering.Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation

  16. Parallel- and serial-contact electrochemical metallization of monolayer nanopatterns: A versatile synthetic tool en route to bottom-up assembly of electric nanocircuits

    Directory of Open Access Journals (Sweden)

    Jonathan Berson

    2012-02-01

    Full Text Available Contact electrochemical transfer of silver from a metal-film stamp (parallel process or a metal-coated scanning probe (serial process is demonstrated to allow site-selective metallization of monolayer template patterns of any desired shape and size created by constructive nanolithography. The precise nanoscale control of metal delivery to predefined surface sites, achieved as a result of the selective affinity of the monolayer template for electrochemically generated metal ions, provides a versatile synthetic tool en route to the bottom-up assembly of electric nanocircuits. These findings offer direct experimental support to the view that, in electrochemical metal deposition, charge is carried across the electrode–solution interface by ion migration to the electrode rather than by electron transfer to hydrated ions in solution.

  17. Bottom-up processing of thermoelectric nanocomposites from colloidal nanocrystal building blocks: the case of Ag{sub 2}Te-PbTe

    Energy Technology Data Exchange (ETDEWEB)

    Cadavid, Doris [Catalonia Institute for Energy Research, IREC (Spain); Ibanez, Maria [Universitat de Barcelona, Departament d' Electronica (Spain); Gorsse, Stephane [Universite de Bordeaux, ICMCB, CNRS (France); Lopez, Antonio M. [Universitat Politecnica de Catalunya, Departament d' Enginyeria Electronica (Spain); Cirera, Albert [Universitat de Barcelona, Departament d' Electronica (Spain); Morante, Joan Ramon; Cabot, Andreu, E-mail: acabot@irec.cat [Catalonia Institute for Energy Research, IREC (Spain)

    2012-12-15

    Nanocomposites are highly promising materials to enhance the efficiency of current thermoelectric devices. A straightforward and at the same time highly versatile and controllable approach to produce nanocomposites is the assembly of solution-processed nanocrystal building blocks. The convenience of this bottom-up approach to produce nanocomposites with homogeneous phase distributions and adjustable composition is demonstrated here by blending Ag{sub 2}Te and PbTe colloidal nanocrystals to form Ag{sub 2}Te-PbTe bulk nanocomposites. The thermoelectric properties of these nanocomposites are analyzed in the temperature range from 300 to 700 K. The evolution of their electrical conductivity and Seebeck coefficient is discussed in terms of the blend composition and the characteristics of the constituent materials.

  18. The synthesis of bottom-up and top-down approaches to climate policy modeling: Electric power technology detail in a social accounting framework

    International Nuclear Information System (INIS)

    ''Hybrid'' climate policy simulations have sought to bridge the gap between ''bottom-up'' engineering and ''top-down'' macroeconomic models by integrating the former's energy technology detail into the latter's macroeconomic framework. Construction of hybrid models is complicated by the need to numerically calibrate them to multiple, incommensurate sources of economic and engineering data. I develop a solution to this problem following Howitt's [Howitt, R.E., 1995. Positive Mathematical Programming, American Journal of Agricultural Economics 77: 329-342] positive mathematical programming approach. Using data for the U.S., I illustrate how the inputs to the electricity sector in a social accounting matrix may be allocated among discrete types of generation so as to be consistent with both technologies' input shares from engineering cost estimates, and the zero profit and market clearance conditions of the sector's macroeconomic production structure. (author)

  19. Impaired Bottom-Up Effective Connectivity Between Amygdala and Subgenual Anterior Cingulate Cortex in Unmedicated Adolescents with Major Depression: Results from a Dynamic Causal Modeling Analysis.

    Science.gov (United States)

    Musgrove, Donald R; Eberly, Lynn E; Klimes-Dougan, Bonnie; Basgoze, Zeynep; Thomas, Kathleen M; Mueller, Bryon A; Houri, Alaa; Lim, Kelvin O; Cullen, Kathryn R

    2015-12-01

    Major depressive disorder (MDD) is a significant contributor to lifetime disability and frequently emerges in adolescence, yet little is known about the neural mechanisms of MDD in adolescents. Dynamic causal modeling (DCM) analysis is an innovative tool that can shed light on neural network abnormalities. A DCM analysis was conducted to test several frontolimbic effective connectivity models in 27 adolescents with MDD and 21 healthy adolescents. The best neural model for each person was identified using Bayesian model selection. The findings revealed that the two adolescent groups fit similar optimal neural models. The best across-groups model was then used to infer upon both within-group and between-group tests of intrinsic and modulation parameters of the network connections. First, for model validation, within-group tests revealed robust evidence for bottom-up connectivity, but less evidence for strong top-down connectivity in both groups. Second, we tested for differences between groups on the validated parameters of the best model. This revealed that adolescents with MDD had significantly weaker bottom-up connectivity in one pathway, from amygdala to sgACC (p=0.008), than healthy controls. This study provides the first examination of effective connectivity using DCM within neural circuitry implicated in emotion processing in adolescents with MDD. These findings aid in advancing understanding the neurobiology of early-onset MDD during adolescence and have implications for future research investigating how effective connectivity changes across contexts, with development, over the course of the disease, and after intervention. PMID:26050933

  20. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Directory of Open Access Journals (Sweden)

    W. J. F. Acton

    2015-10-01

    Full Text Available This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton transfer reaction-mass spectrometer (PTR-MS and a proton transfer reaction-time of flight-mass spectrometer (PTR-ToF-MS together with the methods of virtual disjunct eddy covariance (PTR-MS and eddy covariance (PTR-ToF-MS. Isoprene was the dominant emitted compound with a mean day-time flux of 1.9 mg m-2 h-1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28 day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m-2 h-1 was calculated using the MEGAN isoprene emissions algorithms (Guenther et al., 2006. A detailed tree species distribution map for the site enabled the leaf-level emissions of isoprene and monoterpenes recorded using GC-MS to be scaled up to produce a "bottom-up" canopy-scale flux. This was compared with the "top-down" canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  1. Heat recovery with heat pumps in non-energy intensive industry: A detailed bottom-up model analysis in the French food and drink industry

    International Nuclear Information System (INIS)

    Highlights: • First bottom-up energy model for NEI at 4-digit level of NACE for energy analysis. • Energy end-use modelling due to the unsuitability of end-product/process approach. • Analysis of heat recovery with HP on industrial processes up to 2020 in French F and D. • Energy consumption and emissions drop respectively by 10% compared to 2001 and 9% to 1990. • Results only achieved at heat temperature below 100 °C, concentrated in 1/3 of F and D sectors. - Abstract: Rising energy prices and environmental impacts inevitably encourage industrials to get involved in promoting energy efficiency and emissions reductions. To achieve this goal, we have developed the first detailed bottom-up energy model for Non-Energy Intensive industry (NEI) to study its global energy efficiency and the potential for CO2 emissions reduction at a 4-digit level of NACE classification. The latter, which is generally neglected in energy analyses, is expected to play an important role in reducing industry energy intensity in the long term due to its economic and energy significance and relatively high growth rate. In this paper, the modelling of NEI is done by energy end-use owing to the unsuitability of the end-product/process approach used in the Energy Intensive industry modelling. As an example, we analysed the impact of heat recovery with heat pumps (HP) on industrial processes up to 2020 on energy savings and CO2 emissions reductions in the French food and drink industry (F and D), the biggest NEI sector. The results showed HP could be an excellent and very promising energy recovery technology. For further detailed analysis, the depiction of HP investment cost payments is given per temperature range for each F and D subsector. This model constitutes a useful decision-making tool for assessing potential energy savings from investing in efficient technologies at the highest level of disaggregation, as well as a better subsectoral screening

  2. A bottom-up model to estimate the energy efficiency improvement and CO2 emission reduction potentials in the Chinese iron and steel industry

    International Nuclear Information System (INIS)

    China's annual crude steel production in 2010 was 638.7 Mt accounting for nearly half of the world's annual crude steel production in the same year. Around 461 TWh of electricity and 14,872 PJ of fuel were consumed to produce this quantity of steel. We identified and analyzed 23 energy efficiency technologies and measures applicable to the processes in China's iron and steel industry. Using a bottom-up electricity CSC (Conservation Supply Curve) model, the cumulative cost-effective electricity savings potential for the Chinese iron and steel industry for 2010–2030 is estimated to be 251 TWh, and the total technical electricity saving potential is 416 TWh. The CO2 emissions reduction associated with cost-effective electricity savings is 139 Mt CO2 and the CO2 emission reduction associated with technical electricity saving potential is 237 Mt CO2. The FCSC (Fuel CSC) model for the Chinese iron and steel industry shows cumulative cost-effective fuel savings potential of 11,999 PJ, and the total technical fuel saving potential is 12,139. The CO2 emissions reduction associated with cost-effective and technical fuel savings is 1191 Mt CO2 and 1205 Mt CO2, respectively. In addition, a sensitivity analysis with respect to the discount rate used is conducted. - Highlights: ► Estimation of energy saving potential in the entire Chinese steel industry. ► Development of the bottom-up technology-rich Conservation Supply Curve models. ► Discussion of different approaches for developing Conservation Supply Curves. ► Primary energy saving over 20 years equal to 72% of primary energy of Latin America

  3. Bottom-up effects of nutrient availability on flower production, pollinator visitation, and seed output in a high-Andean shrub.

    Science.gov (United States)

    Muñoz, Alejandro A; Celedon-Neghme, Constanza; Cavieres, Lohengrin A; Arroyo, Mary T K

    2005-03-01

    Soil nutrient availability directly enhances vegetative growth, flowering, and fruiting in alpine ecosystems. However, the impacts of nutrient addition on pollinator visitation, which could affect seed output indirectly, are unknown. In a nutrient addition experiment, we tested the hypothesis that seed output in the insect-pollinated, self-incompatible shrub, Chuquiraga oppositifolia (Asteraceae) of the Andes of central Chile, is enhanced by soil nitrogen (N) availability. We aimed to monitor total shrub floral display, size of flower heads (capitula), pollinator visitation patterns, and seed output during three growing seasons on control and N addition shrubs. N addition did not augment floral display, size of capitula, pollinator visitation, or seed output during the first growing season. Seed mass and viability were 25-40% lower in fertilised shrubs. During the second growing season only 33% of the N addition shrubs flowered compared to 71% of controls, and a significant (50%) enhancement in vegetative growth occurred in fertilised shrubs. During the third growing season, floral display in N addition shrubs was more than double that of controls, received more than twice the number of insect pollinator visits, and seed output was three- to four-fold higher compared to controls. A significant (50%) enhancement in vegetative growth again occurred in N addition shrubs. Results of this study strongly suggest that soil N availability produces strong positive bottom-up effects on the reproductive output of the alpine shrub C. oppositifolia. Despite taking considerably longer to be manifest in comparison to the previously reported top-down indirect negative effects of lizard predators in the same study system, our results suggest that both bottom-up and top-down forces are important in controlling the reproductive output of an alpine shrub. PMID:15583940

  4. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Science.gov (United States)

    Acton, W. J. F.; Schallhart, S.; Langford, B.; Valach, A.; Rantala, P.; Fares, S.; Carriero, G.; Tillmann, R.; Tomlinson, S. J.; Dragosits, U.; Gianelle, D.; Hewitt, C. N.; Nemitz, E.

    2015-10-01

    This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs) 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton transfer reaction-mass spectrometer (PTR-MS) and a proton transfer reaction-time of flight-mass spectrometer (PTR-ToF-MS) together with the methods of virtual disjunct eddy covariance (PTR-MS) and eddy covariance (PTR-ToF-MS). Isoprene was the dominant emitted compound with a mean day-time flux of 1.9 mg m-2 h-1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28 day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m-2 h-1 was calculated using the MEGAN isoprene emissions algorithms (Guenther et al., 2006). A detailed tree species distribution map for the site enabled the leaf-level emissions of isoprene and monoterpenes recorded using GC-MS to be scaled up to produce a "bottom-up" canopy-scale flux. This was compared with the "top-down" canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  5. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories.

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G; Quinn, Paul C; Hu, Chao S; Qian, Miao; Fu, Genyue; Lee, Kang

    2015-02-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese, Caucasian, and racially ambiguous faces. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time. PMID:25497461

  6. Top-down and bottom-up identification of proteins by liquid extraction surface analysis mass spectrometry of healthy and diseased human liver tissue.

    Science.gov (United States)

    Sarsby, Joscelyn; Martin, Nicholas J; Lalor, Patricia F; Bunch, Josephine; Cooper, Helen J

    2014-11-01

    Liquid extraction surface analysis mass spectrometry (LESA MS) has the potential to become a useful tool in the spatially-resolved profiling of proteins in substrates. Here, the approach has been applied to the analysis of thin tissue sections from human liver. The aim was to determine whether LESA MS was a suitable approach for the detection of protein biomarkers of nonalcoholic liver disease (nonalcoholic steatohepatitis, NASH), with a view to the eventual development of LESA MS for imaging NASH pathology. Two approaches were considered. In the first, endogenous proteins were extracted from liver tissue sections by LESA, subjected to automated trypsin digestion, and the resulting peptide mixture was analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS) (bottom-up approach). In the second (top-down approach), endogenous proteins were extracted by LESA, and analyzed intact. Selected protein ions were subjected to collision-induced dissociation (CID) and/or electron transfer dissociation (ETD) mass spectrometry. The bottom-up approach resulted in the identification of over 500 proteins; however identification of key protein biomarkers, liver fatty acid binding protein (FABP1), and its variant (Thr→Ala, position 94), was unreliable and irreproducible. Top-down LESA MS analysis of healthy and diseased liver tissue revealed peaks corresponding to multiple (~15-25) proteins. MS/MS of four of these proteins identified them as FABP1, its variant, α-hemoglobin, and 10 kDa heat shock protein. The reliable identification of FABP1 and its variant by top-down LESA MS suggests that the approach may be suitable for imaging NASH pathology in sections from liver biopsies. PMID:25183224

  7. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Directory of Open Access Journals (Sweden)

    Sebastian McBride

    Full Text Available Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1 conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2 implementation and validation of the model into robotic hardware (as a representative of an active vision system. Seven computational requirements were identified: 1 transformation of retinotopic to egocentric mappings, 2 spatial memory for the purposes of medium-term inhibition of return, 3 synchronization of 'where' and 'what' information from the two visual streams, 4 convergence of top-down and bottom-up information to a centralized point of information processing, 5 a threshold function to elicit saccade action, 6 a function to represent task relevance as a ratio of excitation and inhibition, and 7 derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  8. "Disorganized in time": impact of bottom-up and top-down negative emotion generation on memory formation among healthy and traumatized adolescents.

    Science.gov (United States)

    Guillery-Girard, Bérengère; Clochon, Patrice; Giffard, Bénédicte; Viard, Armelle; Egler, Pierre-Jean; Baleyte, Jean-Marc; Eustache, Francis; Dayan, Jacques

    2013-09-01

    "Travelling in time," a central feature of episodic memory is severely affected among individuals with Post Traumatic Stress Disorder (PTSD) with two opposite effects: vivid traumatic memories are unorganized in temporality (bottom-up processes), non-traumatic personal memories tend to lack spatio-temporal details and false recognitions occur more frequently that in the general population (top-down processes). To test the effect of these two types of processes (i.e. bottom-up and top-down) on emotional memory, we conducted two studies in healthy and traumatized adolescents, a period of life in which vulnerability to emotion is particularly high. Using negative and neutral images selected from the international affective picture system (IAPS), stimuli were divided into perceptual images (emotion generated by perceptual details) and conceptual images (emotion generated by the general meaning of the material). Both categories of stimuli were then used, along with neutral pictures, in a memory task with two phases (encoding and recognition). In both populations, we reported a differential effect of the emotional material on encoding and recognition. Negative perceptual scenes induced an attentional capture effect during encoding and enhanced the recollective distinctiveness. Conversely, the encoding of conceptual scenes was similar to neutral ones, but the conceptual relatedness induced false memories at retrieval. However, among individuals with PTSD, two subgroups of patients were identified. The first subgroup processed the scenes faster than controls, except for the perceptual scenes, and obtained similar performances to controls in the recognition task. The second subgroup group desmonstrated an attentional deficit in the encoding task with no benefit from the distinctiveness associated with negative perceptual scenes on memory performances. These findings provide a new perspective on how negative emotional information may have opposite influences on memory in

  9. Assuring Quality "Bottom-up"

    Czech Academy of Sciences Publication Activity Database

    Hašková, Hana

    Berlin: AGF, 2015. s. 20-21. [Good Quality In Early Childcare - Ideas, Goals And Strategies In Europe. 20150625, Berlin] R&D Projects: GA ČR GA15-13766S Institutional support: RVO:68378025 Subject RIV: AO - Sociology, Demography

  10. Bottom-up disaster resilience

    Science.gov (United States)

    Chan, Emily Y. Y.

    2013-05-01

    The 2008 Wenchuan earthquake highlights some of the successes of government-led schemes to mitigate the impact of natural disasters. A stronger focus on individuals and local communities could reduce losses even further in the future.

  11. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    Science.gov (United States)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat

  12. Referent Salience Affects Second Language Article Use

    Science.gov (United States)

    Trenkic, Danijela; Pongpairoj, Nattama

    2013-01-01

    The effect of referent salience on second language (L2) article production in real time was explored. Thai (-articles) and French (+articles) learners of English described dynamic events involving two referents, one visually cued to be more salient at the point of utterance formulation. Definiteness marking was made communicatively redundant with…

  13. Visualization of neural networks using saliency maps

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.; Kjems, Ulrik; Hansen, Lars Kai;

    1995-01-01

    The saliency map is proposed as a new method for understanding and visualizing the nonlinearities embedded in feedforward neural networks, with emphasis on the ill-posed case, where the dimensionality of the input-field by far exceeds the number of examples. Several levels of approximations are...

  14. Comparing bottom-up and top-down approaches at the landscape scale, including agricultural activities and water systems, at the Roskilde Fjord, Denmark

    Science.gov (United States)

    Lequy, Emeline; Ibrom, Andreas; Ambus, Per; Massad, Raia-Silvia; Markager, Stiig; Asmala, Eero; Garnier, Josette; Gabrielle, Benoit; Loubet, Benjamin

    2015-04-01

    The greenhouse gas nitrous oxide (N2O) mainly originates in direct emissions from agricultural soils due to microbial reactions stimulated by the use of nitrogen fertilisers. Indirect N2O emissions from water systems due to nitrogen leaching and deposition from crop fields range between 26 and 37% of direct agricultural emissions, indicating their potential importance and uncertainty (Reay et al. 2012). The study presented here couples a top-down approach with eddy covariance (EC) and a bottom-up approach using different models and measurements. A QCL sensor at 96-m height on a tall tower measures the emissions of N2O from 1100 ha of crop fields and from the south part of the Roskilde fjord, in a 5-km radius area around the tall tower at Roskilde, Denmark. The bottom-up approach includes ecosystem modelling with CERES-EGC for the crops and PaSIM for the grasslands, and the N2O fluxes from the Roskilde fjord are derived from N2O sea water concentration measurements. EC measurements are now available from July to December 2014, and indicate a magnitude of the emissions from the crop fields around 0.2 mg N2O-N m-2 day-1 (range -9 to 5) which is consistent with the CERES-EGC simulations and calculations using IPCC emission factors. N2O fluxes from the Roskilde fjord in May and July indicated quite constant N2O concentrations around 0.1 µg N L-1 despite variations of nitrate and ammonium in the fjord. The calculated fluxes from these concentrations and the tall tower measurements consistently ranged between -7 and 6 mg N2O-N m-2 day-1. The study site also contains a waste water treatment plant, whose direct emissions will be measured in early 2015 using a dynamic plume tracer dispersion method (Mønster et al. 2014). A refined source attribution methodology together with more measurements and simulations of the N2O fluxes from the different land uses in this study site will provide a clearer view of the dynamics and budgets of N2O at the regional scale. The

  15. Top-down model estimates, bottom-up inventories, and future projections of global natural and anthropogenic emissions of nitrous oxide

    Science.gov (United States)

    Davidson, E. A.; Kanter, D.

    2013-12-01

    Nitrous oxide (N2O) is the third most abundantly emitted greenhouse gas and the largest remaining emitted ozone depleting substance. It is a product of nitrifying and denitrifying bacteria in soils, sediments and water bodies. Humans began to disrupt the N cycle in the preindustrial era as they expanded agricultural land, used fire for land clearing and management, and cultivated leguminous crops that carry out biological N fixation. This disruption accelerated after the industrial revolution, especially as the use of synthetic N fertilizers became common after 1950. Here we present findings from a new United Nations Environment Programme report, in which we constrain estimates of the anthropogenic and natural emissions of N2O and consider scenarios for future emissions. Inventory-based estimates of natural emissions from terrestrial, marine and atmospheric sources range from 10 to 12 Tg N2O-N/yr. Similar values can be derived for global N2O emissions that were predominantly natural before the industrial revolution. While there was inter-decadal variability, there was little or no consistent trend in atmospheric N2O concentrations between 1730 and 1850, allowing us to assume near steady state. Assuming an atmospheric lifetime of 120 years, the 'top-down' estimate of pre-industrial emissions of 11 Tg N2O-N/yr is consistent with the bottom-up inventories for natural emissions, although the former includes some modest pre-industrial anthropogenic effects (probably top-down methodology yields an estimate of 5.3 Tg N2O-N/yr (range 5.2 - 5.5) net anthropogenic emissions for the period 2000-2007. Based on a review of bottom-up inventories, we estimate total net anthropogenic N2O emissions of 6.0 Tg N2O-N/yr (5.4-8.4 Tg N2O-N/yr). Estimates (and ranges) by sector (in Tg N2O-N/yr) are: agriculture 4.1 Tg (3.8-6.8); biomass burning 0.7 (0.5-1.7); energy and transport 0.7 (0.5-1.2); industry 0.7 (0.3-1.1); and other 0.5 (0.2 - 0.8). Tropical deforestation has reduced

  16. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Science.gov (United States)

    Acton, W. Joe F.; Schallhart, Simon; Langford, Ben; Valach, Amy; Rantala, Pekka; Fares, Silvano; Carriero, Giulia; Tillmann, Ralf; Tomlinson, Sam J.; Dragosits, Ulrike; Gianelle, Damiano; Hewitt, C. Nicholas; Nemitz, Eiko

    2016-06-01

    This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs) 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton-transfer-reaction mass spectrometer (PTR-MS) and a proton-transfer-reaction time-of-flight mass spectrometer (PTR-ToF-MS) together with the methods of virtual disjunct eddy covariance (using PTR-MS) and eddy covariance (using PTR-ToF-MS). Isoprene was the dominant emitted compound with a mean daytime flux of 1.9 mg m-2 h-1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28-day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m-2 h-1 was calculated using the Model of Emissions of Gases and Aerosols from Nature (MEGAN) isoprene emission algorithms (Guenther et al., 2006). A detailed tree-species distribution map for the site enabled the leaf-level emission of isoprene and monoterpenes recorded using gas-chromatography mass spectrometry (GC-MS) to be scaled up to produce a bottom-up canopy-scale flux. This was compared with the top-down canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant-species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  17. A two-step combination of top-down and bottom-up fire emission estimates at regional and global scales: strengths and main uncertainties

    Science.gov (United States)

    Sofiev, Mikhail; Soares, Joana; Kouznetsov, Rostislav; Vira, Julius; Prank, Marje

    2016-04-01

    Top-down emission estimation via inverse dispersion modelling is used for various problems, where bottom-up approaches are difficult or highly uncertain. One of such areas is the estimation of emission from wild-land fires. In combination with dispersion modelling, satellite and/or in-situ observations can, in principle, be used to efficiently constrain the emission values. This is the main strength of the approach: the a-priori values of the emission factors (based on laboratory studies) are refined for real-life situations using the inverse-modelling technique. However, the approach also has major uncertainties, which are illustrated here with a few examples of the Integrated System for wild-land Fires (IS4FIRES). IS4FIRES generates the smoke emission and injection profile from MODIS and SEVIRI active-fire radiative energy observations. The emission calculation includes two steps: (i) initial top-down calibration of emission factors via inverse dispersion problem solution that is made once using training dataset from the past, (ii) application of the obtained emission coefficients to individual-fire radiative energy observations, thus leading to bottom-up emission compilation. For such a procedure, the major classes of uncertainties include: (i) imperfect information on fires, (ii) simplifications in the fire description, (iii) inaccuracies in the smoke observations and modelling, (iv) inaccuracies of the inverse problem solution. Using examples of the fire seasons 2010 in Russia, 2012 in Eurasia, 2007 in Australia, etc, it is pointed out that the top-down system calibration performed for a limited number of comparatively moderate cases (often the best-observed ones) may lead to errors in application to extreme events. For instance, the total emission of 2010 Russian fires is likely to be over-estimated by up to 50% if the calibration is based on the season 2006 and fire description is simplified. Longer calibration period and more sophisticated parameterization

  18. Linking top-down and bottom-up approaches for assessing the vulnerability of a 100 % renewable energy system in Northern-Italy

    Science.gov (United States)

    Borga, Marco; Francois, Baptiste; Hingray, Benoit; Zoccatelli, Davide; Creutin, Jean-Dominique; brown, Casey

    2016-04-01

    Due to their variable and un-controllable features, integration of Variable Renewable Energies (e.g. solar-power, wind-power and hydropower, denoted as VRE) into the electricity network implies higher production variability and increased risk of not meeting demand. Two approaches are commonly used for assessing this risk and especially its evolution in a global change context (i.e. climate and societal changes); top-down and bottom-up approaches. The general idea of a top-down approach is to drive analysis of global change or of some key aspects of global change on their systems (e.g., the effects of the COP 21, of the deployment of Smart Grids, or of climate change) with chains of loosely linked simulation models within a predictive framework. The bottom-up approach aims to improve understanding of the dependencies between the vulnerability of regional systems and large-scale phenomenon from knowledge gained through detailed exploration of the response to change of the system of interest, which may reveal vulnerability thresholds, tipping points as well as potential opportunities. Brown et al. (2012) defined an analytical framework to merge these two approaches. The objective is to build, a set of Climate Response Functions (CRFs) putting in perspective i) indicators of desired states ("success") and undesired states ("failure") of a system as defined in collaboration with stakeholders 2) exhaustive exploration of the effects of uncertain forcings and imperfect system understanding on the response of the system itself to a plausible set of possible changes, implemented a with multi-dimensionally consistent "stress test" algorithm, and 3) a set "ex post" hydroclimatic and socioeconomic scenarios that provide insight into the differential effectiveness of alternative policies and serve as entry points for the provision of climate information to inform policy evaluation and choice. We adapted this approach for analyzing a 100 % renewable energy system within a region

  19. Bottom-up coarse-grained models with predictive accuracy and transferability for both structural and thermodynamic properties of heptane-toluene mixtures

    Science.gov (United States)

    Dunn, Nicholas J. H.; Noid, W. G.

    2016-05-01

    This work investigates the promise of a "bottom-up" extended ensemble framework for developing coarse-grained (CG) models that provide predictive accuracy and transferability for describing both structural and thermodynamic properties. We employ a force-matching variational principle to determine system-independent, i.e., transferable, interaction potentials that optimally model the interactions in five distinct heptane-toluene mixtures. Similarly, we employ a self-consistent pressure-matching approach to determine a system-specific pressure correction for each mixture. The resulting CG potentials accurately reproduce the site-site rdfs, the volume fluctuations, and the pressure equations of state that are determined by all-atom (AA) models for the five mixtures. Furthermore, we demonstrate that these CG potentials provide similar accuracy for additional heptane-toluene mixtures that were not included their parameterization. Surprisingly, the extended ensemble approach improves not only the transferability but also the accuracy of the calculated potentials. Additionally, we observe that the required pressure corrections strongly correlate with the intermolecular cohesion of the system-specific CG potentials. Moreover, this cohesion correlates with the relative "structure" within the corresponding mapped AA ensemble. Finally, the appendix demonstrates that the self-consistent pressure-matching approach corresponds to minimizing an appropriate relative entropy.

  20. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems, Using a Bottom-Up Approach and Installer Survey - Second Edition

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, B.; Ardani, K.; Feldman, D.; Citron, R.; Margolis, R.; Zuboy, J.

    2013-10-01

    This report presents results from the second U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs -- often referred to as 'business process' or 'soft' costs -- for U.S. residential and commercial photovoltaic (PV) systems. In service to DOE's SunShot Initiative, annual expenditure and labor-hour-productivity data are analyzed to benchmark 2012 soft costs related to (1) customer acquisition and system design (2) permitting, inspection, and interconnection (PII). We also include an in-depth analysis of costs related to financing, overhead, and profit. Soft costs are both a major challenge and a major opportunity for reducing PV system prices and stimulating SunShot-level PV deployment in the United States. The data and analysis in this series of benchmarking reports are a step toward the more detailed understanding of PV soft costs required to track and accelerate these price reductions.

  1. Toward improved prediction of the bedrock depth underneath hillslopes: Bayesian inference of the bottom-up control hypothesis using high-resolution topographic data

    Science.gov (United States)

    Gomes, Guilherme J. C.; Vrugt, Jasper A.; Vargas, Eurípedes A.

    2016-04-01

    The depth to bedrock controls a myriad of processes by influencing subsurface flow paths, erosion rates, soil moisture, and water uptake by plant roots. As hillslope interiors are very difficult and costly to illuminate and access, the topography of the bedrock surface is largely unknown. This essay is concerned with the prediction of spatial patterns in the depth to bedrock (DTB) using high-resolution topographic data, numerical modeling, and Bayesian analysis. Our DTB model builds on the bottom-up control on fresh-bedrock topography hypothesis of Rempe and Dietrich (2014) and includes a mass movement and bedrock-valley morphology term to extent the usefulness and general applicability of the model. We reconcile the DTB model with field observations using Bayesian analysis with the DREAM algorithm. We investigate explicitly the benefits of using spatially distributed parameter values to account implicitly, and in a relatively simple way, for rock mass heterogeneities that are very difficult, if not impossible, to characterize adequately in the field. We illustrate our method using an artificial data set of bedrock depth observations and then evaluate our DTB model with real-world data collected at the Papagaio river basin in Rio de Janeiro, Brazil. Our results demonstrate that the DTB model predicts accurately the observed bedrock depth data. The posterior mean DTB simulation is shown to be in good agreement with the measured data. The posterior prediction uncertainty of the DTB model can be propagated forward through hydromechanical models to derive probabilistic estimates of factors of safety.

  2. Synthesis of a Cementitious Material Nanocement Using Bottom-Up Nanotechnology Concept: An Alternative Approach to Avoid CO2 Emission during Production of Cement

    Directory of Open Access Journals (Sweden)

    Byung Wan Jo

    2014-01-01

    Full Text Available The world’s increasing need is to develop smart and sustainable construction material, which will generate minimal climate changing gas during their production. The bottom-up nanotechnology has established itself as a promising alternative technique for the production of the cementitious material. The present investigation deals with the chemical synthesis of cementitious material using nanosilica, sodium aluminate, sodium hydroxide, and calcium nitrate as reacting phases. The characteristic properties of the chemically synthesized nanocement were verified by the chemical composition analysis, setting time measurement, particle size distribution, fineness analysis, and SEM and XRD analyses. Finally, the performance of the nanocement was ensured by the fabrication and characterization of the nanocement based mortar. Comparing the results with the commercially available cement product, it is demonstrated that the chemically synthesized nanocement not only shows better physical and mechanical performance, but also brings several encouraging impacts to the society, including the reduction of CO2 emission and the development of sustainable construction material. A plausible reaction scheme has been proposed to explain the synthesis and the overall performances of the nanocement.

  3. Fabricación de electrodos para control de transporte y alineamiento a micro y nanoescalas usando técnicas bottom-up y top-down

    Directory of Open Access Journals (Sweden)

    Darwin Rodríguez

    2014-12-01

    Full Text Available El continuo avance de aplicaciones en dispositivos de autoensamble, posicionamiento, sensores, actuadores, y que permitan controladamente la manipulación de micro y nanoestructuras, han generado amplio interés en el desarrollo de metodologías que permitan optimizar la fabricación de dispositivos para el control y manipulación a micro y nanoescalas. Este proyecto explora técnicas de fabricación de electrodos con el fin de encontrar una técnica óptima y reproducible. Se compara el rendimiento de cada técnica y se describen protocolos de limpieza y seguridad. Se diseñan e implementan tres geometrías para movilizar y posicionar micro y nanopartículas de hierro en una solución de aceite natural. Finalmente se generan campos eléctricos a partir de electroforesis, con el fin de encontrar la curva que describe el desplazamiento de las partículas con respecto al potencial aplicado. Estos resultados generan gran impacto en los actuales esfuerzos de fabricación bottom-up (controlando con campos la ubicación y la movilidad en dispositivos electrónicos. El hecho de fabricar geometría planar con electrodos genera la posibilidad de que se pueda integrar movimiento de partículas a los circuitos integrados que se fabrican en la actualidad.

  4. Bottom-up derivation of conservative and dissipative interactions for coarse-grained molecular liquids with the conditional reversible work method

    Energy Technology Data Exchange (ETDEWEB)

    Deichmann, Gregor; Marcon, Valentina; Vegt, Nico F. A. van der, E-mail: vandervegt@csi.tu-darmstadt.de [Center of Smart Interfaces, Technische Universität Darmstadt, Alarich-Weiss-Straße 10, 64287 Darmstadt (Germany)

    2014-12-14

    Molecular simulations of soft matter systems have been performed in recent years using a variety of systematically coarse-grained models. With these models, structural or thermodynamic properties can be quite accurately represented while the prediction of dynamic properties remains difficult, especially for multi-component systems. In this work, we use constraint molecular dynamics simulations for calculating dissipative pair forces which are used together with conditional reversible work (CRW) conservative forces in dissipative particle dynamics (DPD) simulations. The combined CRW-DPD approach aims to extend the representability of CRW models to dynamic properties and uses a bottom-up approach. Dissipative pair forces are derived from fluctuations of the direct atomistic forces between mapped groups. The conservative CRW potential is obtained from a similar series of constraint dynamics simulations and represents the reversible work performed to couple the direct atomistic interactions between the mapped atom groups. Neopentane, tetrachloromethane, cyclohexane, and n-hexane have been considered as model systems. These molecular liquids are simulated with atomistic molecular dynamics, coarse-grained molecular dynamics, and DPD. We find that the CRW-DPD models reproduce the liquid structure and diffusive dynamics of the liquid systems in reasonable agreement with the atomistic models when using single-site mapping schemes with beads containing five or six heavy atoms. For a two-site representation of n-hexane (3 carbons per bead), time scale separation can no longer be assumed and the DPD approach consequently fails to reproduce the atomistic dynamics.

  5. When top-down becomes bottom up: behaviour of hyperdense howler monkeys (Alouatta seniculus trapped on a 0.6 ha island.

    Directory of Open Access Journals (Sweden)

    Gabriela Orihuela

    Full Text Available Predators are a ubiquitous presence in most natural environments. Opportunities to contrast the behaviour of a species in the presence and absence of predators are thus rare. Here we report on the behaviour of howler monkey groups living under radically different conditions on two land-bridge islands in Lago Guri, Venezuela. One group of 6 adults inhabited a 190-ha island (Danto where they were exposed to multiple potential predators. This group, the control, occupied a home range of 23 ha and contested access to food resources with neighbouring groups in typical fashion. The second group, containing 6 adults, was isolated on a remote, predator-free 0.6 ha islet (Iguana offering limited food resources. Howlers living on the large island moved, fed and rested in a coherent group, frequently engaged in affiliative activities, rarely displayed agonistic behaviour and maintained intergroup spacing through howling. In contrast, the howlers on Iguana showed repulsion, as individuals spent most of their time spaced widely around the perimeter of the island. Iguana howlers rarely engaged in affiliative behaviour, often chased or fought with one another and were not observed to howl. These behaviors are interpreted as adjustments to the unrelenting deprivation associated with bottom-up limitation in a predator-free environment.

  6. When top-down becomes bottom up: behaviour of hyperdense howler monkeys (Alouatta seniculus) trapped on a 0.6 ha island.

    Science.gov (United States)

    Orihuela, Gabriela; Terborgh, John; Ceballos, Natalia; Glander, Kenneth

    2014-01-01

    Predators are a ubiquitous presence in most natural environments. Opportunities to contrast the behaviour of a species in the presence and absence of predators are thus rare. Here we report on the behaviour of howler monkey groups living under radically different conditions on two land-bridge islands in Lago Guri, Venezuela. One group of 6 adults inhabited a 190-ha island (Danto) where they were exposed to multiple potential predators. This group, the control, occupied a home range of 23 ha and contested access to food resources with neighbouring groups in typical fashion. The second group, containing 6 adults, was isolated on a remote, predator-free 0.6 ha islet (Iguana) offering limited food resources. Howlers living on the large island moved, fed and rested in a coherent group, frequently engaged in affiliative activities, rarely displayed agonistic behaviour and maintained intergroup spacing through howling. In contrast, the howlers on Iguana showed repulsion, as individuals spent most of their time spaced widely around the perimeter of the island. Iguana howlers rarely engaged in affiliative behaviour, often chased or fought with one another and were not observed to howl. These behaviors are interpreted as adjustments to the unrelenting deprivation associated with bottom-up limitation in a predator-free environment. PMID:24743575

  7. Middle-Out Approaches to Reform of University Teaching and Learning: Champions striding between the top-down and bottom-up approaches

    Directory of Open Access Journals (Sweden)

    Rick Cummings

    2005-03-01

    Full Text Available In recent years, Australian universities have been driven by a diversity of external forces, including funding cuts, massification of higher education, and changing student demographics, to reform their relationship with students and improve teaching and learning, particularly for those studying off-campus or part-time. Many universities have responded to these forces either through formal strategic plans developed top-down by executive staff or through organic developments arising from staff in a bottom-up approach. By contrast, much of Murdoch University’s response has been led by a small number of staff who have middle management responsibilities and who have championed the reform of key university functions, largely in spite of current policy or accepted practice. This paper argues that the ‘middle-out’ strategy has both a basis in change management theory and practice, and a number of strengths, including low risk, low cost, and high sustainability. Three linked examples of middle-out change management in teaching and learning at Murdoch University are described and the outcomes analyzed to demonstrate the benefits and pitfalls of this approach.

  8. Bottom-up derivation of conservative and dissipative interactions for coarse-grained molecular liquids with the conditional reversible work method

    International Nuclear Information System (INIS)

    Molecular simulations of soft matter systems have been performed in recent years using a variety of systematically coarse-grained models. With these models, structural or thermodynamic properties can be quite accurately represented while the prediction of dynamic properties remains difficult, especially for multi-component systems. In this work, we use constraint molecular dynamics simulations for calculating dissipative pair forces which are used together with conditional reversible work (CRW) conservative forces in dissipative particle dynamics (DPD) simulations. The combined CRW-DPD approach aims to extend the representability of CRW models to dynamic properties and uses a bottom-up approach. Dissipative pair forces are derived from fluctuations of the direct atomistic forces between mapped groups. The conservative CRW potential is obtained from a similar series of constraint dynamics simulations and represents the reversible work performed to couple the direct atomistic interactions between the mapped atom groups. Neopentane, tetrachloromethane, cyclohexane, and n-hexane have been considered as model systems. These molecular liquids are simulated with atomistic molecular dynamics, coarse-grained molecular dynamics, and DPD. We find that the CRW-DPD models reproduce the liquid structure and diffusive dynamics of the liquid systems in reasonable agreement with the atomistic models when using single-site mapping schemes with beads containing five or six heavy atoms. For a two-site representation of n-hexane (3 carbons per bead), time scale separation can no longer be assumed and the DPD approach consequently fails to reproduce the atomistic dynamics

  9. Evolutionary Steps in the Emergence of Life Deduced from the Bottom-Up Approach and GADV Hypothesis (Top-Down Approach

    Directory of Open Access Journals (Sweden)

    Kenji Ikehara

    2016-01-01

    Full Text Available It is no doubt quite difficult to solve the riddle of the origin of life. So, firstly, I would like to point out the kinds of obstacles there are in solving this riddle and how we should tackle these difficult problems, reviewing the studies that have been conducted so far. After that, I will propose that the consecutive evolutionary steps in a timeline can be rationally deduced by using a common event as a juncture, which is obtained by two counter-directional approaches: one is the bottom-up approach through which many researchers have studied the origin of life, and the other is the top-down approach, through which I established the [GADV]-protein world hypothesis or GADV hypothesis on the origin of life starting from a study on the formation of entirely new genes in extant microorganisms. Last, I will describe the probable evolutionary process from the formation of Earth to the emergence of life, which was deduced by using a common event—the establishment of the first genetic code encoding [GADV]-amino acids—as a juncture for the results obtained from the two approaches.

  10. Employment impacts of EU biofuels policy. Combining bottom-up technology information and sectoral market simulations in an input-output framework

    International Nuclear Information System (INIS)

    This paper analyses the employment consequences of policies aimed to support biofuels in the European Union. The promotion of biofuel use has been advocated as a means to promote the sustainable use of natural resources and to reduce greenhouse gas emissions originating from transport activities on the one hand, and to reduce dependence on imported oil and thereby increase security of the European energy supply on the other hand. The employment impacts of increasing biofuels shares are calculated by taking into account a set of elements comprising the demand for capital goods required to produce biofuels, the additional demand for agricultural feedstock, higher fuel prices or reduced household budget in the case of price subsidisation, price effects ensuing from a hypothetical world oil price reduction linked to substitution in the EU market, and price impacts on agro-food commodities. The calculations refer to scenarios for the year 2020 targets as set out by the recent Renewable Energy Roadmap. Employment effects are assessed in an input-output framework taking into account bottom-up technology information to specify biofuels activities and linked to partial equilibrium models for the agricultural and energy sectors. The simulations suggest that biofuels targets on the order of 10-15% could be achieved without adverse net employment effects. (author)

  11. Structural and optical nanoscale analysis of GaN core-shell microrod arrays fabricated by combined top-down and bottom-up process on Si(111)

    Science.gov (United States)

    Müller, Marcus; Schmidt, Gordon; Metzner, Sebastian; Veit, Peter; Bertram, Frank; Krylyuk, Sergiy; Debnath, Ratan; Ha, Jong-Yoon; Wen, Baomei; Blanchard, Paul; Motayed, Abhishek; King, Matthew R.; Davydov, Albert V.; Christen, Jürgen

    2016-05-01

    Large arrays of GaN core-shell microrods were fabricated on Si(111) substrates applying a combined bottom-up and top-down approach which includes inductively coupled plasma (ICP) etching of patterned GaN films grown by metal-organic vapor phase epitaxy (MOVPE) and selective overgrowth of obtained GaN/Si pillars using hydride vapor phase epitaxy (HVPE). The structural and optical properties of individual core-shell microrods have been studied with a nanometer scale spatial resolution using low-temperature cathodoluminescence spectroscopy (CL) directly performed in a scanning electron microscope (SEM) and in a scanning transmission electron microscope (STEM). SEM, TEM, and CL measurements reveal the formation of distinct growth domains during the HVPE overgrowth. A high free-carrier concentration observed in the non-polar \\{ 1\\bar{1}00\\} HVPE shells is assigned to in-diffusion of silicon atoms from the substrate. In contrast, the HVPE shells directly grown on top of the c-plane of the GaN pillars reveal a lower free-carrier concentration.

  12. Fused methods for visual saliency estimation

    Science.gov (United States)

    Danko, Amanda S.; Lyu, Siwei

    2015-02-01

    In this work, we present a new model of visual saliency by combing results from existing methods, improving upon their performance and accuracy. By fusing pre-attentive and context-aware methods, we highlight the abilities of state-of-the-art models while compensating for their deficiencies. We put this theory to the test in a series of experiments, comparatively evaluating the visual saliency maps and employing them for content-based image retrieval and thumbnail generation. We find that on average our model yields definitive improvements upon recall and f-measure metrics with comparable precisions. In addition, we find that all image searches using our fused method return more correct images and additionally rank them higher than the searches using the original methods alone.

  13. Do patients with schizophrenia exhibit aberrant salience?

    OpenAIRE

    Roiser, J. P.; Stephan, K E; den Ouden, H. E. M.; Barnes, T. R. E.; Friston, K.J.; Joyce, E. M.

    2009-01-01

    BACKGROUND: It has been suggested that some psychotic symptoms reflect ‘aberrant salience’, related to dysfunctional reward learning. To test this hypothesis we investigated whether patients with schizophrenia showed impaired learning of task-relevant stimulusreinforcement associations in the presence of distracting task-irrelevant cues. METHODS: We tested 20 medicated patients with schizophrenia and 17 controls on a reaction time game, the Salience Attribution Test. In this game, ...

  14. Olfaction spontaneously highlights visual saliency map

    OpenAIRE

    Chen, Kepu; Zhou, Bin; Chen, Shan; He, Sheng; Zhou, Wen

    2013-01-01

    Attention is intrinsic to our perceptual representations of sensory inputs. Best characterized in the visual domain, it is typically depicted as a spotlight moving over a saliency map that topographically encodes strengths of visual features and feedback modulations over the visual scene. By introducing smells to two well-established attentional paradigms, the dot-probe and the visual-search paradigms, we find that a smell reflexively directs attention to the congruent visual image and facili...

  15. Top-down/bottom-up description of electricity sector for Switzerland using the GEM-E3 computable general equilibrium model

    International Nuclear Information System (INIS)

    Participation of the Paul Scherrer Institute (PSI) in the advancement and extension of the multi-region, Computable General Equilibrium (CGE) model GEM-E3 (CES/KUL, 2002) focused primarily on two top-level facets: a) extension of the model database and model calibration, particularly as related to the second component of this study, which is; b) advancement of the dynamics of innovation and investment, primarily through the incorporation of Exogenous Technical Learning (ETL) into he Bottom-Up (BU, technology-based) part of the dynamic upgrade; this latter activity also included the completion of the dynamic coupling of the BU description of the electricity sector with the 'Top-Down' (TD, econometric) description of the economy inherent to the GEM-E3 CGE model. The results of this two- component study are described in two parts that have been combined in this single summary report: Part I describes the methodology and gives illustrative results from the BUTD integration, as well as describing the approach to and giving preliminary results from incorporating an ETL description into the BU component of the overall model; Part II reports on the calibration component of task in terms of: a) formulating a BU technology database for Switzerland based on previous work; incorporation of that database into the GEM-E3 model; and calibrating the BU database with the TD database embodied in the (Swiss) Social Accounting Matrix (SAM). The BUTD coupling along with the ETL incorporation described in Part I represent the major effort embodied in this investigation, but this effort could not be completed without the calibration preamble reported herein as Part II. A brief summary of the scope of each of these key study components is given. (author)

  16. A comparison of top-down and bottom-up carbon dioxide fluxes in the UK using a multi-platform measurement network.

    Science.gov (United States)

    White, Emily; Rigby, Matt; O'Doherty, Simon; Stavert, Ann; Lunt, Mark; Nemitz, Eiko; Helfter, Carole; Allen, Grant; Pitt, Joe; Bauguitte, Stéphane; Levy, Pete; van Oijen, Marcel; Williams, Mat; Smallman, Luke; Palmer, Paul

    2016-04-01

    Having a comprehensive understanding, on a countrywide scale, of both biogenic and anthropogenic CO2 emissions is essential for knowing how best to reduce anthropogenic emissions and for understanding how the terrestrial biosphere is responding to global fossil fuel emissions. Whilst anthropogenic CO2 flux estimates are fairly well constrained, fluxes from biogenic sources are not. This work will help to verify existing anthropogenic emissions inventories and give a better understanding of biosphere - atmosphere CO2 exchange. Using an innovative top-down inversion scheme; a hierarchical Bayesian Markov Chain Monte Carlo approach with reversible jump "trans-dimensional" basis function selection, we aim to find emissions estimates for biogenic and anthropogenic sources simultaneously. Our approach allows flux uncertainties to be derived more comprehensively than previous methods, and allows the resolved spatial scales in the solution to be determined using the data. We use atmospheric CO2 mole fraction data from the UK Deriving Emissions related to Climate Change (DECC) and Greenhouse gAs UK and Global Emissions (GAUGE) projects. The network comprises of 6 tall tower sites, flight campaigns and a ferry transect along the east coast, and enables us to derive high-resolution monthly flux estimates across the UK and Ireland for the period 2013-2015. We have derived UK total fluxes of 675 PIC 78 Tg/yr during January 2014 (seasonal maximum) and 23 PIC 96 Tg/yr during May 2014 (seasonal minimum). Our disaggregated anthropogenic and biogenic flux estimates are compared to a new high-resolution time resolved anthropogenic inventory that will underpin future UNFCCC reports by the UK, and to DALEC carbon cycle model. This allows us to identify where significant differences exist between these "bottom-up" and "top-down" flux estimates and suggest reasons for discrepancies. We will highlight the strengths and limitations of the UK's CO2 emissions verification infrastructure at

  17. Top-down/bottom-up description of electricity sector for Switzerland using the GEM-E3 computable general equilibrium model

    Energy Technology Data Exchange (ETDEWEB)

    Krakowski, R. A

    2006-06-15

    Participation of the Paul Scherrer Institute (PSI) in the advancement and extension of the multi-region, Computable General Equilibrium (CGE) model GEM-E3 (CES/KUL, 2002) focused primarily on two top-level facets: a) extension of the model database and model calibration, particularly as related to the second component of this study, which is; b) advancement of the dynamics of innovation and investment, primarily through the incorporation of Exogenous Technical Learning (ETL) into he Bottom-Up (BU, technology-based) part of the dynamic upgrade; this latter activity also included the completion of the dynamic coupling of the BU description of the electricity sector with the 'Top-Down' (TD, econometric) description of the economy inherent to the GEM-E3 CGE model. The results of this two- component study are described in two parts that have been combined in this single summary report: Part I describes the methodology and gives illustrative results from the BUTD integration, as well as describing the approach to and giving preliminary results from incorporating an ETL description into the BU component of the overall model; Part II reports on the calibration component of task in terms of: a) formulating a BU technology database for Switzerland based on previous work; incorporation of that database into the GEM-E3 model; and calibrating the BU database with the TD database embodied in the (Swiss) Social Accounting Matrix (SAM). The BUTD coupling along with the ETL incorporation described in Part I represent the major effort embodied in this investigation, but this effort could not be completed without the calibration preamble reported herein as Part II. A brief summary of the scope of each of these key study components is given. (author)

  18. Effects of bottom-up and top-down intervention principles in emergent literacy in children at risk of developmental dyslexia: a longitudinal study.

    Science.gov (United States)

    Helland, Turid; Tjus, Tomas; Hovden, Marit; Ofte, Sonja; Heimann, Mikael

    2011-01-01

    This longitudinal study focused on the effects of two different principles of intervention in children at risk of developing dyslexia from 5 to 8 years old. The children were selected on the basis of a background questionnaire given to parents and preschool teachers, with cognitive and functional magnetic resonance imaging results substantiating group differences in neuropsychological processes associated with phonology, orthography, and phoneme-grapheme correspondence (i.e., alphabetic principle). The two principles of intervention were bottom-up (BU), "from sound to meaning", and top-down (TD), "from meaning to sound." Thus, four subgroups were established: risk/BU, risk/TD, control/BU, and control/TD. Computer-based training took place for 2 months every spring, and cognitive assessments were performed each fall of the project period. Measures of preliteracy skills for reading and spelling were phonological awareness, working memory, verbal learning, and letter knowledge. Literacy skills were assessed by word reading and spelling. At project end the control group scored significantly above age norm, whereas the risk group scored within the norm. In the at-risk group, training based on the BU principle had the strongest effects on phonological awareness and working memory scores, whereas training based on the TD principle had the strongest effects on verbal learning, letter knowledge, and literacy scores. It was concluded that appropriate, specific, data-based intervention starting in preschool can mitigate literacy impairment and that interventions should contain BU training for preliteracy skills and TD training for literacy training. PMID:21383104

  19. Reducing energy consumption and CO2 emissions by energy efficiency measures and international trading: A bottom-up modeling for the U.S. iron and steel sector

    International Nuclear Information System (INIS)

    Highlights: • Use ISEEM to evaluate energy and emission reduction in U.S. Iron and Steel sector. • ISEEM is a new bottom-up optimization model for industry sector energy planning. • Energy and emission reduction includes efficiency measure and international trading. • International trading includes commodity and carbon among U.S., China and India. • Project annual energy use, CO2 emissions, production, and costs from 2010 to 2050. - Abstract: Using the ISEEM modeling framework, we analyzed the roles of energy efficiency measures, steel commodity and international carbon trading in achieving specific CO2 emission reduction targets in the U.S iron and steel sector from 2010 to 2050. We modeled how steel demand is balanced under three alternative emission reduction scenarios designed to include national energy efficiency measures, commodity trading, and international carbon trading as key instruments to meet a particular emission restriction target in the U.S. iron and steel sector; and how production, process structure, energy supply, and system costs change with those scenarios. The results advance our understanding of long-term impacts of different energy policy options designed to reduce energy consumption and CO2 emissions for U.S. iron and steel sector, and generate insight of policy implications for the sector’s environmentally and economically sustainable development. The alternative scenarios associated with 20% emission-reduction target are projected to result in approximately 11–19% annual energy reduction in the medium term (i.e., 2030) and 9–20% annual energy reduction in the long term (i.e., 2050) compared to the Base scenario

  20. The control of automatic imitation based on bottom-up and top-down cues to animacy: insights from brain and behavior.

    Science.gov (United States)

    Klapper, André; Ramsey, Richard; Wigboldus, Daniël; Cross, Emily S

    2014-11-01

    Humans automatically imitate other people's actions during social interactions, building rapport and social closeness in the process. Although the behavioral consequences and neural correlates of imitation have been studied extensively, little is known about the neural mechanisms that control imitative tendencies. For example, the degree to which an agent is perceived as human-like influences automatic imitation, but it is not known how perception of animacy influences brain circuits that control imitation. In the current fMRI study, we examined how the perception and belief of animacy influence the control of automatic imitation. Using an imitation-inhibition paradigm that involves suppressing the tendency to imitate an observed action, we manipulated both bottom-up (visual input) and top-down (belief) cues to animacy. Results show divergent patterns of behavioral and neural responses. Behavioral analyses show that automatic imitation is equivalent when one or both cues to animacy are present but reduces when both are absent. By contrast, right TPJ showed sensitivity to the presence of both animacy cues. Thus, we demonstrate that right TPJ is biologically tuned to control imitative tendencies when the observed agent both looks like and is believed to be human. The results suggest that right TPJ may be involved in a specialized capacity to control automatic imitation of human agents, rather than a universal process of conflict management, which would be more consistent with generalist theories of imitative control. Evidence for specialized neural circuitry that "controls" imitation offers new insight into developmental disorders that involve atypical processing of social information, such as autism spectrum disorders. PMID:24742157

  1. The impact of napping on memory for future-relevant stimuli: Prioritization among multiple salience cues.

    Science.gov (United States)

    Bennion, Kelly A; Payne, Jessica D; Kensinger, Elizabeth A

    2016-06-01

    Prior research has demonstrated that sleep enhances memory for future-relevant information, including memory for information that is salient due to emotion, reward, or knowledge of a later memory test. Although sleep has been shown to prioritize information with any of these characteristics, the present study investigates the novel question of how sleep prioritizes information when multiple salience cues exist. Participants encoded scenes that were future-relevant based on emotion (emotional vs. neutral), reward (rewarded vs. unrewarded), and instructed learning (intentionally vs. incidentally encoded), preceding a delay consisting of a nap, an equivalent time period spent awake, or a nap followed by wakefulness (to control for effects of interference). Recognition testing revealed that when multiple dimensions of future relevance co-occur, sleep prioritizes top-down, goal-directed cues (instructed learning, and to a lesser degree, reward) over bottom-up, stimulus-driven characteristics (emotion). Further, results showed that these factors interact; the effect of a nap on intentionally encoded information was especially strong for neutral (relative to emotional) information, suggesting that once one cue for future relevance is present, there are diminishing returns with additional cues. Sleep may binarize information based on whether it is future-relevant or not, preferentially consolidating memory for the former category. Potential neural mechanisms underlying these selective effects and the implications of this research for educational and vocational domains are discussed. (PsycINFO Database Record PMID:27214500

  2. Adaptive Metric Learning for Saliency Detection.

    Science.gov (United States)

    Li, Shuang; Lu, Huchuan; Lin, Zhe; Shen, Xiaohui; Price, Brian

    2015-11-01

    In this paper, we propose a novel adaptive metric learning algorithm (AML) for visual saliency detection. A key observation is that the saliency of a superpixel can be estimated by the distance from the most certain foreground and background seeds. Instead of measuring distance on the Euclidean space, we present a learning method based on two complementary Mahalanobis distance metrics: 1) generic metric learning (GML) and 2) specific metric learning (SML). GML aims at the global distribution of the whole training set, while SML considers the specific structure of a single image. Considering that multiple similarity measures from different views may enhance the relevant information and alleviate the irrelevant one, we try to fuse the GML and SML together and experimentally find the combining result does work well. Different from the most existing methods which are directly based on low-level features, we devise a superpixelwise Fisher vector coding approach to better distinguish salient objects from the background. We also propose an accurate seeds selection mechanism and exploit contextual and multiscale information when constructing the final saliency map. Experimental results on various image sets show that the proposed AML performs favorably against the state-of-the-arts. PMID:26054067

  3. Energetic Bottomup in the Low Countries. Energy transition from the bottom-up. On Happy energetic civilians, Solar and wind cooperatives, New utility companies; Energieke BottomUp in Lage Landen. De Energietransitie van Onderaf. Over Vrolijke energieke burgers, Zon- en windcooperaties, Nieuwe nuts

    Energy Technology Data Exchange (ETDEWEB)

    Schwencke, A.M.

    2012-08-15

    This essay is an outline of the 'energy transition from the bottom-up'. Leading questions are: (1) what are the actual initiatives; (2) who is involved; (3) how does one work (organization, business models); (4) why are people active in this field; (5) what good is it; (6) what is the aim? The essay is based on public information sources (websites, blogs, publications) and interviews with people involved [Dutch] Dit essay is een verkenning van de 'energietransitie van onderaf'. Leidende vragen zijn: (1) om wat voor initiatieven gaat het nu eigenlijk?; (2) wie zijn daarbij betrokken?; (3) hoe gaat men te werk (organisatie, business modellen)?; (4) waarom is men er op die manier mee bezig?; (5) Zet het zoden aan de dijk?; (6) Waar beweegt het naar toe? Het essay baseert zich op openbare bronnen (websites, blogs, publicaties) en gesprekken met mensen uit het veld.

  4. DISC: Deep Image Saliency Computing via Progressive Representation Learning.

    Science.gov (United States)

    Chen, Tianshui; Lin, Liang; Liu, Lingbo; Luo, Xiaonan; Li, Xuelong

    2016-06-01

    Salient object detection increasingly receives attention as an important component or step in several pattern recognition and image processing tasks. Although a variety of powerful saliency models have been intensively proposed, they usually involve heavy feature (or model) engineering based on priors (or assumptions) about the properties of objects and backgrounds. Inspired by the effectiveness of recently developed feature learning, we provide a novel deep image saliency computing (DISC) framework for fine-grained image saliency computing. In particular, we model the image saliency from both the coarse-and fine-level observations, and utilize the deep convolutional neural network (CNN) to learn the saliency representation in a progressive manner. In particular, our saliency model is built upon two stacked CNNs. The first CNN generates a coarse-level saliency map by taking the overall image as the input, roughly identifying saliency regions in the global context. Furthermore, we integrate superpixel-based local context information in the first CNN to refine the coarse-level saliency map. Guided by the coarse saliency map, the second CNN focuses on the local context to produce fine-grained and accurate saliency map while preserving object details. For a testing image, the two CNNs collaboratively conduct the saliency computing in one shot. Our DISC framework is capable of uniformly highlighting the objects of interest from complex background while preserving well object details. Extensive experiments on several standard benchmarks suggest that DISC outperforms other state-of-the-art methods and it also generalizes well across data sets without additional training. The executable version of DISC is available online: http://vision.sysu.edu.cn/projects/DISC. PMID:26742147

  5. A visual saliency based method for vehicle logo detection

    Science.gov (United States)

    Zhang, Fan; Shen, Yiping; Chang, Hongxing

    2013-07-01

    This paper presents a novel method based on visual saliency and template matching for detecting vehicle logo from images captured by cross-road cameras. To detect the logo, such method first generates a saliency map based on the modified Itti's saliency model, and then obtains regions of interest (ROI) by thresholding the saliency map, at last performs an edge-based template matching to locate the logo. Experiments on more than 2400 images validate both high accuracy and efficiency of the proposed method, and demonstrates our method suitable for real-time application.

  6. Salience Effects in the North-West of England

    OpenAIRE

    Sandra Jansen

    2014-01-01

    The question of how we can define salience, what properties it includes and how we can quantify it have been discussed widely over the past thirty years but we still have more questions than answers about this phenomenon, e. g. not only how salience arises, but also how we can define it. However, despite the lack of a clear definition, salience is often taken into account as an explanatory factor in language change. The scientific discourse on salience has in most cases revolved around phonet...

  7. Greenhouse Gas Emission Accounting. Preliminary study as input to a joint Int. IPCC Expert Meeting / CKO-CCB Workshop on Comparison of Top-down versus Bottom-up Emission Estimates.

    NARCIS (Netherlands)

    Amstel, van A.; Kroeze, C.; Janssen, L.J.H.M.; Olivier, J.G.J.; Wal, van der J.T.

    1997-01-01

    Bottom-up data for carbon dioxide, methane and nitrous oxide from the official national inventories (National Communications) were compared with data from EDGAR (Emission Database for Global Atmospheric Research) and top-down emission estimates, based on the results of dispersion and climate models

  8. Greenhouse Gas Emission Accounting: preliminary study as input to a joint International IPCC Expert Meeting/CKO-CCB Workshop on Comparison of Top-down versus Bottom-up Emission Estimates

    NARCIS (Netherlands)

    Amstel AR van; Kroeze C; Janssen LHJM; Olivier JGJ; Wal JT van der; LLO; LUW/WIMEK

    1997-01-01

    Bottom-up data for carbon dioxide, methane and nitrous oxide from the official national inventories (National Communications) were compared with data from EDGAR (Emission Database for Global Atmospheric Research) and top-down emission estimates, based on the results of dispersion and climate models

  9. Predicting Subjective Affective Salience from Cortical Responses to Invisible Object Stimuli.

    Science.gov (United States)

    Schmack, Katharina; Burk, Julia; Haynes, John-Dylan; Sterzer, Philipp

    2016-08-01

    The affective value of a stimulus substantially influences its potency to gain access to awareness. Here, we sought to elucidate the neural mechanisms underlying such affective salience in a combined behavioral and fMRI experiment. Healthy individuals with varying degrees of spider phobia were presented with pictures of spiders and flowers suppressed from view by continuous flash suppression. Applying multivoxel pattern analysis, we found that the average time that spider stimuli took relative to flowers to gain access to awareness in each participant could be decoded from fMRI signals evoked by suppressed spider versus flower stimuli in occipitotemporal and orbitofrontal cortex. Our results indicate neural signals during unconscious processing of complex visual information in orbitofrontal and ventral visual areas predict access to awareness of this information, suggesting a crucial role for these higher-level cortical regions in mediating affective salience. PMID:26232987

  10. A Statistical Method for Estimating Missing GHG Emissions in Bottom-Up Inventories: The Case of Fossil Fuel Combustion in Industry in the Bogota Region, Colombia

    Science.gov (United States)

    Jimenez-Pizarro, R.; Rojas, A. M.; Pulido-Guio, A. D.

    2012-12-01

    The development of environmentally, socially and financially suitable greenhouse gas (GHG) mitigation portfolios requires detailed disaggregation of emissions by activity sector, preferably at the regional level. Bottom-up (BU) emission inventories are intrinsically disaggregated, but although detailed, they are frequently incomplete. Missing and erroneous activity data are rather common in emission inventories of GHG, criteria and toxic pollutants, even in developed countries. The fraction of missing and erroneous data can be rather large in developing country inventories. In addition, the cost and time for obtaining or correcting this information can be prohibitive or can delay the inventory development. This is particularly true for regional BU inventories in the developing world. Moreover, a rather common practice is to disregard or to arbitrarily impute low default activity or emission values to missing data, which typically leads to significant underestimation of the total emissions. Our investigation focuses on GHG emissions by fossil fuel combustion in industry in the Bogota Region, composed by Bogota and its adjacent, semi-rural area of influence, the Province of Cundinamarca. We found that the BU inventories for this sub-category substantially underestimate emissions when compared to top-down (TD) estimations based on sub-sector specific national fuel consumption data and regional energy intensities. Although both BU inventories have a substantial number of missing and evidently erroneous entries, i.e. information on fuel consumption per combustion unit per company, the validated energy use and emission data display clear and smooth frequency distributions, which can be adequately fitted to bimodal log-normal distributions. This is not unexpected as industrial plant sizes are typically log-normally distributed. Moreover, our statistical tests suggest that industrial sub-sectors, as classified by the International Standard Industrial Classification (ISIC

  11. A bottom-up, vulnerability-based framework for identifying the adaptive capacity of water resources systems in a changing climate

    Science.gov (United States)

    Culley, Sam; Noble, Stephanie; Timbs, Michael; Yates, Adam; Giuliani, Matteo; Castelletti, Andrea; Maier, Holger; Westra, Seth

    2015-04-01

    Water resource system infrastructure and operating policies are commonly designed on the assumption that the statistics of future rainfall, temperature and other hydrometeorological variables are equal to those of the historical record. There is now substantial evidence demonstrating that this assumption is no longer valid, and that climate change will significantly impact water resources systems worldwide. Under different climatic inputs, the performance of these systems may degrade to a point where they become unable to meet the primary objectives for which they were built. In such a changing context, using existing infrastructure more efficiently - rather than planning additional infrastructure - becomes key to restore the system's performance at acceptable levels and minimize financial investments and associated risk. The traditional top-down approach for assessing climate change impacts relies on the use of a cascade of models from the global to the local scale. However, it is often difficult to utilize this top-down approach in a decision-making procedure, as there is disparity amongst various climate projections, arising from incomplete scientific understanding of the complicated processes and feedbacks within the climate system, and model limitations in reproducing those relationships. In contrast with this top-down approach, this study contributes a framework to identify the adaptive capacity of water resource systems under changing climatic conditions adopting a bottom-up, vulnerability-based approach. The performance of the current system management is first assessed for a comprehensive range of climatic conditions, which are independent of climate model forecasts. The adaptive capacity of the system is then estimated by re-evaluating the performance of a set of adaptive operating policies, which are optimized for each climatic condition under which the system is simulated. The proposed framework reverses the perspective by identifying water system

  12. Benefits of China's efforts in gaseous pollutant control indicated by the bottom-up emissions and satellite observations 2000-2014

    Science.gov (United States)

    Xia, Yinmin; Zhao, Yu; Nielsen, Chris P.

    2016-07-01

    To evaluate the effectiveness of national air pollution control policies, the emissions of SO2, NOX, CO and CO2 in China are estimated using bottom-up methods for the most recent 15-year period (2000-2014). Vertical column densities (VCDs) from satellite observations are used to test the temporal and spatial patterns of emissions and to explore the ambient levels of gaseous pollutants across the country. The inter-annual trends in emissions and VCDs match well except for SO2. Such comparison is improved with an optimistic assumption in emission estimation that the emission standards for given industrial sources issued after 2010 have been fully enforced. Underestimation of emission abatement and enhanced atmospheric oxidization likely contribute to the discrepancy between SO2 emissions and VCDs. As suggested by VCDs and emissions estimated under the assumption of full implementation of emission standards, the control of SO2 in the 12th Five-Year Plan period (12th FYP, 2011-2015) is estimated to be more effective than that in the 11th FYP period (2006-2010), attributed to improved use of flue gas desulfurization in the power sector and implementation of new emission standards in key industrial sources. The opposite was true for CO, as energy efficiency improved more significantly from 2005 to 2010 due to closures of small industrial plants. Iron & steel production is estimated to have had particularly strong influence on temporal and spatial patterns of CO. In contrast to fast growth before 2011 driven by increased coal consumption and limited controls, NOX emissions decreased from 2011 to 2014 due to the penetration of selective catalytic/non-catalytic reduction systems in the power sector. This led to reduced NO2 VCDs, particularly in relatively highly polluted areas such as the eastern China and Pearl River Delta regions. In developed areas, transportation is playing an increasingly important role in air pollution, as suggested by the increased ratio of NO2 to SO

  13. What is the role of dopamine in reward: hedonic impact, reward learning, or incentive salience?

    Science.gov (United States)

    Berridge, K C; Robinson, T E

    1998-12-01

    What roles do mesolimbic and neostriatal dopamine systems play in reward? Do they mediate the hedonic impact of rewarding stimuli? Do they mediate hedonic reward learning and associative prediction? Our review of the literature, together with results of a new study of residual reward capacity after dopamine depletion, indicates the answer to both questions is 'no'. Rather, dopamine systems may mediate the incentive salience of rewards, modulating their motivational value in a manner separable from hedonia and reward learning. In a study of the consequences of dopamine loss, rats were depleted of dopamine in the nucleus accumbens and neostriatum by up to 99% using 6-hydroxydopamine. In a series of experiments, we applied the 'taste reactivity' measure of affective reactions (gapes, etc.) to assess the capacity of dopamine-depleted rats for: 1) normal affect (hedonic and aversive reactions), 2) modulation of hedonic affect by associative learning (taste aversion conditioning), and 3) hedonic enhancement of affect by non-dopaminergic pharmacological manipulation of palatability (benzodiazepine administration). We found normal hedonic reaction patterns to sucrose vs. quinine, normal learning of new hedonic stimulus values (a change in palatability based on predictive relations), and normal pharmacological hedonic enhancement of palatability. We discuss these results in the context of hypotheses and data concerning the role of dopamine in reward. We review neurochemical, electrophysiological, and other behavioral evidence. We conclude that dopamine systems are not needed either to mediate the hedonic pleasure of reinforcers or to mediate predictive associations involved in hedonic reward learning. We conclude instead that dopamine may be more important to incentive salience attributions to the neural representations of reward-related stimuli. Incentive salience, we suggest, is a distinct component of motivation and reward. In other words, dopamine systems are necessary

  14. Color edge saliency boosting using natural image statistics

    NARCIS (Netherlands)

    D. Rojas Vigo; J. van de Weijer; T. Gevers

    2010-01-01

    State of the art methods for image matching, content-based retrieval and recognition use local features. Most of these still exploit only the luminance information for detection. The color saliency boosting algorithm has provided an efficient method to exploit the saliency of color edges based on in

  15. Improving Saliency Models by Predicting Human Fixation Patches

    KAUST Repository

    Dubey, Rachit

    2015-04-16

    There is growing interest in studying the Human Visual System (HVS) to supplement and improve the performance of computer vision tasks. A major challenge for current visual saliency models is predicting saliency in cluttered scenes (i.e. high false positive rate). In this paper, we propose a fixation patch detector that predicts image patches that contain human fixations with high probability. Our proposed model detects sparse fixation patches with an accuracy of 84 % and eliminates non-fixation patches with an accuracy of 84 % demonstrating that low-level image features can indeed be used to short-list and identify human fixation patches. We then show how these detected fixation patches can be used as saliency priors for popular saliency models, thus, reducing false positives while maintaining true positives. Extensive experimental results show that our proposed approach allows state-of-the-art saliency methods to achieve better prediction performance on benchmark datasets.

  16. The Social Salience Hypothesis of Oxytocin.

    Science.gov (United States)

    Shamay-Tsoory, Simone G; Abu-Akel, Ahmad

    2016-02-01

    Oxytocin is a nonapeptide that also serves as a neuromodulator in the human central nervous system. Over the last decade, a sizeable body of literature has examined its effects on social behavior in humans. These studies show that oxytocin modulates various aspects of social behaviors such as empathy, trust, in-group preference, and memory of socially relevant cues. Several theoretical formulations have attempted to explain the effects of oxytocin. The prosocial account argues that oxytocin mainly enhances affiliative prosocial behaviors; the fear/stress theory suggests that oxytocin affects social performance by attenuating stress; and the in-/out-group approach proposes that oxytocin regulates cooperation and conflict among humans in the context of intergroup relations. Nonetheless, accumulating evidence reveals that the effects of oxytocin are dependent on a variety of contextual aspects and the individual's characteristics and can induce antisocial effects including aggression and envy. In an attempt to reconcile these accounts, we suggest a theoretical framework that focuses on the overarching role of oxytocin in regulating the salience of social cues through its interaction with the dopaminergic system. Crucially, the salience effect modulates attention orienting responses to external contextual social cues (e.g., competitive vs. cooperative environment) but is dependent on baseline individual differences such as gender, personality traits, and degree of psychopathology. This view could have important implications for the therapeutic applications of oxytocin in conditions characterized with aberrant social behavior. PMID:26321019

  17. Greenhouse Gas Emission Accounting. Preliminary study as input to a joint Int. IPCC Expert Meeting / CKO-CCB Workshop on Comparison of Top-down versus Bottom-up Emission Estimates.

    OpenAIRE

    Amstel, van, R.; Kroeze, C.; Janssen, L.J.H.M.; Olivier, J. G. J.; Wal, van der, M.F.

    1997-01-01

    Bottom-up data for carbon dioxide, methane and nitrous oxide from the official national inventories (National Communications) were compared with data from EDGAR (Emission Database for Global Atmospheric Research) and top-down emission estimates, based on the results of dispersion and climate models using measured concentrations of greenhouse gases in the atmosphere. The aims of this preliminary study were to investigate the possibilities of comparing different types of emission inventories, t...

  18. Salience Effects in the North-West of England

    Directory of Open Access Journals (Sweden)

    Sandra Jansen

    2014-06-01

    Full Text Available The question of how we can define salience, what properties it includes and how we can quantify it have been discussed widely over the past thirty years but we still have more questions than answers about this phenomenon, e. g. not only how salience arises, but also how we can define it. However, despite the lack of a clear definition, salience is often taken into account as an explanatory factor in language change. The scientific discourse on salience has in most cases revolved around phonetic features, while hardly any variables on other linguistic levels have been investigated in terms of their salience. Hence, one goal of this paper is to argue for an expanded view of salience in the sociolinguistic context. This article investigates the variation and change of two groups of variables in Carlisle, an urban speech community in the north west of England. I analyse the variable (th and in particular the replacement of /θ/ with [f] which is widely known as th-fronting. The use of three discourse markers is also examined. Both groups of features will then be discussed in the light of sociolinguistic salience.

  19. On the Salience-Based Level-k Model

    OpenAIRE

    Wolff, Irenaeus

    2015-01-01

    In the current literature, there is a lively debate about whether a level-k model can be based on salience to explain behaviour in games with distinctive action labels such as hide-and-seek or discoordination games. This study presents six different experiments designed to measure salience. When based on any of these empirical salience measures, the standard level-k model does not explain hide-and-seek behaviour. Modifying the model such that players followsalience when payoffs are equal, the...

  20. Invariant Spectral Hashing of Image Saliency Graph

    CERN Document Server

    Taquet, Maxime; De Vleeschouwer, Christophe; Macq, Benoit

    2010-01-01

    Image hashing is the process of associating a short vector of bits to an image. The resulting summaries are useful in many applications including image indexing, image authentication and pattern recognition. These hashes need to be invariant under transformations of the image that result in similar visual content, but should drastically differ for conceptually distinct contents. This paper proposes an image hashing method that is invariant under rotation, scaling and translation of the image. The gist of our approach relies on the geometric characterization of salient point distribution in the image. This is achieved by the definition of a "saliency graph" connecting these points jointly with an image intensity function on the graph nodes. An invariant hash is then obtained by considering the spectrum of this function in the eigenvector basis of the Laplacian graph, that is, its graph Fourier transform. Interestingly, this spectrum is invariant under any relabeling of the graph nodes. The graph reveals geomet...

  1. Simultaneous modeling of visual saliency and value computation improves predictions of economic choice

    OpenAIRE

    Towal, R. Blythe; Mormann, Milica; Koch, Christof

    2013-01-01

    Many decisions we make require visually identifying and evaluating numerous alternatives quickly. These usually vary in reward, or value, and in low-level visual properties, such as saliency. Both saliency and value influence the final decision. In particular, saliency affects fixation locations and durations, which are predictive of choices. However, it is unknown how saliency propagates to the final decision. Moreover, the relative influence of saliency and value is unclear. Here we address...

  2. A ventral salience network in the macaque brain.

    Science.gov (United States)

    Touroutoglou, Alexandra; Bliss-Moreau, Eliza; Zhang, Jiahe; Mantini, Dante; Vanduffel, Wim; Dickerson, Bradford C; Barrett, Lisa Feldman

    2016-05-15

    Successful navigation of the environment requires attending and responding efficiently to objects and conspecifics with the potential to benefit or harm (i.e., that have value). In humans, this function is subserved by a distributed large-scale neural network called the "salience network". We have recently demonstrated that there are two anatomically and functionally dissociable salience networks anchored in the dorsal and ventral portions of the human anterior insula (Touroutoglou et al., 2012). In this paper, we test the hypothesis that these two subnetworks exist in rhesus macaques (Macaca mulatta). We provide evidence that a homologous ventral salience network exists in macaques, but that the connectivity of the dorsal anterior insula in macaques is not sufficiently developed as a dorsal salience network. The evolutionary implications of these finding are considered. PMID:26899785

  3. Diversification of visual media retrieval results using saliency detection

    Science.gov (United States)

    Muratov, Oleg; Boato, Giulia; De Natale, Franesco G. B.

    2013-03-01

    Diversification of retrieval results allows for better and faster search. Recently there has been proposed different methods for diversification of image retrieval results mainly utilizing text information and techniques imported from natural language processing domain. However, images contain visual information that is impossible to describe in text and the use of visual features is inevitable. Visual saliency is information about the main object of an image implicitly included by humans while creating visual content. For this reason it is naturally to exploit this information for the task of diversification of the content. In this work we study whether visual saliency can be used for the task of diversification and propose a method for re-ranking image retrieval results using saliency. The evaluation has shown that the use of saliency information results in higher diversity of retrieval results.

  4. Mortality salience increases personal relevance of the norm of reciprocity.

    Science.gov (United States)

    Schindler, Simon; Reinhard, Marc-André; Stahlberg, Dagmar

    2012-10-01

    Research on terror management theory found evidence that people under mortality salience strive to live up to salient cultural norms and values, like egalitarianism, pacifism, or helpfulness. A basic, strongly internalized norm in most human societies is the norm of reciprocity: people should support those who supported them (i.e., positive reciprocity), and people should injure those who injured them (i.e., negative reciprocity), respectively. In an experiment (N = 98; 47 women, 51 men), mortality salience overall significantly increased personal relevance of the norm of reciprocity (M = 4.45, SD = 0.65) compared to a control condition (M = 4.19, SD = 0.59). Specifically, under mortality salience there was higher motivation to punish those who treated them unfavourably (negative norm of reciprocity). Unexpectedly, relevance of the norm of positive reciprocity remained unaffected by mortality salience. Implications and limitations are discussed. PMID:23234099

  5. Multi-scale saliency search in image analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Slepoy, Alexander; Campisi, Anthony; Backer, Alejandro

    2005-10-01

    Saliency detection in images is an important outstanding problem both in machine vision design and the understanding of human vision mechanisms. Recently, seminal work by Itti and Koch resulted in an effective saliency-detection algorithm. We reproduce the original algorithm in a software application Vision and explore its limitations. We propose extensions to the algorithm that promise to improve performance in the case of difficult-to-detect objects.

  6. Modelling Saliency Awareness for Objective Video Quality Assessment

    OpenAIRE

    Engelke, Ulrich; Barkowsky, Marcus; Callet, Patrick Le; Zepernick, Hans-Jürgen

    2010-01-01

    Existing video quality metrics do usually not take into consideration that spatial regions in video frames are of varying saliency and thus, differently attract the viewer's attention. This paper proposes a model of saliency awareness to complement existing video quality metrics, with the aim to improve the agreement of objectively predicted quality with subjectively rated quality. For this purpose, we conducted a subjective experiment in which human observers rated the annoyance of vide...

  7. Properties of V1 neurons tuned to conjunctions of visual features: application of the V1 saliency hypothesis to visual search behavior.

    Directory of Open Access Journals (Sweden)

    Li Zhaoping

    Full Text Available From a computational theory of V1, we formulate an optimization problem to investigate neural properties in the primary visual cortex (V1 from human reaction times (RTs in visual search. The theory is the V1 saliency hypothesis that the bottom-up saliency of any visual location is represented by the highest V1 response to it relative to the background responses. The neural properties probed are those associated with the less known V1 neurons tuned simultaneously or conjunctively in two feature dimensions. The visual search is to find a target bar unique in color (C, orientation (O, motion direction (M, or redundantly in combinations of these features (e.g., CO, MO, or CM among uniform background bars. A feature singleton target is salient because its evoked V1 response largely escapes the iso-feature suppression on responses to the background bars. The responses of the conjunctively tuned cells are manifested in the shortening of the RT for a redundant feature target (e.g., a CO target from that predicted by a race between the RTs for the two corresponding single feature targets (e.g., C and O targets. Our investigation enables the following testable predictions. Contextual suppression on the response of a CO-tuned or MO-tuned conjunctive cell is weaker when the contextual inputs differ from the direct inputs in both feature dimensions, rather than just one. Additionally, CO-tuned cells and MO-tuned cells are often more active than the single feature tuned cells in response to the redundant feature targets, and this occurs more frequently for the MO-tuned cells such that the MO-tuned cells are no less likely than either the M-tuned or O-tuned neurons to be the most responsive neuron to dictate saliency for an MO target.

  8. Properties of V1 neurons tuned to conjunctions of visual features: application of the V1 saliency hypothesis to visual search behavior.

    Science.gov (United States)

    Zhaoping, Li; Zhe, Li

    2012-01-01

    From a computational theory of V1, we formulate an optimization problem to investigate neural properties in the primary visual cortex (V1) from human reaction times (RTs) in visual search. The theory is the V1 saliency hypothesis that the bottom-up saliency of any visual location is represented by the highest V1 response to it relative to the background responses. The neural properties probed are those associated with the less known V1 neurons tuned simultaneously or conjunctively in two feature dimensions. The visual search is to find a target bar unique in color (C), orientation (O), motion direction (M), or redundantly in combinations of these features (e.g., CO, MO, or CM) among uniform background bars. A feature singleton target is salient because its evoked V1 response largely escapes the iso-feature suppression on responses to the background bars. The responses of the conjunctively tuned cells are manifested in the shortening of the RT for a redundant feature target (e.g., a CO target) from that predicted by a race between the RTs for the two corresponding single feature targets (e.g., C and O targets). Our investigation enables the following testable predictions. Contextual suppression on the response of a CO-tuned or MO-tuned conjunctive cell is weaker when the contextual inputs differ from the direct inputs in both feature dimensions, rather than just one. Additionally, CO-tuned cells and MO-tuned cells are often more active than the single feature tuned cells in response to the redundant feature targets, and this occurs more frequently for the MO-tuned cells such that the MO-tuned cells are no less likely than either the M-tuned or O-tuned neurons to be the most responsive neuron to dictate saliency for an MO target. PMID:22719829

  9. Disgust sensitivity predicts defensive responding to mortality salience.

    Science.gov (United States)

    Kelley, Nicholas J; Crowell, Adrienne L; Tang, David; Harmon-Jones, Eddie; Schmeichel, Brandon J

    2015-10-01

    Disgust protects the physical self. The present authors suggest that disgust also contributes to the protection of the psychological self by fostering stronger defensive reactions to existential concerns. To test this idea, 3 studies examined the link between disgust sensitivity and defensive responses to mortality salience or "terror management" processes (Greenberg, Solomon, & Pyszczynski, 1997). Each study included an individual difference measure of disgust sensitivity, a manipulation of mortality salience, and a dependent measure of defensive responding. In Study 1, disgust sensitivity predicted increases in worldview defense in the mortality salience condition but not in the control condition. In Study 2, disgust sensitivity predicted increases in optimistic perceptions of the future in the mortality salience condition but not in the control condition. In Study 3, disgust sensitivity predicted reductions in delay discounting for those in the mortality salience condition such that those higher in disgust sensitivity discounted the future less. This pattern did not occur in the control condition. These findings highlight disgust sensitivity as a key to understanding reactions to mortality salience, and they support the view that disgust-related responses protect against both physical (e.g., noxious substances) and psychological threats. PMID:25775230

  10. Development Of A Web Service And Android 'APP' For The Distribution Of Rainfall Data. A Bottom-Up Remote Sensing Data Mining And Redistribution Project In The Age Of The 'Web 2.0'

    Science.gov (United States)

    Mantas, Vasco M.; Pereira, A. J. S. C.; Liu, Zhong

    2013-12-01

    A project was devised to develop a set of freely available applications and web services that can (1) simplify access from Mobile Devices to TOVAS data and (2) support the development of new datasets through data repackaging and mash-up. The bottom-up approach enables the multiplication of new services, often of limited direct interest to the organizations that produces the original, global datasets, but significant to small, local users. Through this multiplication of services, the development cost is transferred to the intermediate or end users and the entire process is made more efficient, even allowing new players to use the data in innovative ways.

  11. What drives farmers to make top-down or bottom-up adaptation to climate change and fluctuations? A comparative study on 3 cases of apple farming in Japan and South Africa.

    Directory of Open Access Journals (Sweden)

    Mariko Fujisawa

    Full Text Available Agriculture is one of the most vulnerable sectors to climate change. Farmers have been exposed to multiple stressors including climate change, and they have managed to adapt to those risks. The adaptation actions undertaken by farmers and their decision making are, however, only poorly understood. By studying adaptation practices undertaken by apple farmers in three regions: Nagano and Kazuno in Japan and Elgin in South Africa, we categorize the adaptation actions into two types: farmer initiated bottom-up adaptation and institution led top-down adaptation. We found that the driver which differentiates the type of adaptation likely adopted was strongly related to the farmers' characteristics, particularly their dependence on the institutions, e.g. the farmers' cooperative, in selling their products. The farmers who rely on the farmers' cooperative for their sales are likely to adopt the institution-led adaptation, whereas the farmers who have established their own sales channels tend to start innovative actions by bottom-up. We further argue that even though the two types have contrasting features, the combinations of the both types of adaptations could lead to more successful adaptation particularly in agriculture. This study also emphasizes that more farm-level studies for various crops and regions are warranted to provide substantial feedbacks to adaptation policy.

  12. Learning a Combined Model of Visual Saliency for Fixation Prediction.

    Science.gov (United States)

    Wang, Jingwei; Borji, Ali; Jay Kuo, C-C; Itti, Laurent

    2016-04-01

    A large number of saliency models, each based on a different hypothesis, have been proposed over the past 20 years. In practice, while subscribing to one hypothesis or computational principle makes a model that performs well on some types of images, it hinders the general performance of a model on arbitrary images and large-scale data sets. One natural approach to improve overall saliency detection accuracy would then be fusing different types of models. In this paper, inspired by the success of late-fusion strategies in semantic analysis and multi-modal biometrics, we propose to fuse the state-of-the-art saliency models at the score level in a para-boosting learning fashion. First, saliency maps generated by several models are used as confidence scores. Then, these scores are fed into our para-boosting learner (i.e., support vector machine, adaptive boosting, or probability density estimator) to generate the final saliency map. In order to explore the strength of para-boosting learners, traditional transformation-based fusion strategies, such as Sum, Min, and Max, are also explored and compared in this paper. To further reduce the computation cost of fusing too many models, only a few of them are considered in the next step. Experimental results show that score-level fusion outperforms each individual model and can further reduce the performance gap between the current models and the human inter-observer model. PMID:26829792

  13. Visual saliency in MPEG-4 AVC video stream

    Science.gov (United States)

    Ammar, M.; Mitrea, M.; Hasnaoui, M.; Le Callet, P.

    2015-03-01

    Visual saliency maps already proved their efficiency in a large variety of image/video communication application fields, covering from selective compression and channel coding to watermarking. Such saliency maps are generally based on different visual characteristics (like color, intensity, orientation, motion,…) computed from the pixel representation of the visual content. This paper resumes and extends our previous work devoted to the definition of a saliency map solely extracted from the MPEG-4 AVC stream syntax elements. The MPEG-4 AVC saliency map thus defined is a fusion of static and dynamic map. The static saliency map is in its turn a combination of intensity, color and orientation features maps. Despite the particular way in which all these elementary maps are computed, the fusion techniques allowing their combination plays a critical role in the final result and makes the object of the proposed study. A total of 48 fusion formulas (6 for combining static features and, for each of them, 8 to combine static to dynamic features) are investigated. The performances of the obtained maps are evaluated on a public database organized at IRCCyN, by computing two objective metrics: the Kullback-Leibler divergence and the area under curve.

  14. Fast and Conspicuous? Quantifying Salience With the Theory of Visual Attention

    Science.gov (United States)

    Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid

    2016-01-01

    Particular differences between an object and its surrounding cause salience, guide attention, and improve performance in various tasks. While much research has been dedicated to identifying which feature dimensions contribute to salience, much less regard has been paid to the quantitative strength of the salience caused by feature differences. Only a few studies systematically related salience effects to a common salience measure, and they are partly outdated in the light of new findings on the time course of salience effects. We propose Bundesen’s Theory of Visual Attention (TVA) as a theoretical basis for measuring salience and introduce an empirical and modeling approach to link this theory to data retrieved from temporal-order judgments. With this procedure, TVA becomes applicable to a broad range of salience-related stimulus material. Three experiments with orientation pop-out displays demonstrate the feasibility of the method. A 4th experiment substantiates its applicability to the luminance dimension. PMID:27168868

  15. Moving Foreground Detection Based On Spatio-temporal Saliency

    Directory of Open Access Journals (Sweden)

    Yang Xia

    2013-01-01

    Full Text Available Detection of moving foreground in video is very important for many applications, such as visual surveillance, object-based video coding, etc. When objects move with different speeds and under illumination changes, the robustness of moving object detection methods proposed so far is still not satisfactory. In this paper, we use the semantic information to adjust the pixel-wise learning rate adaptively for more robust detection performance, which are obtained by spatial saliency map based on Gaussian mixture model (GMM in luma space and temporal saliency map obtained by background subtraction. In addition, we design a two-pass background estimation framework, in which the initial estimation is used for temporal saliency estimation, and the other is to detect foreground and update model parameters. The experimental results show that our method can achieve better moving object extraction performance than the existing background subtraction method based on GMM.

  16. The Electrophysiological Signature of Motivational Salience in Mice and Implications for Schizophrenia

    OpenAIRE

    Moessnang, Carolin; Habel, Ute; Schneider, Frank; Siegel, Steven J.

    2012-01-01

    According to the aberrant-salience hypothesis, attribution of motivational salience is severely disrupted in patients with schizophrenia. To provide a translational approach for investigating underlying mechanisms, neural correlates of salience attribution were examined in normal mice and in a MK-801 model of schizophrenia. Electrophysiological responses to standard and deviant tones were assessed in the medial prefrontal cortex (mPFC) using an auditory oddball paradigm. Motivational salience...

  17. Land Cover Change Detection Using Saliency Andwavelet Transformation

    Science.gov (United States)

    Zhang, Haopeng; Jiang, Zhiguo; Cheng, Yan

    2016-06-01

    How to obtain accurate difference map remains an open challenge in change detection. To tackle this problem, we propose a change detection method based on saliency detection and wavelet transformation. We do frequency-tuned saliency detection in initial difference image (IDI) obtained by logarithm ratio to get a salient difference image (SDI). Then, we calculate local entropy of SDI to obtain an entropic salient difference image (ESDI). The final difference image (FDI) is the wavelet fusion of IDI and ESDI, and Otsu thresholding is used to extract difference map from FDI. Experimental results validate the effectiveness and feasibility.

  18. Distribution of attention modulates salience signals in early visual cortex

    OpenAIRE

    Mulckhuyse, M.; Belopolsky, A.V.; Heslenfeld, D.J.; Talsma, D.; Theeuwes, J

    2011-01-01

    Previous research has shown that the extent to which people spread attention across the visual field plays a crucial role in visual selection and the occurrence of bottom-up driven attentional capture. Consistent with previous findings, we show that when attention was diffusely distributed across the visual field while searching for a shape singleton, an irrelevant salient color singleton captured attention. However, while using the very same displays and task, no capture was observed when ob...

  19. Transmitting the sum of all fears: Iranian nuclear threat salience among offspring of Holocaust survivors.

    Science.gov (United States)

    Shrira, Amit

    2015-07-01

    Many Israelis are preoccupied with the prospect of a nuclear-armed Iran, frequently associating it with the danger of annihilation that existed during the Holocaust. The current article examined whether offspring of Holocaust survivors (OHS) are especially preoccupied and sensitive to the Iranian threat, and whether this susceptibility is a part of their increased general image of actual and potential threats, defined as the hostile world scenario (HWS). Study 1 (N = 106) showed that relative to comparisons, OHS reported more preoccupation with the Iranian nuclear threat. Moreover, the positive relationship between the salience of the Iranian threat and symptoms of anxiety was stronger among OHS. Study 2 (N = 450) replicated these findings, while focusing on the Iranian nuclear threat salience and symptoms of psychological distress. It further showed that OHS reported more negative engagement with the HWS (i.e., feeling that surrounding threats decrease one's sense of competence), which in turn mediated their increased preoccupation with the Iranian threat. The results suggest that intergenerational transmission of the Holocaust trauma includes heightened preoccupation with and sensitivity to potential threats of annihilation, and that the specific preoccupation with threats of annihilation reflects a part of a more general preoccupation with surrounding threats. PMID:25793401

  20. 浅谈自底向上的Shell脚本编程及效率优化%A Brief Talk about Bottom-up Shell Script Programming and Efficiency Optimization

    Institute of Scientific and Technical Information of China (English)

    江松波; 倪子伟

    2011-01-01

    Shell scripts run in the interpreter mode, which has always been inefficient. Inefficient design will further affect the efficiency performance of the Shell script. This paper analyzes the characteristics of Shell language and its applications and proposes a "bottom up to Shell scripting" idea based on the perspective of hierarchical design. At the same time, this paper puts forward a comprehensive method, that is from the "external system environment" to “internal execution model", to mastering Shell utilities.The case fully proves that the idea and methods about bottom-up Shell scripting can effectively improve the efficiency of the script.%低效的Shell脚本设计会进一步影响原本解释器模式下并不见长的程序运行效率,使其在面对大数据量文本分析时的资源和时间消:耗变得难以接受.本文通过分析Shell语言及其应用需求的特点,从分层设计的角度提出"自底向上进行Shen脚本编程"的理论,同时提出从"外部系统环境"到"内部执行模式"全面地掌握工具软件的方法.实例充分证明,自底向上的Shell脚本编程思想及方法能够有效提高脚本的执行效率.

  1. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    Science.gov (United States)

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset. PMID:26762649

  2. Bottom Up Project Cost and Risk Modeling

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcosm along with its partners HRP Systems, End-to-End Analytics, and ARES Corporation (unfunded in Phase I), propose to develop a new solution for detailed data...

  3. Bottom-up Experiments and Concrete Utopias

    DEFF Research Database (Denmark)

    Andersson, Lasse

    2010-01-01

    Artiklen undersøger hvorledes brugerdrevne experimenter kan udfordre den standardiserede erhvervsorienterede udgave af Oplevelsesbyen og via eksperimentet stimulerer lokalt forankrede og demokratiske udgaver af en oplevelses- og vidensbaseret by....

  4. Milk bottom-up proteomics: method optimisation.

    Directory of Open Access Journals (Sweden)

    Delphine eVincent

    2016-01-01

    Full Text Available Milk is a complex fluid whose proteome displays a diverse set of proteins of high abundance such as caseins and medium to low abundance whey proteins such as ß-lactoglobulin, lactoferrin, immunoglobulins, glycoproteins, peptide hormones and enzymes. A sample preparation method that enables high reproducibility and throughput is key in reliably identifying proteins present or proteins responding to conditions such as a diet, health or genetics. Using skim milk samples from Jersey and Holstein-Friesian cows, we compared three extraction procedures which have not previously been applied to samples of cows’ milk. Method A (urea involved a simple dilution of the milk in a urea-based buffer, method B (TCA/acetone involved a trichloroacetic acid (TCA/acetone precipitation and method C (methanol/chloroform involved a tri-phasic partition method in chloroform/methanol solution. Protein assays, SDS-PAGE profiling, and trypsin digestion followed by nanoHPLC-electrospray ionisation-tandem mass spectrometry (nLC-ESI-MS/MS analyses were performed to assess their efficiency. Replicates were used at each analytical step (extraction, digestion, injection to assess reproducibility. Mass spectrometry (MS data are available via ProteomeXchange with identifier PXD002529. Overall 186 unique accessions, major and minor proteins, were identified with a combination of methods. Method C (methanol/chloroform yielded the best resolved SDS-patterns and highest protein recovery rates, method A (urea yielded the greatest number of accessions, and, of the three procedures, method B (TCA/acetone was the least compatible of all with a wide range of downstream analytical procedures. Our results also highlighted breed differences between the proteins in milk of Jersey and Holstein-Friesian cows.

  5. Bottom-up tailoring of photonic nanofibers

    DEFF Research Database (Denmark)

    Balzer, Frank; Madsen, Morten; Frese, Ralf; Schiek, Manuela; Tamulevicius, Thomas; Tamulevicius, Sigitas; Rubahn, Horst-Günter

    2008-01-01

    Aligned ensembles of nanoscopic nanofibers from organic molecules such as para-phenylenes for photonic applications can be fabricated by self-assembled molecular growth on a suited dielectric substrate. Epitaxy together with alignment due to electric surface fields determines the growth directions....... In this paper we demonstrate how aligned growth along arbitrary directions can be realized by depositing the molecules on a micro-structured and gold covered Silicon surface, consisting of channels and ridges. For the correct combination of ridge width and deposition temperature fibers grow...

  6. Big data from the bottom up

    OpenAIRE

    Couldry, Nick; Powell, Alison

    2014-01-01

    This short article argues that an adequate response to the implications for governance raised by ‘Big Data’ requires much more attention to agency and reflexivity than theories of ‘algorithmic power’ have so far allowed. It develops this through two contrasting examples: the sociological study of social actors used of analytics to meet their own social ends (for example, by community organisations) and the study of actors’ attempts to build an economy of information more open to civic interve...

  7. Promoting education from the bottom up

    Science.gov (United States)

    Page, Martin

    2010-02-01

    I am not an academic, just a foot soldier: I help out with the Children's University and as a Schools Science Ambassador, giving talks and demonstrations in physics. The recent £40m cuts in the budget of the UK's Science and Technology Facilities Council (January p6) have created quite a stir in the academic community, with talk of a "brain drain". Those in the penthouse should come down to the basement, where they would see just how thin and fragile the scientific foundation is.

  8. Bottom-up approach to spatial datamining

    OpenAIRE

    Künzi, Christophe; Stoffel, Kilian

    2013-01-01

    One of the goals of computer vision research is to design systems that provide human-like visual capabilities such that a certain environment can be sensed and interpreted to take appropriate actions. Among the different forms available to represent such an environment, the 3D point cloud (unstructured collection of points in a three dimensional space) rises a lot of challenging problems. Moreover, the number of 3D data collection drastically increased in recent years, as improvements in the ...

  9. Research and Development from the bottom up

    DEFF Research Database (Denmark)

    Brem, Alexander; Wolfram, P.

    2014-01-01

    ecological context or the growing interest of developed market firms in approaches from emerging markets. Hence, the presented framework supports further research in new paradigms for research and development (R&D) in developed market firms (DMFs), particularly in relation to emerging markets. This framework...... enables scholars to compare concepts from developed and emerging markets, to address studies specifically by using consistent terms, and to advance research into the concepts according their characterization....... introduced consisting of three core dimensions: sophistication, sustainability, and emerging market orientation. On the basis of these dimensions, analogies and distinctions between the terms are identified and general tendencies are explored such as the increasing importance of sustainability in social and...

  10. Mobile Handsets from the Bottom Up

    DEFF Research Database (Denmark)

    Wallis, Cara; Linchuan Qiu, Jack; Ling, Richard

    2013-01-01

    The setting could be a hole-in-the-wall that serves as a shop in a narrow alley in Guangzhou, a cart on a dusty street on the outskirts of Accra, a bustling marketplace in Mexico City, or a tiny storefront near downtown Los Angeles’ garment district. At such locales, men and women hawk an array o...... low-income, largely immigrant communities in cities in the developed world....

  11. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands.

    Science.gov (United States)

    Yam, Rita S W; Fan, Yen-Tzu; Wang, Tzu-Ting

    2016-03-01

    Pomacea canaliculata (Ampullariidae) has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents. PMID:26927135

  12. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands

    Directory of Open Access Journals (Sweden)

    Rita S. W. Yam

    2016-02-01

    Full Text Available Pomacea canaliculata (Ampullariidae has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents.

  13. Correct primary structure assessment and extensive glyco-profiling of cetuximab by a combination of intact, middle-up, middle-down and bottom-up ESI and MALDI mass spectrometry techniques.

    Science.gov (United States)

    Ayoub, Daniel; Jabs, Wolfgang; Resemann, Anja; Evers, Waltraud; Evans, Catherine; Main, Laura; Baessmann, Carsten; Wagner-Rousset, Elsa; Suckau, Detlev; Beck, Alain

    2013-01-01

    The European Medicines Agency received recently the first marketing authorization application for a biosimilar monoclonal antibody (mAb) and adopted the final guidelines on biosimilar mAbs and Fc-fusion proteins. The agency requires high similarity between biosimilar and reference products for approval. Specifically, the amino acid sequences must be identical. The glycosylation pattern of the antibody is also often considered to be a very important quality attribute due to its strong effect on quality, safety, immunogenicity, pharmacokinetics and potency. Here, we describe a case study of cetuximab, which has been marketed since 2004. Biosimilar versions of the product are now in the pipelines of numerous therapeutic antibody biosimilar developers. We applied a combination of intact, middle-down, middle-up and bottom-up electrospray ionization and matrix assisted laser desorption ionization mass spectrometry techniques to characterize the amino acid sequence and major post-translational modifications of the marketed cetuximab product, with special emphasis on glycosylation. Our results revealed a sequence error in the reported sequence of the light chain in databases and in publications, thus highlighting the potency of mass spectrometry to establish correct antibody sequences. We were also able to achieve a comprehensive identification of cetuximab's glycoforms and glycosylation profile assessment on both Fab and Fc domains. Taken together, the reported approaches and data form a solid framework for the comparability of antibodies and their biosimilar candidates that could be further applied to routine structural assessments of these and other antibody-based products. PMID:23924801

  14. A top-down / bottom-up approach for multi-actors and multi-criteria assessment of mining projects for sustainable development. Application on Arlit Uranium mines (Niger)

    International Nuclear Information System (INIS)

    This thesis aims to appraise the relevance of using an hybrid top-down / bottom-up approach to evaluate mining projects in the perspective of sustainable development. With the advent of corporate social responsibility and sustainable development concepts, new social expectations have appeared towards companies that go beyond a sole requirement of profit earning capacity. If companies do not answer to these expectations, they risk to lose their social legitimacy. Traditionally associated with social, environmental, economical and political impacts and risks, mining activity is particularly concerned by these new issues. Whereas mineral resources needs have never been so high, mining companies are now expected to limit their negative effects and to take into account their different audiences' expectations in order to define, together, the terms of their social license to operate. Considering the diversity of issues, scales, actors and contexts, the challenge is real and necessitates tools to better understand issues and to structure dialogues. Based on the Uranium mines of Arlit (Niger) case study, this work shows that associating participatory approaches to structuration tools and literature propositions, appears as an efficient formula to better organize issues diversity and to build a structured dialogue between mining companies and their stakeholders. First Part aims to present the theoretical, institutional and sectorial contexts of the thesis. Second Part exposes work and results of the evaluation carried out in Niger. And, Third Part, shows the conclusions that can be derived from this work and presents a proposal for an evaluation framework, potentially applicable to other mining sites. (author)

  15. Bottom-up electrochemical preparation of solid-state carbon nanodots directly from nitriles/ionic liquids using carbon-free electrodes and the applications in specific ferric ion detection and cell imaging.

    Science.gov (United States)

    Niu, Fushuang; Xu, Yuanhong; Liu, Mengli; Sun, Jing; Guo, Pengran; Liu, Jingquan

    2016-03-14

    Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often required. Herein, solid-state C-dots were simply prepared by bottom-up EC carbonization of nitriles (e.g. acetonitrile) in the presence of an ionic liquid [e.g. 1-butyl-3-methylimidazolium hexafluorophosphate (BMIMPF6)], using carbon-free electrodes. Due to the positive charges of BMIM(+) on the C-dots, the final products presented in a precipitate form on the cathode, and the unreacted nitriles and BMIMPF6 can be easily removed by simple vacuum filtration. The as-prepared solid-state C-dots can be well dispersed in an aqueous medium with excellent photoluminescence properties. The average size of the C-dots was found to be 3.02 ± 0.12 nm as evidenced by transmission electron microscopy. Other techniques such as UV-vis spectroscopy, fluorescence spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy were applied for the characterization of the C-dots and to analyze the possible generation mechanism. These C-dots have been successfully applied in efficient cell imaging and specific ferric ion detection. PMID:26891173

  16. Of wealth and death: materialism, mortality salience, and consumption behavior.

    Science.gov (United States)

    Kasser, T; Sheldon, K M

    2000-07-01

    Theoretical work suggests that feelings of insecurity produce materialistic behavior, but most empirical evidence is correlational in nature. We therefore experimentally activated feelings of insecurity by having some subjects write short essays about death (mortality-salience condition). In Study 1, subjects in the mortality-salience condition, compared with subjects who wrote about a neutral topic, had higher financial expectations for themselves 15 years in the future, in terms of both their overall worth and the amount they would be spending on pleasurable items such as clothing and entertainment. Study 2 extended these findings by demonstrating that subjects exposed to death became more greedy and consumed more resources in a forest-management game. Results are discussed with regard to humanistic and terror-management theories of materialism. PMID:11273398

  17. Exploiting Surroundedness for Saliency Detection: A Boolean Map Approach.

    Science.gov (United States)

    Zhang, Jianming; Sclaroff, Stan

    2016-05-01

    We demonstrate the usefulness of surroundedness for eye fixation prediction by proposing a Boolean Map based Saliency model (BMS). In our formulation, an image is characterized by a set of binary images, which are generated by randomly thresholding the image's feature maps in a whitened feature space. Based on a Gestalt principle of figure-ground segregation, BMS computes a saliency map by discovering surrounded regions via topological analysis of Boolean maps. Furthermore, we draw a connection between BMS and the Minimum Barrier Distance to provide insight into why and how BMS can properly captures the surroundedness cue via Boolean maps. The strength of BMS is verified by its simplicity, efficiency and superior performance compared with 10 state-of-the-art methods on seven eye tracking benchmark datasets. PMID:26336114

  18. Image based Monument Recognition using Graph based Visual Saliency

    DEFF Research Database (Denmark)

    Kalliatakis, Grigorios; Triantafyllidis, Georgios

    2013-01-01

    This article presents an image-based application aiming at simple image classification of well-known monuments in the area of Heraklion, Crete, Greece. This classification takes place by utilizing Graph Based Visual Saliency (GBVS) and employing Scale Invariant Feature Transform (SIFT) or Speeded...... images have been previously processed according to the Graph Based Visual Saliency model in order to keep either SIFT or SURF features corresponding to the actual monuments while the background “noise” is minimized. The application is then able to classify these images, helping the user to better...... Up Robust Features (SURF). For this purpose, images taken at various places of interest are being compared to an existing database containing images of these places at different angles and zoom. The time required for the matching progress in such application is an important element. To this goal, the...

  19. Visual Saliency and Attention as Random Walks on Complex Networks

    CERN Document Server

    Costa, L F

    2006-01-01

    The unmatched versatility of vision in mammals is totally dependent on purposive eye movements and selective attention guided by saliencies in the presented images. The current article shows how concepts and tools from the areas of random walks, Markov chains, complex networks and artificial image analysis can be naturally combined in order to provide a unified and biologically plausible model for saliency detection and visual attention, which become indistinguishable in the process. Images are converted into complex networks by considering pixels as nodes while connections are established in terms of fields of influence defined by visual features such as tangent fields induced by luminance contrasts, distance, and size. Random walks are performed on such networks in order to emulate attentional shifts and even eye movements in the case of large shapes, and the frequency of visits to each node is conveniently obtained from the eigenequation defined by the stochastic matrix associated to the respectively drive...

  20. Saliency and eye movements in the perception of natural scenes

    OpenAIRE

    Foulsham, Thomas

    2008-01-01

    Humans inspect the environment around them by selecting a sequence of locations to fixate which will provide information about the scene. How are these locations chosen? The saliency map model suggests that points in the scene are represented topographically and that the likelihood of them being fixated depends on low-level feature contrast. This model makes specific predictions about the way people will move their eyes when looking at natural scenes, although there are few experimental te...

  1. Saliency of product origin information in consumer choices

    OpenAIRE

    Dmitrović, Tanja; Vida, Irena

    2015-01-01

    In light of increasing market complexities resulting from globalization of consumer products/brands in both mature and transitional markets, this study examines the saliency of information related to the national origin of products in consumer choice behavior. Specifically, effects of consumer ethnocentrism and the type of product/service category on perceived importance of product origin information are investigated. Analysis of data collected on a sample of adult consumers suggests that sal...

  2. Saliency extraction with a distributed spiking neural network

    OpenAIRE

    Chevallier, Sylvain; Tarroux, Philippe; Paugam-Moisy, Hélène

    2006-01-01

    We present a distributed spiking neuron network (SNN) for handling low-level visual perception in order to extract salient locations in robot camera images. We describe a new method which reduce the computional load of the whole system, stemming from our choices of architecture. We also describe a modeling of post-synaptic potential, which allows to quickly compute the contribution of a sum of incoming spikes to a neuron's membrane potential. The interests of this saliency extraction method, ...

  3. Multiview saliency detection based on improved multimanifold ranking

    Science.gov (United States)

    Shi, Yanjiao; Yi, Yugen; Zhang, Ke; Kong, Jun; Zhang, Ming; Wang, Jianzhong

    2014-11-01

    As an important problem in computer vision, saliency detection is essential for image segmentation, super-resolution, object recognition, and so on. We propose a saliency detection method for images. Instead of using contrast between salient regions and their surrounding areas, both cues from salient and nonsalient regions are considered in our study. Based on these cues, an improved multimanifold ranking algorithm is proposed. In our algorithm, features from multiple views are utilized and the different contributions of these multiview features are taken into account. Moreover, an iterative updating optimization scheme is explored to solve the objective function, during which the feature fusion is performed. After two-stage ranking by the improved multimanifold ranking algorithm, each image patch can be assigned a ranking score, which determines the final saliency. The proposed method is evaluated on four public datasets and is compared with the state-of-the-art methods. Experimental results indicate that the proposed method outperforms existing schemes both in qualitative and quantitative comparisons.

  4. Key Object Discovery and Tracking Based on Context-Aware Saliency

    Directory of Open Access Journals (Sweden)

    Geng Zhang

    2013-01-01

    Full Text Available In this paper, we propose an online key object discovery and tracking system based on visual saliency. We formulate the problem as a temporally consistent binary labelling task on a conditional random field and solve it by using a particle filter. We also propose a context‐aware saliency measurement, which can be used to improve the accuracy of any static or dynamic saliency maps. Our refined saliency maps provide clearer indications as to where the key object lies. Based on good saliency cues, we can further segment the key object inside the resulting bounding box, considering the spatial and temporal context. We tested our system extensively on different video clips. The results show that our method has significantly improved the saliency maps and tracks the key object accurately.

  5. Low-Complexity Saliency Detection Algorithm for Fast Perceptual Video Coding

    Directory of Open Access Journals (Sweden)

    Pengyu Liu

    2013-01-01

    Full Text Available A low-complexity saliency detection algorithm for perceptual video coding is proposed; low-level encoding information is adopted as the characteristics of visual perception analysis. Firstly, this algorithm employs motion vector (MV to extract temporal saliency region through fast MV noise filtering and translational MV checking procedure. Secondly, spatial saliency region is detected based on optimal prediction mode distributions in I-frame and P-frame. Then, it combines the spatiotemporal saliency detection results to define the video region of interest (VROI. The simulation results validate that the proposed algorithm can avoid a large amount of computation work in the visual perception characteristics analysis processing compared with other existing algorithms; it also has better performance in saliency detection for videos and can realize fast saliency detection. It can be used as a part of the video standard codec at medium-to-low bit-rates or combined with other algorithms in fast video coding.

  6. Abnormal salience signaling in schizophrenia: the role of integrative beta oscillations

    OpenAIRE

    Liddle, Elizabeth B.; Price, Darren; Palaniyappan, Lena; Brookes, Matthew J.; Robson, Siân E.; Hall, Emma L.; Morris, Peter G; Liddle, Peter F.

    2016-01-01

    Aberrant salience attribution and cerebral dysconnectivity both have strong evidential support as core dysfunctions in schizophrenia. Aberrant salience arising from an excess of dopamine activity has been implicated in delusions and hallucinations, exaggerating the significance of everyday occurrences and thus leading to perceptual distortions and delusional causal inferences. Meanwhile, abnormalities in key nodes of a salience brain network have been implicated in other characteristic sympto...

  7. The Application of Visual Saliency Models in Objective Image Quality Assessment: A Statistical Evaluation.

    Science.gov (United States)

    Zhang, Wei; Borji, Ali; Wang, Zhou; Le Callet, Patrick; Liu, Hantao

    2016-06-01

    Advances in image quality assessment have shown the potential added value of including visual attention aspects in its objective assessment. Numerous models of visual saliency are implemented and integrated in different image quality metrics (IQMs), but the gain in reliability of the resulting IQMs varies to a large extent. The causes and the trends of this variation would be highly beneficial for further improvement of IQMs, but are not fully understood. In this paper, an exhaustive statistical evaluation is conducted to justify the added value of computational saliency in objective image quality assessment, using 20 state-of-the-art saliency models and 12 best-known IQMs. Quantitative results show that the difference in predicting human fixations between saliency models is sufficient to yield a significant difference in performance gain when adding these saliency models to IQMs. However, surprisingly, the extent to which an IQM can profit from adding a saliency model does not appear to have direct relevance to how well this saliency model can predict human fixations. Our statistical analysis provides useful guidance for applying saliency models in IQMs, in terms of the effect of saliency model dependence, IQM dependence, and image distortion dependence. The testbed and software are made publicly available to the research community. PMID:26277009

  8. Bottom-up electrochemical preparation of solid-state carbon nanodots directly from nitriles/ionic liquids using carbon-free electrodes and the applications in specific ferric ion detection and cell imaging

    Science.gov (United States)

    Niu, Fushuang; Xu, Yuanhong; Liu, Mengli; Sun, Jing; Guo, Pengran; Liu, Jingquan

    2016-03-01

    Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often required. Herein, solid-state C-dots were simply prepared by bottom-up EC carbonization of nitriles (e.g. acetonitrile) in the presence of an ionic liquid [e.g. 1-butyl-3-methylimidazolium hexafluorophosphate (BMIMPF6)], using carbon-free electrodes. Due to the positive charges of BMIM+ on the C-dots, the final products presented in a precipitate form on the cathode, and the unreacted nitriles and BMIMPF6 can be easily removed by simple vacuum filtration. The as-prepared solid-state C-dots can be well dispersed in an aqueous medium with excellent photoluminescence properties. The average size of the C-dots was found to be 3.02 +/- 0.12 nm as evidenced by transmission electron microscopy. Other techniques such as UV-vis spectroscopy, fluorescence spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy were applied for the characterization of the C-dots and to analyze the possible generation mechanism. These C-dots have been successfully applied in efficient cell imaging and specific ferric ion detection.Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often

  9. Integrating top-down and bottom-up approaches to design a cost-effective and equitable programme of measures for adaptation of a river basin to global change

    Science.gov (United States)

    Girard, Corentin; Rinaudo, Jean-Daniel; Pulido-Velazquez, Manuel

    2016-04-01

    Adaptation to the multiple facets of global change challenges the conventional means of sustainably planning and managing water resources at the river basin scale. Numerous demand or supply management options are available, from which adaptation measures need to be selected in a context of high uncertainty of future conditions. Given the interdependency of water users, agreements need to be found at the local level to implement the most effective adaptation measures. Therefore, this work develops an approach combining economics and water resources engineering to select a cost-effective programme of adaptation measures in the context of climate change uncertainty, and to define an equitable allocation of the cost of the adaptation plan between the stakeholders involved. A framework is developed to integrate inputs from the two main approaches commonly used to plan for adaptation. The first, referred to as "top-down", consists of a modelling chain going from global greenhouse gases emission scenarios to local hydrological models used to assess the impact of climate change on water resources. Conversely, the second approach, called "bottom-up", starts from assessing vulnerability at the local level to then identify adaptation measures used to face an uncertain future. The methodological framework presented in this contribution relies on a combination of these two approaches to support the selection of adaptation measures at the local level. Outcomes from these two approaches are integrated to select a cost-effective combination of adaptation measures through a least-cost optimization model developed at the river basin scale. The performances of a programme of measures are assessed under different climate projections to identify cost-effective and least-regret adaptation measures. The issue of allocating the cost of the adaptation plan is considered through two complementary perspectives. The outcome of a negotiation process between the stakeholders is modelled through

  10. A top-down / bottom-up approach for multi-actors and multi-criteria assessment of mining projects for sustainable development. Application on Arlit Uranium mines (Niger); Une demarche Top-Down / Bottom-Up pour l'evaluation en termes multicriteres et multi-acteurs des projets miniers dans l'optique du developpement durable. Application sur les mines d'Uranium d'Arlit (Niger)

    Energy Technology Data Exchange (ETDEWEB)

    Chamaret, A

    2007-06-15

    This thesis aims to appraise the relevance of using an hybrid top-down / bottom-up approach to evaluate mining projects in the perspective of sustainable development. With the advent of corporate social responsibility and sustainable development concepts, new social expectations have appeared towards companies that go beyond a sole requirement of profit earning capacity. If companies do not answer to these expectations, they risk to lose their social legitimacy. Traditionally associated with social, environmental, economical and political impacts and risks, mining activity is particularly concerned by these new issues. Whereas mineral resources needs have never been so high, mining companies are now expected to limit their negative effects and to take into account their different audiences' expectations in order to define, together, the terms of their social license to operate. Considering the diversity of issues, scales, actors and contexts, the challenge is real and necessitates tools to better understand issues and to structure dialogues. Based on the Uranium mines of Arlit (Niger) case study, this work shows that associating participatory approaches to structuration tools and literature propositions, appears as an efficient formula to better organize issues diversity and to build a structured dialogue between mining companies and their stakeholders. First Part aims to present the theoretical, institutional and sectorial contexts of the thesis. Second Part exposes work and results of the evaluation carried out in Niger. And, Third Part, shows the conclusions that can be derived from this work and presents a proposal for an evaluation framework, potentially applicable to other mining sites. (author)

  11. Sex ratio influences the motivational salience of facial attractiveness

    OpenAIRE

    Hahn, A. C; Fisher, C. I.; DeBruine, L. M.; Jones, B. C.

    2014-01-01

    The sex ratio of the local population influences mating-related behaviours in many species. Recent experiments show that male-biased sex ratios increase the amount of financial resources men will invest in potential mates, suggesting that sex ratios influence allocation of mating effort in humans. To investigate this issue further, we tested for effects of cues to the sex ratio of the local population on the motivational salience of attractiveness in own-sex and opposite-sex faces. We did thi...

  12. Preserving visual saliency in image to sound substitution systems

    Science.gov (United States)

    Ancuti, Codruta O.; Ancuti, Cosmin; Bekaert, Philippe

    2009-02-01

    Color plays a significant role in the scene interpretation in terms of visual perception. Numerous visual substitution systems deal with grayscale images disregarding this information from original image. Visually percept color-based details often fade due to the grayscale conversion and that can mislead the overall comprehension of the considered scene. We present a decolorization method that considers color contrast and preserve color saliency after transformation. We exploit this model to enhance the perception of visually disable persons over the interpreted images by the substitution system. The results demonstrate that our enhance system is capable to improves the overall scene interpretation in comparison with similar substitution system.

  13. Do stakeholder management strategy and salience influence corporate social responsibility in Indian companies?

    OpenAIRE

    Supriti Mishra; Damodar Suar

    2010-01-01

    Purpose – This study aims to examine whether strategy towards primary stakeholders and their salience influence corporate social responsibility towards the corresponding stakeholders. Design/methodology/approach – Data were collected through a questionnaire from 150 senior level managers including CEOs. The stakeholder management strategy, salience, and corporate social responsibility were assessed in the context of employees, customers, investors, community, natural environment, and supplier...

  14. Transformational leadership and employees career salience; an empirical study conducted on banks of Pakistan

    OpenAIRE

    Tabassum Riaz; Muhammad Ramzan; Hafiz Muhammad Ishaq; Muhammad Umair Akram

    2012-01-01

    The following study examines the relationship between transformational leadership and employees’ career salience. This research is conducted to answer the question that whether employees’ career salience has association with transformational leadership. This study focuses only on banking sector. Transformational leadership is measured using its four dimensions i.e. idealized influence, intellectual stimulation, inspirational motivation and individualized considerations, relationship is determ...

  15. How Important Are Items on a Student Evaluation? A Study of Item Salience

    Science.gov (United States)

    Hills, Stacey Barlow; Naegle, Natali; Bartkus, Kenneth R.

    2009-01-01

    Although student evaluations of teaching (SETs) have been the subject of numerous research studies, the salience of SET items to students has not been examined. In the present study, the authors surveyed 484 students from a large public university. The authors suggest that not all items are viewed equally and that measures of item salience can…

  16. "Always in My Face": An Exploration of Social Class Consciousness, Salience, and Values

    Science.gov (United States)

    Martin, Georgianna L.

    2015-01-01

    This qualitative study explores social class consciousness, salience, and values of White, low-income, first-generation college students. Overall, participants minimized the salience of social class as an aspect of their identity with many of them expressing that they did not want their social class to define them. Although participants largely…

  17. 应用于Bottom-up蛋白质鉴定的质谱数据采集策略研究进展%Data Acquisition Strategy for Mass Spectrometers Applied to Bottom-up-based Protein Identification

    Institute of Scientific and Technical Information of China (English)

    徐长明; 张纪阳; 张伟; 谢红卫

    2013-01-01

    The high complexity of the proteome has brought great challenges to mass spectrometry-based protein identification.The technical requirements continuously forward the development of mass spectrometry.The advances in hardware and software of instrument platform provide more choices and supports for protein identification.However,it is necessary to design high-quality data acquisition strategy,which is heavyly dependent on the specific biological problem and the sample,to make the best use of the performance of the instrument.Here,the data acquisition strategy that has been developed for mass spectrometers in high throughput protein identification was reviewed.The simple repetitions,ion exclusion,ion inclusion,online intelligent data acquisition and segmented scanning technology for Bottom-up strategy were highlighted,and the impact of these strategies on the protein identification was concerned.Finally,the advantages and disadvantages of various strategies were summarized,and the future directions of developing the data acquisition strategy for mass spectrometers were discussed.%蛋白质组的高度复杂性给基于质谱的蛋白质鉴定提出了很大的挑战.技术需求促进质谱技术不断向前发展.仪器平台在软硬件方面的进步,为高通量蛋白质鉴定提供了更多选择和支撑.但是,仪器性能的充分发挥,还需要根据生物学问题的需求和分析样本的特性,设计高质量的数据采集策略.本文对目前高通量蛋白质鉴定中已开发的质谱数据采集策略进行了综述,重点介绍了Bottom-up策略中使用的简单重复、离子排除和监测、在线智能化扫描和分段扫描等技术,并关注了这些策略对高通量蛋白质鉴定的影响,总结了各种策略的优缺点并展望了其未来发展方向.

  18. Target detection method based on supervised saliency map and efficient subwindow search

    Science.gov (United States)

    Liu, Songtao; Jiang, Ning; Liu, Zhenxing

    2015-10-01

    In order to realize fast target detection under complex image scene, a novel method is proposed based on supervised saliency map and efficient subwindow search. Supervised saliency map generation mainly includes: (1) the original image is segmented by different parameters to obtain multi-segmentation results; (2) regional feature is mapped for salient value by random forest regressor; (3) obtain saliency map by fusing multi-level segmentation results. Efficient subwindow search method is implemented by transforming salient target detection as maximum saliency density, and using branch and bound algorithm to localize the maximum saliency density in global optimum. The experimental results show that the new method can not only detect salient region, but also recognize this region in some extent.

  19. To Reveal or To Cloak? Effects of Identity Salience on Stereotype Threat Responses in Avatar-Represented Group Contexts

    Directory of Open Access Journals (Sweden)

    Jong-Eun Roselyn Lee

    2009-01-01

    Full Text Available With rapid advances in digital technologies, the popularity of avatars — digital representations of people in computer-mediated environments — is growing. Avatars allow people to visually represent their offline social identity, or selectively render certain layer(s of their social identity less identifiable or unidentifiable in online environments. The present research investigated how African Americans, whose racial identity often suffers negative stereotyping, responded to stereotype threat when they performed a stereotype-relevant task with 2 ostensible coactors in an avatar-represented group setting. A 2 x 2 between-participants design manipulated salience of racial identity (salient vs. nonsalient and performance context (competition vs. cooperation, and assessed the extent to which participants persisted on an extremely challenging stereotype-relevant task (unsolvable anagram. The results showed that in the context of competition, participants in the race-nonsalient avatar group persisted significantly longer on the unsolvable anagram than did those in the race-salient avatar group; however, in the context of cooperation, no significant difference was found between the 2 avatar groups. The findings indicate that the effects of identity salience as varied by different types of avatars (identity-revealing vs. identity-cloaking on identity-associated threat may be moderated by the contexts of performance in which the target individuals are situated.

  20. Inherent Difference in Saliency for Generators with Different PM Materials

    Directory of Open Access Journals (Sweden)

    Sandra Eriksson

    2014-01-01

    Full Text Available The inherent differences between salient and nonsalient electrical machines are evaluated for two permanent magnet generators with different configurations. The neodymium based (NdFeB permanent magnets (PMs in a generator are substituted with ferrite magnets and the characteristics of the NdFeB generator and the ferrite generator are compared through FEM simulations. The NdFeB generator is a nonsalient generator, whereas the ferrite machine is a salient-pole generator, with small saliency. The two generators have almost identical properties at rated load operation. However, at overload the behaviour differs between the two generators. The salient-pole, ferrite generator has lower maximum torque than the NdFeB generator and a larger voltage drop at high current. It is concluded that, for applications where overload capability is important, saliency must be considered and the generator design adapted according to the behaviour at overload operation. Furthermore, if the maximum torque is the design criteria, additional PM mass will be required for the salient-pole machine.

  1. Emotional salience, emotional awareness, peculiar beliefs, and magical thinking.

    Science.gov (United States)

    Berenbaum, Howard; Boden, M Tyler; Baker, John P

    2009-04-01

    Two studies with college student participants (Ns = 271 and 185) tested whether peculiar beliefs and magical thinking were associated with (a) the emotional salience of the stimuli about which individuals may have peculiar beliefs or magical thinking, (b) attention to emotion, and (c) clarity of emotion. Study 1 examined belief that a baseball team was cursed. Study 2 measured magical thinking using a procedure developed by P. Rozin and C. Nemeroff (2002). In both studies, peculiar beliefs and magical thinking were associated with Salience x Attention x Clarity interactions. Among individuals for whom the objects of the belief-magical thinking were highly emotionally salient and who had high levels of attention to emotion, higher levels of emotional clarity were associated with increased peculiar beliefs-magical thinking. In contrast, among individuals for whom the objects of the belief-magical thinking were not emotionally salient and who had high levels of attention to emotion, higher levels of emotional clarity were associated with diminished peculiar beliefs-magical thinking. PMID:19348532

  2. Perceptual Object Extraction Based on Saliency and Clustering

    Directory of Open Access Journals (Sweden)

    Qiaorong Zhang

    2010-08-01

    Full Text Available Object-based visual attention has received an increasing interest in recent years. Perceptual object is the basic attention unit of object-based visual attention. The definition and extraction of perceptual objects is one of the key technologies in object-based visual attention computation model. A novel perceptual object definition and extraction method is proposed in this paper. Based on Gestalt theory and visual feature integration theory, perceptual object is defined using homogeneity region, salient region and edges. An improved saliency map generating algorithm is employed first. Based on the saliency map, salient edges are extracted. Then graph-based clustering algorithm is introduced to get homogeneity regions in the image. Finally an integration strategy is adopted to combine salient edges and homogeneity regions to extract perceptual objects. The proposed perceptual object extraction method has been tested on lots of natural images. Experiment results and analysis are presented in this paper also. Experiment results show that the proposed method is reasonable and valid.

  3. Scalable mobile image retrieval by exploring contextual saliency.

    Science.gov (United States)

    Yang, Xiyu; Qian, Xueming; Xue, Yao

    2015-06-01

    Nowadays, it is very convenient to capture photos by a smart phone. As using, the smart phone is a convenient way to share what users experienced anytime and anywhere through social networks, it is very possible that we capture multiple photos to make sure the content is well photographed. In this paper, an effective scalable mobile image retrieval approach is proposed by exploring contextual salient information for the input query image. Our goal is to explore the high-level semantic information of an image by finding the contextual saliency from multiple relevant photos rather than solely using the input image. Thus, the proposed mobile image retrieval approach first determines the relevant photos according to visual similarity, then mines salient features by exploring contextual saliency from multiple relevant images, and finally determines contributions of salient features for scalable retrieval. Compared with the existing mobile-based image retrieval approaches, our approach requires less bandwidth and has better retrieval performance. We can carry out retrieval with retrieval. Experimental results show the effectiveness of the proposed approach. PMID:25775488

  4. Correspondences among pupillary dilation response, subjective salience of sounds, and loudness.

    Science.gov (United States)

    Liao, Hsin-I; Kidani, Shunsuke; Yoneya, Makoto; Kashino, Makio; Furukawa, Shigeto

    2016-04-01

    A pupillary dilation response is known to be evoked by salient deviant or contrast auditory stimuli, but so far a direct link between it and subjective salience has been lacking. In two experiments, participants listened to various environmental sounds while their pupillary responses were recorded. In separate sessions, participants performed subjective pairwise-comparison tasks on the sounds with respect to their salience, loudness, vigorousness, preference, beauty, annoyance, and hardness. The pairwise-comparison data were converted to ratings on the Thurstone scale. The results showed a close link between subjective judgments of salience and loudness. The pupil dilated in response to the sound presentations, regardless of sound type. Most importantly, this pupillary dilation response to an auditory stimulus positively correlated with the subjective salience, as well as the loudness, of the sounds (Exp. 1). When the loudnesses of the sounds were identical, the pupil responses to each sound were similar and were not correlated with the subjective judgments of salience or loudness (Exp. 2). This finding was further confirmed by analyses based on individual stimulus pairs and participants. In Experiment 3, when salience and loudness were manipulated by systematically changing the sound pressure level and acoustic characteristics, the pupillary dilation response reflected the changes in both manipulated factors. A regression analysis showed a nearly perfect linear correlation between the pupillary dilation response and loudness. The overall results suggest that the pupillary dilation response reflects the subjective salience of sounds, which is defined, or is heavily influenced, by loudness. PMID:26163191

  5. The electrophysiological signature of motivational salience in mice and implications for schizophrenia.

    Science.gov (United States)

    Moessnang, Carolin; Habel, Ute; Schneider, Frank; Siegel, Steven J

    2012-12-01

    According to the aberrant-salience hypothesis, attribution of motivational salience is severely disrupted in patients with schizophrenia. To provide a translational approach for investigating underlying mechanisms, neural correlates of salience attribution were examined in normal mice and in a MK-801 model of schizophrenia. Electrophysiological responses to standard and deviant tones were assessed in the medial prefrontal cortex (mPFC) using an auditory oddball paradigm. Motivational salience was induced by aversive conditioning to the deviant tone. Analysis of the auditory evoked potential (AEP) showed selective modulation of the late frontal negativity (LFN) by motivational salience, which persisted throughout a 4-week delay. MK-801, an N-methyl-D-aspartic acid receptor antagonist, abolished this differential response to motivational salience in conditioned mice. In contrast, a pronounced LFN response was observed towards the deviant, ie, perceptually salient tone, in nonconditioned mice. The finding of a selective modulation of a late frontal slow wave suggests increased top-down processing and emotional evaluation of motivationally salient stimuli. In particular, the LFN is discussed as the mouse analog to the human stimulus preceding negativity, which reflects preparatory processes in anticipation of reward or punishment. MK-801 led to a disruption of the normal response in conditioned and nonconditioned mice, including an aberrantly increased LFN in nonconditioned mice. This pattern of 'false-negative' and 'false-positive' responses suggests a degradation of salience attribution, which points to mPFC responses to be relevant for translational research on cognitive alterations in schizophrenia. PMID:22910459

  6. Salience and Attention in Surprisal-Based Accounts of Language Processing

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  7. Salience and attention in surprisal-based accounts of language processing

    Directory of Open Access Journals (Sweden)

    Alessandra eZarcone

    2016-06-01

    Full Text Available The notion of salience has been singled out as the explanatory factor for a diverse range oflinguistic phenomena. In particular, perceptual salience (e.g. visual salience of objects in the world,acoustic prominence of linguistic sounds and semantic-pragmatic salience (e.g. prominence ofrecently mentioned or topical referents have been shown to influence language comprehensionand production. A different line of research has sought to account for behavioral correlates ofcognitive load during comprehension as well as for certain patterns in language usage usinginformation-theoretic notions, such as surprisal. Surprisal and salience both affect languageprocessing at different levels, but the relationship between the two has not been adequatelyelucidated, and the question of whether salience can be reduced to surprisal / predictability isstill open. Our review identifies two main challenges in addressing this question: terminologicalinconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalise upon work in visual cognition inorder to orient ourselves in surveying the different facets of the notion of salience in linguisticsand their relation with models of surprisal. We find that work on salience highlights aspects oflinguistic communication that models of surprisal tend to overlook, namely the role of attentionand relevance to current goals, and we argue that the Predictive Coding framework provides aunified view which can account for the role played by attention and predictability at different levelsof processing and which can clarify the interplay between low and high levels of processes andbetween predictability-driven expectation and attention-driven focus.

  8. Salience and Attention in Surprisal-Based Accounts of Language Processing.

    Science.gov (United States)

    Zarcone, Alessandra; van Schijndel, Marten; Vogels, Jorrig; Demberg, Vera

    2016-01-01

    The notion of salience has been singled out as the explanatory factor for a diverse range of linguistic phenomena. In particular, perceptual salience (e.g., visual salience of objects in the world, acoustic prominence of linguistic sounds) and semantic-pragmatic salience (e.g., prominence of recently mentioned or topical referents) have been shown to influence language comprehension and production. A different line of research has sought to account for behavioral correlates of cognitive load during comprehension as well as for certain patterns in language usage using information-theoretic notions, such as surprisal. Surprisal and salience both affect language processing at different levels, but the relationship between the two has not been adequately elucidated, and the question of whether salience can be reduced to surprisal / predictability is still open. Our review identifies two main challenges in addressing this question: terminological inconsistency and lack of integration between high and low levels of representations in salience-based accounts and surprisal-based accounts. We capitalize upon work in visual cognition in order to orient ourselves in surveying the different facets of the notion of salience in linguistics and their relation with models of surprisal. We find that work on salience highlights aspects of linguistic communication that models of surprisal tend to overlook, namely the role of attention and relevance to current goals, and we argue that the Predictive Coding framework provides a unified view which can account for the role played by attention and predictability at different levels of processing and which can clarify the interplay between low and high levels of processes and between predictability-driven expectation and attention-driven focus. PMID:27375525

  9. Learning to predict where human gaze is using quaternion DCT based regional saliency detection

    Science.gov (United States)

    Li, Ting; Xu, Yi; Zhang, Chongyang

    2014-09-01

    Many current visual attention approaches used semantic features to accurately capture human gaze. However, these approaches demand high computational cost and can hardly be applied to daily use. Recently, some quaternion-based saliency detection models, such as PQFT (phase spectrum of Quaternion Fourier Transform), QDCT (Quaternion Discrete Cosine Transform), have been proposed to meet real-time requirement of human gaze tracking tasks. However, current saliency detection methods used global PQFT and QDCT to locate jump edges of the input, which can hardly detect the object boundaries accurately. To address the problem, we improved QDCT-based saliency detection model by introducing superpixel-wised regional saliency detection mechanism. The local smoothness of saliency value distribution is emphasized to distinguish noises of background from salient regions. Our algorithm called saliency confidence can distinguish the patches belonging to the salient object and those of the background. It decides whether the image patches belong to the same region. When an image patch belongs to a region consisting of other salient patches, this patch should be salient as well. Therefore, we use saliency confidence map to get background weight and foreground weight to do the optimization on saliency map obtained by QDCT. The optimization is accomplished by least square method. The optimization approach we proposed unifies local and global saliency by combination of QDCT and measuring the similarity between each image superpixel. We evaluate our model on four commonly-used datasets (Toronto, MIT, OSIE and ASD) using standard precision-recall curves (PR curves), the mean absolute error (MAE) and area under curve (AUC) measures. In comparison with most state-of-art models, our approach can achieve higher consistency with human perception without training. It can get accurate human gaze even in cluttered background. Furthermore, it achieves better compromise between speed and accuracy.

  10. Metabolic mapping reveals sex-dependent involvement of default mode and salience network in alexithymia.

    Science.gov (United States)

    Colic, L; Demenescu, L R; Li, M; Kaufmann, J; Krause, A L; Metzger, C; Walter, M

    2016-02-01

    Alexithymia, a personality construct marked by difficulties in processing one's emotions, has been linked to the altered activity in the anterior cingulate cortex (ACC). Although longitudinal studies reported sex differences in alexithymia, what mediates them is not known. To investigate sex-specific associations of alexithymia and neuronal markers, we mapped metabolites in four brain regions involved differentially in emotion processing using a point-resolved spectroscopy MRS sequence in 3 Tesla. Both sexes showed negative correlations between alexithymia and N-acetylaspartate (NAA) in pregenual ACC (pgACC). Women showed a robust negative correlation of the joint measure of glutamate and glutamine (Glx) to NAA in posterior cingulate cortex (PCC), whereas men showed a weak positive association of Glx to NAA in dorsal ACC (dACC). Our results suggest that lowered neuronal integrity in pgACC, a region of the default mode network (DMN), might primarily account for the general difficulties in emotional processing in alexithymia. Association of alexithymia in women extends to another region in the DMN-PCC, while in men a region in the salience network (SN) was involved. These observations could be representative of sex specific regulation strategies that include diminished internal evaluation of feelings in women and cognitive emotion suppression in men. PMID:26341904

  11. Mortality salience enhances racial in-group bias in empathic neural responses to others' suffering.

    Science.gov (United States)

    Li, Xiaoyang; Liu, Yi; Luo, Siyang; Wu, Bing; Wu, Xinhuai; Han, Shihui

    2015-09-01

    Behavioral research suggests that mortality salience (MS) leads to increased in-group identification and in-group favoritism in prosocial behavior. What remains unknown is whether and how MS influences brain activity that mediates emotional resonance with in-group and out-group members and is associated with in-group favoritism in helping behavior. The current work investigated MS effects on empathic neural responses to racial in-group and out-group members' suffering. Experiments 1 and 2 respectively recorded event related potentials (ERPs) and blood oxygen level dependent signals to pain/neutral expressions of Asian and Caucasian faces from Chinese adults who had been primed with MS or negative affect (NA). Experiment 1 found that an early frontal/central activity (P2) was more strongly modulated by pain vs. neutral expressions of Asian than Caucasian faces, but this effect was not affected by MS vs. NA priming. However, MS relative to NA priming enhanced racial in-group bias in long-latency neural response to pain expressions over the central/parietal regions (P3). Experiment 2 found that MS vs. NA priming increased racial in-group bias in empathic neural responses to pain expression in the anterior and mid-cingulate cortex. Our findings indicate that reminding mortality enhances brain activity that differentiates between racial in-group and out-group members' emotional states and suggest a neural basis of in-group favoritism under mortality threat. PMID:26074201

  12. Hopelessly Mortal: The Role of Mortality Salience, Immortality and Trait Self-esteem in Personal Hope

    OpenAIRE

    Wisman, Arnaud; Heflick, Nathan A

    2015-01-01

    Do people lose hope when thinking about death? Based on Terror Management Theory, we predicted that thoughts of death (i.e., mortality salience) would reduce personal hope for people low, but not high, in self-esteem, and that this reduction in hope would be ameliorated by promises of immortality. In Studies 1 and 2, mortality salience reduced personal hope for people low in self-esteem, but not for people high in self-esteem. In Study 3, mortality salience reduced hope for people low in self...

  13. Incentive salience attribution under reward uncertainty: A Pavlovian model.

    Science.gov (United States)

    Anselme, Patrick

    2015-02-01

    There is a vast literature on the behavioural effects of partial reinforcement in Pavlovian conditioning. Compared with animals receiving continuous reinforcement, partially rewarded animals typically show (a) a slower development of the conditioned response (CR) early in training and (b) a higher asymptotic level of the CR later in training. This phenomenon is known as the partial reinforcement acquisition effect (PRAE). Learning models of Pavlovian conditioning fail to account for it. In accordance with the incentive salience hypothesis, it is here argued that incentive motivation (or 'wanting') plays a more direct role in controlling behaviour than does learning, and reward uncertainty is shown to have an excitatory effect on incentive motivation. The psychological origin of that effect is discussed and a computational model integrating this new interpretation is developed. Many features of CRs under partial reinforcement emerge from this model. PMID:25444780

  14. Mortality salience increases defensive distancing from people with terminal cancer.

    Science.gov (United States)

    Smith, Lauren M; Kasser, Tim

    2014-01-01

    Based on principles of terror management theory, the authors hypothesized that participants would distance more from a target person with terminal cancer than from a target with arthritis, and that this effect would be stronger following mortality salience. In Study 1, adults rated how similar their personalities were to a target person; in Study 2, participants arranged two chairs in preparation for meeting the target person. Both studies found that distancing from the person with terminal cancer increased after participants wrote about their own death (vs. giving a speech). Thus, death anxiety may explain why people avoid close contact with terminally ill people; further analyses suggest that gender and self-esteem may also influence such distancing from the terminally ill. PMID:24521045

  15. Decisive Visual Saliency and Consumers' In-store Decisions

    DEFF Research Database (Denmark)

    Clement, Jesper; Aastrup, Jesper; Forsberg, Signe Charlotte

    2015-01-01

    that they are affected by the visual stimuli in the store. The objective for this paper is to investigate the visual saliency from two factors: 1) in-store signage and 2) placement of products. This is done by a triangulation method where we utilize data from an eye-track study and sales data from...... grocery stores. The first study takes place in laboratory settings with a simulated purchase situation, and the second research design builds on manipulated in-store settings and data from real purchases. We found optimal placement of two comparable goods (branded good and private label) to increase...... visual attention and sale for both goods. The use of signage increases visual attention and sale as well, yet only for the product that the label addressed, implying a cannibalization effect. The limitation of the study and implications for retail managers and for brand owners are discussed....

  16. The time course of color- and luminance-based salience effects.

    Directory of Open Access Journals (Sweden)

    Isabel C Dombrowe

    2010-11-01

    Full Text Available Salient objects in the visual field attract our attention. Recent work in the orientation domain has shown that the effects of the relative salience of two singleton elements on covert visual attention disappear over time. The present study aims to investigate how salience derived from color and luminance differences affects covert selection. In two experiments, observers indicated the location of a probe which was presented at different stimulus-onset-asynchronies after the presentation of a singleton display containing a homogeneous array of oriented lines and two distinct color singletons (Experiment 1 or luminance singletons (Experiment 2. The results show that relative singleton salience from luminance and color differences, just as from orientation differences, affects covert visual attention in a brief time span after stimulus onset. The mere presence of an object, however, can affect covert attention for a longer time span regardless of salience.

  17. Development of salience-driven and visually-guided eye movement responses.

    Science.gov (United States)

    Kooiker, Marlou J G; van der Steen, Johannes; Pel, Johan J M

    2016-03-01

    Development of visuospatial attention can be quantified from infancy onward using visually-guided eye movement responses. We investigated the interaction between eye movement response times and salience in target areas of visual stimuli over age in a cohort of typically developing children. A preferential looking (PL) paradigm consisting of stimuli with six different visual modalities (cartoons, contrast, form, local motion, color, global motion) was combined with the automated measurement of reflexive eye movements. Effective salience was defined as visual salience of each target area relative to its background. Three classes of PL stimuli were used: with high- (cartoon, contrast), intermediate- (local motion, form), and low-effective salience (global motion, color). Eye movement response times to the target areas of the six PL stimuli were nonverbally assessed in 220 children aged 1-12 years. The development of response times with age was influenced by effective salience: Response times to targets with high salience reached stable values earlier in development (around 4 years of age) than to targets with low salience (around 9 years of age). Intra-individual response time variability was highest for low-salient stimuli, and stabilized later (around 4 years) than for highly salient stimuli (2 years). The improvement of eye movement response times to visual modalities in PL stimuli occurred earlier in development for highly salient than for low-salient targets. The present age-dependent and salience-related results provide a quantitative and theoretical framework to assess the development of visuospatial attention, and of related visual processing capacities, in children from 1 year of age. PMID:26998802

  18. Hypergraph-based saliency map generation with potential region-of-interest approximation and validation

    Science.gov (United States)

    Liang, Zhen; Fu, Hong; Chi, Zheru; Feng, Dagan

    2012-01-01

    A novel saliency model is proposed in this paper to automatically process images in the similar way as the human visual system which focuses on conspicuous regions that catch human beings' attention. The model combines a hypergraph representation and a partitioning process with potential region-of-interest (p-ROI) approximation and validation. Experimental results demonstrate that the proposed method shows considerable improvement in the performance of saliency map generation.

  19. Evidence Inhibition Responds Reactively to the Salience of Distracting Information during Focused Attention

    OpenAIRE

    Natalie Wyatt; Liana Machado

    2013-01-01

    Along with target amplification, distractor inhibition is regarded as a major contributor to selective attention. Some theories suggest that the strength of inhibitory processing is proportional to the salience of the distractor (i.e., inhibition reacts to the distractor intensity). Other theories suggest that the strength of inhibitory processing does not depend on the salience of the distractor (i.e., inhibition does not react to the distractor intensity). The present study aimed to elucida...

  20. The Time Course of Color- and Luminance-Based Salience Effects

    OpenAIRE

    MiekeDonk

    2010-01-01

    Salient objects in the visual field attract our attention. Recent work in the orientation domain has shown that the effects of the relative salience of two singleton elements on covert visual attention disappear over time. The present study aims to investigate how salience derived from color and luminance differences affects covert selection. In two experiments, observers indicated the location of a probe which was presented at different stimulus-onset-asynchronies after the presentation of a...

  1. OBJECTIVE PREDICTION OF VISUAL SALIENCY MAPS IN EGOCENTRIC VIDEOS FOR CONTENT-ACTION INTERPRETATION

    OpenAIRE

    Boujut, Hugo; Buso, Vincent; Benois-Pineau, Jenny

    2013-01-01

    Extraction of visual saliency from video is in the focus of in- tensive research nowadays due to the variety and importance of application areas. In this paper we study the relation be- tween subjective saliency maps, recorded on the basis of gaze- tracker data in a new upcoming video content: the egocentric video recorded with wearable cameras. On the basis of phys- iological research and comparing the subjective maps of an Actor performing activities of everyday life and a Viewer who interp...

  2. Competition of synonyms through time : Conceptual and social salience factors and their interrelations

    OpenAIRE

    Soares da Silva, Augusto

    2015-01-01

    This paper highlights three theoretical and descriptive insights into synonymy and lexical variation and change: (1) the diachronic development of synonymous forms reveals essential aspects about the nature and motivations of synonymy; (2) the emergence and competition of synonymous forms can either result from conceptual salience factors or from social salience factors; (3) synonym competition sheds light upon processes of language variation and change. Focusing on the interplay between conc...

  3. DeepSaliency: Multi-Task Deep Neural Network Model for Salient Object Detection.

    Science.gov (United States)

    Li, Xi; Zhao, Liming; Wei, Lina; Yang, Ming-Hsuan; Wu, Fei; Zhuang, Yueting; Ling, Haibin; Wang, Jingdong

    2016-08-01

    A key problem in salient object detection is how to effectively model the semantic properties of salient objects in a data-driven manner. In this paper, we propose a multi-task deep saliency model based on a fully convolutional neural network with global input (whole raw images) and global output (whole saliency maps). In principle, the proposed saliency model takes a data-driven strategy for encoding the underlying saliency prior information, and then sets up a multi-task learning scheme for exploring the intrinsic correlations between saliency detection and semantic image segmentation. Through collaborative feature learning from such two correlated tasks, the shared fully convolutional layers produce effective features for object perception. Moreover, it is capable of capturing the semantic information on salient objects across different levels using the fully convolutional layers, which investigate the feature-sharing properties of salient object detection with a great reduction of feature redundancy. Finally, we present a graph Laplacian regularized nonlinear regression model for saliency refinement. Experimental results demonstrate the effectiveness of our approach in comparison with the state-of-the-art approaches. PMID:27305676

  4. The Motivational Salience of Faces Is Related to Both Their Valence and Dominance.

    Science.gov (United States)

    Wang, Hongyi; Hahn, Amanda C; DeBruine, Lisa M; Jones, Benedict C

    2016-01-01

    Both behavioral and neural measures of the motivational salience of faces are positively correlated with their physical attractiveness. Whether physical characteristics other than attractiveness contribute to the motivational salience of faces is not known, however. Research with male macaques recently showed that more dominant macaques' faces hold greater motivational salience. Here we investigated whether dominance also contributes to the motivational salience of faces in human participants. Principal component analysis of third-party ratings of faces for multiple traits revealed two orthogonal components. The first component ("valence") was highly correlated with rated trustworthiness and attractiveness. The second component ("dominance") was highly correlated with rated dominance and aggressiveness. Importantly, both components were positively and independently related to the motivational salience of faces, as assessed from responses on a standard key-press task. These results show that at least two dissociable components underpin the motivational salience of faces in humans and present new evidence for similarities in how humans and non-human primates respond to facial cues of dominance. PMID:27513859

  5. Issue Salience and the Domestic Legitimacy Demands of European Integration. The Cases of Britain and Germany

    Directory of Open Access Journals (Sweden)

    Henrike Viehrig

    2008-04-01

    Full Text Available The salience of European issues to the general public is a major determinant of the domestic legitimacy demands that governments face when they devise their European policies. The higher the salience of these issues, the more restrictive will be the legitimacy demands that governments have to meet on the domestic level. Whereas the domestic legitimacy of European policy can rest on a permissive consensus among the public in cases of low issue salience, it requires the electorate’s explicit endorsement in cases of high issue salience. Polling data from Britain and Germany show that the salience of European issues is clearly higher in Britain than in Germany. We thus conclude that British governments face tougher domestic legitimacy demands when formulating their European policies than German governments. This may contribute to accounting for both countries’ different approaches to the integration process: Germany as a role model of a pro-integrationist member state and, in contrast, Britain as the eternal 'awkward partner'.

  6. Abnormal salience signaling in schizophrenia: The role of integrative beta oscillations.

    Science.gov (United States)

    Liddle, Elizabeth B; Price, Darren; Palaniyappan, Lena; Brookes, Matthew J; Robson, Siân E; Hall, Emma L; Morris, Peter G; Liddle, Peter F

    2016-04-01

    Aberrant salience attribution and cerebral dysconnectivity both have strong evidential support as core dysfunctions in schizophrenia. Aberrant salience arising from an excess of dopamine activity has been implicated in delusions and hallucinations, exaggerating the significance of everyday occurrences and thus leading to perceptual distortions and delusional causal inferences. Meanwhile, abnormalities in key nodes of a salience brain network have been implicated in other characteristic symptoms, including the disorganization and impoverishment of mental activity. A substantial body of literature reports disruption to brain network connectivity in schizophrenia. Electrical oscillations likely play a key role in the coordination of brain activity at spatially remote sites, and evidence implicates beta band oscillations in long-range integrative processes. We used magnetoencephalography and a task designed to disambiguate responses to relevant from irrelevant stimuli to investigate beta oscillations in nodes of a network implicated in salience detection and previously shown to be structurally and functionally abnormal in schizophrenia. Healthy participants, as expected, produced an enhanced beta synchronization to behaviorally relevant, as compared to irrelevant, stimuli, while patients with schizophrenia showed the reverse pattern: a greater beta synchronization in response to irrelevant than to relevant stimuli. These findings not only support both the aberrant salience and disconnectivity hypotheses, but indicate a common mechanism that allows us to integrate them into a single framework for understanding schizophrenia in terms of disrupted recruitment of contextually appropriate brain networks. Hum Brain Mapp 37:1361-1374, 2016. © 2016 Wiley Periodicals, Inc. PMID:26853904

  7. Mortality salience, martyrdom, and military might: the great satan versus the axis of evil.

    Science.gov (United States)

    Pyszczynski, Tom; Abdollahi, Abdolhossein; Solomon, Sheldon; Greenberg, Jeff; Cohen, Florette; Weise, David

    2006-04-01

    Study 1 investigated the effect of mortality salience on support for martyrdom attacks among Iranian college students. Participants were randomly assigned to answer questions about either their own death or an aversive topic unrelated to death and then evaluated materials from fellow students who either supported or opposed martyrdom attacks against the United States. Whereas control participants preferred the student who opposed martyrdom, participants reminded of death preferred the student who supported martyrdom and indicated they were more likely to consider such activities themselves. Study 2 investigated the effect of mortality salience on American college students' support for extreme military interventions by American forces that could kill thousands of civilians. Mortality salience increased support for such measures among politically conservative but not politically liberal students. The roles of existential fear, cultural worldviews, and construing one's nation as pursing a heroic battle against evil in advocacy of violence were discussed. PMID:16513804

  8. Aircraft Detection in High-Resolution SAR Images Based on a Gradient Textural Saliency Map

    Directory of Open Access Journals (Sweden)

    Yihua Tan

    2015-09-01

    Full Text Available This paper proposes a new automatic and adaptive aircraft target detection algorithm in high-resolution synthetic aperture radar (SAR images of airport. The proposed method is based on gradient textural saliency map under the contextual cues of apron area. Firstly, the candidate regions with the possible existence of airport are detected from the apron area. Secondly, directional local gradient distribution detector is used to obtain a gradient textural saliency map in the favor of the candidate regions. In addition, the final targets will be detected by segmenting the saliency map using CFAR-type algorithm. The real high-resolution airborne SAR image data is used to verify the proposed algorithm. The results demonstrate that this algorithm can detect aircraft targets quickly and accurately, and decrease the false alarm rate.

  9. Neural Dynamics of Emotional Salience Processing in Response to Voices during the Stages of Sleep

    Science.gov (United States)

    Chen, Chenyi; Sung, Jia-Ying; Cheng, Yawei

    2016-01-01

    Sleep has been related to emotional functioning. However, the extent to which emotional salience is processed during sleep is unknown. To address this concern, we investigated night sleep in healthy adults regarding brain reactivity to the emotionally (happily, fearfully) spoken meaningless syllables dada, along with correspondingly synthesized nonvocal sounds. Electroencephalogram (EEG) signals were continuously acquired during an entire night of sleep while we applied a passive auditory oddball paradigm. During all stages of sleep, mismatch negativity (MMN) in response to emotional syllables, which is an index for emotional salience processing of voices, was detected. In contrast, MMN to acoustically matching nonvocal sounds was undetected during Sleep Stage 2 and 3 as well as rapid eye movement (REM) sleep. Post-MMN positivity (PMP) was identified with larger amplitudes during Stage 3, and at earlier latencies during REM sleep, relative to wakefulness. These findings clearly demonstrated the neural dynamics of emotional salience processing during the stages of sleep. PMID:27378870

  10. The self salience model of other-to-self effects : Integrating principles of self-enhancement, complementarity, and imitation

    NARCIS (Netherlands)

    Stapel, DA; Van der Zee, KI

    2006-01-01

    In a series of studies the Self Salience Model of other-to-self effects is tested. This model posits that self-construal salience is all important determinant of whether other-to-self effects follow the principles of self-enhancement, imitation, or complementarity. Participants imagined interactions

  11. Toward isolating the role of dopamine in the acquisition of incentive salience attribution.

    Science.gov (United States)

    Chow, Jonathan J; Nickell, Justin R; Darna, Mahesh; Beckmann, Joshua S

    2016-10-01

    Stimulus-reward learning has been heavily linked to the reward-prediction error learning hypothesis and dopaminergic function. However, some evidence suggests dopaminergic function may not strictly underlie reward-prediction error learning, but may be specific to incentive salience attribution. Utilizing a Pavlovian conditioned approach procedure consisting of two stimuli that were equally reward-predictive (both undergoing reward-prediction error learning) but functionally distinct in regard to incentive salience (levers that elicited sign-tracking and tones that elicited goal-tracking), we tested the differential role of D1 and D2 dopamine receptors and nucleus accumbens dopamine in the acquisition of sign- and goal-tracking behavior and their associated conditioned reinforcing value within individuals. Overall, the results revealed that both D1 and D2 inhibition disrupted performance of sign- and goal-tracking. However, D1 inhibition specifically prevented the acquisition of sign-tracking to a lever, instead promoting goal-tracking and decreasing its conditioned reinforcing value, while neither D1 nor D2 signaling was required for goal-tracking in response to a tone. Likewise, nucleus accumbens dopaminergic lesions disrupted acquisition of sign-tracking to a lever, while leaving goal-tracking in response to a tone unaffected. Collectively, these results are the first evidence of an intraindividual dissociation of dopaminergic function in incentive salience attribution from reward-prediction error learning, indicating that incentive salience, reward-prediction error, and their associated dopaminergic signaling exist within individuals and are stimulus-specific. Thus, individual differences in incentive salience attribution may be reflective of a differential balance in dopaminergic function that may bias toward the attribution of incentive salience, relative to reward-prediction error learning only. PMID:27371135

  12. Multi-scale mesh saliency with local adaptive patches for viewpoint selection

    OpenAIRE

    Nouri, Anass; Charrier, Christophe; Lézoray, Olivier

    2015-01-01

    International audience Our visual attention is attracted by specific areas into 3D objects (represented by meshes). This visual attention depends on the degree of saliency exposed by these areas. In this paper, we propose a novel multi-scale approach for detecting salient regions. To do so, we define a local surface descriptor based on patches of adaptive size and filled in with a local height field. The single-scale saliency of a vertex is defined as its degree measure in the mesh with ed...

  13. Perspectives on the Salience and Magnitude of Dam Impacts for Hydro Development Scenarios in China

    Directory of Open Access Journals (Sweden)

    Desiree Tullos

    2010-06-01

    Survey results indicate differences in the perceived salience and magnitude of impacts across both expert groups and dam scenarios. Furthermore, surveys indicate that stakeholder perceptions changed as the information provided regarding dam impacts became more specific, suggesting that stakeholder evaluation may be influenced by quality of information. Finally, qualitative comments from the survey reflect some of the challenges of interdisciplinary dam assessment, including cross-disciplinary cooperation, data standardisation and weighting, and the distribution and potential mitigation of impacts. Given the complexity of data and perceptions around dam impacts, decision-support tools that integrate the objective magnitude and perceived salience of impacts are required urgently.

  14. The Intonational Marking of Topical Salience in Spontaneous Speech Evidence from Spoken French

    OpenAIRE

    Lacheret-Dujour, Anne

    2002-01-01

    Our analysis of prosodic patterns in spontaneous speech in French is based on the hypothesis that intonation, far from being the reflection of syntactical and rhythmical organisation only, also depends on informative structure. This hypothesis is illustrated here by the role of prosodic markedness in the expression of topic. We propose a bottom-up modelisation of prosody, from signal processing (segmentation and labelling of prosodic units according to perceptual constraints) to phonological ...

  15. Individual Variation in the Propensity to Attribute Incentive Salience to an Appetitive Cue Predicts the Propensity to Attribute Motivational Salience to an Aversive Cue

    OpenAIRE

    Morrow, Jonathan D.; Maren, Stephen; ROBINSON, TERRY E.

    2011-01-01

    It has been proposed that animals that attribute high levels of incentive salience to reward-related cues may be especially vulnerable to addiction. Individual variation has also been observed in the motivational value attributed to aversive cues, which may confer vulnerability to anxiety disorders such as post-traumatic stress disorder (PTSD). There may be a core behavioral trait that contributes to individual variation in the motivational value assigned to predictive cues regardless of emot...

  16. Basal forebrain motivational salience signal enhances cortical processing and decision speed

    Directory of Open Access Journals (Sweden)

    Sylvina M Raver

    2015-10-01

    Full Text Available The basal forebrain (BF contains major projections to the cerebral cortex, and plays a well-documented role in arousal, attention, decision-making, and in modulating cortical activity. BF neuronal degeneration is an early event in Alzheimer’s disease and dementias, and occurs in normal cognitive aging. While the BF is best known for its population of cortically projecting cholinergic neurons, the region is anatomically and neurochemically diverse, and also contains prominent populations of non-cholinergic projection neurons. In recent years, increasing attention has been dedicated to these non-cholinergic BF neurons in order to better understand how non-cholinergic BF circuits control cortical processing and behavioral performance. In this review, we focus on a unique population of putative non-cholinergic BF neurons that encodes the motivational salience of stimuli with a robust ensemble bursting response. We review recent studies that describe the specific physiological and functional characteristics of these BF salience-encoding neurons in behaving animals. These studies support the unifying hypothesis whereby BF salience-encoding neurons act as a gain modulation mechanism of the decision-making process to enhance cortical processing of behaviorally relevant stimuli, and thereby facilitate faster and more precise behavioral responses. This function of BF salience-encoding neurons represents a critical component in determining which incoming stimuli warrant an animal’s attention, and is therefore a fundamental and early requirement of behavioral flexibility.

  17. Two-scale image fusion of visible and infrared images using saliency detection

    Science.gov (United States)

    Bavirisetti, Durga Prasad; Dhuli, Ravindra

    2016-05-01

    Military, navigation and concealed weapon detection need different imaging modalities such as visible and infrared to monitor a targeted scene. These modalities provide complementary information. For better situation awareness, complementary information of these images has to be integrated into a single image. Image fusion is the process of integrating complementary source information into a composite image. In this paper, we propose a new image fusion method based on saliency detection and two-scale image decomposition. This method is beneficial because the visual saliency extraction process introduced in this paper can highlight the saliency information of source images very well. A new weight map construction process based on visual saliency is proposed. This process is able to integrate the visually significant information of source images into the fused image. In contrast to most of the multi-scale image fusion techniques, proposed technique uses only two-scale image decomposition. So it is fast and efficient. Our method is tested on several image pairs and is evaluated qualitatively by visual inspection and quantitatively using objective fusion metrics. Outcomes of the proposed method are compared with the state-of-art multi-scale fusion techniques. Results reveal that the proposed method performance is comparable or superior to the existing methods.

  18. Subjective and Objective Parameters Determining "Salience" in Long-Term Dialect Accommodation.

    Science.gov (United States)

    Auer, Peter; Barden, Birgit; Grosskopf, Beate

    1998-01-01

    Presents results of a longitudinal study on long-term dialect accommodation in a German dialect setting. An important model of explaining which linguistic structures undergo such convergence and which do not makes use of the notion of "salience." (Author/VWL)

  19. Movement or Goal: Goal Salience and Verbal Cues Affect Preschoolers' Imitation of Action Components

    Science.gov (United States)

    Elsner, Birgit; Pfeifer, Caroline

    2012-01-01

    The impact of goal salience and verbal cues given by the model on 3- to 5-year-olds' reproduction of action components (movement or goal) was investigated in an imitation choice task. Preschoolers watched an experimenter moving a puppet up or down a ramp, terminating at one of two target objects. The target objects were either differently colored…

  20. The Development of Visual Search in Infancy: Attention to Faces versus Salience

    Science.gov (United States)

    Kwon, Mee-Kyoung; Setoodehnia, Mielle; Baek, Jongsoo; Luck, Steven J.; Oakes, Lisa M.

    2016-01-01

    Four experiments examined how faces compete with physically salient stimuli for the control of attention in 4-, 6-, and 8-month-old infants (N = 117 total). Three computational models were used to quantify physical salience. We presented infants with visual search arrays containing a face and familiar object(s), such as shoes and flowers. Six- and…