WorldWideScience

Sample records for bottom-up saliency mediates

  1. Automatic Polyp Detection via A Novel Unified Bottom-up and Top-down Saliency Approach.

    Science.gov (United States)

    Yuan, Yixuan; Li, Dengwang; Meng, Max Q-H

    2017-07-31

    In this paper, we propose a novel automatic computer-aided method to detect polyps for colonoscopy videos. To find the perceptually and semantically meaningful salient polyp regions, we first segment images into multilevel superpixels. Each level corresponds to different sizes of superpixels. Rather than adopting hand-designed features to describe these superpixels in images, we employ sparse autoencoder (SAE) to learn discriminative features in an unsupervised way. Then a novel unified bottom-up and top-down saliency method is proposed to detect polyps. In the first stage, we propose a weak bottom-up (WBU) saliency map by fusing the contrast based saliency and object-center based saliency together. The contrast based saliency map highlights image parts that show different appearances compared with surrounding areas while the object-center based saliency map emphasizes the center of the salient object. In the second stage, a strong classifier with Multiple Kernel Boosting (MKB) is learned to calculate the strong top-down (STD) saliency map based on samples directly from the obtained multi-level WBU saliency maps. We finally integrate these two stage saliency maps from all levels together to highlight polyps. Experiment results achieve 0.818 recall for saliency calculation, validating the effectiveness of our method. Extensive experiments on public polyp datasets demonstrate that the proposed saliency algorithm performs favorably against state-of-the-art saliency methods to detect polyps.

  2. The Roles of Feature-Specific Task Set and Bottom-Up Salience in Attentional Capture: An ERP Study

    Science.gov (United States)

    Eimer, Martin; Kiss, Monika; Press, Clare; Sauter, Disa

    2009-01-01

    We investigated the roles of top-down task set and bottom-up stimulus salience for feature-specific attentional capture. Spatially nonpredictive cues preceded search arrays that included a color-defined target. For target-color singleton cues, behavioral spatial cueing effects were accompanied by cue-induced N2pc components, indicative of…

  3. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    Science.gov (United States)

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. Bottom up

    International Nuclear Information System (INIS)

    Ockenden, James

    1999-01-01

    This article presents an overview of the electric supply industries in Eastern Europe. The development of more competitive and efficient plant in Poland and work on emissions control ahead of EU membership; the Czech's complicated tariff system; Hungary's promised 8% return on investment in their electricity supply industry and its tariff problems; Bulgaria and Ukraine's desperate need for investment to build alternative plants to their aging nuclear plants; and demand outstripping supply in Romania are among the topics considered.. The viscous circle of poor service and low utility income is considered, and the top-down approach for breaking the cycle by improving plant efficiency, and the bottom up approach of improving plant income as practiced by Moldavia are explained. (UK)

  5. Relative importance of plant-mediated bottom-up and top-down forces on herbivore abundance on Brassica oleracea

    NARCIS (Netherlands)

    Kos, M.; Broekgaarden, C.; Kabouw, P.; Oude Lenferink, K.; Poelman, E.H.; Vet, L.E.M.; Dicke, M.; Loon, van J.J.A.

    2011-01-01

    1. Arthropod communities are structured by complex interactions between bottom-up (resource-based) and top-down (natural enemy-based) forces. Their relative importance in shaping arthropod communities, however, continues to be under debate. Bottom-up and top-down forces can be affected by

  6. Self-Efficacy as a Mediator in Bottom-Up Dissemination of a Research-Supported Intervention for Young, Traumatized Children and Their Families.

    Science.gov (United States)

    David, Paula; Schiff, Miriam

    2017-01-01

    Implementation literature has under-reported bottom-up dissemination attempts of research-supported interventions (RSI). This study examined factors associated with individual clinicians' implementation of Child-Parent Psychotherapy (CPP), including CPP social network (SN), supervision, and self-efficacy. Seventy-seven (90%) CPP graduates completed a cross-sectional survey, including measures regarding social network, receiving supervision, and CPP self-efficacy. Self-efficacy was significantly associated with CPP implementation; CPP SN and supervision were not. Mediation models showed that self-efficacy significantly mediated between CPP SN and supervision, and the implementation variables. Findings illuminate the importance of supporting clinicians using a new RSI, particularly in bottom-up dissemination, in order to foster RSI self-efficacy.

  7. Formation of Monocrystalline 1D and 2D Architectures via Epitaxial Attachment: Bottom-Up Routes through Surfactant-Mediated Arrays of Oriented Nanocrystals.

    Science.gov (United States)

    Nakagawa, Yoshitaka; Kageyama, Hiroyuki; Oaki, Yuya; Imai, Hiroaki

    2015-06-09

    Monocrystalline architectures with well-defined shapes were achieved by bottom-up routes through epitaxial attachment of Mn3O4 nanocrystals. The crystallographically continuous 1D chains elongated in the a axis and 2D panels having large a or c faces were obtained by removal of the organic mediator from surfactant-mediated 1D and 2D arrays of Mn3O4 nanocrystals, respectively. Our basal approach indicates that the epitaxial attachment through the surfactant-mediated arrays is utilized for fabrication of a wide variety of micrometric architectures from nanometric crystalline units.

  8. Bottom-up nutrient and top-down fish impacts on insect-mediated mercury flux from aquatic ecosystems.

    Science.gov (United States)

    Jones, Taylor A; Chumchal, Matthew M; Drenner, Ray W; Timmins, Gabrielle N; Nowlin, Weston H

    2013-03-01

    Methyl mercury (MeHg) is one of the most hazardous contaminants in the environment, adversely affecting the health of wildlife and humans. Recent studies have demonstrated that aquatic insects biotransport MeHg and other contaminants to terrestrial consumers, but the factors that regulate the flux of MeHg out of aquatic ecosystems via emergent insects have not been studied. The authors used experimental mesocosms to test the hypothesis that insect emergence and the associated flux of MeHg from aquatic to terrestrial ecosystems is affected by both bottom-up nutrient effects and top-down fish consumer effects. In the present study, nutrient addition led to an increase in MeHg flux primarily by enhancing the biomass of emerging insects whose tissues were contaminated with MeHg, whereas fish decreased MeHg flux primarily by reducing the biomass of emerging insects. Furthermore, the authors found that these factors are interdependent such that the effects of nutrients are more pronounced when fish are absent, and the effects of fish are more pronounced when nutrient concentrations are high. The present study is the first to demonstrate that the flux of MeHg from aquatic to terrestrial ecosystems is strongly enhanced by bottom-up nutrient effects and diminished by top-down consumer effects. Copyright © 2012 SETAC.

  9. Culture from the Bottom Up

    Science.gov (United States)

    Atkinson, Dwight; Sohn, Jija

    2013-01-01

    The culture concept has been severely criticized for its top-down nature in TESOL, leading arguably to its falling out of favor in the field. But what of the fact that people do "live culturally" (Ingold, 1994)? This article describes a case study of culture from the bottom up--culture as understood and enacted by its individual users.…

  10. Density- and trait-mediated top-down effects modify bottom-up control of a highly endemic tropical aquatic food web

    Science.gov (United States)

    C. M. Dalton; A. Mokiao-Lee; T. S. Sakihara; M. G. Weber; C. A. Roco; Z. Han; B. Dudley; R. A. MacKenzie; N. G. Hairston Jr.

    2013-01-01

    Benthic invertebrates mediate bottom–up and top–down influences in aquatic food webs, and changes in the abundance or traits of invertebrates can alter the strength of top–down effects. Studies assessing the role of invertebrate abundance and behavior as controls on food web structure are rare at the whole ecosystem scale. Here we use a comparative approach to...

  11. Top-Down Beta Enhances Bottom-Up Gamma.

    Science.gov (United States)

    Richter, Craig G; Thompson, William H; Bosman, Conrado A; Fries, Pascal

    2017-07-12

    Several recent studies have demonstrated that the bottom-up signaling of a visual stimulus is subserved by interareal gamma-band synchronization, whereas top-down influences are mediated by alpha-beta band synchronization. These processes may implement top-down control of stimulus processing if top-down and bottom-up mediating rhythms are coupled via cross-frequency interaction. To test this possibility, we investigated Granger-causal influences among awake macaque primary visual area V1, higher visual area V4, and parietal control area 7a during attentional task performance. Top-down 7a-to-V1 beta-band influences enhanced visually driven V1-to-V4 gamma-band influences. This enhancement was spatially specific and largest when beta-band activity preceded gamma-band activity by ∼0.1 s, suggesting a causal effect of top-down processes on bottom-up processes. We propose that this cross-frequency interaction mechanistically subserves the attentional control of stimulus selection. SIGNIFICANCE STATEMENT Contemporary research indicates that the alpha-beta frequency band underlies top-down control, whereas the gamma-band mediates bottom-up stimulus processing. This arrangement inspires an attractive hypothesis, which posits that top-down beta-band influences directly modulate bottom-up gamma band influences via cross-frequency interaction. We evaluate this hypothesis determining that beta-band top-down influences from parietal area 7a to visual area V1 are correlated with bottom-up gamma frequency influences from V1 to area V4, in a spatially specific manner, and that this correlation is maximal when top-down activity precedes bottom-up activity. These results show that for top-down processes such as spatial attention, elevated top-down beta-band influences directly enhance feedforward stimulus-induced gamma-band processing, leading to enhancement of the selected stimulus. Copyright © 2017 Richter, Thompson et al.

  12. Where to start? Bottom-up attention improves working memory by determining encoding order.

    Science.gov (United States)

    Ravizza, Susan M; Uitvlugt, Mitchell G; Hazeltine, Eliot

    2016-12-01

    The present study aimed to characterize the mechanism by which working memory is enhanced for items that capture attention because of their novelty or saliency-that is, via bottom-up attention. The first experiment replicated previous research by corroborating that bottom-up attention directed to an item is sufficient for enhancing working memory and, moreover, generalized the effect to the domain of verbal working memory. The subsequent 3 experiments sought to determine how bottom-up attention affects working memory. We considered 2 hypotheses: (1) Bottom-up attention enhances the encoded representation of the stimulus, similar to how voluntary attention functions, or (2) It affects the order of encoding by shifting priority onto the attended stimulus. By manipulating how stimuli were presented (simultaneous/sequential display) and whether the cue predicted the tested items, we found evidence that bottom-up attention improves working memory performance via the order of encoding hypothesis. This finding was observed across change detection and free recall paradigms. In contrast, voluntary attention improved working memory regardless of encoding order and showed greater effects on working memory. We conclude that when multiple information sources compete, bottom-up attention prioritizes the location at which encoding should begin. When encoding order is set, bottom-up attention has little or no benefit to working memory. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  13. Neoliberalism Viewed From the Bottom Up

    DEFF Research Database (Denmark)

    Danneris, Sophie

    2017-01-01

    Drawing on the assumption that it is pivotal to include a bottom up perspective to understand the way in which the welfare system functions, this chapter sets out to explore the lived experience of neoliberalism. The purpose is to gain insight into the consequences of neoliberalism from...... the viewpoint of the vulnerable benefit claimants who encounter it on a daily basis. The analysis is based on a qualitative longitudinal study conducted from 2013 to 2015, which shows how, in varying ways, clients routinely cope with being part of a neoliberal welfare state: by resignation, by taking action...

  14. Information theoretic preattentive saliency

    DEFF Research Database (Denmark)

    Loog, Marco

    2011-01-01

    Employing an information theoretic operational definition of bottom-up attention from the field of computational visual perception a very general expression for saliency is provided. As opposed to many of the current approaches to determining a saliency map there is no need for an explicit data...... of which features, image information is described. We illustrate our result by determining a few specific saliency maps based on particular choices of features. One of them makes the link with the mapping underlying well-known Harris interest points, which is a result recently obtained in isolation...

  15. Bottom-up guidance in visual search for conjunctions.

    Science.gov (United States)

    Proulx, Michael J

    2007-02-01

    Understanding the relative role of top-down and bottom-up guidance is crucial for models of visual search. Previous studies have addressed the role of top-down and bottom-up processes in search for a conjunction of features but with inconsistent results. Here, the author used an attentional capture method to address the role of top-down and bottom-up processes in conjunction search. The role of bottom-up processing was assayed by inclusion of an irrelevant-size singleton in a search for a conjunction of color and orientation. One object was uniquely larger on each trial, with chance probability of coinciding with the target; thus, the irrelevant feature of size was not predictive of the target's location. Participants searched more efficiently for the target when it was also the size singleton, and they searched less efficiently for the target when a nontarget was the size singleton. Although a conjunction target cannot be detected on the basis of bottom-up processing alone, participants used search strategies that relied significantly on bottom-up guidance in finding the target, resulting in interference from the irrelevant-size singleton.

  16. A plea for Global Health Action bottom-up

    Directory of Open Access Journals (Sweden)

    Ulrich Laaser

    2016-10-01

    Full Text Available This opinion piece focuses on global health action by hands-on bottom-up practice: Initiation of an organizational framework and securing financial efficiency are – however - essential, both clearly a domain of well trained public health professionals. Examples of action are cited in the four main areas of global threats: planetary climate change, global divides and inequity, global insecurity and violent conflicts, global instability and financial crises. In conclusion a stable health systems policy framework would greatly enhance success. However, such organisational framework dries out if not linked to public debates channelling fresh thoughts and controversial proposals: the structural stabilisation is essential but has to serve not to dominate bottom-up activities. In other words a horizontal management is required, a balanced equilibrium between bottom-up initiative and top-down support. Last not least rewarding voluntary and charity work by public acknowledgement is essential.

  17. Hydrodynamic cavitation: a bottom-up approach to liquid aeration

    NARCIS (Netherlands)

    Raut, J.S.; Stoyanov, S.D.; Duggal, C.; Pelan, E.G.; Arnaudov, L.N.; Naik, V.M.

    2012-01-01

    We report the use of hydrodynamic cavitation as a novel, bottom-up method for continuous creation of foams comprising of air microbubbles in aqueous systems containing surface active ingredients, like proteins or particles. The hydrodynamic cavitation was created using a converging-diverging nozzle.

  18. Combining bottom-up and top-down

    International Nuclear Information System (INIS)

    Boehringer, Christoph; Rutherford, Thomas F.

    2008-01-01

    We motivate the formulation of market equilibrium as a mixed complementarity problem which explicitly represents weak inequalities and complementarity between decision variables and equilibrium conditions. The complementarity format permits an energy-economy model to combine technological detail of a bottom-up energy system with a second-best characterization of the over-all economy. Our primary objective is pedagogic. We first lay out the complementarity features of economic equilibrium and demonstrate how we can integrate bottom-up activity analysis into a top-down representation of the broader economy. We then provide a stylized numerical example of an integrated model - within both static and dynamic settings. Finally, we present illustrative applications to three themes figuring prominently on the energy policy agenda of many industrialized countries: nuclear phase-out, green quotas, and environmental tax reforms

  19. Combining bottom-up and top-down

    Energy Technology Data Exchange (ETDEWEB)

    Boehringer, Christoph [Department of Economics, University of Oldenburg, Oldenburg (Germany); Centre for European Economic Research (ZEW), Mannheim (Germany); Rutherford, Thomas F. [Ann Arbor, Michigan (United States)

    2008-03-15

    We motivate the formulation of market equilibrium as a mixed complementarity problem which explicitly represents weak inequalities and complementarity between decision variables and equilibrium conditions. The complementarity format permits an energy-economy model to combine technological detail of a bottom-up energy system with a second-best characterization of the over-all economy. Our primary objective is pedagogic. We first lay out the complementarity features of economic equilibrium and demonstrate how we can integrate bottom-up activity analysis into a top-down representation of the broader economy. We then provide a stylized numerical example of an integrated model - within both static and dynamic settings. Finally, we present illustrative applications to three themes figuring prominently on the energy policy agenda of many industrialized countries: nuclear phase-out, green quotas, and environmental tax reforms. (author)

  20. The Interplay of Top-Down and Bottom-Up

    DEFF Research Database (Denmark)

    Winkler, Till; Brown, Carol V.; Ozturk, Pinar

    2014-01-01

    The exchange of patient health information across different organizations involved in healthcare delivery has potential benefits for a wide range of stakeholders. However, many governments in Europe and in the U.S. have, despite both top-down and bottom-up initiatives, experienced major barriers...... in achieving sustainable models for implementing health information exchange (HIE) throughout their healthcare systems. In the case of the U.S., three years after stimulus funding allocated as part of the 2009 HITECH Act, the extent to which government funding will be needed to sustain health information...... organizations (HIOs) that facilitate HIE across regional stakeholders remains an unanswered question. This research investigates the impacts of top-down and bottom-up initiatives on the evolutionary paths of HIOs in two contingent states in the U.S. (New Jersey and New York) which had different starting...

  1. Bottom-up effects on attention capture and choice

    DEFF Research Database (Denmark)

    Peschel, Anne; Orquin, Jacob Lund; Mueller Loose, Simone

    Attention processes and decision making are accepted to be closely linked together because only information that is attended to can be incorporated in the decision process. Little is known however, to which extent bottom-up processes of attention affect stimulus selection and therefore...... the information available to form a decision. Does changing one visual cue in the stimulus set affect attention towards this cue and what does that mean for the choice outcome? To address this, we conducted a combined eye tracking and choice experiment in a consumer choice setting with visual shelf simulations...... salient. The observed effect on attention also carries over into increased choice likelihood. From these results, we conclude that even small changes in the choice capture attention based on bottom-up processes. Also for eye tracking studies in other domains (e.g. search tasks) this means that stimulus...

  2. Age-related decline in bottom-up processing and selective attention in the very old.

    Science.gov (United States)

    Zhuravleva, Tatyana Y; Alperin, Brittany R; Haring, Anna E; Rentz, Dorene M; Holcomb, Philip J; Daffner, Kirk R

    2014-06-01

    Previous research demonstrating age-related deficits in selective attention have not included old-old adults, an increasingly important group to study. The current investigation compared event-related potentials in 15 young-old (65-79 years old) and 23 old-old (80-99 years old) subjects during a color-selective attention task. Subjects responded to target letters in a specified color (Attend) while ignoring letters in a different color (Ignore) under both low and high loads. There were no group differences in visual acuity, accuracy, reaction time, or latency of early event-related potential components. The old-old group showed a disruption in bottom-up processing, indexed by a substantially diminished posterior N1 (smaller amplitude). They also demonstrated markedly decreased modulation of bottom-up processing based on selected visual features, indexed by the posterior selection negativity (SN), with similar attenuation under both loads. In contrast, there were no group differences in frontally mediated attentional selection, measured by the anterior selection positivity (SP). There was a robust inverse relationship between the size of the SN and SP (the smaller the SN, the larger the SP), which may represent an anteriorly supported compensatory mechanism. In the absence of a decline in top-down modulation indexed by the SP, the diminished SN may reflect age-related degradation of early bottom-up visual processing in old-old adults.

  3. Bottom-up approach for carbon nanotube interconnects

    International Nuclear Information System (INIS)

    Li Jun; Ye Qi; Cassell, Alan; Ng, Hou Tee; Stevens, Ramsey; Han Jie; Meyyappan, M.

    2003-01-01

    We report a bottom-up approach to integrate multiwalled carbon nanotubes (MWNTs) into multilevel interconnects in silicon integrated-circuit manufacturing. MWNTs are grown vertically from patterned catalyst spots using plasma-enhanced chemical vapor deposition. We demonstrate the capability to grow aligned structures ranging from a single tube to forest-like arrays at desired locations. SiO 2 is deposited to encapsulate each nanotube and the substrate, followed by a mechanical polishing process for planarization. MWNTs retain their integrity and demonstrate electrical properties consistent with their original structure

  4. A Bottom-Up Approach to SUSY Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Horn, Claus; /SLAC

    2011-11-11

    This paper proposes a new way to do event generation and analysis in searches for new physics at the LHC. An abstract notation is used to describe the new particles on a level which better corresponds to detector resolution of LHC experiments. In this way the SUSY discovery space can be decomposed into a small number of eigenmodes each with only a few parameters, which allows to investigate the SUSY parameter space in a model-independent way. By focusing on the experimental observables for each process investigated the Bottom-Up Approach allows to systematically study the boarders of the experimental efficiencies and thus to extend the sensitivity for new physics.

  5. ICT-ENABLED BOTTOM-UP ARCHITECTURAL DESIGN

    Directory of Open Access Journals (Sweden)

    Burak Pak

    2016-04-01

    Full Text Available This paper aims at discussing the potentials of bottom-up design practices in relation to the latest developments in Information and Communication Technologies (ICT by making an in-depth review of inaugural cases. The first part of the study involves a literature study and the elaboration of basic strategies from the case study. The second part reframes the existing ICT tools and strategies and elaborates on their potentials to support the modes of participation performed in these cases. As a result, by distilling the created knowledge, the study reveals the potentials of novel modes of ICT-enabled design participation which exploit a set of collective action tools to support sustainable ways of self-organization and bottom-up design. The final part explains the relevance of these with solid examples and presents a hypothetical case for future implementation. The paper concludes with a brief reflection on the implications of the findings for the future of architectural design education.

  6. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  7. Making the results of bottom-up energy savings comparable

    Directory of Open Access Journals (Sweden)

    Moser Simon

    2012-01-01

    Full Text Available The Energy Service Directive (ESD has pushed forward the issue of energy savings calculations without clarifying the methodological basis. Savings achieved in the Member States are calculated with rather non-transparent and hardly comparable Bottom-up (BU methods. This paper develops the idea of parallel evaluation tracks separating the Member States’ issue of ESD verification and comparable savings calculations. Comparability is ensured by developing a standardised BU calculation kernel for different energy efficiency improvement (EEI actions which simultaneously depicts the different calculation options in a structured way (e.g. baseline definition, system boundaries, double counting. Due to the heterogeneity of BU calculations the approach requires a central database where Member States feed in input data on BU actions according to a predefined structure. The paper demonstrates the proposed approach including a concrete example of application.

  8. Bottom up design of nanoparticles for anti-cancer diapeutics

    DEFF Research Database (Denmark)

    Needham, David; Arslanagic, Amina; Glud, Kasper

    2016-01-01

    for EPR uptake and tumor detection. We show that, while free-drug cannot be optimally administered in vivo, a nanoparticle formulation of orlistat could in principle represent a stable parenteral delivery system. The article ends with a brief discussion of what we see as the way forward in Individualized...... the feasibility of an idea: could we design, make, develop, and test the concept for treating metastatic cancer by, "Putting the Drug in the Cancer's Food? "Limit size" is the size of the cancer's food, ? the common Low Density Lipoprotein, (LDL) ~20 nm diameter. In this contribution to Pieter's LTAA we focus...... on the "bottom" (nucleation) and the "up" (growth) of "bottom-up design" as it applies to homogeneous nucleation of especially, hydrophobic drugs and the 8 physico-chemical stages and associated parameters that determine the initial size, and any subsequent coarsening, of a nanoparticle suspension. We show that...

  9. On an elementary definition of visual saliency

    DEFF Research Database (Denmark)

    Loog, Marco

    2008-01-01

    Various approaches to computational modelling of bottom-up visual attention have been proposed in the past two decades. As part of this trend, researchers have studied ways to characterize the saliency map underlying many of these models. In more recent years, several definitions based on probabi......Various approaches to computational modelling of bottom-up visual attention have been proposed in the past two decades. As part of this trend, researchers have studied ways to characterize the saliency map underlying many of these models. In more recent years, several definitions based...... on probabilistic and information or decision theoretic considerations have been proposed. These provide experimentally successful, appealing, low-level, operational, and elementary definitions of visual saliency (see eg, Bruce, 2005 Neurocomputing 65 125 - 133). Here, I demonstrate that, in fact, all...

  10. A bottom-up approach to the strong CP problem

    Science.gov (United States)

    Diaz-Cruz, J. L.; Hollik, W. G.; Saldana-Salazar, U. J.

    2018-05-01

    The strong CP problem is one of many puzzles in the theoretical description of elementary particle physics that still lacks an explanation. While top-down solutions to that problem usually comprise new symmetries or fields or both, we want to present a rather bottom-up perspective. The main problem seems to be how to achieve small CP violation in the strong interactions despite the large CP violation in weak interactions. In this paper, we show that with minimal assumptions on the structure of mass (Yukawa) matrices, they do not contribute to the strong CP problem and thus we can provide a pathway to a solution of the strong CP problem within the structures of the Standard Model and no extension at the electroweak scale is needed. However, to address the flavor puzzle, models based on minimal SU(3) flavor groups leading to the proposed flavor matrices are favored. Though we refrain from an explicit UV completion of the Standard Model, we provide a simple requirement for such models not to show a strong CP problem by construction.

  11. Quantum simulation from the bottom up: the case of rebits

    Science.gov (United States)

    Enshan Koh, Dax; Yuezhen Niu, Murphy; Yoder, Theodore J.

    2018-05-01

    Typically, quantum mechanics is thought of as a linear theory with unitary evolution governed by the Schrödinger equation. While this is technically true and useful for a physicist, with regards to computation it is an unfortunately narrow point of view. Just as a classical computer can simulate highly nonlinear functions of classical states, so too can the more general quantum computer simulate nonlinear evolutions of quantum states. We detail one particular simulation of nonlinearity on a quantum computer, showing how the entire class of -unitary evolutions (on n qubits) can be simulated using a unitary, real-amplitude quantum computer (consisting of n  +  1 qubits in total). These operators can be represented as the sum of a linear and antilinear operator, and add an intriguing new set of nonlinear quantum gates to the toolbox of the quantum algorithm designer. Furthermore, a subgroup of these nonlinear evolutions, called the -Cliffords, can be efficiently classically simulated, by making use of the fact that Clifford operators can simulate non-Clifford (in fact, non-linear) operators. This perspective of using the physical operators that we have to simulate non-physical ones that we do not is what we call bottom-up simulation, and we give some examples of its broader implications.

  12. Bottom-Up Synthesis and Sensor Applications of Biomimetic Nanostructures

    Directory of Open Access Journals (Sweden)

    Li Wang

    2016-01-01

    Full Text Available The combination of nanotechnology, biology, and bioengineering greatly improved the developments of nanomaterials with unique functions and properties. Biomolecules as the nanoscale building blocks play very important roles for the final formation of functional nanostructures. Many kinds of novel nanostructures have been created by using the bioinspired self-assembly and subsequent binding with various nanoparticles. In this review, we summarized the studies on the fabrications and sensor applications of biomimetic nanostructures. The strategies for creating different bottom-up nanostructures by using biomolecules like DNA, protein, peptide, and virus, as well as microorganisms like bacteria and plant leaf are introduced. In addition, the potential applications of the synthesized biomimetic nanostructures for colorimetry, fluorescence, surface plasmon resonance, surface-enhanced Raman scattering, electrical resistance, electrochemistry, and quartz crystal microbalance sensors are presented. This review will promote the understanding of relationships between biomolecules/microorganisms and functional nanomaterials in one way, and in another way it will guide the design and synthesis of biomimetic nanomaterials with unique properties in the future.

  13. Thinking about the Weather: How Display Salience and Knowledge Affect Performance in a Graphic Inference Task

    Science.gov (United States)

    Hegarty, Mary; Canham, Matt S.; Fabrikant, Sara I.

    2010-01-01

    Three experiments examined how bottom-up and top-down processes interact when people view and make inferences from complex visual displays (weather maps). Bottom-up effects of display design were investigated by manipulating the relative visual salience of task-relevant and task-irrelevant information across different maps. Top-down effects of…

  14. Top-down and bottom-up aspects of active search in a real-world environment.

    Science.gov (United States)

    Foulsham, Tom; Chapman, Craig; Nasiopoulos, Eleni; Kingstone, Alan

    2014-03-01

    Visual search has been studied intensively in the labouratory, but lab search often differs from search in the real world in many respects. Here, we used a mobile eye tracker to record the gaze of participants engaged in a realistic, active search task. Participants were asked to walk into a mailroom and locate a target mailbox among many similar mailboxes. This procedure allowed control of bottom-up cues (by making the target mailbox more salient; Experiment 1) and top-down instructions (by informing participants about the cue; Experiment 2). The bottom-up salience of the target had no effect on the overall time taken to search for the target, although the salient target was more likely to be fixated and found once it was within the central visual field. Top-down knowledge of target appearance had a larger effect, reducing the need for multiple head and body movements, and meaning that the target was fixated earlier and from further away. Although there remains much to be discovered in complex real-world search, this study demonstrates that principles from visual search in the labouratory influence gaze in natural behaviour, and provides a bridge between these labouratory studies and research examining vision in natural tasks.

  15. Combining Top-down and Bottom-up Accountability: Evidence from a Bribery Experiment.

    OpenAIRE

    Danila Serra

    2008-01-01

    Monitoring corruption typically relies on top-down interventions aimed at increasing the probability of external controls and the severity of punishment. An alternative approach to fighting corruption is to induce bottom-up pressure for reform. Recent studies have shown that both top-down and bottom-up mechanisms are rarely able to keep service providers accountable. This paper investigates the effectiveness of an accountability system that combines bottom-up monitoring and top-down auditing ...

  16. Saliency of color image derivatives: a comparison between computational models and human perception

    NARCIS (Netherlands)

    Vazquez, E.; Gevers, T.; Lucassen, M.; van de Weijer, J.; Baldrich, R.

    2010-01-01

    In this paper, computational methods are proposed to compute color edge saliency based on the information content of color edges. The computational methods are evaluated on bottom-up saliency in a psychophysical experiment, and on a more complex task of salient object detection in real-world images.

  17. Fabricating ordered functional nanostructures onto polycrystalline substrates from the bottom-up

    International Nuclear Information System (INIS)

    Torres, María; Pardo, Lorena; Ricote, Jesús; Fuentes-Cobas, Luís E.; Rodriguez, Brian J.; Calzada, M. Lourdes

    2012-01-01

    Microemulsion-mediated synthesis has emerged as a powerful bottom-up procedure for the preparation of ferroelectric nanostructures onto substrates. However, periodical order has yet to be achieved onto polycrystalline Pt-coated Si substrates. Here, we report a new methodology that involves microemulsion-mediated synthesis and the controlled modification of the surface of the substrate by coating it with a template-layer of water-micelles. This layer modifies the surface tension of the substrate and yields a periodic arrangement of ferroelectric crystalline nanostructures. The size of the nanostructures is decreased to the sub-50 nm range and they show a hexagonal order up to the third neighbors, which corresponds to a density of 275 Gb in −2 . The structural analysis of the nanostructures by synchrotron X-ray diffraction confirms that the nanostructures have a PbTiO 3 perovskite structure, with lattice parameters of a = b = 3.890(0) Å and c = 4.056(7) Å. Piezoresponse force microscopy confirmed the ferro-piezoelectric character of the nanostructures. This simple methodology is valid for the self-assembly of other functional oxides onto polycrystalline substrates, enabling their reliable integration into micro/nano devices.

  18. Mapping practices of project management – merging top-down and bottom-up perspectives

    DEFF Research Database (Denmark)

    Thuesen, Christian

    2015-01-01

    This paper presents a new methodology for studying different accounts of project management practices based on network mapping and analysis. Drawing upon network mapping and visualization as an analytical strategy top-down and bottom-up accounts of project management practice are analysed...... and compared. The analysis initially reveals a substantial difference between the top-down and bottom-up accounts of practice. Furthermore it identifies a soft side of project management that is central in the bottom-up account but absent from the top-down. Finally, the study shows that network mapping...

  19. Trophic cascades of bottom-up and top-down forcing on nutrients and plankton in the Kattegat, evaluated by modelling

    DEFF Research Database (Denmark)

    Petersen, Marcell Elo; Maar, Marie; Larsen, Janus

    2017-01-01

    The aim of the study was to investigate the relative importance of bottom-up and top-down forcing on trophic cascades in the pelagic food-web and the implications for water quality indicators (summer phytoplankton biomass and winter nutrients) in relation to management. The 3D ecological model....... On annual basis, the system was more bottom-up than top-down controlled. Microzooplankton was found to play an important role in the pelagic food web as mediator of nutrient and energy fluxes. This study demonstrated that the best scenario for improved water quality was a combined reduction in nutrient...

  20. Saccade generation by the frontal eye fields in rhesus monkeys is separable from visual detection and bottom-up attention shift.

    Science.gov (United States)

    Lee, Kyoung-Min; Ahn, Kyung-Ha; Keller, Edward L

    2012-01-01

    The frontal eye fields (FEF), originally identified as an oculomotor cortex, have also been implicated in perceptual functions, such as constructing a visual saliency map and shifting visual attention. Further dissecting the area's role in the transformation from visual input to oculomotor command has been difficult because of spatial confounding between stimuli and responses and consequently between intermediate cognitive processes, such as attention shift and saccade preparation. Here we developed two tasks in which the visual stimulus and the saccade response were dissociated in space (the extended memory-guided saccade task), and bottom-up attention shift and saccade target selection were independent (the four-alternative delayed saccade task). Reversible inactivation of the FEF in rhesus monkeys disrupted, as expected, contralateral memory-guided saccades, but visual detection was demonstrated to be intact at the same field. Moreover, saccade behavior was impaired when a bottom-up shift of attention was not a prerequisite for saccade target selection, indicating that the inactivation effect was independent of the previously reported dysfunctions in bottom-up attention control. These findings underscore the motor aspect of the area's functions, especially in situations where saccades are generated by internal cognitive processes, including visual short-term memory and long-term associative memory.

  1. Saccade generation by the frontal eye fields in rhesus monkeys is separable from visual detection and bottom-up attention shift.

    Directory of Open Access Journals (Sweden)

    Kyoung-Min Lee

    Full Text Available The frontal eye fields (FEF, originally identified as an oculomotor cortex, have also been implicated in perceptual functions, such as constructing a visual saliency map and shifting visual attention. Further dissecting the area's role in the transformation from visual input to oculomotor command has been difficult because of spatial confounding between stimuli and responses and consequently between intermediate cognitive processes, such as attention shift and saccade preparation. Here we developed two tasks in which the visual stimulus and the saccade response were dissociated in space (the extended memory-guided saccade task, and bottom-up attention shift and saccade target selection were independent (the four-alternative delayed saccade task. Reversible inactivation of the FEF in rhesus monkeys disrupted, as expected, contralateral memory-guided saccades, but visual detection was demonstrated to be intact at the same field. Moreover, saccade behavior was impaired when a bottom-up shift of attention was not a prerequisite for saccade target selection, indicating that the inactivation effect was independent of the previously reported dysfunctions in bottom-up attention control. These findings underscore the motor aspect of the area's functions, especially in situations where saccades are generated by internal cognitive processes, including visual short-term memory and long-term associative memory.

  2. A Bottom up Initiative: Meditation & Mindfulness 'Eastern' Practices in the "Western" Academia

    DEFF Research Database (Denmark)

    Singla, Rashmi

    a case of bottom up initiative, where the students themselves have demanded inclusion of non- conventional psychosocial interventions illustrated by meditation and mindfulness as Eastern psychological practices, thus filling the gap related to the existential, spiritual approaches. The western...

  3. The updated bottom up solution applied to atmospheric pressure photoionization and electrospray ionization mass spectrometry

    Science.gov (United States)

    The Updated Bottom Up Solution (UBUS) was recently applied to atmospheric pressure chemical ionization (APCI) mass spectrometry (MS) of triacylglycerols (TAGs). This report demonstrates that the UBUS applies equally well to atmospheric pressure photoionization (APPI) MS and to electrospray ionizatio...

  4. Affective salience can reverse the effects of stimulus-driven salience on eye movements in complex scenes

    Directory of Open Access Journals (Sweden)

    Yaqing eNiu

    2012-09-01

    Full Text Available In natural vision both stimulus features and cognitive/affective factors influence an observer's attention. However, the relationship between stimulus-driven (bottom-up and cognitive/affective (top-down factors remains controversial: Can affective salience counteract strong visual stimulus signals and shift attention allocation irrespective of bottom-up features? Is there any difference between negative and positive scenes in terms of their influence on attention deployment? Here we examined the impact of affective factors on eye movement behavior, to understand the competition between visual stimulus-driven salience and affective salience and how they affect gaze allocation in complex scene viewing. Building on our previous research, we compared predictions generated by a visual salience model with measures indexing participant-identified emotionally meaningful regions of each image. To examine how eye movement behaviour differs for negative, positive, and neutral scenes, we examined the influence of affective salience in capturing attention according to emotional valence. Taken together, our results show that affective salience can override stimulus-driven salience and overall emotional valence can determine attention allocation in complex scenes. These findings are consistent with the hypothesis that cognitive/affective factors play a dominant role in active gaze control.

  5. Bottom-up synthetic biology: modular design for making artificial platelets

    Science.gov (United States)

    Majumder, Sagardip; Liu, Allen P.

    2018-01-01

    Engineering artificial cells to mimic one or multiple fundamental cell biological functions is an emerging area of synthetic biology. Reconstituting functional modules from biological components in vitro is a challenging yet an important essence of bottom-up synthetic biology. Here we describe the concept of building artificial platelets using bottom-up synthetic biology and the four functional modules that together could enable such an ambitious effort.

  6. The Chicago Fire of 1871: A Bottom Up Approach to Disaster Relief

    OpenAIRE

    Skarbek, Emily C.

    2014-01-01

    Can bottom-up relief efforts lead to recovery after disasters? Conventional wisdom and contemporary public policy suggest that major crises require centralized authority to provide disaster relief goods. Using a novel set of comprehensive donation and expenditure data collected from archival records, this paper examines a bottom-up relief effort following one of the most devastating natural disasters of the nineteenth century: the Chicago Fire of 1871. Findings show that while there was no ce...

  7. Bottom-up vs. top-down effects on terrestrial insect herbivores: a meta-analysis.

    Science.gov (United States)

    Vidal, Mayra C; Murphy, Shannon M

    2018-01-01

    Primary consumers are under strong selection from resource ('bottom-up') and consumer ('top-down') controls, but the relative importance of these selective forces is unknown. We performed a meta-analysis to compare the strength of top-down and bottom-up forces on consumer fitness, considering multiple predictors that can modulate these effects: diet breadth, feeding guild, habitat/environment, type of bottom-up effects, type of top-down effects and how consumer fitness effects are measured. We focused our analyses on the most diverse group of primary consumers, herbivorous insects, and found that in general top-down forces were stronger than bottom-up forces. Notably, chewing, sucking and gall-making herbivores were more affected by top-down than bottom-up forces, top-down forces were stronger than bottom-up in both natural and controlled (cultivated) environments, and parasitoids and predators had equally strong top-down effects on insect herbivores. Future studies should broaden the scope of focal consumers, particularly in understudied terrestrial systems, guilds, taxonomic groups and top-down controls (e.g. pathogens), and test for more complex indirect community interactions. Our results demonstrate the surprising strength of forces exerted by natural enemies on herbivorous insects, and thus the necessity of using a tri-trophic approach when studying insect-plant interactions. © 2017 John Wiley & Sons Ltd/CNRS.

  8. New, national bottom-up estimate for tree-based biological ...

    Science.gov (United States)

    Nitrogen is a limiting nutrient in many ecosystems, but is also a chief pollutant from human activity. Quantifying human impacts on the nitrogen cycle and investigating natural ecosystem nitrogen cycling both require an understanding of the magnitude of nitrogen inputs from biological nitrogen fixation (BNF). A bottom-up approach to estimating BNF—scaling rates up from measurements to broader scales—is attractive because it is rooted in actual BNF measurements. However, bottom-up approaches have been hindered by scaling difficulties, and a recent top-down approach suggested that the previous bottom-up estimate was much too large. Here, we used a bottom-up approach for tree-based BNF, overcoming scaling difficulties with the systematic, immense (>70,000 N-fixing trees) Forest Inventory and Analysis (FIA) database. We employed two approaches to estimate species-specific BNF rates: published ecosystem-scale rates (kg N ha-1 yr-1) and published estimates of the percent of N derived from the atmosphere (%Ndfa) combined with FIA-derived growth rates. Species-specific rates can vary for a variety of reasons, so for each approach we examined how different assumptions influenced our results. Specifically, we allowed BNF rates to vary with stand age, N-fixer density, and canopy position (since N-fixation is known to require substantial light).Our estimates from this bottom-up technique are several orders of magnitude lower than previous estimates indicating

  9. Bottom-up modeling of oil production: A review of approaches

    International Nuclear Information System (INIS)

    Jakobsson, Kristofer; Söderbergh, Bengt; Snowden, Simon; Aleklett, Kjell

    2014-01-01

    Bottom-up models of oil production are continuously being used to guide investments and policymaking. Compared to simpler top-down models, bottom-up models have a number of advantages due to their modularity, flexibility and concreteness. The purposes of this paper is to identify the crucial modeling challenges, compare the different ways in which nine existing models handle them, assess the appropriateness of these models, and point to possibilities of further development. The conclusions are that the high level of detail in bottom-up models is of questionable value for predictive accuracy, but of great value for identifying areas of uncertainty and new research questions. There is a potential for improved qualitative insights through systematic sensitivity analysis. This potential is at present largely unrealized. - Highlights: • Bottom-up models are influential in the study of the oil production supply chain. • Nine existing bottom-up models are reviewed. • The high level of detail is of questionable value for predictive accuracy. • There is a potential for more systematic sensitivity analysis

  10. Increased performance in a bottom-up designed robot by experimentally guided redesign

    DEFF Research Database (Denmark)

    Larsen, Jørgen Christian

    2013-01-01

    Purpose – Using a bottom-up, model-free approach when building robots is often seen as a less scientific way, compared to a top-down model-based approach, because the results are not easily generalizable to other systems. The authors, however, hypothesize that this problem may be addressed by using...... the bottom-up, mode-free approach, the authors used the robotic construction kit, LocoKit. This construction kit allows researchers to construct legged robots, without having a mathematical model beforehand. The authors used no specific mathematical model to design the robot, but instead used intuition...... solid experimental methods. The purpose of this paper is to show how well-known experimental methods from bio-mechanics are used to measure and locate weaknesses in a bottom-up, model-free implementation of a quadruped walker and come up with a better solution. Design/methodology/approach – To study...

  11. When endogenous spatial attention improves conscious perception: effects of alerting and bottom-up activation.

    Science.gov (United States)

    Botta, Fabiano; Lupiáñez, Juan; Chica, Ana B

    2014-01-01

    Recent studies have consistently demonstrated that conscious perception interacts with exogenous attentional orienting, but it can be dissociated from endogenous attentional orienting (Chica Lasaponara, et al., 2011; Wyart & Tallon-Baudry, 2008). It has been hypothesized that enhanced conscious processing at exogenously attended locations results from a synergistic action of spatial orienting, bottom-up activation, and phasic alerting induced by the abrupt onset of the exogenous cue (Chica, Lasaponara, et al., 2011). Instead, as endogenous cues need more time to be interpreted, the phasic alerting they produce may have dissipated when the target appears. Furthermore, endogenous cues presumably elicit a weak bottom-up activation at the cued location. Consistent with these hypotheses, we observed that endogenous attention modulated conscious perception, but only when phasic alerting or bottom-up activation was increased. Results are discussed in the context of recent theoretical models of consciousness (Dehaene, Changeux, Naccache, Sackur, & Sergent, 2006). Copyright © 2013 Elsevier Inc. All rights reserved.

  12. Bottom-up learning of hierarchical models in a class of deterministic POMDP environments

    Directory of Open Access Journals (Sweden)

    Itoh Hideaki

    2015-09-01

    Full Text Available The theory of partially observable Markov decision processes (POMDPs is a useful tool for developing various intelligent agents, and learning hierarchical POMDP models is one of the key approaches for building such agents when the environments of the agents are unknown and large. To learn hierarchical models, bottom-up learning methods in which learning takes place in a layer-by-layer manner from the lowest to the highest layer are already extensively used in some research fields such as hidden Markov models and neural networks. However, little attention has been paid to bottom-up approaches for learning POMDP models. In this paper, we present a novel bottom-up learning algorithm for hierarchical POMDP models and prove that, by using this algorithm, a perfect model (i.e., a model that can perfectly predict future observations can be learned at least in a class of deterministic POMDP environments

  13. Hydrophobic Interaction Chromatography for Bottom-Up Proteomics Analysis of Single Proteins and Protein Complexes.

    Science.gov (United States)

    Rackiewicz, Michal; Große-Hovest, Ludger; Alpert, Andrew J; Zarei, Mostafa; Dengjel, Jörn

    2017-06-02

    Hydrophobic interaction chromatography (HIC) is a robust standard analytical method to purify proteins while preserving their biological activity. It is widely used to study post-translational modifications of proteins and drug-protein interactions. In the current manuscript we employed HIC to separate proteins, followed by bottom-up LC-MS/MS experiments. We used this approach to fractionate antibody species followed by comprehensive peptide mapping as well as to study protein complexes in human cells. HIC-reversed-phase chromatography (RPC)-mass spectrometry (MS) is a powerful alternative to fractionate proteins for bottom-up proteomics experiments making use of their distinct hydrophobic properties.

  14. Integrated Assessment of Energy Policies: A Decomposition of Top-Down and Bottom-Up

    Energy Technology Data Exchange (ETDEWEB)

    Boehringer, Christoph (Univ. of Oldenburg (Germany)); Rutherford, Thomas F. (ETH Zuerich (Switzerland))

    2008-01-15

    The formulation of market equilibrium problems as mixed complementarity problems (MCP) permits integration of bottom-up programming models of the energy system into top-down general equilibrium models of the overall economy. Yet, in practise the MCP approach loses analytical tractability of income effects, when the energy system includes upper and lowrbounds on many decision variables . We therefore advocate the use of complementarity methods to solve only the top-down economic equilibrium model and employ quadratic programming to solve the underlying bottom-up energy supply model. A simple iterative procedure reconciles the equilibrium prices and quantities between both models.

  15. Bottom-up and top-down effects on plant communities

    DEFF Research Database (Denmark)

    Souza, Lara; Zelikova, Tamara Jane; Sanders, Nate

    2016-01-01

    -down) and soil nitrogen (bottom-up) were manipulated over six years in an existing old-field community. We tracked plant α and β diversity - within plot richness and among plot biodiversity- and aboveground net primary productivity (ANPP) over the course of the experiment. We found that bottom-up factors...... affected ANPP while top-down factors influenced plant community structure. Across years, while N reduction lowered ANPP by 10%, N reduction did not alter ANPP relative to control plots. Further, N reduction lowered ANPP by 20% relative to N addition plots. On the other hand, the reduction of insect...... community composition via shifts in plant dominance....

  16. Roll-to-roll UV imprint for bottom-up transistor fabrication

    NARCIS (Netherlands)

    Maury, P.; Turkenburg, D.H.; Stroeks, N.; Giesen, P.; Wijnen, M.; Tacken, R.; Meinders, E.R.; Werf, R. van der

    2011-01-01

    We propose a design to fabricate transistors on flexible substrates in a bottom-up fashion using R2R UV-imprint lithography. The design consists of a template composed of multilevel as well as gray level features, the later used to facilitate device interconnection. A hard mold is fabricated by LBR

  17. Coupling 2D Finite Element Models and Circuit Equations Using a Bottom-Up Methodology

    Science.gov (United States)

    2002-11-01

    EQUATIONS USING A BOTTOM-UP METHODOLOGY E. G6mezl, J. Roger-Folch2 , A. Gabald6nt and A. Molina’ ’Dpto. de Ingenieria Eldctrica. Universidad Polit...de Ingenieria Elictrica. ETSII. Universidad Politdcnica de Valencia. PO Box 22012, 46071. Valencia, Spain. E-mail: iroger adie.upv.es ABSTRACT The

  18. The Girlfriends Project: Evaluating a Promising Community-Based Intervention from a Bottom-Up Perspective

    Science.gov (United States)

    Hawk, Mary

    2015-01-01

    Randomized controlled trials are the gold standard in research but may not fully explain or predict outcome variations in community-based interventions. Demonstrating efficacy of externally driven programs in well-controlled environments may not translate to community-based implementation where resources and priorities vary. A bottom-up evaluation…

  19. An integrated top-down and bottom-up strategy for characterization protein isoforms and modifications

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Si; Tolic, Nikola; Tian, Zhixin; Robinson, Errol W.; Pasa-Tolic, Ljiljana

    2011-04-15

    Bottom-up and top-down strategies are two commonly used methods for mass spectrometry (MS) based protein identification; each method has its own advantages and disadvantages. In this chapter, we describe an integrated top-down and bottom-up approach facilitated by concurrent liquid chromatography-mass spectrometry (LC-MS) analysis and fraction collection for comprehensive high-throughput intact protein profiling. The approach employs a high resolution reversed phase (RP) LC separation coupled with LC eluent fraction collection and concurrent on-line MS with a high field (12 Tesla) Fourier-transform ion cyclotron resonance (FTICR) mass spectrometer. Protein elusion profiles and tentative modified protein identification are made using detected intact protein mass in conjunction with bottom-up protein identifications from the enzymatic digestion and analysis of corresponding LC fractions. Specific proteins of biological interest are incorporated into a target ion list for subsequent off-line gas-phase fragmentation that uses an aliquot of the original collected LC fraction, an aliquot of which was also used for bottom-up analysis.

  20. Bottom-up GGM algorithm for constructing multiple layered hierarchical gene regulatory networks

    Science.gov (United States)

    Multilayered hierarchical gene regulatory networks (ML-hGRNs) are very important for understanding genetics regulation of biological pathways. However, there are currently no computational algorithms available for directly building ML-hGRNs that regulate biological pathways. A bottom-up graphic Gaus...

  1. Bottom-up and top-down emotion generation: implications for emotion regulation

    Science.gov (United States)

    Misra, Supriya; Prasad, Aditya K.; Pereira, Sean C.; Gross, James J.

    2012-01-01

    Emotion regulation plays a crucial role in adaptive functioning and mounting evidence suggests that some emotion regulation strategies are often more effective than others. However, little attention has been paid to the different ways emotions can be generated: from the ‘bottom-up’ (in response to inherently emotional perceptual properties of the stimulus) or ‘top-down’ (in response to cognitive evaluations). Based on a process priming principle, we hypothesized that mode of emotion generation would interact with subsequent emotion regulation. Specifically, we predicted that top-down emotions would be more successfully regulated by a top-down regulation strategy than bottom-up emotions. To test this hypothesis, we induced bottom-up and top-down emotions, and asked participants to decrease the negative impact of these emotions using cognitive reappraisal. We observed the predicted interaction between generation and regulation in two measures of emotional responding. As measured by self-reported affect, cognitive reappraisal was more successful on top-down generated emotions than bottom-up generated emotions. Neurally, reappraisal of bottom-up generated emotions resulted in a paradoxical increase of amygdala activity. This interaction between mode of emotion generation and subsequent regulation should be taken into account when comparing of the efficacy of different types of emotion regulation, as well as when reappraisal is used to treat different types of clinical disorders. PMID:21296865

  2. Electrodeposition in capillaries: Bottom-up micro and nanopatterning of functional materials on conductive substrates

    NARCIS (Netherlands)

    George, A.; Maijenburg, A.W.; Maas, M.G.; Blank, David H.A.; ten Elshof, Johan E.

    2011-01-01

    A cost-effective and versatile methodology for bottom-up patterned growth of inorganic and metallic materials on the micro- and nanoscale is presented. Pulsed electrodeposition was employed to deposit arbitrary patterns of Ni, ZnO, and FeO(OH) of high quality, with lateral feature sizes down to

  3. Oriented bottom-up growth of armchair graphene nanoribbons on germanium

    Science.gov (United States)

    Arnold, Michael Scott; Jacobberger, Robert Michael

    2016-03-15

    Graphene nanoribbon arrays, methods of growing graphene nanoribbon arrays and electronic and photonic devices incorporating the graphene nanoribbon arrays are provided. The graphene nanoribbons in the arrays are formed using a scalable, bottom-up, chemical vapor deposition (CVD) technique in which the (001) facet of the germanium is used to orient the graphene nanoribbon crystals along the [110] directions of the germanium.

  4. Comparing Top-Down with Bottom-Up Approaches: Teaching Data Modeling

    Science.gov (United States)

    Kung, Hsiang-Jui; Kung, LeeAnn; Gardiner, Adrian

    2013-01-01

    Conceptual database design is a difficult task for novice database designers, such as students, and is also therefore particularly challenging for database educators to teach. In the teaching of database design, two general approaches are frequently emphasized: top-down and bottom-up. In this paper, we present an empirical comparison of students'…

  5. Bottom-up and Top-down: An alternate classification of LD authoring approaches

    NARCIS (Netherlands)

    Sodhi, Tim; Miao, Yongwu; Brouns, Francis; Koper, Rob

    2007-01-01

    Sodhi, T., Miao, Y., Brouns, F., & Koper, E. J. R. (2007). Bottom-up and Top-down: An alternate classification of LD authoring approaches. Paper presented at the TENCompetence Open Workshop on Current research on IMS Learning Design and Lifelong Competence Development Infrastructures. June, 21-22,

  6. Enhanced Photon Extraction from a Nanowire Quantum Dot Using a Bottom-Up Photonic Shell

    DEFF Research Database (Denmark)

    Jeannin, Mathieu; Cremel, Thibault; Häyrynen, Teppo

    2017-01-01

    Semiconductor nanowires offer the possibility to grow high-quality quantum-dot heterostructures, and, in particular, CdSe quantum dots inserted in ZnSe nanowires have demonstrated the ability to emit single photons up to room temperature. In this paper, we demonstrate a bottom-up approach...

  7. Bottom-up processes influence the demography and life-cycle phenology of Hawaiian bird communities

    Science.gov (United States)

    Jared D. Wolfe; C. John Ralph; Andrew Wiegardt

    2017-01-01

    Changes in climate can indirectly regulate populations at higher trophic levels by influencing the availability of food resources in the lower reaches of the food web. As such, species that rely on fruit and nectar food resources may be particularly sensitive to these bottom-up perturbations due to the strength of their trophic linkages with climatically-...

  8. Iodine versus Bromine Functionalization for Bottom-Up Graphene Nanoribbon Growth

    DEFF Research Database (Denmark)

    Bronner, Christopher; Marangoni, Tomas; Rizzo, Daniel J.

    2017-01-01

    Deterministic bottom-up approaches for synthesizing atomically well-defined graphene nanoribbons (GNRs) largely rely on the surface-catalyzed activation of selected labile bonds in a molecular precursor followed by step-growth polymerization and cyclodehydrogenation. While the majority of success...

  9. Modeling Bottom-Up Visual Attention Using Dihedral Group D4 §

    Directory of Open Access Journals (Sweden)

    Puneet Sharma

    2016-08-01

    Full Text Available In this paper, first, we briefly describe the dihedral group D 4 that serves as the basis for calculating saliency in our proposed model. Second, our saliency model makes two major changes in a latest state-of-the-art model known as group-based asymmetry. First, based on the properties of the dihedral group D 4 , we simplify the asymmetry calculations associated with the measurement of saliency. This results is an algorithm that reduces the number of calculations by at least half that makes it the fastest among the six best algorithms used in this research article. Second, in order to maximize the information across different chromatic and multi-resolution features, the color image space is de-correlated. We evaluate our algorithm against 10 state-of-the-art saliency models. Our results show that by using optimal parameters for a given dataset, our proposed model can outperform the best saliency algorithm in the literature. However, as the differences among the (few best saliency models are small, we would like to suggest that our proposed model is among the best and the fastest among the best. Finally, as a part of future work, we suggest that our proposed approach on saliency can be extended to include three-dimensional image data.

  10. Comparison between bottom-up and top-down approaches in the estimation of measurement uncertainty.

    Science.gov (United States)

    Lee, Jun Hyung; Choi, Jee-Hye; Youn, Jae Saeng; Cha, Young Joo; Song, Woonheung; Park, Ae Ja

    2015-06-01

    Measurement uncertainty is a metrological concept to quantify the variability of measurement results. There are two approaches to estimate measurement uncertainty. In this study, we sought to provide practical and detailed examples of the two approaches and compare the bottom-up and top-down approaches to estimating measurement uncertainty. We estimated measurement uncertainty of the concentration of glucose according to CLSI EP29-A guideline. Two different approaches were used. First, we performed a bottom-up approach. We identified the sources of uncertainty and made an uncertainty budget and assessed the measurement functions. We determined the uncertainties of each element and combined them. Second, we performed a top-down approach using internal quality control (IQC) data for 6 months. Then, we estimated and corrected systematic bias using certified reference material of glucose (NIST SRM 965b). The expanded uncertainties at the low glucose concentration (5.57 mmol/L) by the bottom-up approach and top-down approaches were ±0.18 mmol/L and ±0.17 mmol/L, respectively (all k=2). Those at the high glucose concentration (12.77 mmol/L) by the bottom-up and top-down approaches were ±0.34 mmol/L and ±0.36 mmol/L, respectively (all k=2). We presented practical and detailed examples for estimating measurement uncertainty by the two approaches. The uncertainties by the bottom-up approach were quite similar to those by the top-down approach. Thus, we demonstrated that the two approaches were approximately equivalent and interchangeable and concluded that clinical laboratories could determine measurement uncertainty by the simpler top-down approach.

  11. The interplay of bottom-up and top-down mechanisms in visual guidance during object naming.

    Science.gov (United States)

    Coco, Moreno I; Malcolm, George L; Keller, Frank

    2014-01-01

    An ongoing issue in visual cognition concerns the roles played by low- and high-level information in guiding visual attention, with current research remaining inconclusive about the interaction between the two. In this study, we bring fresh evidence into this long-standing debate by investigating visual saliency and contextual congruency during object naming (Experiment 1), a task in which visual processing interacts with language processing. We then compare the results of this experiment to data of a memorization task using the same stimuli (Experiment 2). In Experiment 1, we find that both saliency and congruency influence visual and naming responses and interact with linguistic factors. In particular, incongruent objects are fixated later and less often than congruent ones. However, saliency is a significant predictor of object naming, with salient objects being named earlier in a trial. Furthermore, the saliency and congruency of a named object interact with the lexical frequency of the associated word and mediate the time-course of fixations at naming. In Experiment 2, we find a similar overall pattern in the eye-movement responses, but only the congruency of the target is a significant predictor, with incongruent targets fixated less often than congruent targets. Crucially, this finding contrasts with claims in the literature that incongruent objects are more informative than congruent objects by deviating from scene context and hence need a longer processing. Overall, this study suggests that different sources of information are interactively used to guide visual attention on the targets to be named and raises new questions for existing theories of visual attention.

  12. Agricultural ammonia emissions in China: reconciling bottom-up and top-down estimates

    Directory of Open Access Journals (Sweden)

    L. Zhang

    2018-01-01

    Full Text Available Current estimates of agricultural ammonia (NH3 emissions in China differ by more than a factor of 2, hindering our understanding of their environmental consequences. Here we apply both bottom-up statistical and top-down inversion methods to quantify NH3 emissions from agriculture in China for the year 2008. We first assimilate satellite observations of NH3 column concentration from the Tropospheric Emission Spectrometer (TES using the GEOS-Chem adjoint model to optimize Chinese anthropogenic NH3 emissions at the 1∕2°  ×  2∕3° horizontal resolution for March–October 2008. Optimized emissions show a strong summer peak, with emissions about 50 % higher in summer than spring and fall, which is underestimated in current bottom-up NH3 emission estimates. To reconcile the latter with the top-down results, we revisit the processes of agricultural NH3 emissions and develop an improved bottom-up inventory of Chinese NH3 emissions from fertilizer application and livestock waste at the 1∕2°  ×  2∕3° resolution. Our bottom-up emission inventory includes more detailed information on crop-specific fertilizer application practices and better accounts for meteorological modulation of NH3 emission factors in China. We find that annual anthropogenic NH3 emissions are 11.7 Tg for 2008, with 5.05 Tg from fertilizer application and 5.31 Tg from livestock waste. The two sources together account for 88 % of total anthropogenic NH3 emissions in China. Our bottom-up emission estimates also show a distinct seasonality peaking in summer, consistent with top-down results from the satellite-based inversion. Further evaluations using surface network measurements show that the model driven by our bottom-up emissions reproduces the observed spatial and seasonal variations of NH3 gas concentrations and ammonium (NH4+ wet deposition fluxes over China well, providing additional credibility to the improvements we have made to our

  13. A top-down bottom-up modeling approach to climate change policy analysis

    International Nuclear Information System (INIS)

    Tuladhar, Sugandha D.; Yuan, Mei; Bernstein, Paul; Montgomery, W. David; Smith, Anne

    2009-01-01

    This paper analyzes macroeconomic impacts of U.S. climate change policies for three different emissions pathways using a top-down bottom-up integrated model. The integrated model couples a technology-rich, bottom-up model of the U.S. electricity sector with a fully dynamic, forward-looking general equilibrium model of the U.S. economy. Our model provides a unique and consistent modeling framework for climate change analysis. Because of the model's detail and flexibility, we use it to examine additional scenarios to analyze many of the major uncertainties surrounding the implementation and impact of climate change policies - the role of command-and-control measures, loss in flexibility mechanisms such as banking, limits on low-emitting technology, and availability of offsets. The results consistently demonstrate that those policies that combine market-oriented abatement incentives with full flexibility are the most cost-effective. (author)

  14. The generation of myricetin-nicotinamide nanococrystals by top down and bottom up technologies.

    Science.gov (United States)

    Liu, Mingyu; Hong, Chao; Li, Guowen; Ma, Ping; Xie, Yan

    2016-09-30

    Myricetin-nicotinamide (MYR-NIC) nanococrystal preparation methods were developed and optimized using both top down and bottom up approaches. The grinding (top down) method successfully achieved nanococrystals, but there were some micrometer range particles and aggregation. The key consideration of the grinding technology was to control the milling time to determine a balance between the particle size and distribution. In contrast, a modified bottom up approach based on a solution method in conjunction with sonochemistry resulted in a uniform MYR-NIC nanococrystal that was confirmed by powder x-ray diffraction, scanning electron microscopy, dynamic light scattering, and differential scanning calorimeter, and the particle dissolution rate and amount were significantly greater than that of MYR-NIC cocrystal. Notably, this was a simple method without the addition of any non-solvent. We anticipate our findings will provide some guidance for future nanococrystal preparation as well as its application in both chemical and pharmaceutical area.

  15. A Bottom-up Approach to Environmental Cost-Benefit Analysis

    DEFF Research Database (Denmark)

    Carolus, Johannes Friedrich; Hanley, Nick; Olsen, Søren Bøye

    with the underlying environmental problem, and then assesses costs and benefits of various strategies and solutions suggested by local and directly affected stakeholders. For empirical case studies concerning two river catchments in Sweden and Latvia, the bottom-up CBA approach utilises local knowledge, assesses......Cost-Benefit Analysis is a method to assess the effects of policies and projects on social welfare. CBAs are usually applied in a top-down approach, in the sense that a decision-making body first decides on which policies or projects are to be considered, and then applies a set of uniform criteria...... plans which are not only developed for local conditions but are also likely to be more acceptable to local society, and sheds additional light on possible distributional effects. By not only benefitting from, but also supporting participative environmental planning, bottom-up CBA is in line...

  16. Top-down versus bottom-up processing of influence diagrams in probabilistic analysis

    International Nuclear Information System (INIS)

    Timmerman, R.D.; Burns, T.J.; Dodds, H.L. Jr.

    1986-01-01

    Recent work by Phillips and Selby has shown that influence diagram methodology can be a useful analytical tool in reactor safety studies. In some instances an influence diagram can be used as a graphical representation of probabilistic dependence within a system or event sequence. Under these circumstances, Bayesian statistics is employed to transform the relationships depicted in the influence diagram into the correct expression for a desired marginal probability (e.g. the top node). Top-down and bottom-up algorithms have emerged as the dominant methods for quantifying influence diagrams. The purpose of this paper is to demonstrate a potential error in employing the bottom-up algorithm when dealing with interdependencies

  17. Mechanisms of knowledge flows in bottom-up and top-down cluster initiatives

    Directory of Open Access Journals (Sweden)

    Wojciech Dyba

    2016-01-01

    Full Text Available Knowledge flows are widely believed to be a phenomenon of clusters, and inducing them is one of the chief objectives in establishing and promoting cluster initiatives (CI. However, not many studies discuss how these flows and their effects may differ depending on the mode of CI creation and on the role of public authorities in this process. The main aim of this article is to compare mechanisms of knowledge flows in bottom-up and top-down cluster initiatives. The results of an empirical research involving two case studies in western Poland, obtained through the use of Social Network Analysis (SNA, allowed stating that in bottom-up cluster initiatives firms which were innovation leaders played a prime role in disseminating technological and business knowledge, while in the top-down initiatives the most important were representatives of universities and research centres as well as formal coordinators of cooperation. Policy implications stemming from these results were identified.

  18. Top-down versus bottom-up processing of influence diagrams in probabilistic analysis

    International Nuclear Information System (INIS)

    Timmerman, R.D.; Burns, T.J.; Dodds, H.L. Jr.

    1984-01-01

    Recent work by Phillips et al., and Selby et al., has shown that influence diagram methodology can be a useful analytical tool in reactor safety studies. An influence diagram is a graphical representation of probabilistic dependence within a system or event sequence. Bayesian statistics are employed to transform the relationships depicted in the influence diagram into the correct expression for a desired marginal probability (e.g. the top event). As with fault trees, top-down and bottom-up algorithms have emerged as the dominant methods for quantifying influence diagrams. Purpose of this paper is to demonstrate a potential error in employing the bottom-up algorithm when dealing with interdependencies. In addition, the computing efficiency of both methods is discussed

  19. Learning affects top down and bottom up modulation of eye movements in decision making

    DEFF Research Database (Denmark)

    Orquin, Jacob Lund; Bagger, Martin; Mueller Loose, Simone

    2013-01-01

    Repeated decision making is subject to changes over time such as decreases in decision time and information use and increases in decision accuracy. We show that a traditional strategy selection view of decision making cannot account for these temporal dynamics without relaxing main assumptions...... about what defines a decision strategy. As an alternative view we suggest that temporal dynamics in decision making are driven by attentional and perceptual processes and that this view has been expressed in the information reduction hypothesis. We test the information reduction hypothesis by integrating...... it in a broader framework of top down and bottom up processes and derive the predictions that repeated decisions increase top down control of attention capture which in turn leads to a reduction in bottom up attention capture. To test our hypotheses we conducted a repeated discrete choice experiment with three...

  20. A balance of bottom-up and top-down in linking climate policies

    Science.gov (United States)

    Green, Jessica F.; Sterner, Thomas; Wagner, Gernot

    2014-12-01

    Top-down climate negotiations embodied by the Kyoto Protocol have all but stalled, chiefly because of disagreements over targets and objections to financial transfers. To avoid those problems, many have shifted their focus to linkage of bottom-up climate policies such as regional carbon markets. This approach is appealing, but we identify four obstacles to successful linkage: different levels of ambition; competing domestic policy objectives; objections to financial transfers; and the difficulty of close regulatory coordination. Even with a more decentralized approach, overcoming the 'global warming gridlock' of the intergovernmental negotiations will require close international coordination. We demonstrate how a balance of bottom-up and top-down elements can create a path toward an effective global climate architecture.

  1. A constraint-based bottom-up counterpart to definite clause grammars

    DEFF Research Database (Denmark)

    Christiansen, Henning

    2004-01-01

    A new grammar formalism, CHR Grammars (CHRG), is proposed that provides a constraint-solving approach to language analysis, built on top of the programming language of Constraint Handling Rules in the same way as Definite Clause Grammars (DCG) on Prolog. CHRG works bottom-up and adds the following......, integrity constraints, operators a la assumption grammars, and to incorporate other constraint solvers. (iv)~Context-sensitive rules that apply for disambiguation, coordination in natural language and tagger-like rules....

  2. Top down and bottom up selection drives variations in frequency and form of a visual signal

    OpenAIRE

    Yeh, Chien-Wei; Blamires, Sean J.; Liao, Chen-Pan; Tso, I.-Min

    2015-01-01

    The frequency and form of visual signals can be shaped by selection from predators, prey or both. When a signal simultaneously attracts predators and prey, selection may favour a strategy that minimizes risks while attracting prey. Accordingly, varying the frequency and form of the silken decorations added to their web may be a way that Argiope spiders minimize predation while attracting prey. Nonetheless, the role of extraneous factors renders the influences of top down and bottom up selecti...

  3. Dissociable effects of top-down and bottom-up attention during episodic encoding

    Science.gov (United States)

    Uncapher, Melina R.; Hutchinson, J. Benjamin; Wagner, Anthony D.

    2011-01-01

    It is well established that the formation of memories for life’s experiences—episodic memory—is influenced by how we attend to those experiences, yet the neural mechanisms by which attention shapes episodic encoding are still unclear. We investigated how top-down and bottom-up attention contribute to memory encoding of visual objects in humans by manipulating both types of attention during functional magnetic resonance imaging (fMRI) of episodic memory formation. We show that dorsal parietal cortex—specifically, intraparietal sulcus (IPS)—was engaged during top-down attention and was also recruited during the successful formation of episodic memories. By contrast, bottom-up attention engaged ventral parietal cortex—specifically, temporoparietal junction (TPJ)—and was also more active during encoding failure. Functional connectivity analyses revealed further dissociations in how top-down and bottom-up attention influenced encoding: while both IPS and TPJ influenced activity in perceptual cortices thought to represent the information being encoded (fusiform/lateral occipital cortex), they each exerted opposite effects on memory encoding. Specifically, during a preparatory period preceding stimulus presentation, a stronger drive from IPS was associated with a higher likelihood that the subsequently attended stimulus would be encoded. By contrast, during stimulus processing, stronger connectivity with TPJ was associated with a lower likelihood the stimulus would be successfully encoded. These findings suggest that during encoding of visual objects into episodic memory, top-down and bottom-up attention can have opposite influences on perceptual areas that subserve visual object representation, suggesting that one manner in which attention modulates memory is by altering the perceptual processing of to-be-encoded stimuli. PMID:21880922

  4. Bottom-Up Technologies for Reuse: Automated Extractive Adoption of Software Product Lines

    OpenAIRE

    Martinez , Jabier ,; Ziadi , Tewfik; Bissyandé , Tegawendé; Klein , Jacques ,; Le Traon , Yves ,

    2017-01-01

    International audience; Adopting Software Product Line (SPL) engineering principles demands a high up-front investment. Bottom-Up Technologies for Reuse (BUT4Reuse) is a generic and extensible tool aimed to leverage existing similar software products in order to help in extractive SPL adoption. The envisioned users are 1) SPL adopters and 2) Integrators of techniques and algorithms to provide automation in SPL adoption activities. We present the methodology it implies for both types of users ...

  5. Quantifying the Flexibility of Residential Electricity Demand in 2050: a Bottom-Up Approach

    OpenAIRE

    van Stiphout, Arne; Engels, Jonas; Guldentops, Dries; Deconinck, Geert

    2015-01-01

    This work presents a new method to quantify the flexibility of automatic demand response applied to residential electricity demand using price elasticities. A stochastic bottom-up model of flexible electricity demand in 2050 is presented. Three types of flexible devices are implemented: electrical heating, electric vehicles and wet appliances. Each house schedules its flexible demand w.r.t. a varying price signal, in order to minimize electricity cost. Own- and cross-price elasticities are ob...

  6. Selective spatial attention modulates bottom-up informational masking of speech

    OpenAIRE

    Carlile, Simon; Corkhill, Caitlin

    2015-01-01

    To hear out a conversation against other talkers listeners overcome energetic and informational masking. Largely attributed to top-down processes, information masking has also been demonstrated using unintelligible speech and amplitude-modulated maskers suggesting bottom-up processes. We examined the role of speech-like amplitude modulations in information masking using a spatial masking release paradigm. Separating a target talker from two masker talkers produced a 20?dB improvement in speec...

  7. Identifying prognostic features by bottom-up approach and correlating to drug repositioning.

    Directory of Open Access Journals (Sweden)

    Wei Li

    Full Text Available Traditionally top-down method was used to identify prognostic features in cancer research. That is to say, differentially expressed genes usually in cancer versus normal were identified to see if they possess survival prediction power. The problem is that prognostic features identified from one set of patient samples can rarely be transferred to other datasets. We apply bottom-up approach in this study: survival correlated or clinical stage correlated genes were selected first and prioritized by their network topology additionally, then a small set of features can be used as a prognostic signature.Gene expression profiles of a cohort of 221 hepatocellular carcinoma (HCC patients were used as a training set, 'bottom-up' approach was applied to discover gene-expression signatures associated with survival in both tumor and adjacent non-tumor tissues, and compared with 'top-down' approach. The results were validated in a second cohort of 82 patients which was used as a testing set.Two sets of gene signatures separately identified in tumor and adjacent non-tumor tissues by bottom-up approach were developed in the training cohort. These two signatures were associated with overall survival times of HCC patients and the robustness of each was validated in the testing set, and each predictive performance was better than gene expression signatures reported previously. Moreover, genes in these two prognosis signature gave some indications for drug-repositioning on HCC. Some approved drugs targeting these markers have the alternative indications on hepatocellular carcinoma.Using the bottom-up approach, we have developed two prognostic gene signatures with a limited number of genes that associated with overall survival times of patients with HCC. Furthermore, prognostic markers in these two signatures have the potential to be therapeutic targets.

  8. Bottom-up laboratory testing of the DKIST Visible Broadband Imager (VBI)

    Science.gov (United States)

    Ferayorni, Andrew; Beard, Andrew; Cole, Wes; Gregory, Scott; Wöeger, Friedrich

    2016-08-01

    The Daniel K. Inouye Solar Telescope (DKIST) is a 4-meter solar observatory under construction at Haleakala, Hawaii [1]. The Visible Broadband Imager (VBI) is a first light instrument that will record images at the highest possible spatial and temporal resolution of the DKIST at a number of scientifically important wavelengths [2]. The VBI is a pathfinder for DKIST instrumentation and a test bed for developing processes and procedures in the areas of unit, systems integration, and user acceptance testing. These test procedures have been developed and repeatedly executed during VBI construction in the lab as part of a "test early and test often" philosophy aimed at identifying and resolving issues early thus saving cost during integration test and commissioning on summit. The VBI team recently completed a bottom up end-to-end system test of the instrument in the lab that allowed the instrument's functionality, performance, and usability to be validated against documented system requirements. The bottom up testing approach includes four levels of testing, each introducing another layer in the control hierarchy that is tested before moving to the next level. First the instrument mechanisms are tested for positioning accuracy and repeatability using a laboratory position-sensing detector (PSD). Second the real-time motion controls are used to drive the mechanisms to verify speed and timing synchronization requirements are being met. Next the high-level software is introduced and the instrument is driven through a series of end-to-end tests that exercise the mechanisms, cameras, and simulated data processing. Finally, user acceptance testing is performed on operational and engineering use cases through the use of the instrument engineering graphical user interface (GUI). In this paper we present the VBI bottom up test plan, procedures, example test cases and tools used, as well as results from test execution in the laboratory. We will also discuss the benefits realized

  9. Bottom-up approach to sustainable urban development in Lebanon: The case of Zouk Mosbeh

    OpenAIRE

    El Asmar, Jean-Pierre; Ebohon, O. J.; Taki, A. H.

    2012-01-01

    In contrast with the “top-down” approach to development, the dominant methodology in Lebanon, Iemphasize rather the “bottom-up” approach where all stakeholders have equal opportunities to participate in policy formulation and implementation. The bottom-up or participatory approach to sustainable development has hardly been tested for urban development and management in Lebanon. This research concerns the sustainable rehabilitation of the built environment in the area of Zouk Mosbeh (ZM) in ...

  10. Protein chimerism: novel source of protein diversity in humans adds complexity to bottom-up proteomics.

    Science.gov (United States)

    Casado-Vela, Juan; Lacal, Juan Carlos; Elortza, Felix

    2013-01-01

    Three main molecular mechanisms are considered to contribute expanding the repertoire and diversity of proteins present in living organisms: first, at DNA level (gene polymorphisms and single nucleotide polymorphisms); second, at messenger RNA (pre-mRNA and mRNA) level including alternative splicing (also termed differential splicing or cis-splicing); finally, at the protein level mainly driven through PTM and specific proteolytic cleavages. Chimeric mRNAs constitute an alternative source of protein diversity, which can be generated either by chromosomal translocations or by trans-splicing events. The occurrence of chimeric mRNAs and proteins is a frequent event in cells from the immune system and cancer cells, mainly as a consequence of gene rearrangements. Recent reports support that chimeric proteins may also be expressed at low levels under normal physiological circumstances, thus, representing a novel source of protein diversity. Notably, recent publications demonstrate that chimeric protein products can be successfully identified through bottom-up proteomic analyses. Several questions remain unsolved, such as the physiological role and impact of such chimeric proteins or the potential occurrence of chimeric proteins in higher eukaryotic organisms different from humans. The occurrence of chimeric proteins certainly seems to be another unforeseen source of complexity for the proteome. It may be a process to take in mind not only when performing bottom-up proteomic analyses in cancer studies but also in general bottom-up proteomics experiments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Hierarchical time series bottom-up approach for forecast the export value in Central Java

    Science.gov (United States)

    Mahkya, D. A.; Ulama, B. S.; Suhartono

    2017-10-01

    The purpose of this study is Getting the best modeling and predicting the export value of Central Java using a Hierarchical Time Series. The export value is one variable injection in the economy of a country, meaning that if the export value of the country increases, the country’s economy will increase even more. Therefore, it is necessary appropriate modeling to predict the export value especially in Central Java. Export Value in Central Java are grouped into 21 commodities with each commodity has a different pattern. One approach that can be used time series is a hierarchical approach. Hierarchical Time Series is used Buttom-up. To Forecast the individual series at all levels using Autoregressive Integrated Moving Average (ARIMA), Radial Basis Function Neural Network (RBFNN), and Hybrid ARIMA-RBFNN. For the selection of the best models used Symmetric Mean Absolute Percentage Error (sMAPE). Results of the analysis showed that for the Export Value of Central Java, Bottom-up approach with Hybrid ARIMA-RBFNN modeling can be used for long-term predictions. As for the short and medium-term predictions, it can be used a bottom-up approach RBFNN modeling. Overall bottom-up approach with RBFNN modeling give the best result.

  12. Daylighting performance evaluation of a bottom-up motorized roller shade

    Energy Technology Data Exchange (ETDEWEB)

    Kapsis, K.; Athienitis, A.K.; Zmeureanu, R.G. [Department of Building, Civil and Environmental Engineering, Concordia University, Montreal, QC (Canada); Tzempelikos, A. [School of Civil Engineering, Purdue University, West Lafayette, IN (United States)

    2010-12-15

    This paper presents an experimental and simulation study for quantifying the daylighting performance of bottom-up roller shades installed in office spaces. The bottom-up shade is a motorized roller shade that opens from top to bottom operating in the opposite direction of a conventional roller shade, so as to cover the bottom part of the window, while allowing daylight to enter from the top part of the window, reaching deeper into the room. A daylighting simulation model, validated with full-scale experiments, was developed in order to establish correlations between the shade position, outdoor illuminance and work plane illuminance for different outdoor conditions. Then, a shading control algorithm was developed for application in any location and orientation. The validated model was employed for a sensitivity analysis of the impact of shade optical properties and control on the potential energy savings due to the use of daylighting. The results showed that Daylight Autonomy for the bottom-up shade is 8-58% higher compared to a conventional roller shade, with a difference of 46% further away from the facade, where the use of electric lighting is needed most of the time. The potential reduction in energy consumption for lighting is 21-41%. (author)

  13. A Bottom-up Trend in Research of Management of Technology

    Directory of Open Access Journals (Sweden)

    Yoko Ishino

    2014-12-01

    Full Text Available Management of Technology (MOT is defined as an academic discipline of management that enables organizations to manage their technological fundamentals to create competitive advantage. MOT covers a wide range of contents including administrative strategy, R&D management, manufacturing management, technology transfer, production control, marketing, accounting, finance, business ethics, and others. For each topic, researchers have conducted their MOT research at various levels. However, a practical and pragmatic side of MOT surely affects its research trends. Finding changes of MOT research trends, or the chronological transitions of principal subjects, can help understand the key concepts of current MOT. This paper studied a bottom-up trend in research fields in MOT by applying a text-mining method to the conference proceedings of IAMOT (International Association for Management of Technology. First, focusing on only nouns found several keywords, which more frequently emerge over time in the IAMOT proceedings. Then, expanding the scope into other parts of speech viewed the keywords in a natural context. Finally, it was found that the use of an important keyword has qualitatively and quantitatively extended over time. In conclusion, a bottom-up trend in MOT research was detected and the effects of the social situation on the trend were discussed.Keywords: Management of Technology; Text Mining; Research Trend; Bottom-up Trend; Patent

  14. Top-down or bottom-up modelling. An application to CO2 abatement

    International Nuclear Information System (INIS)

    Laroui, F.; Van Leeuwen, M.J.

    1995-06-01

    In four articles a comparison is made of bottom-up, or engineers'' models, and top-down models, which comprise macro-econometric models, computable general equilibrium models and also models in the system dynamics tradition. In the first article the history of economic modelling is outlined. In the second article the multi-sector macro-economic Computable General Equilibrium model for the Netherlands is described. It can be used to study the long-term effects of fiscal policy measures on economic and environmental indicators, in particular the effects on the level of CO2-emissions. The aim of article 3 is to describe the structure of the electricity supply industry in the UK and how it can be represented in a bottom-up sub-model within a more general E3 sectoral model of the UK economy. The objective of the last paper (4) is mainly a methodological discussion about integrating top-down and bottom-up models which can be used to assess CO2 abatement policies impacts on economic activity

  15. Nanomaterial processing using self-assembly-bottom-up chemical and biological approaches

    International Nuclear Information System (INIS)

    Thiruvengadathan, Rajagopalan; Gangopadhyay, Keshab; Gangopadhyay, Shubhra; Korampally, Venumadhav; Ghosh, Arkasubhra; Chanda, Nripen

    2013-01-01

    Nanotechnology is touted as the next logical sequence in technological evolution. This has led to a substantial surge in research activities pertaining to the development and fundamental understanding of processes and assembly at the nanoscale. Both top-down and bottom-up fabrication approaches may be used to realize a range of well-defined nanostructured materials with desirable physical and chemical attributes. Among these, the bottom-up self-assembly process offers the most realistic solution toward the fabrication of next-generation functional materials and devices. Here, we present a comprehensive review on the physical basis behind self-assembly and the processes reported in recent years to direct the assembly of nanoscale functional blocks into hierarchically ordered structures. This paper emphasizes assembly in the synthetic domain as well in the biological domain, underscoring the importance of biomimetic approaches toward novel materials. In particular, two important classes of directed self-assembly, namely, (i) self-assembly among nanoparticle–polymer systems and (ii) external field-guided assembly are highlighted. The spontaneous self-assembling behavior observed in nature that leads to complex, multifunctional, hierarchical structures within biological systems is also discussed in this review. Recent research undertaken to synthesize hierarchically assembled functional materials have underscored the need as well as the benefits harvested in synergistically combining top-down fabrication methods with bottom-up self-assembly. (review article)

  16. Bottom-Up Tri-gate Transistors and Submicrosecond Photodetectors from Guided CdS Nanowalls.

    Science.gov (United States)

    Xu, Jinyou; Oksenberg, Eitan; Popovitz-Biro, Ronit; Rechav, Katya; Joselevich, Ernesto

    2017-11-08

    Tri-gate transistors offer better performance than planar transistors by exerting additional gate control over a channel from two lateral sides of semiconductor nanowalls (or "fins"). Here we report the bottom-up assembly of aligned CdS nanowalls by a simultaneous combination of horizontal catalytic vapor-liquid-solid growth and vertical facet-selective noncatalytic vapor-solid growth and their parallel integration into tri-gate transistors and photodetectors at wafer scale (cm 2 ) without postgrowth transfer or alignment steps. These tri-gate transistors act as enhancement-mode transistors with an on/off current ratio on the order of 10 8 , 4 orders of magnitude higher than the best results ever reported for planar enhancement-mode CdS transistors. The response time of the photodetector is reduced to the submicrosecond level, 1 order of magnitude shorter than the best results ever reported for photodetectors made of bottom-up semiconductor nanostructures. Guided semiconductor nanowalls open new opportunities for high-performance 3D nanodevices assembled from the bottom up.

  17. Top-down and bottom-up definitions of human failure events in human reliability analysis

    International Nuclear Information System (INIS)

    Boring, Ronald Laurids

    2014-01-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down - defined as a subset of the PRA - whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up - derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  18. Uncertainty quantification for radiation measurements: Bottom-up error variance estimation using calibration information

    International Nuclear Information System (INIS)

    Burr, T.; Croft, S.; Krieger, T.; Martin, K.; Norman, C.; Walsh, S.

    2016-01-01

    One example of top-down uncertainty quantification (UQ) involves comparing two or more measurements on each of multiple items. One example of bottom-up UQ expresses a measurement result as a function of one or more input variables that have associated errors, such as a measured count rate, which individually (or collectively) can be evaluated for impact on the uncertainty in the resulting measured value. In practice, it is often found that top-down UQ exhibits larger error variances than bottom-up UQ, because some error sources are present in the fielded assay methods used in top-down UQ that are not present (or not recognized) in the assay studies used in bottom-up UQ. One would like better consistency between the two approaches in order to claim understanding of the measurement process. The purpose of this paper is to refine bottom-up uncertainty estimation by using calibration information so that if there are no unknown error sources, the refined bottom-up uncertainty estimate will agree with the top-down uncertainty estimate to within a specified tolerance. Then, in practice, if the top-down uncertainty estimate is larger than the refined bottom-up uncertainty estimate by more than the specified tolerance, there must be omitted sources of error beyond those predicted from calibration uncertainty. The paper develops a refined bottom-up uncertainty approach for four cases of simple linear calibration: (1) inverse regression with negligible error in predictors, (2) inverse regression with non-negligible error in predictors, (3) classical regression followed by inversion with negligible error in predictors, and (4) classical regression followed by inversion with non-negligible errors in predictors. Our illustrations are of general interest, but are drawn from our experience with nuclear material assay by non-destructive assay. The main example we use is gamma spectroscopy that applies the enrichment meter principle. Previous papers that ignore error in predictors

  19. Stress testing hydrologic models using bottom-up climate change assessment

    Science.gov (United States)

    Stephens, C.; Johnson, F.; Marshall, L. A.

    2017-12-01

    Bottom-up climate change assessment is a promising approach for understanding the vulnerability of a system to potential future changes. The technique has been utilised successfully in risk-based assessments of future flood severity and infrastructure vulnerability. We find that it is also an ideal tool for assessing hydrologic model performance in a changing climate. In this study, we applied bottom-up climate change to compare the performance of two different hydrologic models (an event-based and a continuous model) under increasingly severe climate change scenarios. This allowed us to diagnose likely sources of future prediction error in the two models. The climate change scenarios were based on projections for southern Australia, which indicate drier average conditions with increased extreme rainfall intensities. We found that the key weakness in using the event-based model to simulate drier future scenarios was the model's inability to dynamically account for changing antecedent conditions. This led to increased variability in model performance relative to the continuous model, which automatically accounts for the wetness of a catchment through dynamic simulation of water storages. When considering more intense future rainfall events, representation of antecedent conditions became less important than assumptions around (non)linearity in catchment response. The linear continuous model we applied may underestimate flood risk in a future climate with greater extreme rainfall intensity. In contrast with the recommendations of previous studies, this indicates that continuous simulation is not necessarily the key to robust flood modelling under climate change. By applying bottom-up climate change assessment, we were able to understand systematic changes in relative model performance under changing conditions and deduce likely sources of prediction error in the two models.

  20. Visual anticipation biases conscious perception but not bottom-up visual processing

    Directory of Open Access Journals (Sweden)

    Paul F.M.J. Verschure

    2015-01-01

    Full Text Available Theories of consciousness can be grouped with respect to their stance on embodiment, sensori-motor contingencies, prediction and integration. In this list prediction plays a key role and it is not clear which aspects of prediction are most prominent in the conscious scene. An evolving view on the brain is that it can be seen as a prediction machine that optimizes its ability to predict states of the world and the self through the top-down propagation of predictions and the bottom-up presentation of prediction errors. There are competing views though on whether prediction or prediction errors dominate the conscious scene. Yet, due to the lack of efficient indirect measures, the dynamic effects of prediction on perception, decision making and consciousness have been difficult to assess and to model. We propose a novel mathematical framework and psychophysical paradigm that allows us to assess both the hierarchical structuring of perceptual consciousness, its content and the impact of predictions and / or errors on the conscious scene. Using a displacement detection task combined with reverse correlation we reveal signatures of the usage of prediction at three different levels of perception: bottom-up early saccades, top-down driven late saccades and conscious decisions. Our results suggest that the brain employs multiple parallel mechanisms at different levels of information processing to restrict the sensory field using predictions. We observe that cognitive load has a quantifiable effect on this dissociation of the bottom-up sensory and top-down predictive processes. We propose a probabilistic data association model from dynamical systems theory to model this predictive bias in different information processing levels.

  1. Visual anticipation biases conscious decision making but not bottom-up visual processing.

    Science.gov (United States)

    Mathews, Zenon; Cetnarski, Ryszard; Verschure, Paul F M J

    2014-01-01

    Prediction plays a key role in control of attention but it is not clear which aspects of prediction are most prominent in conscious experience. An evolving view on the brain is that it can be seen as a prediction machine that optimizes its ability to predict states of the world and the self through the top-down propagation of predictions and the bottom-up presentation of prediction errors. There are competing views though on whether prediction or prediction errors dominate the formation of conscious experience. Yet, the dynamic effects of prediction on perception, decision making and consciousness have been difficult to assess and to model. We propose a novel mathematical framework and a psychophysical paradigm that allows us to assess both the hierarchical structuring of perceptual consciousness, its content and the impact of predictions and/or errors on conscious experience, attention and decision-making. Using a displacement detection task combined with reverse correlation, we reveal signatures of the usage of prediction at three different levels of perceptual processing: bottom-up fast saccades, top-down driven slow saccades and consciousnes decisions. Our results suggest that the brain employs multiple parallel mechanism at different levels of perceptual processing in order to shape effective sensory consciousness within a predicted perceptual scene. We further observe that bottom-up sensory and top-down predictive processes can be dissociated through cognitive load. We propose a probabilistic data association model from dynamical systems theory to model the predictive multi-scale bias in perceptual processing that we observe and its role in the formation of conscious experience. We propose that these results support the hypothesis that consciousness provides a time-delayed description of a task that is used to prospectively optimize real time control structures, rather than being engaged in the real-time control of behavior itself.

  2. The potential of LCM to mainstream bottom-up eco-innovation and alternative thinking

    DEFF Research Database (Denmark)

    De Rosa, Michele; Ghose, Agneta

    2015-01-01

    . For this reason, under the LCM framework, a number of bottom-up eco innovations and non-traditional approaches can be categorized, arising often in difficult economic context. However, it is not because of LCM that alternative solutions were found in these cases, but due to necessity. The potential of LCM and its...... level. The first example is Can Decreix in Cerbere, a social experiment that intend to demonstrate how the entire society can be managed in an alternative way, cleaner and more equitable, consuming less and sharing more. Activities involve frugal technologies, agroecology, educational workshops...

  3. Integrating Top-down and Bottom-up Cybersecurity Guidance using XML.

    Science.gov (United States)

    Lubell, Joshua

    2016-08-01

    This paper describes a markup-based approach for synthesizing disparate information sources and discusses a software implementation of the approach. The implementation makes it easier for people to use two complementary, but differently structured, guidance specifications together: the (top-down) Cybersecurity Framework and the (bottom-up) National Institute of Standards and Technology Special Publication 800-53 security control catalog. An example scenario demonstrates how the software implementation can help a security professional select the appropriate safeguards for restricting unauthorized access to an Industrial Control System. The implementation and example show the benefits of this approach and suggest its potential application to disciplines other than cybersecurity.

  4. Integrating Top-down and Bottom-up Cybersecurity Guidance using XML

    Science.gov (United States)

    Lubell, Joshua

    2016-01-01

    This paper describes a markup-based approach for synthesizing disparate information sources and discusses a software implementation of the approach. The implementation makes it easier for people to use two complementary, but differently structured, guidance specifications together: the (top-down) Cybersecurity Framework and the (bottom-up) National Institute of Standards and Technology Special Publication 800-53 security control catalog. An example scenario demonstrates how the software implementation can help a security professional select the appropriate safeguards for restricting unauthorized access to an Industrial Control System. The implementation and example show the benefits of this approach and suggest its potential application to disciplines other than cybersecurity. PMID:27795810

  5. Unsupervised tattoo segmentation combining bottom-up and top-down cues

    Science.gov (United States)

    Allen, Josef D.; Zhao, Nan; Yuan, Jiangbo; Liu, Xiuwen

    2011-06-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for finding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a figureground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is efficient and suitable for further tattoo classification and retrieval purpose.

  6. Coupled multi-physics simulation frameworks for reactor simulation: A bottom-up approach

    International Nuclear Information System (INIS)

    Tautges, Timothy J.; Caceres, Alvaro; Jain, Rajeev; Kim, Hong-Jun; Kraftcheck, Jason A.; Smith, Brandon M.

    2011-01-01

    A 'bottom-up' approach to multi-physics frameworks is described, where first common interfaces to simulation data are developed, then existing physics modules are adapted to communicate through those interfaces. Physics modules read and write data through those common interfaces, which also provide access to common simulation services like parallel IO, mesh partitioning, etc.. Multi-physics codes are assembled as a combination of physics modules, services, interface implementations, and driver code which coordinates calling these various pieces. Examples of various physics modules and services connected to this framework are given. (author)

  7. Public engagement as a field of tension between bottom-up and top-down strategies

    DEFF Research Database (Denmark)

    Horsbøl, Anders; Lassen, Inger

    2012-01-01

    In the ongoing debate about climate change, public engagement is given increasing prominence as a possible solution to a general lack of citizen participation in climate change mitigation efforts. Recent years have seen a surge in public engagement initiatives in many countries in the Western world....... These initiatives often have to deal with dilemmas between participatory aspects and other considerations such as planning efficiency, dilemmas that potentially bring about tension between bottom-up and top-down strategies. Literature on climate change issues has addressed the failure of public response, which has...... knowledge and information about climate change has not significantly changed people’s behaviour towards higher involvement....

  8. Towards a field-theory interpretation of bottom-up holography

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, V.P.J.; Grubinskas, S.; Stoof, H.T.C. [Institute for Theoretical Physics and Center for Extreme Matter and Emergent Phenomena,Utrecht University,Leuvenlaan 4, 3584 CE Utrecht (Netherlands)

    2015-04-08

    We investigate recent results for the electrical conductivity and the fermionic self-energy, obtained in a holographic bottom-up model for a relativistic charge-neutral conformal field theory. We present two possible field-theoretic derivations of these results, using either a semiholographic or a holographic point of view. In the semiholographic interpretation, we also show how, in general, the conductivity should be calculated in agreement with Ward identities. The resulting field-theory interpretation may lead to a better understanding of the holographic dictionary in applied AdS/CMT.

  9. NEMO. Netherlands Energy demand MOdel. A top-down model based on bottom-up information

    International Nuclear Information System (INIS)

    Koopmans, C.C.; Te Velde, D.W.; Groot, W.; Hendriks, J.H.A.

    1999-06-01

    The title model links energy use to other production factors, (physical) production, energy prices, technological trends and government policies. It uses a 'putty-semiputty' vintage production structure, in which new investments, adaptations to existing capital goods (retrofit) and 'good-housekeeping' are discerned. Price elasticities are relatively large in the long term and small in the short term. Most predictions of energy use are based on either econometric models or on 'bottom-up information', i.e. disaggregated lists of technical possibilities for and costs of saving energy. Typically, one predicts more energy-efficiency improvements using bottom-up information than using econometric ('top-down') models. We bridged this so-called 'energy-efficiency gap' by designing our macro/meso model NEMO in such a way that we can use bottom-up (micro) information to estimate most model parameters. In our view, reflected in NEMO, the energy-efficiency gap arises for two reasons. The first is that firms and households use a fairly high discount rate of 15% when evaluating the profitability of energy-efficiency improvements. The second is that our bottom-up information ('ICARUS') for most economic sectors does not (as NEMO does) take account of the fact that implementation of new, energy-efficient technology in capital stock takes place only gradually. Parameter estimates for 19 sectors point at a long-term technological energy efficiency improvement trend in Netherlands final energy use of 0.8% per year. The long-term price elasticity is estimated to be 0.29. These values are comparable to other studies based on time series data. Simulations of the effects of the oil price shocks in the seventies and the subsequent fall of oil prices show that the NEMO's price elasticities are consistent with historical data. However, the present pace at which new technologies become available (reflected in NEMO) appears to be lower than in the seventies and eighties. This suggests that it

  10. Co-financing of bottom-up approaches towards Broadband Infrastructure Development

    DEFF Research Database (Denmark)

    Williams, Idongesit

    2016-01-01

    with financial injection and the other did not due to low revenue. This paper, based on these cases, proposes the utilization and the reintroduction of Universal Service funds in developing countries to aid these small networks. This is a qualitative study, the Grounded Theory approach was used adopted gather...... networks –leading to the demise of some of these initiatives. This paper proposes co-financing of these networks as a means of sustaining the bottom-up Broadband network. The argument of this paper is anchored on two of developing country cases. One in India and the other in Ghana. One survived...

  11. A bottom-up approach for the synthesis of highly ordered fullerene-intercalated graphene hybrids

    Directory of Open Access Journals (Sweden)

    Dimitrios eGournis

    2015-02-01

    Full Text Available Much of the research effort on graphene focuses on its use as a building block for the development of new hybrid nanostructures with well-defined dimensions and properties suitable for applications such as gas storage, heterogeneous catalysis, gas/liquid separations, nanosensing and biomedicine. Towards this aim, here we describe a new bottom-up approach, which combines self-assembly with the Langmuir Schaefer deposition technique to synthesize graphene-based layered hybrid materials hosting fullerene molecules within the interlayer space. Our film preparation consists in a bottom-up layer-by-layer process that proceeds via the formation of a hybrid organo-graphene oxide Langmuir film. The structure and composition of these hybrid fullerene-containing thin multilayers deposited on hydrophobic substrates were characterized by a combination of X-ray diffraction, Raman and X-ray photoelectron spectroscopies, atomic force microscopy and conductivity measurements. The latter revealed that the presence of C60 within the interlayer spacing leads to an increase in electrical conductivity of the hybrid material as compared to the organo-graphene matrix alone.

  12. Top down and bottom up selection drives variations in frequency and form of a visual signal.

    Science.gov (United States)

    Yeh, Chien-Wei; Blamires, Sean J; Liao, Chen-Pan; Tso, I-Min

    2015-03-30

    The frequency and form of visual signals can be shaped by selection from predators, prey or both. When a signal simultaneously attracts predators and prey selection may favour a strategy that minimizes risks while attracting prey. Accordingly, varying the frequency and form of the silken decorations added to their web may be a way that Argiope spiders minimize predation while attracting prey. Nonetheless, the role of extraneous factors renders the influences of top down and bottom up selection on decoration frequency and form variation difficult to discern. Here we used dummy spiders and decorations to simulate four possible strategies that the spider Argiope aemula may choose and measured the prey and predator attraction consequences for each in the field. The strategy of decorating at a high frequency with a variable form attracted the most prey, while that of decorating at a high frequency with a fixed form attracted the most predators. These results suggest that mitigating the cost of attracting predators while maintaining prey attraction drives the use of variation in decoration form by many Argiope spp. when decorating frequently. Our study highlights the importance of considering top-down and bottom up selection pressure when devising evolutionary ecology experiments.

  13. Selective spatial attention modulates bottom-up informational masking of speech.

    Science.gov (United States)

    Carlile, Simon; Corkhill, Caitlin

    2015-03-02

    To hear out a conversation against other talkers listeners overcome energetic and informational masking. Largely attributed to top-down processes, information masking has also been demonstrated using unintelligible speech and amplitude-modulated maskers suggesting bottom-up processes. We examined the role of speech-like amplitude modulations in information masking using a spatial masking release paradigm. Separating a target talker from two masker talkers produced a 20 dB improvement in speech reception threshold; 40% of which was attributed to a release from informational masking. When across frequency temporal modulations in the masker talkers are decorrelated the speech is unintelligible, although the within frequency modulation characteristics remains identical. Used as a masker as above, the information masking accounted for 37% of the spatial unmasking seen with this masker. This unintelligible and highly differentiable masker is unlikely to involve top-down processes. These data provides strong evidence of bottom-up masking involving speech-like, within-frequency modulations and that this, presumably low level process, can be modulated by selective spatial attention.

  14. The bottom-up approach to integrative validity: a new perspective for program evaluation.

    Science.gov (United States)

    Chen, Huey T

    2010-08-01

    The Campbellian validity model and the traditional top-down approach to validity have had a profound influence on research and evaluation. That model includes the concepts of internal and external validity and within that model, the preeminence of internal validity as demonstrated in the top-down approach. Evaluators and researchers have, however, increasingly recognized that in an evaluation, the over-emphasis on internal validity reduces that evaluation's usefulness and contributes to the gulf between academic and practical communities regarding interventions. This article examines the limitations of the Campbellian validity model and the top-down approach and provides a comprehensive, alternative model, known as the integrative validity model for program evaluation. The integrative validity model includes the concept of viable validity, which is predicated on a bottom-up approach to validity. This approach better reflects stakeholders' evaluation views and concerns, makes external validity workable, and becomes therefore a preferable alternative for evaluation of health promotion/social betterment programs. The integrative validity model and the bottom-up approach enable evaluators to meet scientific and practical requirements, facilitate in advancing external validity, and gain a new perspective on methods. The new perspective also furnishes a balanced view of credible evidence, and offers an alternative perspective for funding. Copyright (c) 2009 Elsevier Ltd. All rights reserved.

  15. A bottom-up model to describe consumers’ preferences towards late season peaches

    Energy Technology Data Exchange (ETDEWEB)

    Groot, E.; Albisu, L.M.

    2015-07-01

    Peaches are consumed in Mediterranean countries since ancient times. Nowadays there are few areas in Europe that produce peaches with Protected Designation of Origin (PDO), and the Calanda area is one of them. The aim of this work is to describe consumers’ preferences towards late season PDO Calanda peaches in the city of Zaragoza, Spain, by a bottom-up model. The bottom-up model proves greater amount of information than top-down models. In this approach it is estimated one utility function per consumer. Thus, it is not necessary to make assumptions about preference distributions and correlations across respondents. It was observed that preference distributions were neither normal nor independently distributed. If those preferences were estimated by top-down models, conclusions would be biased. This paper also explores a new way to describe preferences through individual utility functions. Results show that the largest behavioural group gathered origin sensitive consumers. Their utility increased if the peaches were produced in the Calanda area and, especially, when peaches had the PDO Calanda brand. In sequence, the second most valuable attribute for consumers was the price. Peach size and packaging were not so important on purchase choice decision. Nevertheless, it is advisable to avoid trading smallest size peaches (weighting around 160 g/fruit). Traders also have to be careful by using active packaging. It was found that a group of consumers disliked this kind of product, probably, because they perceived it as less natural. (Author)

  16. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation

    Directory of Open Access Journals (Sweden)

    Rahul Deb Das

    2016-11-01

    Full Text Available Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS. Both applications rely on the sensor traces generated by travellers’ smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  17. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.

    Science.gov (United States)

    Das, Rahul Deb; Winter, Stephan

    2016-11-23

    Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  18. Bottom-up approach for decentralised energy planning. Case study of Tumkur district in India

    Energy Technology Data Exchange (ETDEWEB)

    Hiremath, Rahul B. [Walchand Institute of Technology, Solapur 413006 (India); Kumar, Bimlesh [Civil Engineering, Indian Institute of Technology, Guwahati 781039 (India); Balachandra, P. [Energy Technology Innovation Policy, Belfer Center for Science and International Affairs, Harvard Kennedy School, Harvard University, Cambridge, MA 02138 (United States); Ravindranath, N.H. [CST, IISc, Bangalore 560012 (India)

    2010-02-15

    Decentralized Energy Planning (DEP) is one of the options to meet the rural and small-scale energy needs in a reliable, affordable and environmentally sustainable way. The main aspect of the energy planning at decentralized level would be to prepare an area-based DEP to meet energy needs and development of alternate energy sources at least-cost to the economy and environment. Present work uses goal-programming method in order to analyze the DEP through bottom-up approach. This approach includes planning from the lowest scale of Tumkur district in India. The scale of analysis included village level - Ungra, panchayat level (local council) - Yedavani, block level - Kunigal and district level - Tumkur. The approach adopted was bottom-up (village to district) to allow a detailed description of energy services and the resulting demand for energy forms and supply technologies. Different scenarios are considered at four decentralized scales for the year 2005 and are developed and analyzed for the year 2020. Decentralized bioenergy system for producing biogas and electricity, using local biomass resources, are shown to promote development compared to other renewables. This is because, apart from meeting energy needs, multiple goals could be achieved such as self-reliance, local employment, and land reclamation apart from CO{sub 2} emissions reduction. (author)

  19. Bottom-up approach for decentralised energy planning: Case study of Tumkur district in India

    Energy Technology Data Exchange (ETDEWEB)

    Hiremath, Rahul B., E-mail: rahulhiremath@gmail.co [Walchand Institute of Technology Solapur 413006 (India); Kumar, Bimlesh, E-mail: bimk@iitg.ernet.i [Civil Engineering, Indian Institute of Technology, Guwahati 781039 (India); Balachandra, P., E-mail: balachandra_patil@hks.harvard.ed [Energy Technology Innovation Policy, Belfer Center for Science and International Affairs, Harvard Kennedy School, Harvard University, Cambridge, MA 02138 (United States); Ravindranath, N.H., E-mail: ravi@ces.iisc.ernet.i [CST, IISc, Bangalore 560012 (India)

    2010-02-15

    Decentralized Energy Planning (DEP) is one of the options to meet the rural and small-scale energy needs in a reliable, affordable and environmentally sustainable way. The main aspect of the energy planning at decentralized level would be to prepare an area-based DEP to meet energy needs and development of alternate energy sources at least-cost to the economy and environment. Present work uses goal-programming method in order to analyze the DEP through bottom-up approach. This approach includes planning from the lowest scale of Tumkur district in India. The scale of analysis included village level-Ungra, panchayat level (local council)-Yedavani, block level-Kunigal and district level-Tumkur. The approach adopted was bottom-up (village to district) to allow a detailed description of energy services and the resulting demand for energy forms and supply technologies. Different scenarios are considered at four decentralized scales for the year 2005 and are developed and analyzed for the year 2020. Decentralized bioenergy system for producing biogas and electricity, using local biomass resources, are shown to promote development compared to other renewables. This is because, apart from meeting energy needs, multiple goals could be achieved such as self-reliance, local employment, and land reclamation apart from CO{sub 2} emissions reduction.

  20. Top-down (Prior Knowledge) and Bottom-up (Perceptual Modality) Influences on Spontaneous Interpersonal Synchronization.

    Science.gov (United States)

    Gipson, Christina L; Gorman, Jamie C; Hessler, Eric E

    2016-04-01

    Coordination with others is such a fundamental part of human activity that it can happen unintentionally. This unintentional coordination can manifest as synchronization and is observed in physical and human systems alike. We investigated the role of top-down influences (prior knowledge of the perceptual modality their partner is using) and bottom-up factors (perceptual modality combination) on spontaneous interpersonal synchronization. We examine this phenomena with respect to two different theoretical perspectives that differently emphasize top-down and bottom-up factors in interpersonal synchronization: joint-action/shared cognition theories and ecological-interactive theories. In an empirical study twelve dyads performed a finger oscillation task while attending to each other's movements through either visual, auditory, or visual and auditory perceptual modalities. Half of the participants were given prior knowledge of their partner's perceptual capabilities for coordinating across these different perceptual modality combinations. We found that the effect of top-down influence depends on the perceptual modality combination between two individuals. When people used the same perceptual modalities, top-down influence resulted in less synchronization and when people used different perceptual modalities, top-down influence resulted in more synchronization. Furthermore, persistence in the change in behavior as a result of having perceptual information about each other ('social memory') was stronger when this top-down influence was present.

  1. Comparing effectiveness of top-down and bottom-up strategies in containing influenza.

    Directory of Open Access Journals (Sweden)

    Achla Marathe

    Full Text Available This research compares the performance of bottom-up, self-motivated behavioral interventions with top-down interventions targeted at controlling an "Influenza-like-illness". Both types of interventions use a variant of the ring strategy. In the first case, when the fraction of a person's direct contacts who are diagnosed exceeds a threshold, that person decides to seek prophylaxis, e.g. vaccine or antivirals; in the second case, we consider two intervention protocols, denoted Block and School: when a fraction of people who are diagnosed in a Census Block (resp., School exceeds the threshold, prophylax the entire Block (resp., School. Results show that the bottom-up strategy outperforms the top-down strategies under our parameter settings. Even in situations where the Block strategy reduces the overall attack rate well, it incurs a much higher cost. These findings lend credence to the notion that if people used antivirals effectively, making them available quickly on demand to private citizens could be a very effective way to control an outbreak.

  2. Resolving the chemical nature of nanodesigned silica surface obtained via a bottom-up approach.

    Science.gov (United States)

    Rahma, Hakim; Buffeteau, Thierry; Belin, Colette; Le Bourdon, Gwenaëlle; Degueil, Marie; Bennetau, Bernard; Vellutini, Luc; Heuzé, Karine

    2013-08-14

    The covalent grafting on silica surfaces of a functional dendritic organosilane coupling agent inserted, in a long alkyl chain monolayer, is described. In this paper, we show that depending on experimental parameters, particularly the solvent, it is possible to obtain a nanodesigned surface via a bottom-up approach. Thus, we succeed in the formation of both homogeneous dense monolayer and a heterogeneous dense monolayer, the latter being characterized by a nanosized volcano-type pattern (4-6 nm of height, 100 nm of width, and around 3 volcanos/μm(2)) randomly distributed over the surface. The dendritic attribute of the grafted silylated coupling agent affords enough anchoring sites to immobilize covalently functional gold nanoparticles (GNPs), coated with amino PEG polymer to resolve the chemical nature of the surfaces and especially the volcano type nanopattern structures of the heterogeneous monolayer. Thus, the versatile surface chemistry developed herein is particularly challenging as the nanodesign is straightforward achieved in a bottom-up approach without any specific lithography device.

  3. Top-down versus bottom-up processing of influence diagrams in probabilistic analysis

    International Nuclear Information System (INIS)

    Timmerman, R.D.; Burns, T.J.; Dodds, H.L. Jr.

    1986-01-01

    Recent work by Phillips et al and Selby et al has shown that influence diagram methodology can be a useful analytical tool in reactor safety studies. In some instances, an influence diagram can be used as a graphical representation of probabilistic dependence within a system or event sequence. Under these circumstances, Bayesian statistics is employed to transform the relationships depicted in the influence diagram into the correct expression for a desired marginal probability (e.g., the top node). In the references cited above, the authors demonstrated the usefulness of influence diagrams for assessing the reliability of operator performance during pressurized thermal shock transients. In addition, the use of influence diagrams identified the critical variables that had the greatest impact on operator reliability for a particular scenario (e.g., control room design, procedures, etc.). Top-down and bottom-up algorithms have emerged as the dominant methods for quantifying influence diagrams. The purpose of this paper is to demonstrate a potential error in employing the bottom-up algorithm when dealing with interdependencies

  4. A combined bottom-up/top-down approach to prepare a sterile injectable nanosuspension.

    Science.gov (United States)

    Hu, Xi; Chen, Xi; Zhang, Ling; Lin, Xia; Zhang, Yu; Tang, Xing; Wang, Yanjiao

    2014-09-10

    To prepare a uniform nanosuspension of strongly hydrophobic riboflavin laurate (RFL) allowing sterile filtration, physical modification (bottom-up) was combined with high-pressure homogenization (top-down) method. Unlike other bottom-up approaches, physical modification with surfactants (TPGS and PL-100) by lyophilization controlled crystallization and compensated for the poor wettability of RFL. On one hand, crystal growth and aggregation during freezing was restricted by a stabilizer-layer adsorbed on the drug surface by hydrophobic interaction. On the other hand, subsequent crystallization of drug in the sublimation process was limited to the interstitial spaces between solvent crystals. After lyophilization, modified drug with a smaller particle size and better wettability was obtained. When adding surfactant solution, water molecules passed between the hydrophilic groups of surface active molecules and activated the polymer chains allowing them to stretch into water. The coarse suspension was crushed into a nanosuspension (MP=162 nm) by high-pressure homogenization. For long term stability, lyophilization was applied again to solidify the nanosuspension (sorbitol as cryoprotectant). A slight crystal growth to about 600 nm was obtained to allow slow release for a sustained effect after muscular administration. Moreover, no paw-licking responses and very slight muscular inflammation demonstrated the excellent biocompatibility of this long-acting RFL injection. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Dynamic formulation of a top-down and bottom-up merging energy policy model

    International Nuclear Information System (INIS)

    Frei, Christoph W.; Haldi, P.-A.; Sarlos, G.Gerard

    2003-01-01

    The impact of energy policy measures is not restricted to the energy system and should therefore be analysed within an economy-wide framework, while keeping the essential details of the energy sector. The aim of this paper is to present new developments in the field of the consistent evaluation of indicators for the sustainability assessment of energy policy measures. Starting from the static concept of Boehringer (Energy Econ. 20 (1998) 233), this paper shows how the complementarity format can be used in computable general equilibrium (CGE) modelling for a dynamic formulation of bottom-up and top-down approach merging models. While a hybrid approach increases the credibility of CGE models in energy policy analysis by replacing the energy sector generic functional forms with a bottom-up activity analysis based on specific technologies, the endogenous formulation of investment decisions makes an explicit description of evolving specific capital stocks and technology mixes possible. Both features are essential when assessing effects of policy measures that may be affected by structural change--which is typically the case in the long-term assessment of energy policy measures

  6. Painful faces-induced attentional blink modulated by top-down and bottom-up mechanisms

    Directory of Open Access Journals (Sweden)

    Chun eZheng

    2015-06-01

    Full Text Available Pain-related stimuli can capture attention in an automatic (bottom-up or intentional (top-down fashion. Previous studies have examined attentional capture by pain-related information using spatial attention paradigms that involve mainly a bottom-up mechanism. In the current study, we investigated the pain information–induced attentional blink (AB using a rapid serial visual presentation (RSVP task, and compared the effects of task-irrelevant and task-relevant pain distractors. Relationships between accuracy of target identification and individual traits (i.e., empathy and catastrophizing thinking about pain were also examined. The results demonstrated that task-relevant painful faces had a significant pain information–induced AB effect, whereas task-irrelevant faces a near-significant trend of this effect, supporting the notion that pain-related stimuli can influence the temporal dynamics of attention. Furthermore, we found a significant negative correlation between response accuracy and pain catastrophizing score in task-relevant trials. These findings suggest that active scanning of environmental information related to pain produces greater deficits in cognition than does unintentional attention toward pain, which may represent the different ways in which healthy individuals and patients with chronic pain process pain-relevant information. These results may provide insight into the understanding of maladaptive attentional processing in patients with chronic pain.

  7. Feature-based attention: it is all bottom-up priming.

    Science.gov (United States)

    Theeuwes, Jan

    2013-10-19

    Feature-based attention (FBA) enhances the representation of image characteristics throughout the visual field, a mechanism that is particularly useful when searching for a specific stimulus feature. Even though most theories of visual search implicitly or explicitly assume that FBA is under top-down control, we argue that the role of top-down processing in FBA may be limited. Our review of the literature indicates that all behavioural and neuro-imaging studies investigating FBA suffer from the shortcoming that they cannot rule out an effect of priming. The mere attending to a feature enhances the mandatory processing of that feature across the visual field, an effect that is likely to occur in an automatic, bottom-up way. Studies that have investigated the feasibility of FBA by means of cueing paradigms suggest that the role of top-down processing in FBA is limited (e.g. prepare for red). Instead, the actual processing of the stimulus is needed to cause the mandatory tuning of responses throughout the visual field. We conclude that it is likely that all FBA effects reported previously are the result of bottom-up priming.

  8. Bottom-up realization and electrical characterization of a graphene-based device

    International Nuclear Information System (INIS)

    Maffucci, A; Micciulla, F; Cataldo, A; Bellucci, S; Miano, G

    2016-01-01

    We propose a bottom-up procedure to fabricate an easy-to-engineer graphene-based device, consisting of a microstrip-like circuit where few-layer graphene nanoplatelets are used to contact two copper electrodes. The graphene nanoplatelets are obtained by the microwave irradiation of intercalated graphite, i.e., an environmentally friendly, fast and low-cost procedure. The contact is created by a bottom-up process, driven by the application of a DC electrical field in the gap between the electrodes, yielding the formation of a graphene carpet. The electrical resistance of the device has been measured as a function of the gap length and device temperature. The possible use of this device as a gas sensor is demonstrated by measuring the sensitivity of its electrical resistance to the presence of gas. The measured results demonstrate a good degree of reproducibility in the fabrication process, and the competitive performance of devices, thus making the proposed technique potentially attractive for industrial applications. (paper)

  9. Integrating the bottom-up and top-down approach to energy economy modelling. The case of Denmark

    DEFF Research Database (Denmark)

    Klinge Jacobsen, Henrik

    1998-01-01

    This paper presents results from an integration project covering Danish models based on bottom-up and top-down approaches to energy]economy modelling. The purpose of the project was to identify theoretical and methodological problems for integrating existing models for Denmark and to implement...... an integration of the models. The integration was established through a number of links between energy bottom-up modules and a macroeconomic model. In this integrated model it is possible to analyse both top-down instruments, such as taxes along with bottom-up instruments, such as regulation of technology...

  10. Evaluations of carbon fluxes estimated by top-down and bottom-up approaches

    Science.gov (United States)

    Murakami, K.; Sasai, T.; Kato, S.; Hiraki, K.; Maksyutov, S. S.; Yokota, T.; Nasahara, K.; Matsunaga, T.

    2013-12-01

    There are two types of estimating carbon fluxes using satellite observation data, and these are referred to as top-down and bottom-up approaches. Many uncertainties are however still remain in these carbon flux estimations, because the true values of carbon flux are still unclear and estimations vary according to the type of the model (e.g. a transport model, a process based model) and input data. The CO2 fluxes in these approaches are estimated by using different satellite data such as the distribution of CO2 concentration in the top-down approach and the land cover information (e.g. leaf area, surface temperature) in the bottom-up approach. The satellite-based CO2 flux estimations with reduced uncertainty can be used efficiently for identifications of large emission area and carbon stocks of forest area. In this study, we evaluated the carbon flux estimates from two approaches by comparing with each other. The Greenhouse gases Observing SATellite (GOSAT) has been observing atmospheric CO2 concentrations since 2009. GOSAT L4A data product is the monthly CO2 flux estimations for 64 sub-continental regions and is estimated by using GOSAT FTS SWIR L2 XCO2 data and atmospheric tracer transport model. We used GOSAT L4A CO2 flux as top-down approach estimations and net ecosystem productions (NEP) estimated by the diagnostic type biosphere model BEAMS as bottom-up approach estimations. BEAMS NEP is only natural land CO2 flux, so we used GOSAT L4A CO2 flux after subtraction of anthropogenic CO2 emissions and oceanic CO2 flux. We compared with two approach in temperate north-east Asia region. This region is covered by grassland and crop land (about 60 %), forest (about 20 %) and bare ground (about 20 %). The temporal variation for one year period was indicated similar trends between two approaches. Furthermore we show the comparison of CO2 flux estimations in other sub-continental regions.

  11. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  12. Elucidating the role of D4 receptors in mediating attributions of salience to incentive stimuli on Pavlovian conditioned approach and conditioned reinforcement paradigms.

    Science.gov (United States)

    Cocker, P J; Vonder Haar, C; Winstanley, C A

    2016-10-01

    The power of drug-associated cues to instigate drug 'wanting' and consequently promote drug seeking is a corner stone of contemporary theories of addiction. Gambling disorder has recently been added to the pantheon of addictive disorders due to the phenomenological similarities between the diseases. However, the neurobiological mechanism that may mediate increased sensitivity towards conditioned stimuli in addictive disorders is unclear. We have previously demonstrated using a rodent analogue of a simple slot machine that the dopamine D4 receptor is critically engaged in controlling animals' attribution of salience to stimuli associated with reward in this paradigm, and consequently may represent a target for the treatment of gambling disorder. Here, we investigated the role of acute administration of a D4 receptor agonist on animals' responsivity to conditioned stimuli on both a Pavlovian conditioned approach (autoshaping) and a conditioned reinforcement paradigm. Following training on one of the two tasks, separate cohorts of rats (male and female) were administered a dose of PD168077 shown to be maximally effective at precipitating errors in reward expectancy on the rat slot machine task (10mg/kg). However, augmenting the activity of the D4 receptors in this manner did not alter behaviour on either task. These data therefore provide novel evidence that the D4 receptor does not alter incentive motivation in response to cues on simple behavioural tasks. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Word-of-mouth marketing and enterprise strategies: a bottom-up diffusion model

    Directory of Open Access Journals (Sweden)

    Marco REMONDINO

    2011-06-01

    Full Text Available A comprehensive simulation model is presented, aimed to show the dynamics of social diffusion based on the word of mouth (e.g.: viral marketing over a social network of interconnected individuals. The model is build following a bottom-up approach and the agent based paradigm; this means that the dynamics of the diffusion is simulated in real time and generated at the micro level, not calculated by using mathematical formulas. This allows both to follow – step by step – the emergent process and to be able – if needed – to add complex behavior for the agents and analyze how this impacts the phenomenon at the macro level.

  14. Manufacturing at Nanoscale: Top-Down, Bottom-up and System Engineering

    International Nuclear Information System (INIS)

    Zhang Xiang; Sun Cheng; Fang, Nicholas

    2004-01-01

    The current nano-technology revolution is facing several major challenges: to manufacture nanodevices below 20 nm, to fabricate three-dimensional complex nano-structures, and to heterogeneously integrate multiple functionalities. To tackle these grand challenges, the Center for Scalable and Integrated NAno-Manufacturing (SINAM), a NSF Nanoscale Science and Engineering Center, set its goal to establish a new manufacturing paradigm that integrates an array of new nano-manufacturing technologies, including the plasmonic imaging lithography and ultramolding imprint lithography aiming toward critical resolution of 1-10 nm and the hybrid top-down and bottom-up technologies to achieve massively parallel integration of heterogeneous nanoscale components into higher-order structures and devices. Furthermore, SINAM will develop system engineering strategies to scale-up the nano-manufacturing technologies. SINAMs integrated research and education platform will shed light to a broad range of potential applications in computing, telecommunication, photonics, biotechnology, health care, and national security

  15. Bottom-up control of geomagnetic secular variation by the Earth's inner core

    DEFF Research Database (Denmark)

    Aubert, Julien; Finlay, Chris; Fournier, Alexandre

    2013-01-01

    of geomagnetic secular variation. Here we show that it can be reproduced provided that two mechanisms relying on the inner core are jointly considered. First, gravitational coupling5 aligns the inner core with the mantle, forcing the flow of liquid metal in the outer core into a giant, westward drifting, sheet...... release in the outer core which in turn distorts the gyre, forcing it to become eccentric, in agreement with recent core flow inversions6, 10, 11. This bottom-up heterogeneous driving of core convection dominates top-down driving from mantle thermal heterogeneities, and localizes magnetic variations......Temporal changes in the Earth’s magnetic field, known as geomagnetic secular variation, occur most prominently at low latitudes in the Atlantic hemisphere1, 2 (that is, from −90 degrees east to 90 degrees east), whereas in the Pacific hemisphere there is comparatively little activity...

  16. Bottom-Up Cost Analysis of a High Concentration PV Module; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, K.; Woodhouse, M.; Lee, H.; Smestad, G.

    2015-04-13

    We present a bottom-up model of III-V multi-junction cells, as well as a high concentration PV (HCPV) module. We calculate $0.65/Wp(DC) manufacturing costs for our model HCPV module design with today’s capabilities, and find that reducing cell costs and increasing module efficiency offer the promising pathways for future cost reductions. Cell costs could be significantly reduced via an increase in manufacturing scale, substrate reuse, and improved manufacturing yields. We also identify several other significant drivers of HCPV module costs, including the Fresnel lens primary optic, module housing, thermal management, and the receiver board. These costs could potentially be lowered by employing innovative module designs.

  17. A bottom-up perspective on leadership of collaborative innovation in the public sector

    DEFF Research Database (Denmark)

    Hansen, Jesper Rohr

    The thesis investigates how new forms of public leadership can contribute to solving complex problems in today’s welfare societies through innovation. A bottom-up type of leadership for collaborative innovation addressing wicked problems is theorised, displaying a social constructive process...... approach to leadership; a theoretical model emphasises that leadership emerges through social processes of recognition. Leadership is recognised by utilising the uncertainty of a wicked problem and innovation to influence collaborators’ sensemaking processes. The empirical setting is the City of Copenhagen....... A crucial condition for success is iterative leadership adaptation. In conclusion, the thesis finds that specialized professionals are indeed able to develop politically viable, innovative and collaborative solutions to wicked problems; and that such professionals are able to transform themselves...

  18. Bottom-Up Nanofabrication of Supported Noble Metal Alloy Nanoparticle Arrays for Plasmonics

    DEFF Research Database (Denmark)

    Nugroho, Ferry A. A.; Iandolo, Beniamino; Wagner, Jakob Birkedal

    2016-01-01

    Mixing different elements at the nanoscale to obtain alloy nanostructures with fine-tuned physical and chemical properties offers appealing opportunities for nanotechnology and nanoscience. However, despite widespread successful application of alloy nanoparticles made by colloidal synthesis...... in heterogeneous catalysis, nanoalloy systems have been used very rarely in solid-state devices and nanoplasmonics-related applications. One reason is that such applications require integration in arrays on a surface with compelling demands on nanoparticle arrangement, uniformity in surface coverage......, and optimization of the surface density. These cannot be fulfilled even using state-of-the-art self -assembly strategies of colloids. As a solution, we present here a generic bottom-up nanolithography-compatible fabrication approach for large-area arrays of alloy nanoparticles on surfaces. To illustrate...

  19. Mesoporous ZSM-5 Zeolites in Acid Catalysis: Top-Down vs. Bottom-Up Approach

    Directory of Open Access Journals (Sweden)

    Pit Losch

    2017-07-01

    Full Text Available A top-down desilication of Al-rich ZSM-5 zeolites and a bottom-up mesopores creating method were evaluated in this study. Three liquid–solid and one gas–solid heterogeneously-catalysed reactions were chosen to establish relationships between zeolites textural properties and their catalytic behavior in acid-catalysed model reactions that are influenced by shape selectivity: Diels-Alder cyclization between isoprene and methylacrylate, Methanol-to-Olefins (MTO reaction, chlorination of iodobenzene with trichloroisocyanuric acid (TCCA, and Friedel-Crafts acylation of anisole by carboxylic acids with differing sizes. It is found amongst others that no optimal mesoporosity for all the different reactions can be easily obtained, but depending on the chosen application, a specific treatment has to be set to achieve high activity/selectivity and stability.

  20. Bottom-up effects of climate on fish populations: data from the Continuous Plankton Recorder

    DEFF Research Database (Denmark)

    Pitois, S.G.; Lynam, C.P.; Jansen, Teunis

    2012-01-01

    The Continuous Plankton Recorder (CPR) dataset on fish larvae has an extensive spatio-temporal coverage that allows the responses of fish populations to past changes in climate variability, including abrupt changes such as regime shifts, to be investigated. The newly available dataset offers...... in the plankton ecosystem, while the larvae of migratory species such as Atlantic mackerel responded more to hydrographic changes. Climate variability seems more likely to influence fish populations through bottom-up control via a cascading effect from changes in the North Atlantic Oscillation (NAO) impacting...... with fishing effects interacting with climate effects and this study supports furthering our under - standing of such interactions before attempting to predict how fish populations respond to climate variability...

  1. Collective Inclusioning: A Grounded Theory of a Bottom-Up Approach to Innovation and Leading

    Directory of Open Access Journals (Sweden)

    Michal Lysek

    2016-06-01

    Full Text Available This paper is a grounded theory study of how leaders (e.g., entrepreneurs, managers, etc. engage people in challenging undertakings (e.g., innovation that require everyone’s commitment to such a degree that they would have to go beyond what could be reasonably expected in order to succeed. Company leaders sometimes wonder why their employees no longer show the same responsibility towards their work, and why they are more concerned with internal politics than solving customer problems. It is because company leaders no longer apply collective inclusioning to the same extent as they did in the past. Collective inclusioning can be applied in four ways by convincing, afinitizing, goal congruencing, and engaging. It can lead to fostering strong units of people for taking on challenging undertakings. Collective inclusioning is a complementing theory to other strategic management and leading theories. It offers a new perspective on how to implement a bottom-up approach to innovation.

  2. Electrodeposition in capillaries: bottom-up micro- and nanopatterning of functional materials on conductive substrates.

    Science.gov (United States)

    George, Antony; Maijenburg, A Wouter; Maas, Michiel G; Blank, Dave H A; Ten Elshof, Johan E

    2011-09-01

    A cost-effective and versatile methodology for bottom-up patterned growth of inorganic and metallic materials on the micro- and nanoscale is presented. Pulsed electrodeposition was employed to deposit arbitrary patterns of Ni, ZnO, and FeO(OH) of high quality, with lateral feature sizes down to 200-290 nm. The pattern was defined by an oxygen plasma-treated patterned PDMS mold in conformal contact with a conducting substrate and immersed in an electrolyte solution, so that the solid phases were deposited from the solution in the channels of the patterned mold. It is important that the distance between the entrance of the channels, and the location where deposition is needed, is kept limited. The as-formed patterns were characterized by high resolution scanning electron microscope, energy-dispersive X-ray analysis, atomic force microscopy, and X-ray diffraction.

  3. MaxSynBio - Avenues towards creating cells from the bottom up.

    Science.gov (United States)

    Schwille, Petra; Spatz, Joachim; Landfester, Katharina; Bodenschatz, Eberhard; Herminghaus, Stephan; Sourjik, Victor; Erb, Tobias; Bastiaens, Philippe; Lipowsky, Reinhard; Hyman, Anthony; Dabrock, Peter; Baret, Jean-Christophe; Vidakovic-Koch, Tanja; Bieling, Peter; Dimova, Rumiana; Mutschler, Hannes; Robinson, Tom; Tang, Dora; Wegner, Seraphine; Sundmacher, Kai

    2018-05-11

    A large Max Planck-based German research consortium ('MaxSynBio') was formed to investigate living systems from a fundamental perspective. The research program of MaxSynBio relies solely on the bottom-up approach to Synthetic Biology. MaxSynBio focuses on the detailed analysis and understanding of essential processes of life, via their modular reconstitution in minimal synthetic systems. The ultimate goal is to construct a basic living unit entirely from non-living components. The fundamental insights gained from the activities in MaxSynBio can eventually be utilized for establishing a new generation of biotechnological processes, which would be based on synthetic cell constructs that replace natural cells currently used in conventional biotechnology. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Bottom-up effects of a no-take zone on endangered penguin demographics.

    Science.gov (United States)

    Sherley, Richard B; Winker, Henning; Altwegg, Res; van der Lingen, Carl D; Votier, Stephen C; Crawford, Robert J M

    2015-07-01

    Marine no-take zones can have positive impacts for target species and are increasingly important management tools. However, whether they indirectly benefit higher order predators remains unclear. The endangered African penguin (Spheniscus demersus) depends on commercially exploited forage fish. We examined how chick survival responded to an experimental 3-year fishery closure around Robben Island, South Africa, controlling for variation in prey biomass and fishery catches. Chick survival increased by 18% when the closure was initiated, which alone led to a predicted 27% higher population compared with continued fishing. However, the modelled population continued to decline, probably because of high adult mortality linked to poor prey availability over larger spatial scales. Our results illustrate that small no-take zones can have bottom-up benefits for highly mobile marine predators, but are only one component of holistic, ecosystem-based management regimes. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  5. The Ideological Divide Concerning Climate Change Opinion: Integrating Top-Down and Bottom-Up Approaches

    Directory of Open Access Journals (Sweden)

    Jennifer eJacquet

    2014-12-01

    Full Text Available The United States wields disproportionate global influence in terms of carbon dioxide emissions and international climate policy. This renders it an especially important context in which to examine the interplay among social, psychological, and political factors in shaping attitudes and behaviors about climate change. In this article, we review the emerging literature addressing the liberal-conservative divide in the U.S. with respect to thought, communication, and action concerning climate change. Because of its theoretical and practical significance, we focus on the motivational basis for skepticism and inaction on the part of some, including top-down institutional forces, such as corporate strategy, and bottom-up psychological factors, such as ego, group, and system justification. Although more research is needed to elucidate fully the social, cognitive, and motivational bases of environmental attitudes and behavior, a great deal has been learned in just a few years by focusing on specific ideological factors in addition to general psychological principles.

  6. Sequential bottom-up assembly of mechanically stabilized synthetic cells by microfluidics

    Science.gov (United States)

    Weiss, Marian; Frohnmayer, Johannes Patrick; Benk, Lucia Theresa; Haller, Barbara; Janiesch, Jan-Willi; Heitkamp, Thomas; Börsch, Michael; Lira, Rafael B.; Dimova, Rumiana; Lipowsky, Reinhard; Bodenschatz, Eberhard; Baret, Jean-Christophe; Vidakovic-Koch, Tanja; Sundmacher, Kai; Platzman, Ilia; Spatz, Joachim P.

    2018-01-01

    Compartments for the spatially and temporally controlled assembly of biological processes are essential towards cellular life. Synthetic mimics of cellular compartments based on lipid-based protocells lack the mechanical and chemical stability to allow their manipulation into a complex and fully functional synthetic cell. Here, we present a high-throughput microfluidic method to generate stable, defined sized liposomes termed `droplet-stabilized giant unilamellar vesicles (dsGUVs)’. The enhanced stability of dsGUVs enables the sequential loading of these compartments with biomolecules, namely purified transmembrane and cytoskeleton proteins by microfluidic pico-injection technology. This constitutes an experimental demonstration of a successful bottom-up assembly of a compartment with contents that would not self-assemble to full functionality when simply mixed together. Following assembly, the stabilizing oil phase and droplet shells are removed to release functional self-supporting protocells to an aqueous phase, enabling them to interact with physiologically relevant matrices.

  7. Differential recolonization of Atlantic intertidal habitats after disturbance reveals potential bottom-up community regulation.

    Science.gov (United States)

    Petzold, Willy; Scrosati, Ricardo A

    2014-01-01

    In the spring of 2014, abundant sea ice that drifted out of the Gulf of St. Lawrence caused extensive disturbance in rocky intertidal habitats on the northern Atlantic coast of mainland Nova Scotia, Canada. To monitor recovery of intertidal communities, we surveyed two wave-exposed locations in the early summer of 2014. Barnacle recruitment and the abundance of predatory dogwhelks were low at one location (Tor Bay Provincial Park) but more than 20 times higher at the other location (Whitehead). Satellite data indicated that the abundance of coastal phytoplankton (the main food source for barnacle larvae) was consistently higher at Whitehead just before the barnacle recruitment season, when barnacle larvae were in the water column. These observations suggest bottom-up forcing of intertidal communities. The underlying mechanisms and their intensity along the NW Atlantic coast could be investigated through studies done at local and regional scales.

  8. A bottom-up route to enhance thermoelectric figures of merit in graphene nanoribbons

    DEFF Research Database (Denmark)

    Sevincli, Haldun; Sevik, Cem; Cagin, Tahir

    2013-01-01

    We propose a hybrid nano-structuring scheme for tailoring thermal and thermoelectric transport properties of graphene nanoribbons. Geometrical structuring and isotope cluster engineering are the elements that constitute the proposed scheme. Using first-principles based force constants and Hamilto......We propose a hybrid nano-structuring scheme for tailoring thermal and thermoelectric transport properties of graphene nanoribbons. Geometrical structuring and isotope cluster engineering are the elements that constitute the proposed scheme. Using first-principles based force constants...... and Hamiltonians, we show that the thermal conductance of graphene nanoribbons can be reduced by 98.8% at room temperature and the thermoelectric figure of merit, ZT, can be as high as 3.25 at T = 800 K. The proposed scheme relies on a recently developed bottom-up fabrication method, which is proven to be feasible...

  9. Bottom-Up Engineering of Well-Defined 3D Microtissues Using Microplatforms and Biomedical Applications.

    Science.gov (United States)

    Lee, Geon Hui; Lee, Jae Seo; Wang, Xiaohong; Lee, Sang Hoon

    2016-01-07

    During the last decades, the engineering of well-defined 3D tissues has attracted great attention because it provides in vivo mimicking environment and can be a building block for the engineering of bioartificial organs. In this Review, diverse engineering methods of 3D tissues using microscale devices are introduced. Recent progress of microtechnologies has enabled the development of microplatforms for bottom-up assembly of diverse shaped 3D tissues consisting of various cells. Micro hanging-drop plates, microfluidic chips, and arrayed microwells are the typical examples. The encapsulation of cells in hydrogel microspheres and microfibers allows the engineering of 3D microtissues with diverse shapes. Applications of 3D microtissues in biomedical fields are described, and the future direction of microplatform-based engineering of 3D micro-tissues is discussed. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Unsupervised Tattoo Segmentation Combining Bottom-Up and Top-Down Cues

    Energy Technology Data Exchange (ETDEWEB)

    Allen, Josef D [ORNL

    2011-01-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for nding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a gure-ground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is e cient and suitable for further tattoo classi cation and retrieval purpose.

  11. Bottom-up production of meta-atoms for optical magnetism in visible and NIR light

    Science.gov (United States)

    Barois, Philippe; Ponsinet, Virginie; Baron, Alexandre; Richetti, Philippe

    2018-02-01

    Many unusual optical properties of metamaterials arise from the magnetic response of engineered structures of sub-wavelength size (meta-atoms) exposed to light. The top-down approach whereby engineered nanostructure of well-defined morphology are engraved on a surface proved to be successful for the generation of strong optical magnetism. It faces however the limitations of high cost and small active area in visible light where nanometre resolution is needed. The bottom-up approach whereby the fabrication metamaterials of large volume or large area results from the combination of nanochemitry and self-assembly techniques may constitute a cost-effective alternative. This approach nevertheless requires the large-scale production of functional building-blocks (meta-atoms) bearing a strong magnetic optical response. We propose in this paper a few tracks that lead to the large scale synthesis of magnetic metamaterials operating in visible or near IR light.

  12. Implementing collaborative improvement - top-down, bottom-up or both?

    DEFF Research Database (Denmark)

    Kaltoft, Rasmus; Boer, Harry; Caniato, Federico

    2007-01-01

    , the study identifies three different implementation approaches. The bottom-up learning-by-doing approach starts at a practical level, with simple improvement activities, and aims at gradually developing a wide range of CoI knowledge, skills and initiatives. The top-down directive approach starts...... with aligning the partners' CoI objectives and an assessment of their collaboration and CoI maturity in order to provide a common platform before actually starting improvement activities. The laissez-faire approach builds on shared goals/vision, meetings on equal terms and joint work, in a non-directive and non......-facilitated way, though. The article demonstrates how and why the different approaches have different effects on the development of collaborative improvement....

  13. Bottom-up construction of a superstructure in a porous uranium-organic crystal

    Energy Technology Data Exchange (ETDEWEB)

    Li, Peng; Vermeulen, Nicolaas A.; Malliakas, Christos D.; G?mez-Gualdr?n, Diego A.; Howarth, Ashlee J.; Mehdi, B. Layla; Dohnalkova, Alice; Browning, Nigel D.; O?Keeffe, Michael; Farha, Omar K.

    2017-04-20

    Bottom-up construction of highly intricate structures from simple building blocks remains one of the most difficult challenges in chemistry. We report a structurally complex, mesoporous uranium-based metal-organic framework (MOF) made from simple starting components. The structure comprises 10 uranium nodes and seven tricarboxylate ligands (both crystallographically nonequivalent), resulting in a 173.3-angstrom cubic unit cell enclosing 816 uranium nodes and 816 organic linkers—the largest unit cell found to date for any nonbiological material. The cuboctahedra organize into pentagonal and hexagonal prismatic secondary structures, which then form tetrahedral and diamond quaternary topologies with unprecedented complexity. This packing results in the formation of colossal icosidodecahedral and rectified hexakaidecahedral cavities with internal diameters of 5.0 nanometers and 6.2 nanometers, respectively—ultimately giving rise to the lowest-density MOF reported to date.

  14. Bottom-up and Top-down Input Augment the Variability of Cortical Neurons

    Science.gov (United States)

    Nassi, Jonathan J.; Kreiman, Gabriel; Born, Richard T.

    2016-01-01

    SUMMARY Neurons in the cerebral cortex respond inconsistently to a repeated sensory stimulus, yet they underlie our stable sensory experiences. Although the nature of this variability is unknown, its ubiquity has encouraged the general view that each cell produces random spike patterns that noisily represent its response rate. In contrast, here we show that reversibly inactivating distant sources of either bottom-up or top-down input to cortical visual areas in the alert primate reduces both the spike train irregularity and the trial-to-trial variability of single neurons. A simple model in which a fraction of the pre-synaptic input is silenced can reproduce this reduction in variability, provided that there exist temporal correlations primarily within, but not between, excitatory and inhibitory input pools. A large component of the variability of cortical neurons may therefore arise from synchronous input produced by signals arriving from multiple sources. PMID:27427459

  15. Achieving social-ecological fit through bottom-up collaborative governance: an empirical investigation

    Directory of Open Access Journals (Sweden)

    Angela M. Guerrero

    2015-12-01

    Full Text Available Significant benefits can arise from collaborative forms of governance that foster self-organization and flexibility. Likewise, governance systems that fit with the extent and complexity of the system under management are considered essential to our ability to solve environmental problems. However, from an empirical perspective the fundamental question of whether self-organized (bottom-up collaborative forms of governance are able to accomplish adequate fit is unresolved. We used new theory and methodological approaches underpinned by interdisciplinary network analysis to address this gap by investigating three governance challenges that relate to the problem of fit: shared management of ecological resources, management of interconnected ecological resources, and cross-scale management. We first identified a set of social-ecological network configurations that represent the hypothesized ways in which collaborative arrangements can contribute to addressing these challenges. Using social and ecological data from a large-scale biodiversity conservation initiative in Australia, we empirically determined how well the observed patterns of stakeholder interactions reflect these network configurations. We found that stakeholders collaborate to manage individual parcels of native vegetation, but not for the management of interconnected parcels. In addition, our data show that the collaborative arrangements enable management across different scales (local, regional, supraregional. Our study provides empirical support for the ability of collaborative forms of governance to address the problem of fit, but also suggests that in some cases the establishment of bottom-up collaborative arrangements would likely benefit from specific guidance to facilitate the establishment of collaborations that better align with the ways ecological resources are interconnected across the landscape. In our case study region, this would improve the capacity of stakeholders to

  16. Top-down instead of bottom-up estimates of uncertainty in INAA results?

    International Nuclear Information System (INIS)

    Bode, P.; De Nadai Fernandes, E.A.

    2005-01-01

    The initial publication of the ISO Guide to the Expression of Uncertainty in Measurement (GUM) and many related documents has resulted in a worldwide awareness of the importance of a realistic estimate of the value reported after the +/- sign. The evaluation of uncertainty in measurement, as introduced by the GUM, is derived from the principles applied in physical measurements. Many testing laboratories have already experienced large problems in applying these principles in e.g. (bio)chemical measurements, resulting in time-consuming evaluations and costly additional experiments. Other, more pragmatic and less costly approaches have been proposed to obtain a realistic estimate of the range in which the true value of the measurement may be found with a certain degree of probability. One of these approaches, the 'top-down method', is based on the standard deviation in the results of intercomparison data. This approach is much easier for tests for which it is either difficult to establish a full measurement equation, or if e.g. matrix-matching reference materials are absent. It has been demonstrated that the GUM 'bottom-up' approach of evaluating uncertainty in measurement can easily be applied in instrumental neutron activation analysis (INAA) as all significant sources of uncertainty can be evaluated. INAA is therefore a valuable technique to test the validity of the top-down approach. In this contribution, examples of the top-down evaluation of uncertainty in INAA derived from participation in intercomparison rounds and proficiency testing schemes will be presented. The results will be compared with the bottom-up evaluation of uncertainty, and ease of applicability, validity and usefullness of both approaches will be discussed.

  17. Top-down and bottom-up approaches for cost estimating new reactor designs

    International Nuclear Information System (INIS)

    Berbey, P.; Gautier, G.M.; Duflo, D.; Rouyer, J.L.

    2007-01-01

    For several years, Generation-4 designs will be 'pre-conceptual' for the less mature concepts and 'preliminary' for the more mature concepts. In this situation, appropriate data for some of the plant systems may be lacking to develop a bottom-up cost estimate. Therefore, a more global approach, the Top-Down Approach (TDA), is needed to help the designers and decision makers in comparing design options. It utilizes more or less simple models for cost estimating the different parts of a design. TDA cost estimating effort applies to a whole functional element whose cost is approached by similar estimations coming from existing data, ratios and models, for a given range of variation of parameters. Modeling is used when direct analogy is not possible. There are two types of models, global and specific ones. Global models are applied to cost modules related to Code Of Account. Exponential formulae such as Ci = Ai + (Bi x Pi n ) are used when there are cost data for comparable modules in nuclear or other industries. Specific cost models are developed for major specific components of the plant: - process equipment such as reactor vessel, steam generators or large heat exchangers. - buildings, with formulae estimating the construction cost from base cost of m3 of building volume. - systems, when unit costs, cost ratios and models are used, depending on the level of detail of the design. Bottom Up Approach (BUA), which is based on unit prices coming from similar equipment or from manufacturer consulting, is very valuable and gives better cost estimations than TDA when it can be applied, that is at a rather late stage of the design. Both approaches are complementary when some parts of the design are detailed enough to be estimated by BUA, and when BUA results are used to check TDA results and to improve TDA models. This methodology is applied to the HTR (High Temperature Reactor) concept and to an advanced PWR design

  18. The changing Chinese SEA indicator guidelines: Top-down or bottom-up?

    International Nuclear Information System (INIS)

    Gao, Jingjing; Christensen, Per; Kørnøv, Lone

    2014-01-01

    In the last decades, China has introduced a set of indicators to guide the Strategic Environmental Assessment (SEA) practice. The most recent indicator system proposed in 2009 is based on sector-specific guidelines and it found its justification in past negative experiences with more general guidelines (from 2003), which were mostly inspired by, or copied from, international experiences. Based on interviews with practitioners, researchers and administrators, we map and analyse the change in the national guidelines. This analysis is based on a description of the indicators that makes it possible to discern different aggregation levels of indicators and then trace the changes occurring under two sets of guidelines. The analysis also reveals the reasons and rationales behind the changes found in the guidelines. This analysis is inspired by implementation theory and a description of some of the more general trends in the development of SEA and other environmental policies in a recent Chinese context. Beside a more top-down, intentional approach specifying indicators for different sectors based on Chinese experiences from the preceding years, another significant change, following the new guidelines, is a more bottom-up approach which gives more discretion to practitioners. This entails a call for practitioners to make decisions on indicators, which involves an interpretation of the ones present in sector guidance. Highlights: • Focusing on the new Chinese national SEA guidelines proposed in 2009 • Mapping and analysing the most recent change in the indicator system • Revealing the reasons and rationales behind the changes found in the new guidelines • A top-down intention specifying indicators for different sectors • A bottom-up effect in giving discretion and interpretation of using indicators

  19. Preferential effect of isoflurane on top-down vs. bottom-up pathways in sensory cortex.

    Science.gov (United States)

    Raz, Aeyal; Grady, Sean M; Krause, Bryan M; Uhlrich, Daniel J; Manning, Karen A; Banks, Matthew I

    2014-01-01

    The mechanism of loss of consciousness (LOC) under anesthesia is unknown. Because consciousness depends on activity in the cortico-thalamic network, anesthetic actions on this network are likely critical for LOC. Competing theories stress the importance of anesthetic actions on bottom-up "core" thalamo-cortical (TC) vs. top-down cortico-cortical (CC) and matrix TC connections. We tested these models using laminar recordings in rat auditory cortex in vivo and murine brain slices. We selectively activated bottom-up vs. top-down afferent pathways using sensory stimuli in vivo and electrical stimulation in brain slices, and compared effects of isoflurane on responses evoked via the two pathways. Auditory stimuli in vivo and core TC afferent stimulation in brain slices evoked short latency current sinks in middle layers, consistent with activation of core TC afferents. By contrast, visual stimuli in vivo and stimulation of CC and matrix TC afferents in brain slices evoked responses mainly in superficial and deep layers, consistent with projection patterns of top-down afferents that carry visual information to auditory cortex. Responses to auditory stimuli in vivo and core TC afferents in brain slices were significantly less affected by isoflurane compared to responses triggered by visual stimuli in vivo and CC/matrix TC afferents in slices. At a just-hypnotic dose in vivo, auditory responses were enhanced by isoflurane, whereas visual responses were dramatically reduced. At a comparable concentration in slices, isoflurane suppressed both core TC and CC/matrix TC responses, but the effect on the latter responses was far greater than on core TC responses, indicating that at least part of the differential effects observed in vivo were due to local actions of isoflurane in auditory cortex. These data support a model in which disruption of top-down connectivity contributes to anesthesia-induced LOC, and have implications for understanding the neural basis of consciousness.

  20. Diversity has stronger top-down than bottom-up effects on decomposition.

    Science.gov (United States)

    Srivastava, Diane S; Cardinale, Bradley J; Downing, Amy L; Duffy, J Emmett; Jouseau, Claire; Sankaran, Mahesh; Wright, Justin P

    2009-04-01

    The flow of energy and nutrients between trophic levels is affected by both the trophic structure of food webs and the diversity of species within trophic levels. However, the combined effects of trophic structure and diversity on trophic transfer remain largely unknown. Here we ask whether changes in consumer diversity have the same effect as changes in resource diversity on rates of resource consumption. We address this question by focusing on consumer-resource dynamics for the ecologically important process of decomposition. This study compares the top-down effect of consumer (detritivore) diversity on the consumption of dead organic matter (decomposition) with the bottom-up effect of resource (detrital) diversity, based on a compilation of 90 observations reported in 28 studies. We did not detect effects of either detrital or consumer diversity on measures of detrital standing stock, and effects on consumer standing stock were equivocal. However, our meta-analysis indicates that reductions in detritivore diversity result in significant reductions in the rate of decomposition. Detrital diversity has both positive and negative effects on decomposition, with no overall trend. This difference between top-down and bottom-up effects of diversity is robust to different effect size metrics and could not be explained by differences in experimental systems or designs between detritivore and detrital manipulations. Our finding that resource diversity has no net effect on consumption in "brown" (detritus-consumer) food webs contrasts with previous findings from "green" (plant-herbivore) food webs and suggests that effects of plant diversity on consumption may fundamentally change after plant death.

  1. The changing Chinese SEA indicator guidelines: Top-down or bottom-up?

    Energy Technology Data Exchange (ETDEWEB)

    Gao, Jingjing, E-mail: Jingjing@plan.aau.dk; Christensen, Per; Kørnøv, Lone

    2014-01-15

    In the last decades, China has introduced a set of indicators to guide the Strategic Environmental Assessment (SEA) practice. The most recent indicator system proposed in 2009 is based on sector-specific guidelines and it found its justification in past negative experiences with more general guidelines (from 2003), which were mostly inspired by, or copied from, international experiences. Based on interviews with practitioners, researchers and administrators, we map and analyse the change in the national guidelines. This analysis is based on a description of the indicators that makes it possible to discern different aggregation levels of indicators and then trace the changes occurring under two sets of guidelines. The analysis also reveals the reasons and rationales behind the changes found in the guidelines. This analysis is inspired by implementation theory and a description of some of the more general trends in the development of SEA and other environmental policies in a recent Chinese context. Beside a more top-down, intentional approach specifying indicators for different sectors based on Chinese experiences from the preceding years, another significant change, following the new guidelines, is a more bottom-up approach which gives more discretion to practitioners. This entails a call for practitioners to make decisions on indicators, which involves an interpretation of the ones present in sector guidance. Highlights: • Focusing on the new Chinese national SEA guidelines proposed in 2009 • Mapping and analysing the most recent change in the indicator system • Revealing the reasons and rationales behind the changes found in the new guidelines • A top-down intention specifying indicators for different sectors • A bottom-up effect in giving discretion and interpretation of using indicators.

  2. Platform dependencies in bottom-up hydrogen/deuterium exchange mass spectrometry.

    Science.gov (United States)

    Burns, Kyle M; Rey, Martial; Baker, Charles A H; Schriemer, David C

    2013-02-01

    Hydrogen-deuterium exchange mass spectrometry is an important method for protein structure-function analysis. The bottom-up approach uses protein digestion to localize deuteration to higher resolution, and the essential measurement involves centroid mass determinations on a very large set of peptides. In the course of evaluating systems for various projects, we established two (HDX-MS) platforms that consisted of a FT-MS and a high-resolution QTOF mass spectrometer, each with matched front-end fluidic systems. Digests of proteins spanning a 20-110 kDa range were deuterated to equilibrium, and figures-of-merit for a typical bottom-up (HDX-MS) experiment were compared for each platform. The Orbitrap Velos identified 64% more peptides than the 5600 QTOF, with a 42% overlap between the two systems, independent of protein size. Precision in deuterium measurements using the Orbitrap marginally exceeded that of the QTOF, depending on the Orbitrap resolution setting. However, the unique nature of FT-MS data generates situations where deuteration measurements can be inaccurate, because of destructive interference arising from mismatches in elemental mass defects. This is shown through the analysis of the peptides common to both platforms, where deuteration values can be as low as 35% of the expected values, depending on FT-MS resolution, peptide length and charge state. These findings are supported by simulations of Orbitrap transients, and highlight that caution should be exercised in deriving centroid mass values from FT transients that do not support baseline separation of the full isotopic composition.

  3. Conventional-Flow Liquid Chromatography-Mass Spectrometry for Exploratory Bottom-Up Proteomic Analyses.

    Science.gov (United States)

    Lenčo, Juraj; Vajrychová, Marie; Pimková, Kristýna; Prokšová, Magdaléna; Benková, Markéta; Klimentová, Jana; Tambor, Vojtěch; Soukup, Ondřej

    2018-04-17

    Due to its sensitivity and productivity, bottom-up proteomics based on liquid chromatography-mass spectrometry (LC-MS) has become the core approach in the field. The de facto standard LC-MS platform for proteomics operates at sub-μL/min flow rates, and nanospray is required for efficiently introducing peptides into a mass spectrometer. Although this is almost a "dogma", this view is being reconsidered in light of developments in highly efficient chromatographic columns, and especially with the introduction of exceptionally sensitive MS instruments. Although conventional-flow LC-MS platforms have recently penetrated targeted proteomics successfully, their possibilities in discovery-oriented proteomics have not yet been thoroughly explored. Our objective was to determine what are the extra costs and what optimization and adjustments to a conventional-flow LC-MS system must be undertaken to identify a comparable number of proteins as can be identified on a nanoLC-MS system. We demonstrate that the amount of a complex tryptic digest needed for comparable proteome coverage can be roughly 5-fold greater, providing the column dimensions are properly chosen, extra-column peak dispersion is minimized, column temperature and flow rate are set to levels appropriate for peptide separation, and the composition of mobile phases is fine-tuned. Indeed, we identified 2 835 proteins from 2 μg of HeLa cells tryptic digest separated during a 60 min gradient at 68 μL/min on a 1.0 mm × 250 mm column held at 55 °C and using an aqua-acetonitrile mobile phases containing 0.1% formic acid, 0.4% acetic acid, and 3% dimethyl sulfoxide. Our results document that conventional-flow LC-MS is an attractive alternative for bottom-up exploratory proteomics.

  4. How interactions between top-down and bottom-up controls on carbon cycling affect fluxes within and from lakes

    Science.gov (United States)

    Sadro, S.; Piovia-Scott, J.; Nelson, C.; Sickman, J. O.; Knapp, R.

    2017-12-01

    While the role of inland waters in global carbon cycling has grown clearer in recent decades, the extent to which top-down and bottom-up mechanisms interact to regulate dynamics at the catchment scale is not well understood. The degree to which lakes process, export, or store terrestrial carbon is influenced by hydrological variability, variation in the magnitude of terrestrial organic matter (t-OM) entering a system, the efficiency with which such material is metabolized by bacterioplankton, the extent to which it is incorporated into secondary consumer biomass, and by the effects of food-web structure, such as the presence or absence of top predators. However, how these processes interact to mediate carbon fluxes between terrestrial, aquatic, and atmospheric reservoirs remains unclear. We develop a conceptual model that explores how interactions among these factors ultimately affects carbon dynamics using data from lakes located in the Sierra Nevada mountains of California. The Sierra are an excellent system for studies of carbon cycling because elevation-induced landscape gradients in soil development and vegetation cover provide large natural variation in terrestrial inputs to lakes, while variation in confounding factors such as lake morphometry or trophic state is comparatively small. Dissolved organic carbon concentrations increase 100 fold in lakes spanning the alpine to montane elevation gradient found in the Sierra, and fluorescence characteristics reflect an increasingly terrestrial signature with decreasing elevation. Bacterioplankton make up a large proportion of total ecosystem metabolism in these systems, and their metabolic efficiency is tightly coupled to the composition of dissolved organic matter. Stable isotope food web data (δ13C, Δ14C, and δ2H) and measurements of pCO2 from lakes indicate the magnitude of allochthony, rates if carbon cycling, and ecosystem heterotrophy all increase with the increasingly terrestrial signature of dissolved

  5. Saliency Detection via Absorbing Markov Chain With Learnt Transition Probability.

    Science.gov (United States)

    Lihe Zhang; Jianwu Ai; Bowen Jiang; Huchuan Lu; Xiukui Li

    2018-02-01

    In this paper, we propose a bottom-up saliency model based on absorbing Markov chain (AMC). First, a sparsely connected graph is constructed to capture the local context information of each node. All image boundary nodes and other nodes are, respectively, treated as the absorbing nodes and transient nodes in the absorbing Markov chain. Then, the expected number of times from each transient node to all other transient nodes can be used to represent the saliency value of this node. The absorbed time depends on the weights on the path and their spatial coordinates, which are completely encoded in the transition probability matrix. Considering the importance of this matrix, we adopt different hierarchies of deep features extracted from fully convolutional networks and learn a transition probability matrix, which is called learnt transition probability matrix. Although the performance is significantly promoted, salient objects are not uniformly highlighted very well. To solve this problem, an angular embedding technique is investigated to refine the saliency results. Based on pairwise local orderings, which are produced by the saliency maps of AMC and boundary maps, we rearrange the global orderings (saliency value) of all nodes. Extensive experiments demonstrate that the proposed algorithm outperforms the state-of-the-art methods on six publicly available benchmark data sets.

  6. The case for refining bottom-up methane emission inventories using top-down measurements

    Science.gov (United States)

    Kelly, Bryce F. J.; Iverach, Charlotte P.; Ginty, Elisa; Bashir, Safdar; Lowry, Dave; Fisher, Rebecca E.; France, James L.; Nisbet, Euan G.

    2017-04-01

    Bottom-up global methane emission estimates are important for guiding policy development and mitigation strategies. Such inventories enable rapid and consistent proportioning of emissions by industrial sectors and land use at various scales from city to country to global. There has been limited use of top-down measurements to guide refining emission inventories. Here we compare the EDGAR gridmap data version 4.2 with over 5000 km of daytime ground level mobile atmospheric methane surveys in eastern Australia. The landscapes and industries surveyed include: urban environments, dryland farming, intensive livestock farming (both beef and lamb), irrigation agriculture, open cut and underground coal mining, and coal seam gas production. Daytime mobile methane surveys over a 2-year period show that at the landscape scale there is a high level of repeatability for the mole fraction of methane measured in the ground level atmosphere. Such consistency in the mole fraction of methane indicates that these data can be used as a proxy for flux. A scatter plot of the EDGAR emission gridmap Log[ton substance / 0.1 degree x 0.1 degree / year] versus the median mole fraction of methane / 0.1 degree x 0.1 degree in the ground level atmosphere highlights that the extent of elevated methane emissions associated with coal mining in the Hunter coalfields, which covers an area of 56 km by 24 km, has been under-represented in the EDGAR input data. Our results also show that methane emissions from country towns (population poor information on the extent of urban gas leaks. Given the uncertainties associated with the base land use and industry data for each country, we generalise the Australian observations to the global inventory with caution. The extensive comparison of top-down measurements versus the EDGAR version 4.2 methane gridmaps highlights the need for adjustments to the base resource data and/or the emission factors applied for coal mining, especially emissions from underground

  7. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Directory of Open Access Journals (Sweden)

    Merger Eduard

    2012-08-01

    Full Text Available Abstract Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV, and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to

  8. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Science.gov (United States)

    2012-01-01

    Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs) ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV), and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to provide information on the

  9. Merging Bottom-Up with Top-Down: Continuous Lamellar Networks and Block Copolymer Lithography

    Science.gov (United States)

    Campbell, Ian Patrick

    Block copolymer lithography is an emerging nanopatterning technology with capabilities that may complement and eventually replace those provided by existing optical lithography techniques. This bottom-up process relies on the parallel self-assembly of macromolecules composed of covalently linked, chemically distinct blocks to generate periodic nanostructures. Among the myriad potential morphologies, lamellar structures formed by diblock copolymers with symmetric volume fractions have attracted the most interest as a patterning tool. When confined to thin films and directed to assemble with interfaces perpendicular to the substrate, two-dimensional domains are formed between the free surface and the substrate, and selective removal of a single block creates a nanostructured polymeric template. The substrate exposed between the polymeric features can subsequently be modified through standard top-down microfabrication processes to generate novel nanostructured materials. Despite tremendous progress in our understanding of block copolymer self-assembly, continuous two-dimensional materials have not yet been fabricated via this robust technique, which may enable nanostructured material combinations that cannot be fabricated through bottom-up methods. This thesis aims to study the effects of block copolymer composition and processing on the lamellar network morphology of polystyrene-block-poly(methyl methacrylate) (PS-b-PMMA) and utilize this knowledge to fabricate continuous two-dimensional materials through top-down methods. First, block copolymer composition was varied through homopolymer blending to explore the physical phenomena surrounding lamellar network continuity. After establishing a framework for tuning the continuity, the effects of various processing parameters were explored to engineer the network connectivity via defect annihilation processes. Precisely controlling the connectivity and continuity of lamellar networks through defect engineering and

  10. The Early Anthropogenic Hypothesis: Top-Down and Bottom-up Evidence

    Science.gov (United States)

    Ruddiman, W. F.

    2014-12-01

    Two complementary lines of evidence support the early anthropogenic hypothesis. Top-down evidence comes from comparing Holocene greenhouse-gas trends with those during equivalent intervals of previous interglaciations. The increases in CO2 and CH4 during the late Holocene are anomalous compared to the decreasing trends in a stacked average of previous interglaciations, thereby supporting an anthropogenic origin. During interglacial stage 19, the closest Holocene insolation analog, CO2 fell to 245 ppm by the time equivalent to the present, in contrast to the observed pre-industrial rise to 280-285 ppm. The 245-ppm level measured in stage 19 falls at the top of the natural range predicted by the original anthropogenic hypothesis of Ruddiman (2003). Bottom-up evidence comes from a growing list of archeological and other compilations showing major early anthropogenic transformations of Earth's surface. Key examples include: efforts by Dorian Fuller and colleagues mapping the spread of irrigated rice agriculture across southern Asia and its effects on CH4 emissions prior to the industrial era; an additional effort by Fuller showing the spread of methane-emitting domesticated livestock across Asia and Africa (coincident with the spread of fertile crescent livestock across Europe); historical compilations by Jed Kaplan and colleagues documenting very high early per-capita forest clearance in Europe, thus underpinning simulations of extensive pre-industrial clearance and large CO2 emissions; and wide-ranging studies by Erle Ellis and colleagues of early anthropogenic land transformations in China and elsewhere.

  11. People-centred health systems, a bottom-up approach: where theory meets empery.

    Science.gov (United States)

    Sturmberg, Joachim P; Njoroge, Alice

    2017-04-01

    Health systems are complex and constantly adapt to changing demands. These complex-adaptive characteristics are rarely considered in the current bureaucratic top-down approaches to health system reforms aimed to constrain demand and expenditure growth. The economic focus fails to address the needs of patients, providers and communities, and ultimately results in declining effectiveness and efficiency of the health care system as well as the health of the wider community. A needs-focused complex-adaptive health system can be represented by the 'healthcare vortex' model; how to build a needs-focused complex-adaptive health system is illustrated by Eastern Deanery AIDS Relief Program approaches in the poor neighbourhoods of Nairobi, Kenya. A small group of nurses and community health workers focused on the care of terminally ill HIV/AIDS patients. This work identified additional problems: tuberculosis (TB) was underdiagnosed and undertreated, a local TB-technician was trained to run a local lab, a courier services helped to reach all at need, collaboration with the Ministry of Health established local TB and HIV treatment programmes and philanthropists helped to supplement treatment with nutrition support. Maternal-to-child HIV-prevention and adolescent counselling services addressed additional needs. The 'theory of the healthcare vortex' indeed matches the 'empery of the real world experiences'. Locally developed and delivered adaptive, people-centred health systems, a bottom-up community and provider initiated approach, deliver highly effective and sustainable health care despite significant resource constraints. © 2016 John Wiley & Sons, Ltd.

  12. Metabolic Network Discovery by Top-Down and Bottom-Up Approaches and Paths for Reconciliation

    Energy Technology Data Exchange (ETDEWEB)

    Çakır, Tunahan, E-mail: tcakir@gyte.edu.tr [Computational Systems Biology Group, Department of Bioengineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey); Khatibipour, Mohammad Jafar [Computational Systems Biology Group, Department of Bioengineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey); Department of Chemical Engineering, Gebze Technical University (formerly known as Gebze Institute of Technology), Gebze (Turkey)

    2014-12-03

    The primary focus in the network-centric analysis of cellular metabolism by systems biology approaches is to identify the active metabolic network for the condition of interest. Two major approaches are available for the discovery of the condition-specific metabolic networks. One approach starts from genome-scale metabolic networks, which cover all possible reactions known to occur in the related organism in a condition-independent manner, and applies methods such as the optimization-based Flux-Balance Analysis to elucidate the active network. The other approach starts from the condition-specific metabolome data, and processes the data with statistical or optimization-based methods to extract information content of the data such that the active network is inferred. These approaches, termed bottom-up and top-down, respectively, are currently employed independently. However, considering that both approaches have the same goal, they can both benefit from each other paving the way for the novel integrative analysis methods of metabolome data- and flux-analysis approaches in the post-genomic era. This study reviews the strengths of constraint-based analysis and network inference methods reported in the metabolic systems biology field; then elaborates on the potential paths to reconcile the two approaches to shed better light on how the metabolism functions.

  13. Innovative Sol-Gel Routes for the Bottom-up Preparation of Heterogeneous Catalysts.

    Science.gov (United States)

    Debecker, Damien P

    2017-12-11

    Heterogeneous catalysts can be prepared by different methods offering various levels of control on the final properties of the solid. In this account, we exemplify bottom-up preparation routes that are based on the sol-gel chemistry and allow to tailor some decisive properties of solid catalysts. First, an emulsion templating strategy is shown to lead to macrocellular self-standing monoliths with a macroscopic 3D structure. The latter can be used as catalyst or catalyst supports in flow chemistry, without requiring any subsequent shaping step. Second, the aerosol-assisted sol-gel process allows for the one-step and continuous production of porous mixed oxides. Tailored textural properties can be obtained together with an excellent control on composition and homogeneity. Third, the application of non-hydrolytic sol-gel routes, in the absence of water, leads to mixed oxides with outstanding textural properties and with peculiar surface chemistry. In all cases, the resulting catalytic performance can be correlated with the specificities of the preparation routes presented. This is exemplified in catalytic reactions in the fields of biomass conversion, petro chemistry, enantioselective organic synthesis, and air pollution mitigation. © 2017 The Chemical Society of Japan & Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Bottom-up synthesis of nitrogen-doped graphene sheets for ultrafast lithium storage.

    Science.gov (United States)

    Tian, Lei-Lei; Wei, Xian-Yong; Zhuang, Quan-Chao; Jiang, Chen-Hui; Wu, Chao; Ma, Guang-Yao; Zhao, Xing; Zong, Zhi-Min; Sun, Shi-Gang

    2014-06-07

    A facile bottom-up strategy was developed to fabricate nitrogen-doped graphene sheets (NGSs) from glucose using a sacrificial template synthesis method. Three main types of nitrogen dopants (pyridinic, pyrrolic and graphitic nitrogens) were introduced into the graphene lattice, and an inimitable microporous structure of NGS with a high specific surface area of 504 m(2) g(-1) was obtained. Particularly, with hybrid features of lithium ion batteries and Faradic capacitors at a low rate and features of Faradic capacitors at a high rate, the NGS presents a superior lithium storage performance. During electrochemical cycling, the NGS electrode afforded an enhanced reversible capacity of 832.4 mA h g(-1) at 100 mA g(-1) and an excellent cycling stability of 750.7 mA h g(-1) after 108 discharge-charge cycles. Furthermore, an astonishing rate capability of 333 mA h g(-1) at 10,000 mA g(-1) and a high rate cycle performance of 280.6 mA h g(-1) even after 1200 cycles were also achieved, highlighting the significance of nitrogen doping on the maximum utilization of graphene-based materials for advanced lithium storage.

  15. Bottom-up synthesis of nitrogen-doped graphene sheets for ultrafast lithium storage

    Science.gov (United States)

    Tian, Lei-Lei; Wei, Xian-Yong; Zhuang, Quan-Chao; Jiang, Chen-Hui; Wu, Chao; Ma, Guang-Yao; Zhao, Xing; Zong, Zhi-Min; Sun, Shi-Gang

    2014-05-01

    A facile bottom-up strategy was developed to fabricate nitrogen-doped graphene sheets (NGSs) from glucose using a sacrificial template synthesis method. Three main types of nitrogen dopants (pyridinic, pyrrolic and graphitic nitrogens) were introduced into the graphene lattice, and an inimitable microporous structure of NGS with a high specific surface area of 504 m2 g-1 was obtained. Particularly, with hybrid features of lithium ion batteries and Faradic capacitors at a low rate and features of Faradic capacitors at a high rate, the NGS presents a superior lithium storage performance. During electrochemical cycling, the NGS electrode afforded an enhanced reversible capacity of 832.4 mA h g-1 at 100 mA g-1 and an excellent cycling stability of 750.7 mA h g-1 after 108 discharge-charge cycles. Furthermore, an astonishing rate capability of 333 mA h g-1 at 10 000 mA g-1 and a high rate cycle performance of 280.6 mA h g-1 even after 1200 cycles were also achieved, highlighting the significance of nitrogen doping on the maximum utilization of graphene-based materials for advanced lithium storage.

  16. Visionmaker NYC: A bottom-up approach to finding shared socioeconomic pathways in New York City

    Science.gov (United States)

    Sanderson, E. W.; Fisher, K.; Giampieri, M.; Barr, J.; Meixler, M.; Allred, S. B.; Bunting-Howarth, K. E.; DuBois, B.; Parris, A. S.

    2015-12-01

    Visionmaker NYC is a free, public participatory, bottom-up web application to develop and share climate mitigation and adaptation strategies for New York City neighborhoods. The goal is to develop shared socioeconomic pathways by allowing a broad swath of community members - from schoolchildren to architects and developers to the general public - to input their concepts for a desired future. Visions are comprised of climate scenarios, lifestyle choices, and ecosystem arrangements, where ecosystems are broadly defined to include built ecosystems (e.g. apartment buildings, single family homes, etc.), transportation infrastructure (e.g. highways, connector roads, sidewalks), and natural land cover types (e.g. wetlands, forests, estuary.) Metrics of water flows, carbon cycling, biodiversity patterns, and population are estimated for the user's vision, for the same neighborhood today, and for that neighborhood as it existed in the pre-development state, based on the Welikia Project (welikia.org.) Users can keep visions private, share them with self-defined groups of other users, or distribute them publicly. Users can also propose "challenges" - specific desired states of metrics for specific parts of the city - and others can post visions in response. Visionmaker contributes by combining scenario planning, scientific modelling, and social media to create new, wide-open possibilities for discussion, collaboration, and imagination regarding future, shared socioeconomic pathways.

  17. Electrical transport of bottom-up grown single-crystal Si1-xGex nanowire

    International Nuclear Information System (INIS)

    Yang, W F; Lee, S J; Liang, G C; Whang, S J; Kwong, D L

    2008-01-01

    In this work, we fabricated an Si 1-x Ge x nanowire (NW) metal-oxide-semiconductor field-effect transistor (MOSFET) by using bottom-up grown single-crystal Si 1-x Ge x NWs integrated with HfO 2 gate dielectric, TaN/Ta gate electrode and Pd Schottky source/drain electrodes, and investigated the electrical transport properties of Si 1-x Ge x NWs. It is found that both undoped and phosphorus-doped Si 1-x Ge x NW MOSFETs exhibit p-MOS operation while enhanced performance of higher I on ∼100 nA and I on /I off ∼10 5 are achieved from phosphorus-doped Si 1-x Ge x NWs, which can be attributed to the reduction of the effective Schottky barrier height (SBH). Further improvement in gate control with a subthreshold slope of 142 mV dec -1 was obtained by reducing HfO 2 gate dielectric thickness. A comprehensive study on SBH between the Si 1-x Ge x NW channel and Pd source/drain shows that a doped Si 1-x Ge x NW has a lower effective SBH due to a thinner depletion width at the junction and the gate oxide thickness has negligible effect on effective SBH

  18. Sustainability and Uncertainty: Bottom-Up and Top-Down Approaches

    Directory of Open Access Journals (Sweden)

    K. Klint Jensen

    2010-04-01

    Full Text Available The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one’s mind about a range of difficult questions. One line of research (bottom-up takes sustaining a system over time as its starting point and then infers prescriptions from this requirement. Another line (top-down takes an economical interpretation of the Brundtland Commission’s suggestion that the present generation’s needsatisfaction should not compromise the need-satisfaction of future generations as its starting point. It then measures sustainability at the level of society and infers prescriptions from this requirement. These two approaches may conflict, and in this conflict the top-down approach has the upper hand, ethically speaking. However, the implicit goal in the top-down approach of justice between generations needs to be refined in several dimensions. But even given a clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable are decisions under uncertainty. There might be different judgments on likelihoods; but even given some set of probabilities, there might be disagreement on the right level of precaution in face of the uncertainty.

  19. Bottom-Up, Wet Chemical Technique for the Continuous Synthesis of Inorganic Nanoparticles

    Directory of Open Access Journals (Sweden)

    Annika Betke

    2014-01-01

    Full Text Available Continuous wet chemical approaches for the production of inorganic nanoparticles are important for large scale production of nanoparticles. Here we describe a bottom-up, wet chemical method applying a microjet reactor. This technique allows the separation between nucleation and growth in a continuous reactor environment. Zinc oxide (ZnO, magnetite (Fe3O4, as well as brushite (CaHPO4·2H2O, particles with a small particle size distribution can be obtained continuously by using the rapid mixing of two precursor solutions and the fast removal of the nuclei from the reaction environment. The final particles were characterized by FT-IR, TGA, DLS, XRD and SEM techniques. Systematic studies on the influence of the different process parameters, such as flow rate and process temperature, show that the particle size can be influenced. Zinc oxide was obtained with particle sizes between 44 nm and 102 nm. The obtained magnetite particles have particle sizes in the range of 46 nm to 132 nm. Brushite behaves differently; the obtained particles were shaped like small plates with edge lengths between 100 nm and 500 nm.

  20. a Bottom-Up Geosptial Data Update Mechanism for Spatial Data Infrastructure Updating

    Science.gov (United States)

    Tian, W.; Zhu, X.; Liu, Y.

    2012-08-01

    Currently, the top-down spatial data update mechanism has made a big progress and it is wildly applied in many SDI (spatial data infrastructure). However, this mechanism still has some issues. For example, the update schedule is limited by the professional department's project, usually which is too long for the end-user; the data form collection to public cost too much time and energy for professional department; the details of geospatial information does not provide sufficient attribute, etc. Thus, how to deal with the problems has become the effective shortcut. Emerging Internet technology, 3S technique and geographic information knowledge which is popular in the public promote the booming development of geoscience in volunteered geospatial information. Volunteered geospatial information is the current "hotspot", which attracts many researchers to study its data quality and credibility, accuracy, sustainability, social benefit, application and so on. In addition to this, a few scholars also pay attention to the value of VGI to support the SDI updating. And on that basis, this paper presents a bottom-up update mechanism form VGI to SDI, which includes the processes of match homonymous elements between VGI and SDI vector data , change data detection, SDI spatial database update and new data product publication to end-users. Then, the proposed updating cycle is deeply discussed about the feasibility of which can detect the changed elements in time and shorten the update period, provide more accurate geometry and attribute data for spatial data infrastructure and support update propagation.

  1. Two Paths to Transforming Markets through Public Sector EnergyEfficiency: Bottom Up versus Top Down

    Energy Technology Data Exchange (ETDEWEB)

    Van Wie McGrory, Laura; Coleman, Philip; Fridley, David; Harris,Jeffrey; Villasenor Franco, Edgar

    2006-05-10

    The evolution of government purchasing initiatives in Mexicoand China, part of the PEPS (Promoting an Energy-efficient Public Sector)program, demonstrates the need for flexibility in designingenergy-efficiency strategies in the public sector. Several years ofpursuing a top-down (federally led) strategy in Mexico produced fewresults, and it was not until the program was restructured in 2004 tofocus on municipal-level purchasing that the program gained momentum.Today, a new partnership with the Mexican federal government is leadingto an intergovernmental initiative with strong support at the federallevel. By contrast, the PEPS purchasing initiative in China wassuccessfully initiated and led at the central government level withstrategic support from international experts. The very different successtrajectories in these two countries provide valuable lessons fordesigning country-specific public sector energy-efficiency initiatives.Enabling conditions for any successful public sector purchasinginitiative include the existence of mandatory energy-efficiencyperformance standards, an effective energy-efficiency endorsementlabeling program, an immediate need for energy conservation, a simplepilot phase (focusing on a limited number of strategically chosenproducts), and specialized technical assistance. Top-down purchasingprograms are likely to be more successful where there is high-levelpolitical endorsement and a national procurement law in place, supportedby a network of trained purchasers. Bottom-up (municipally led)purchasing programs require that municipalities have the authority to settheir own purchasing policies, and also benefit from existing networks ofcities, supported by motivated municipal leaders and trained purchasingofficials.

  2. Bottom-up fabrication of zwitterionic polymer brushes on intraocular lens for improved biocompatibility

    Directory of Open Access Journals (Sweden)

    Han Y

    2016-12-01

    Full Text Available Yuemei Han,1,* Xu Xu,1,* Junmei Tang,1,* Chenghui Shen,2 Quankui Lin,1,2 Hao Chen1,2 1School of Ophthalmology & Optometry, Eye Hospital, Wenzhou Medical University, 2Wenzhou Institute of Biomaterials and Engineering, Chinese Academy of Sciences, Wenzhou, People’s Republic of China *These authors contributed equally to this work Abstract: Intraocular lens (IOL is an efficient implantable device commonly used for treating cataracts. However, bioadhesion of bacteria or residual lens epithelial cells on the IOL surface after surgery causes postoperative complications, such as endophthalmitis or posterior capsular opacification, and leads to loss of sight again. In the present study, zwitterionic polymer brushes were fabricated on the IOL surface via bottom-up grafting procedure. The attenuated total reflection-Fourier transform infrared and contact angle measurements indicated successful surface modification, as well as excellent hydrophilicity. The coating of hydrophilic zwitterionic polymer effectively decreased the bioadhesion of lens epithelial cells or bacteria. In vivo intraocular implantation results showed good in vivo biocompatibility of zwitterionic IOL and its effectiveness against postoperative complications. Keywords: RAFT, surface modification, endophthalmitis, PCO, in vivo

  3. Enhanced Photon Extraction from a Nanowire Quantum Dot Using a Bottom-Up Photonic Shell

    Science.gov (United States)

    Jeannin, Mathieu; Cremel, Thibault; Häyrynen, Teppo; Gregersen, Niels; Bellet-Amalric, Edith; Nogues, Gilles; Kheng, Kuntheak

    2017-11-01

    Semiconductor nanowires offer the possibility to grow high-quality quantum-dot heterostructures, and, in particular, CdSe quantum dots inserted in ZnSe nanowires have demonstrated the ability to emit single photons up to room temperature. In this paper, we demonstrate a bottom-up approach to fabricate a photonic fiberlike structure around such nanowire quantum dots by depositing an oxide shell using atomic-layer deposition. Simulations suggest that the intensity collected in our NA =0.6 microscope objective can be increased by a factor 7 with respect to the bare nanowire case. Combining microphotoluminescence, decay time measurements, and numerical simulations, we obtain a fourfold increase in the collected photoluminescence from the quantum dot. We show that this improvement is due to an increase of the quantum-dot emission rate and a redirection of the emitted light. Our ex situ fabrication technique allows a precise and reproducible fabrication on a large scale. Its improved extraction efficiency is compared to state-of-the-art top-down devices.

  4. Rational design of modular circuits for gene transcription: A test of the bottom-up approach

    Directory of Open Access Journals (Sweden)

    Giordano Emanuele

    2010-11-01

    Full Text Available Abstract Background Most of synthetic circuits developed so far have been designed by an ad hoc approach, using a small number of components (i.e. LacI, TetR and a trial and error strategy. We are at the point where an increasing number of modular, inter-changeable and well-characterized components is needed to expand the construction of synthetic devices and to allow a rational approach to the design. Results We used interchangeable modular biological parts to create a set of novel synthetic devices for controlling gene transcription, and we developed a mathematical model of the modular circuits. Model parameters were identified by experimental measurements from a subset of modular combinations. The model revealed an unexpected feature of the lactose repressor system, i.e. a residual binding affinity for the operator site by induced lactose repressor molecules. Once this residual affinity was taken into account, the model properly reproduced the experimental data from the training set. The parameters identified in the training set allowed the prediction of the behavior of networks not included in the identification procedure. Conclusions This study provides new quantitative evidences that the use of independent and well-characterized biological parts and mathematical modeling, what is called a bottom-up approach to the construction of gene networks, can allow the design of new and different devices re-using the same modular parts.

  5. Metabolic Network Discovery by Top-Down and Bottom-Up Approaches and Paths for Reconciliation

    International Nuclear Information System (INIS)

    Çakır, Tunahan; Khatibipour, Mohammad Jafar

    2014-01-01

    The primary focus in the network-centric analysis of cellular metabolism by systems biology approaches is to identify the active metabolic network for the condition of interest. Two major approaches are available for the discovery of the condition-specific metabolic networks. One approach starts from genome-scale metabolic networks, which cover all possible reactions known to occur in the related organism in a condition-independent manner, and applies methods such as the optimization-based Flux-Balance Analysis to elucidate the active network. The other approach starts from the condition-specific metabolome data, and processes the data with statistical or optimization-based methods to extract information content of the data such that the active network is inferred. These approaches, termed bottom-up and top-down, respectively, are currently employed independently. However, considering that both approaches have the same goal, they can both benefit from each other paving the way for the novel integrative analysis methods of metabolome data- and flux-analysis approaches in the post-genomic era. This study reviews the strengths of constraint-based analysis and network inference methods reported in the metabolic systems biology field; then elaborates on the potential paths to reconcile the two approaches to shed better light on how the metabolism functions.

  6. Succumbing to Bottom-Up Biases on Task Choice Predicts Increased Switch Costs in the Voluntary Task Switching Paradigm

    Science.gov (United States)

    Orr, Joseph M.; Weissman, Daniel H.

    2010-01-01

    Bottom-up biases are widely thought to influence task choice in the voluntary task switching paradigm. Definitive support for this hypothesis is lacking, however, because task choice and task performance are usually confounded. We therefore revisited this hypothesis using a paradigm in which task choice and task performance are temporally separated. As predicted, participants tended to choose the task that was primed by bottom-up biases. Moreover, such choices were linked to increased switch costs during subsequent task performance. These findings provide compelling evidence that bottom-up biases influence voluntary task choice. They also suggest that succumbing to such biases reflects a reduction of top-down control that persists to influence upcoming task performance. PMID:21713192

  7. Perceptual salience affects the contents of working memory during free-recollection of objects from natural scenes

    Directory of Open Access Journals (Sweden)

    Tiziana ePedale

    2015-02-01

    Full Text Available One of the most important issues in the study of cognition is to understand which are the factors determining internal representation of the external world. Previous literature has started to highlight the impact of low-level sensory features (indexed by saliency-maps in driving attention selection, hence increasing the probability for objects presented in complex and natural scenes to be successfully encoded into working memory(WM and then correctly remembered. Here we asked whether the probability of retrieving high-saliency objects modulates the overall contents of WM, by decreasing the probability of retrieving other, lower-saliency objects. We presented pictures of natural scenes for 4 secs. After a retention period of 8 secs, we asked participants to verbally report as many objects/details as possible of the previous scenes. We then computed how many times the objects located at either the peak of maximal or minimal saliency in the scene (as indexed by a saliency-map; Itti et al., 1998 were recollected by participants. Results showed that maximal-saliency objects were recollected more often and earlier in the stream of successfully reported items than minimal-saliency objects. This indicates that bottom-up sensory salience increases the recollection probability and facilitates the access to memory representation at retrieval, respectively. Moreover, recollection of the maximal- (but not the minimal- saliency objects predicted the overall amount of successfully recollected objects: The higher the probability of having successfully reported the most-salient object in the scene, the lower the amount of recollected objects. These findings highlight that bottom-up sensory saliency modulates the current contents of WM during recollection of objects from natural scenes, most likely by reducing available resources to encode and then retrieve other (lower saliency objects.

  8. Addressing the Misuse Potential of Life Science Research-Perspectives From a Bottom-Up Initiative in Switzerland.

    Science.gov (United States)

    Oeschger, Franziska M; Jenal, Ursula

    2018-01-01

    Codes of conduct have received wide attention as a bottom-up approach to foster responsibility for dual use aspects of life science research within the scientific community. In Switzerland, a series of discussion sessions led by the Swiss Academy of Sciences with over 40 representatives of most Swiss academic life science research institutions has revealed that while a formal code of conduct was considered too restrictive, a bottom-up approach toward awareness raising and education and demonstrating scientists' responsibility toward society was highly welcomed. Consequently, an informational brochure on "Misuse potential and biosecurity in life sciences research" was developed to provide material for further discussions and education.

  9. C-STrap Sample Preparation Method--In-Situ Cysteinyl Peptide Capture for Bottom-Up Proteomics Analysis in the STrap Format.

    Science.gov (United States)

    Zougman, Alexandre; Banks, Rosamonde E

    2015-01-01

    Recently we introduced the concept of Suspension Trapping (STrap) for bottom-up proteomics sample processing that is based upon SDS-mediated protein extraction, swift detergent removal and rapid reactor-type protein digestion in a quartz depth filter trap. As the depth filter surface is made of silica, it is readily modifiable with various functional groups using the silane coupling chemistries. Thus, during the digest, peptides possessing specific features could be targeted for enrichment by the functionalized depth filter material while non-targeted peptides could be collected as an unbound distinct fraction after the digest. In the example presented here the quartz depth filter surface is functionalized with the pyridyldithiol group therefore enabling reversible in-situ capture of the cysteine-containing peptides generated during the STrap-based digest. The described C-STrap method retains all advantages of the original STrap methodology and provides robust foundation for the conception of the targeted in-situ peptide fractionation in the STrap format for bottom-up proteomics. The presented data support the method's use in qualitative and semi-quantitative proteomics experiments.

  10. C-STrap Sample Preparation Method--In-Situ Cysteinyl Peptide Capture for Bottom-Up Proteomics Analysis in the STrap Format.

    Directory of Open Access Journals (Sweden)

    Alexandre Zougman

    Full Text Available Recently we introduced the concept of Suspension Trapping (STrap for bottom-up proteomics sample processing that is based upon SDS-mediated protein extraction, swift detergent removal and rapid reactor-type protein digestion in a quartz depth filter trap. As the depth filter surface is made of silica, it is readily modifiable with various functional groups using the silane coupling chemistries. Thus, during the digest, peptides possessing specific features could be targeted for enrichment by the functionalized depth filter material while non-targeted peptides could be collected as an unbound distinct fraction after the digest. In the example presented here the quartz depth filter surface is functionalized with the pyridyldithiol group therefore enabling reversible in-situ capture of the cysteine-containing peptides generated during the STrap-based digest. The described C-STrap method retains all advantages of the original STrap methodology and provides robust foundation for the conception of the targeted in-situ peptide fractionation in the STrap format for bottom-up proteomics. The presented data support the method's use in qualitative and semi-quantitative proteomics experiments.

  11. Importance of bottom-up approach in water management - sustainable development of catchment areas in Croatia

    Science.gov (United States)

    Pavic, M.; Cosic-Flajsig, G.; Petricec, M.; Blazevic, Z.

    2012-04-01

    Association for preservation of Croatian waters and sea SLAP is a non-governmental organization (NGO) that gathers more than 150 scientist, hydrologist and civil engineers. SLAP has been established in 2006 and since then had organized many conferences and participated in projects dealing with water management. We have started our work developing plans to secure water supply to the 22 (21) villages in the rural parts of Dubrovnik (Pozega) area and trough the years we have accumulated knowledge and experience in dealing with stakeholders in hydrology and water management. Within this paper we will present importance of bottom-up approach to the stakeholders in water management in Croatia on two case studies: (1) Management of River Trebizat catchment area - irrigation of the Imotsko-Bekijsko rural parts; (2) Development of multipurpose water reservoirs at the River Orljava catchment area. Both projects were designed in the mid and late 1980's but due to the war were forgotten and on halt. River Trebizat meanders between Croatia and Bosnia and Herzegovina and acquires joint management by both countries. In 2010 and 2011 SLAP has organized conferences in both countries gathering all the relevant stakeholders from representatives of local and state governments, water management companies and development agencies to the scientist and interested NGO's. The conferences gave firm scientific background of the topic including presentation of all previous studies and measurements as well as model results but presented in manner appropriate to the stakeholders. The main result of the conference was contribution to the development of joint cross-border project sent to the EU Pre-Accession funds in December 2011 with the aim to strengthen capacities of both countries and prepare larger project dealing with management of the whole Trebizat catchment area to EU structural funds once Croatia enters EU in 2013. Similar approach was taken for the Orljava catchment in the northern

  12. Top-down or bottom-up: Contrasting perspectives on psychiatric diagnoses

    Directory of Open Access Journals (Sweden)

    Willem MA Verhoeven

    2008-09-01

    Full Text Available Willem MA Verhoeven1,2, Siegfried Tuinier1, Ineke van der Burgt31Vincent van Gogh Institute for Psychiatry, Venray, The Netherlands; 2Department of Psychiatry, Erasmus University Medical Centre, Rotterdam, The Netherlands; 3Department of Human Genetics, Radboud University Medical Centre, Nijmegen, The NetherlandsAbstract: Clinical psychiatry is confronted with the expanding knowledge of medical genetics. Most of the research into the genetic underpinnings of major mental disorders as described in the categorical taxonomies, however, did reveal linkage with a variety of chromosomes. This heterogeneity of results is most probably due to the assumption that the nosological categories as used in these studies are disease entities with clear boundaries. If the reverse way of looking, the so-called bottom-up approach, is applied, it becomes clear that genetic abnormalities are in most cases not associated with a single psychiatric disorder but with a certain probability to develop a variety of aspecific psychiatric symptoms. The adequacy of the categorical taxonomy, the so-called top-down approach, seems to be inversely related to the amount of empirical etiological data. This is illustrated by four rather prevalent genetic syndromes, fragile X syndrome, Prader-Willi syndrome, 22q11 deletion syndrome, and Noonan syndrome, as well as by some cases with rare chromosomal abnormalities. From these examples, it becomes clear that psychotic symptoms as well as mood, anxiety, and autistic features can be found in a great variety of different genetic syndromes. A psychiatric phenotype exists, but comprises, apart from the chance to present several psychiatric symptoms, all elements from developmental, neurocognitive, and physical characteristics.Keywords: genetic disorders, psychiatric symptoms, phenotype, mental disorders

  13. A bottom-up model of spatial attention predicts human error patterns in rapid scene recognition.

    Science.gov (United States)

    Einhäuser, Wolfgang; Mundhenk, T Nathan; Baldi, Pierre; Koch, Christof; Itti, Laurent

    2007-07-20

    Humans demonstrate a peculiar ability to detect complex targets in rapidly presented natural scenes. Recent studies suggest that (nearly) no focal attention is required for overall performance in such tasks. Little is known, however, of how detection performance varies from trial to trial and which stages in the processing hierarchy limit performance: bottom-up visual processing (attentional selection and/or recognition) or top-down factors (e.g., decision-making, memory, or alertness fluctuations)? To investigate the relative contribution of these factors, eight human observers performed an animal detection task in natural scenes presented at 20 Hz. Trial-by-trial performance was highly consistent across observers, far exceeding the prediction of independent errors. This consistency demonstrates that performance is not primarily limited by idiosyncratic factors but by visual processing. Two statistical stimulus properties, contrast variation in the target image and the information-theoretical measure of "surprise" in adjacent images, predict performance on a trial-by-trial basis. These measures are tightly related to spatial attention, demonstrating that spatial attention and rapid target detection share common mechanisms. To isolate the causal contribution of the surprise measure, eight additional observers performed the animal detection task in sequences that were reordered versions of those all subjects had correctly recognized in the first experiment. Reordering increased surprise before and/or after the target while keeping the target and distractors themselves unchanged. Surprise enhancement impaired target detection in all observers. Consequently, and contrary to several previously published findings, our results demonstrate that attentional limitations, rather than target recognition alone, affect the detection of targets in rapidly presented visual sequences.

  14. Representing Farmer Irrigation Decisions in Northern India: Model Development from the Bottom Up.

    Science.gov (United States)

    O'Keeffe, J.; Buytaert, W.; Brozovic, N.; Mijic, A.

    2017-12-01

    The plains of northern India are among the most intensely populated and irrigated regions of the world. Sustaining water demand has been made possible by exploiting the vast and hugely productive aquifers underlying the Indo-Gangetic basin. However, an increasing demand from a growing population and highly variable socio-economic and environmental variables mean present resources may not be sustainable, resulting in water security becoming one of India's biggest challenges. Unless solutions which take into consideration the regions evolving anthropogenic and environmental conditions are found, the sustainability of India's water resources looks bleak. Understanding water user decisions and their potential outcome is important for development of suitable water resource management options. Computational models are commonly used to assist water use decision making, typically representing natural processes well. The inclusion of human decision making however, one of the dominant drivers of change, has lagged behind. Improved representation of irrigation water user behaviour within models provides more accurate, relevant information for irrigation management. This research conceptualizes and proceduralizes observed farmer irrigation practices, highlighting feedbacks between the environment and livelihood. It is developed using a bottom up approach, informed through field experience and stakeholder interaction in Uttar Pradesh, northern India. Real world insights are incorporated through collected information creating a realistic representation of field conditions, providing a useful tool for policy analysis and water management. The modelling framework is applied to four districts. Results suggest predicted future climate will have little direct impact on water resources, crop yields or farmer income. In addition, increased abstraction may be sustainable in some areas under carefully managed conditions. By simulating dynamic decision making, feedbacks and interactions

  15. Bottom-up modeling approach for the quantitative estimation of parameters in pathogen-host interactions.

    Science.gov (United States)

    Lehnert, Teresa; Timme, Sandra; Pollmächer, Johannes; Hünniger, Kerstin; Kurzai, Oliver; Figge, Marc Thilo

    2015-01-01

    Opportunistic fungal pathogens can cause bloodstream infection and severe sepsis upon entering the blood stream of the host. The early immune response in human blood comprises the elimination of pathogens by antimicrobial peptides and innate immune cells, such as neutrophils or monocytes. Mathematical modeling is a predictive method to examine these complex processes and to quantify the dynamics of pathogen-host interactions. Since model parameters are often not directly accessible from experiment, their estimation is required by calibrating model predictions with experimental data. Depending on the complexity of the mathematical model, parameter estimation can be associated with excessively high computational costs in terms of run time and memory. We apply a strategy for reliable parameter estimation where different modeling approaches with increasing complexity are used that build on one another. This bottom-up modeling approach is applied to an experimental human whole-blood infection assay for Candida albicans. Aiming for the quantification of the relative impact of different routes of the immune response against this human-pathogenic fungus, we start from a non-spatial state-based model (SBM), because this level of model complexity allows estimating a priori unknown transition rates between various system states by the global optimization method simulated annealing. Building on the non-spatial SBM, an agent-based model (ABM) is implemented that incorporates the migration of interacting cells in three-dimensional space. The ABM takes advantage of estimated parameters from the non-spatial SBM, leading to a decreased dimensionality of the parameter space. This space can be scanned using a local optimization approach, i.e., least-squares error estimation based on an adaptive regular grid search, to predict cell migration parameters that are not accessible in experiment. In the future, spatio-temporal simulations of whole-blood samples may enable timely

  16. Construction of mammographic examination process ontology using bottom-up hierarchical task analysis.

    Science.gov (United States)

    Yagahara, Ayako; Yokooka, Yuki; Jiang, Guoqian; Tsuji, Shintarou; Fukuda, Akihisa; Nishimoto, Naoki; Kurowarabi, Kunio; Ogasawara, Katsuhiko

    2018-03-01

    Describing complex mammography examination processes is important for improving the quality of mammograms. It is often difficult for experienced radiologic technologists to explain the process because their techniques depend on their experience and intuition. In our previous study, we analyzed the process using a new bottom-up hierarchical task analysis and identified key components of the process. Leveraging the results of the previous study, the purpose of this study was to construct a mammographic examination process ontology to formally describe the relationships between the process and image evaluation criteria to improve the quality of mammograms. First, we identified and created root classes: task, plan, and clinical image evaluation (CIE). Second, we described an "is-a" relation referring to the result of the previous study and the structure of the CIE. Third, the procedural steps in the ontology were described using the new properties: "isPerformedBefore," "isPerformedAfter," and "isPerformedAfterIfNecessary." Finally, the relationships between tasks and CIEs were described using the "isAffectedBy" property to represent the influence of the process on image quality. In total, there were 219 classes in the ontology. By introducing new properties related to the process flow, a sophisticated mammography examination process could be visualized. In relationships between tasks and CIEs, it became clear that the tasks affecting the evaluation criteria related to positioning were greater in number than those for image quality. We developed a mammographic examination process ontology that makes knowledge explicit for a comprehensive mammography process. Our research will support education and help promote knowledge sharing about mammography examination expertise.

  17. Pharmacy-based statewide naloxone distribution: A novel "top-down, bottom-up" approach.

    Science.gov (United States)

    Morton, Kate J; Harrand, Brianna; Floyd, Carly Cloud; Schaefer, Craig; Acosta, Julie; Logan, Bridget Claire; Clark, Karen

    To highlight New Mexico's multifaceted approach to widespread pharmacy naloxone distribution and to share the interventions as a tool for improving pharmacy-based naloxone practices in other states. New Mexico had the second highest drug overdose death rate in 2014 of which 53% were related to prescription opioids. Opioid overdose death is preventable through the use of naloxone, a safe and effective medication that reverses the effects of prescription opioids and heroin. Pharmacists can play an important role in providing naloxone to individuals who use prescription opioids. Not applicable. Not applicable. A multifaceted approach was utilized in New Mexico from the top down with legislative passage of provisions for a statewide standing order and New Mexico Department of Health support for pharmacy-based naloxone delivery. A bottom up approach was also initiated with the development and implementation of a training program for pharmacists and pharmacy technicians. Naloxone Medicaid claims were used to illustrate statewide distribution and utilization of the pharmacist statewide standing order for naloxone. Percent of pharmacies dispensing naloxone in each county were calculated. Trained pharmacy staff completed a program evaluation form. Questions about quality of instruction and ability of trainer to meet stated objectives were rated on a Likert scale. There were 808 naloxone Medicaid claims from 100 outpatient pharmacies during the first half of 2016, a 9-fold increase over 2014. The "A Dose of R x eality" training program evaluation indicated that participants felt the training was free from bias and met all stated objectives (4 out of 4 on Likert scale). A multi-pronged approach coupling state and community collaboration was successful in overcoming barriers and challenges associated with pharmacy naloxone distribution and ensured its success as an effective avenue for naloxone acquisition in urban and rural communities. Copyright © 2017 American Pharmacists

  18. Bottom-up assembly of salivary gland microtissues for assessing myoepithelial cell function.

    Science.gov (United States)

    Ozdemir, Tugba; Srinivasan, Padma Pradeepa; Zakheim, Daniel R; Harrington, Daniel A; Witt, Robert L; Farach-Carson, Mary C; Jia, Xinqiao; Pradhan-Bhatt, Swati

    2017-10-01

    Myoepithelial cells are flat, stellate cells present in exocrine tissues including the salivary glands. While myoepithelial cells have been studied extensively in mammary and lacrimal gland tissues, less is known of the function of myoepithelial cells derived from human salivary glands. Several groups have isolated tumorigenic myoepithelial cells from cancer specimens, however, only one report has demonstrated isolation of normal human salivary myoepithelial cells needed for use in salivary gland tissue engineering applications. Establishing a functional organoid model consisting of myoepithelial and secretory acinar cells is therefore necessary for understanding the coordinated action of these two cell types in unidirectional fluid secretion. Here, we developed a bottom-up approach for generating salivary gland microtissues using primary human salivary myoepithelial cells (hSMECs) and stem/progenitor cells (hS/PCs) isolated from normal salivary gland tissues. Phenotypic characterization of isolated hSMECs confirmed that a myoepithelial cell phenotype consistent with that from other exocrine tissues was maintained over multiple passages of culture. Additionally, hSMECs secreted basement membrane proteins, expressed adrenergic and cholinergic neurotransmitter receptors, and released intracellular calcium [Ca 2+ i ] in response to parasympathetic agonists. In a collagen I contractility assay, activation of contractile machinery was observed in isolated hSMECs treated with parasympathetic agonists. Recombination of hSMECs with assembled hS/PC spheroids in a microwell system was used to create microtissues resembling secretory complexes of the salivary gland. We conclude that the engineered salivary gland microtissue complexes provide a physiologically relevant model for both mechanistic studies and as a building block for the successful engineering of the salivary gland for restoration of salivary function in patients suffering from hyposalivation. Copyright © 2017

  19. Second Language Listening Instruction: Comparing a Strategies-Based Approach with an Interactive, Strategies/Bottom-Up Skills Approach

    Science.gov (United States)

    Yeldham, Michael

    2016-01-01

    This quasi-experimental study compared a strategies approach to second language listening instruction with an interactive approach, one combining a roughly equal balance of strategies and bottom-up skills. The participants were lower-intermediate-level Taiwanese university EFL learners, who were taught for 22 hours over one and a half semesters.…

  20. Enhancing criterion-related validity through bottom-up contextualization of personality inventories: The construction of an ecological conscientiousness scale

    NARCIS (Netherlands)

    dr René Butter; Marise Born

    2011-01-01

    In this paper the concept of "ecological personality scales" is introduced. These are contextualized inventories with a high ecological validity. They are developed in a bottom-up or qualitative way and combine a relatively high trait specificity with a relatively high situational specificity. An

  1. The Interaction of Top-Down and Bottom-Up Statistics in the Resolution of Syntactic Category Ambiguity

    Science.gov (United States)

    Gibson, Edward

    2006-01-01

    This paper investigates how people resolve syntactic category ambiguities when comprehending sentences. It is proposed that people combine: (a) context-dependent syntactic expectations (top-down statistical information) and (b) context-independent lexical-category frequencies of words (bottom-up statistical information) in order to resolve…

  2. Citizenship Policy from the Bottom-Up: The Linguistic and Semiotic Landscape of a Naturalization Field Office

    Science.gov (United States)

    Loring, Ariel

    2015-01-01

    This article follows a bottom-up approach to language policy (Ramanathan, 2005; Wodak, 2006) in an analysis of citizenship in policy and practice. It compares representations of citizenship in and around a regional branch of the United States Citizenship and Immigration Services (USCIS), with a focus on citizenship swearing-in ceremonies for…

  3. Bottom-up estimation of joint moments during manual lifting using orientation sensors instead of position sensors

    NARCIS (Netherlands)

    Faber, G.S.; Kingma, I.; van Dieen, J.H.

    2010-01-01

    L5/S1, hip and knee moments during manual lifting tasks are, in a laboratory environment, frequently established by bottom-up inverse dynamics, using force plates to measure ground reaction forces (GRFs) and an optoelectronic system to measure segment positions and orientations. For field

  4. Assessing the Gap Between Top-down and Bottom-up Measured Methane Emissions in Indianapolis, IN.

    Science.gov (United States)

    Prasad, K.; Lamb, B. K.; Cambaliza, M. O. L.; Shepson, P. B.; Stirm, B. H.; Salmon, O. E.; Lavoie, T. N.; Lauvaux, T.; Ferrara, T.; Howard, T.; Edburg, S. L.; Whetstone, J. R.

    2014-12-01

    Releases of methane (CH4) from the natural gas supply chain in the United States account for approximately 30% of the total US CH4 emissions. However, there continues to be large questions regarding the accuracy of current emission inventories for methane emissions from natural gas usage. In this paper, we describe results from top-down and bottom-up measurements of methane emissions from the large isolated city of Indianapolis. The top-down results are based on aircraft mass balance and tower based inverse modeling methods, while the bottom-up results are based on direct component sampling at metering and regulating stations, surface enclosure measurements of surveyed pipeline leaks, and tracer/modeling methods for other urban sources. Mobile mapping of methane urban concentrations was also used to identify significant sources and to show an urban-wide low level enhancement of methane levels. The residual difference between top-down and bottom-up measured emissions is large and cannot be fully explained in terms of the uncertainties in top-down and bottom-up emission measurements and estimates. Thus, the residual appears to be, at least partly, attributed to a significant wide-spread diffusive source. Analyses are included to estimate the size and nature of this diffusive source.

  5. The landscape of fear: The missing link to understand top-down and bottom-up controls of prey abundance?

    Science.gov (United States)

    Identifying factors that may be responsible for affecting and possibly regulating the size of animal populations is a cornerstone in understanding population ecology. The main factors that are thought to influence population size are either resources (bottom-up), predation, (top-down), or interspec...

  6. Using classic methods in a networked manner: seeing volunteered spatial information in a bottom-up fashion

    NARCIS (Netherlands)

    Carton, L.J.; Ache, P.M.

    2014-01-01

    Using new social media and ICT infrastructures for self-organization, more and more citizen networks and business sectors organize themselves voluntarily around sustainability themes. The paper traces and evaluates one emerging innovation in such bottom-up, networked form of sustainable

  7. Initial Clinician Reports of the Bottom-Up Dissemination of an Evidence-Based Intervention for Early Childhood Trauma

    Science.gov (United States)

    David, Paula; Schiff, Miriam

    2018-01-01

    Background: Bottom-up dissemination (BUD) of evidence based treatments (EBT), entailing the spread of an intervention through a peer network in a decentralized manner, is an under-reported phenomenon in the professional literature. Objective: This paper presents findings from a study researching the feasibility of BUD of an evidence-based…

  8. Students' Perceptions about Online Teaching Effectiveness: A Bottom-Up Approach for Identifying Online Instructors' Roles

    Science.gov (United States)

    Gómez-Rey, Pilar; Barbera, Elena; Fernández-Navarro, Francisco

    2018-01-01

    The topic of online instructors' roles has been of interest to the educational community since the late twentieth century. In previous studies, the identification of online instructors' roles was done using a top-down (deductive) approach. This study applied a bottom-up (inductive) procedure to examine not only the roles of online instructors from…

  9. Grain size engineering of bcc refractory metals: Top-down and bottom-up-Application to tungsten

    International Nuclear Information System (INIS)

    Kecskes, L.J.; Cho, K.C.; Dowding, R.J.; Schuster, B.E.; Valiev, R.Z.; Wei, Q.

    2007-01-01

    We have used two general methodologies for the production of ultrafine grained (UFG) and nanocrystalline (NC) tungsten (W) metal samples: top-down and bottom-up. In the first, Equal channel angular extrusion (ECAE), coupled with warm rolling has been used to fabricate UFG W, and high pressure torsion (HPT) was used to fabricate NC W. We demonstrate an abrupt shift in the deformation mechanism, particularly under dynamic compressive loading, in UFG and NC W. This novel deformation mechanism, a dramatic transition from a uniform deformation mode to that of localized shearing, is shared by other UFG and NC body-centerd cubic (BCC) metals. We have also conducted a series of bottom-up experiments to consolidate powdered UFG W precursors into solid bodies. The bottom-up approach relies on rapid, high-temperature consolidation, specifically designed for UFG and NC W powders. The mechanical property results from the top-down UFG and NC W were used as minimum property benchmarks to guide and design the experimental protocols and parameters for use in the bottom-up procedures. Preliminary results, showing rapid grain growth during the consolidation cycle, did not achieve full density in the W samples. Further development of high-purity W nanopowders and appropriate grain-growth inhibitors (e.g., Zener pinning) will be required to successfully produce bulk-sized UFG and NC W samples

  10. Evaluating the Resilience of the Bottom-up Method used to Detect and Benchmark the Smartness of University Campuses

    NARCIS (Netherlands)

    Giovannella, Carlo; Andone, Diana; Dascalu, Mihai; Popescu, Elvira; Rehm, Matthias; Mealha, Oscar

    2017-01-01

    A new method to perform a bottom-up extraction and benchmark of the perceived multilevel smartness of complex ecosystems has been recently described and applied to territories and learning ecosystems like university campuses and schools. In this paper we study the resilience of our method

  11. Two-dimensional combinatorial screening enables the bottom-up design of a microRNA-10b inhibitor.

    Science.gov (United States)

    Velagapudi, Sai Pradeep; Disney, Matthew D

    2014-03-21

    The RNA motifs that bind guanidinylated kanamycin A (G Kan A) and guanidinylated neomycin B (G Neo B) were identified via two-dimensional combinatorial screening (2DCS). The results of these studies enabled the "bottom-up" design of a small molecule inhibitor of oncogenic microRNA-10b.

  12. Reconciling Top-Down and Bottom-Up Estimates of Oil and Gas Methane Emissions in the Barnett Shale

    Science.gov (United States)

    Hamburg, S.

    2015-12-01

    Top-down approaches that use aircraft, tower, or satellite-based measurements of well-mixed air to quantify regional methane emissions have typically estimated higher emissions from the natural gas supply chain when compared to bottom-up inventories. A coordinated research campaign in October 2013 used simultaneous top-down and bottom-up approaches to quantify total and fossil methane emissions in the Barnett Shale region of Texas. Research teams have published individual results including aircraft mass-balance estimates of regional emissions and a bottom-up, 25-county region spatially-resolved inventory. This work synthesizes data from the campaign to directly compare top-down and bottom-up estimates. A new analytical approach uses statistical estimators to integrate facility emission rate distributions from unbiased and targeted high emission site datasets, which more rigorously incorporates the fat-tail of skewed distributions to estimate regional emissions of well pads, compressor stations, and processing plants. The updated spatially-resolved inventory was used to estimate total and fossil methane emissions from spatial domains that match seven individual aircraft mass balance flights. Source apportionment of top-down emissions between fossil and biogenic methane was corroborated with two independent analyses of methane and ethane ratios. Reconciling top-down and bottom-up estimates of fossil methane emissions leads to more accurate assessment of natural gas supply chain emission rates and the relative contribution of high emission sites. These results increase our confidence in our understanding of the climate impacts of natural gas relative to more carbon-intensive fossil fuels and the potential effectiveness of mitigation strategies.

  13. The Comparative Effect of Top-down Processing and Bottom-up Processing through TBLT on Extrovert and Introvert EFL

    Directory of Open Access Journals (Sweden)

    Pezhman Nourzad Haradasht

    2013-09-01

    Full Text Available This research seeks to examine the effect of two models of reading comprehension, namely top-down and bottom-up processing, on the reading comprehension of extrovert and introvert EFL learners’ reading comprehension. To do this, 120 learners out of a total number of 170 intermediate learners being educated at Iran Mehr English Language School were selected all taking a PET (Preliminary English Test first for homogenization prior to the study. They also answered the Eysenck Personality Inventory (EPI which in turn categorized them into two subgroups within each reading models consisting of introverts and extroverts. All in all, there were four subgroups: 30 introverts and 30 extroverts undergoing the top-down processing treatment, and 30 introverts and 30 extroverts experiencing the bottom-up processing treatment. The aforementioned PET was administered as the post test of the study after each group was exposed to the treatment for 18 sessions in six weeks. After the instructions finished, the mean scores of all four groups on this post test were computed and a two-way ANOVA was run to test all the four hypotheses raise in this study. the results showed that while learners generally benefitted more from the bottom-up processing setting compared  to the top-down processing one, the extrovert group was better off receiving top-down instruction. Furthermore, introverts outperformed extroverts in bottom-up group; yet between the two personalities subgroups in the top-down setting no difference was seen. A predictable pattern of benefitting from teaching procedures could not be drawn for introverts as in both top-down and bottom-up settings, they benefitted more than extroverts.

  14. A bottom-up approach to urban metabolism: the perspective of BRIDGE

    Science.gov (United States)

    Chrysoulakis, N.; Borrego, C.; San Josè, R.; Grimmond, S. B.; Jones, M. B.; Magliulo, V.; Klostermann, J.; Santamouris, M.

    2011-12-01

    Urban metabolism considers a city as a system and usually distinguishes between energy and material flows as its components. "Metabolic" studies are usually top-down approaches that assess the inputs and outputs of food, water, energy, and pollutants from a city, or that compare the changing metabolic process of several cities. In contrast, bottom-up approaches are based on quantitative estimates of urban metabolism components at local to regional scales. Such approaches consider the urban metabolism as the 3D exchange and transformation of energy and matter between a city and its environment. The city is considered as a system and the physical flows between this system and its environment are quantitatively estimated. The transformation of landscapes from primarily agricultural and forest uses to urbanized landscapes can greatly modify energy and material exchanges and it is, therefore, an important aspect of an urban area. Here we focus on the exchanges and transformation of energy, water, carbon and pollutants. Recent advances in bio-physical sciences have led to new methods and models to estimate local scale energy, water, carbon and pollutant fluxes. However, there is often poor communication of new knowledge and its implications to end-users, such as planners, architects and engineers. The FP7 Project BRIDGE (SustainaBle uRban plannIng Decision support accountinG for urban mEtabolism) aims at bridging this gap and at illustrating the advantages of considering environmental issues in urban planning. BRIDGE does not perform a complete life cycle analysis or calculate whole system urban metabolism, but rather focuses on specific metabolism components (energy, water, carbon and pollutants). Its main goal is the development of a Decision Suport System (DSS) with the potential to select planning actions which better fit the goal of changing the metabolism of urban systems towards sustainability. BRIDGE evaluates how planning alternatives can modify the physical

  15. Manipulation, salience, and nudges.

    Science.gov (United States)

    Noggle, Robert

    2018-03-01

    Cass Sunstein and Richard Thaler recommend helping people make better decisions by employing 'nudges', which they define as noncoercive methods of influencing choice for the better. Not surprisingly, healthcare practitioners and public policy professionals have become interested in whether nudges might be a promising method of improving health-related behaviors without resorting to heavy-handed methods such as coercion, deception, or government regulation. Many nudges seem unobjectionable as they merely improve the quality and quantity available for the decision-maker. However, other nudges influence decision-making in ways that do not involve providing more and better information. Nudges of this sort raise concerns about manipulation. This paper will focus on noninformational nudges that operate by changing the salience of various options. It will survey two approaches to understanding manipulation, one which sees manipulation as a kind of pressure, and one that sees it as a kind of trickery. On the pressure view, salience nudges do not appear to be manipulative. However, on the trickery view (which the author favors), salience nudges will be manipulative if they increase the salience so that it is disproportionate to that fact's true relevance and importance for the decision at hand. By contrast, salience nudges will not be manipulative if they merely highlight some fact that is true and important for the decision at hand. The paper concludes by providing examples of both manipulative and nonmanipulative salience nudges. © 2017 John Wiley & Sons Ltd.

  16. Bottom-up and top-down attentional contributions to the size congruity effect.

    Science.gov (United States)

    Sobel, Kenith V; Puri, Amrita M; Faulkenberry, Thomas J

    2016-07-01

    The size congruity effect refers to the interaction between the numerical and physical (i.e., font) sizes of digits in a numerical (or physical) magnitude selection task. Although various accounts of the size congruity effect have attributed this interaction to either an early representational stage or a late decision stage, only Risko, Maloney, and Fugelsang (Attention, Perception, & Psychophysics, 75, 1137-1147, 2013) have asserted a central role for attention. In the present study, we used a visual search paradigm to further study the role of attention in the size congruity effect. In Experiments 1 and 2, we showed that manipulating top-down attention (via the task instructions) had a significant impact on the size congruity effect. The interaction between numerical and physical size was larger for numerical size comparison (Exp. 1) than for physical size comparison (Exp. 2). In the remaining experiments, we boosted the feature salience by using a unique target color (Exp. 3) or by increasing the display density by using three-digit numerals (Exps. 4 and 5). As expected, a color singleton target abolished the size congruity effect. Searching for three-digit targets based on numerical size (Exp. 4) resulted in a large size congruity effect, but search based on physical size (Exp. 5) abolished the effect. Our results reveal a substantial role for top-down attention in the size congruity effect, which we interpreted as support for a shared-decision account.

  17. Merging bottom-up and top-down precipitation products using a stochastic error model

    Science.gov (United States)

    Maggioni, Viviana; Massari, Christian; Brocca, Luca; Ciabatta, Luca

    2017-04-01

    Accurate quantitative precipitation estimation is of great importance for water resources management, agricultural planning, and forecasting and monitoring of natural hazards such as flash floods and landslides. In situ observations are limited around the Earth, especially in remote areas (e.g., complex terrain, dense vegetation), but currently available satellite precipitation products are able to provide global precipitation estimates with an accuracy that depends upon many factors (e.g., type of storms, temporal sampling, season etc…). Recently, Brocca et al. (2014) have proposed an alternative approach (i.e., SM2RAIN) that allows to estimate rainfall from space by using satellite soil moisture observations. In contrast with classical satellite precipitation products which sense the cloud properties to retrieve the instantaneous precipitation, this new bottom-up approach makes use of two consecutive soil moisture measurements for obtaining an estimate of the fallen precipitation within the interval between two satellite passes. As a result, the nature of the measurement is different and complementary to the one of classical precipitation products and could provide a different valid perspective to improve current satellite rainfall estimates via appropriate integration between the products (i.e., SM2RAIN plus a classical satellite rainfall product). However, whether SM2RAIN is able or not to improve the performance of any state-of-the-art satellite rainfall product is much dependent upon an adequate quantification and characterization of the relative errors of the products. In this study, the stochastic rainfall error model SREM2D (Hossain et al. 2006) is used for characterizing the retrieval error of both SM2RAIN and a state-of-the-art satellite precipitation product (i.e., 3B42RT). The error characterization serves for an optimal integration between SM2RAIN and 3B42RT for enhancing the capability of the resulting integrated product (i.e. SM2RAIN+3B42RT) in

  18. Price elasticities, policy measures and actual developments in household energy consumption - A bottom up analysis for the Netherlands

    International Nuclear Information System (INIS)

    Boonekamp, Piet G.M.

    2007-01-01

    In the Netherlands it seems likely that the large number of new policy measures in the past decade has influenced the response of households to changing prices. To investigate this issue the energy trends in the period 1990-2000 have been simulated with a bottom-up model, applied earlier for scenario studies, and extensive data from surveys. For a number of alternative price cases the elasticity values found are explained using the bottom-up changes in energy trends. One finding is that the specific set of saving options defines for a great part the price response. The price effect has been analysed too in combination with the policy measures standards, subsidies and energy taxes. The simulation results indicate that the elasticity value could be 30-40% higher without these measures. (author)

  19. Cyclization of the N-Terminal X-Asn-Gly Motif during Sample Preparation for Bottom-Up Proteomics

    DEFF Research Database (Denmark)

    Zhang, Xumin; Højrup, Peter

    2010-01-01

    We, herein, report a novel -17 Da peptide modification corresponding to an N-terminal cyclization of peptides possessing the N-terminal motif of X-Asn-Gly. The cyclization occurs spontaneously during sample preparation for bottom-up proteomics studies. Distinct from the two well-known N-terminal ......We, herein, report a novel -17 Da peptide modification corresponding to an N-terminal cyclization of peptides possessing the N-terminal motif of X-Asn-Gly. The cyclization occurs spontaneously during sample preparation for bottom-up proteomics studies. Distinct from the two well-known N......-terminal cyclizations, cyclization of N-terminal glutamine and S-carbamoylmethylcysteine, it is dependent on pH instead of [NH(4)(+)]. The data set from our recent study on large-scale N(α)-modified peptides revealed a sequence requirement for the cyclization event similar to the well-known deamidation of Asn to iso...

  20. Hong Kong protests: A quantitative and bottom-up account of resistance against Chinese social media (sina weibo censorship

    Directory of Open Access Journals (Sweden)

    Jingyi Zhao

    2017-06-01

    Full Text Available Chinese online censorship, though has been deeply explored by many scholars from a top-down perspective and has mostly concentrated on the macro level, it appears that there are few, if any, existing studies that features a bottom-up perspective and explores the micro-level aspects of online media censorship. To fill this research gap, this article uses the Occupy movement in Hong Kong as a research case to analyze social media users’ resistance under conditions of heavy censorship from a bottom-up perspective. That is, the research questions seek to uncover what novel ways Weibo users use to try and circumvent Weibo censorship. It is confirmed that the microbloggers tend to use embedded pictures and user ID names, instead of using text messages to camouflage the sensitive information to share with other users; that Weibo users tend to create new accounts once their original ones have been closed or monitored.

  1. Linking electricity prices and costs in bottom-up top-down coupling under changing market environments

    OpenAIRE

    Maire, Sophie

    2016-01-01

    Electricity market liberalization is altering pricing mechanisms in wholesale electricity markets, which will affect the effectiveness of climate and energy policies. Models used to simulate such policies must be responsive to pricing rules. We show how this can be done and simulate a tightening of climate and energy policies. We use a soft-coupled framework composed of a top-down dynamic computable general equilibrium model and a bottom-up dynamic electricity supply model. The first simulate...

  2. Incentives for Collaborative Governance: Top-Down and Bottom-Up Initiatives in the Swedish Mountain Region

    Directory of Open Access Journals (Sweden)

    Katarina Eckerberg

    2015-08-01

    Full Text Available Governance collaborations between public and private partners are increasingly used to promote sustainable mountain development, yet information is limited on their nature and precise extent. This article analyzes collaboration on environment and natural resource management in Swedish mountain communities to critically assess the kinds of issues these efforts address, how they evolve, who leads them, and what functional patterns they exhibit based on Margerum's (2008 typology of action, organizational, and policy collaboration. Based on official documents, interviews, and the records of 245 collaborative projects, we explore the role of the state, how perceptions of policy failure may inspire collaboration, and the opportunities that European Union funds have created. Bottom-up collaborations, most of which are relatively recent, usually have an action and sometimes an organizational function. Top-down collaborations, however, are usually organizational or policy oriented. Our findings suggest that top-down and bottom-up collaborations are complementary in situations with considerable conflict over time and where public policies have partly failed, such as for nature protection and reindeer grazing. In less contested areas, such as rural development, improving tracks and access, recreation, and fishing, there is more bottom-up, action-oriented collaboration. State support, especially in the form of funding, is central to explaining the emergence of bottom-up action collaboration. Our findings show that the state both initiates and coordinates policy networks and retains a great deal of power over the nature and functioning of collaborative governance. A practical consequence is that there is great overlap—aggravated by sectorized approaches—that creates a heavy workload for some regional partners.

  3. Tisza, Transmission and Innovation: An Innovative Bottom-up Model for Transmission and Promotion of Tisza Cultural Heritage

    OpenAIRE

    Barberis Rami, Matías Ezequiel; Berić, Dejan; Mátai, Anikó; Opriş, Lavinia-Ioana; Ricci, Giulia; Rustja, Dritan

    2015-01-01

    The project aims to promote and preserve both tangible and intangible cultural heritage in a particular region of the Danube river basin, Tisza Region (TR). The TR cultural heritage is less-well-known in the rest of Europe and is at risk of being lost or forgotten if not preserved and supported. In this project is presented an innovative and strategic bottom-up model which allows local people to manage how their heritage is disseminated through transmission and promotion of their ...

  4. Effects of pollutants on bottom-up and top-down processes in insect-plant interactions

    International Nuclear Information System (INIS)

    Butler, Casey D.; Trumble, John T.

    2008-01-01

    Bottom-up (host plant quality) and top-down (natural enemies) forces both influence the fitness and population dynamics of herbivores. However, the impact of pollutants acting on these forces has not been examined, which prompted us to review the literature to test hypotheses regarding this area of research. A comprehensive literature search found 126 references which examined fitness components and population dynamics of 203 insect herbivores. One hundred and fifty-three of the 203 herbivores (75.4%) had fitness impacted due to bottom-up factors in polluted environments. In contrast, only 20 of the 203 (9.9%) had fitness significantly impacted due to top-down factors in polluted environments. The paucity of results for top-down factors impacting fitness does not necessarily mean that top-down factors are less important, but rather that fewer studies include natural enemies. We provide a synthesis of available data by pollution type and herbivore guild, and suggest future research to address this issue. - Pollutants can affect insect herbivores through bottom-up and, possibly, top-down processes

  5. Sponge communities on Caribbean coral reefs are structured by factors that are top-down, not bottom-up.

    Science.gov (United States)

    Pawlik, Joseph R; Loh, Tse-Lynn; McMurray, Steven E; Finelli, Christopher M

    2013-01-01

    Caribbean coral reefs have been transformed in the past few decades with the demise of reef-building corals, and sponges are now the dominant habitat-forming organisms on most reefs. Competing hypotheses propose that sponge communities are controlled primarily by predatory fishes (top-down) or by the availability of picoplankton to suspension-feeding sponges (bottom-up). We tested these hypotheses on Conch Reef, off Key Largo, Florida, by placing sponges inside and outside predator-excluding cages at sites with less and more planktonic food availability (15 m vs. 30 m depth). There was no evidence of a bottom-up effect on the growth of any of 5 sponge species, and 2 of 5 species grew more when caged at the shallow site with lower food abundance. There was, however, a strong effect of predation by fishes on sponge species that lacked chemical defenses. Sponges with chemical defenses grew slower than undefended species, demonstrating a resource trade-off between growth and the production of secondary metabolites. Surveys of the benthic community on Conch Reef similarly did not support a bottom-up effect, with higher sponge cover at the shallower depth. We conclude that the structure of sponge communities on Caribbean coral reefs is primarily top-down, and predict that removal of sponge predators by overfishing will shift communities toward faster-growing, undefended species that better compete for space with threatened reef-building corals.

  6. Sponge communities on Caribbean coral reefs are structured by factors that are top-down, not bottom-up.

    Directory of Open Access Journals (Sweden)

    Joseph R Pawlik

    Full Text Available Caribbean coral reefs have been transformed in the past few decades with the demise of reef-building corals, and sponges are now the dominant habitat-forming organisms on most reefs. Competing hypotheses propose that sponge communities are controlled primarily by predatory fishes (top-down or by the availability of picoplankton to suspension-feeding sponges (bottom-up. We tested these hypotheses on Conch Reef, off Key Largo, Florida, by placing sponges inside and outside predator-excluding cages at sites with less and more planktonic food availability (15 m vs. 30 m depth. There was no evidence of a bottom-up effect on the growth of any of 5 sponge species, and 2 of 5 species grew more when caged at the shallow site with lower food abundance. There was, however, a strong effect of predation by fishes on sponge species that lacked chemical defenses. Sponges with chemical defenses grew slower than undefended species, demonstrating a resource trade-off between growth and the production of secondary metabolites. Surveys of the benthic community on Conch Reef similarly did not support a bottom-up effect, with higher sponge cover at the shallower depth. We conclude that the structure of sponge communities on Caribbean coral reefs is primarily top-down, and predict that removal of sponge predators by overfishing will shift communities toward faster-growing, undefended species that better compete for space with threatened reef-building corals.

  7. An integrated top-down and bottom-up proteomic approach to characterize the antigen-binding fragment of antibodies.

    Science.gov (United States)

    Dekker, Lennard; Wu, Si; Vanduijn, Martijn; Tolić, Nikolai; Stingl, Christoph; Zhao, Rui; Luider, Theo; Paša-Tolić, Ljiljana

    2014-05-01

    We have previously shown that different individuals exposed to the same antigen produce antibodies with identical mutations in their complementarity determining regions (CDR), suggesting that CDR tryptic peptides can serve as biomarkers for disease diagnosis and prognosis. Complete Fabs derived from disease specific antibodies have even higher potential; they could potentially be used for disease treatment and are required to identify the antigens toward which the antibodies are directed. However, complete Fab sequence characterization via LC-MS analysis of tryptic peptides (i.e. bottom-up) has proven to be impractical for mixtures of antibodies. To tackle this challenge, we have developed an integrated bottom-up and top-down MS approach, employing 2D chromatography coupled with Fourier transform mass spectrometry (FTMS), and applied this approach for full characterization of the variable parts of two pharmaceutical monoclonal antibodies with sensitivity comparable to the bottom-up standard. These efforts represent an essential step toward the identification of disease specific antibodies in patient samples with potentially significant clinical impact. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Top-down and bottom-up attention-to-memory: mapping functional connectivity in two distinct networks that underlie cued and uncued recognition memory.

    Science.gov (United States)

    Burianová, Hana; Ciaramelli, Elisa; Grady, Cheryl L; Moscovitch, Morris

    2012-11-15

    The objective of this study was to examine the functional connectivity of brain regions active during cued and uncued recognition memory to test the idea that distinct networks would underlie these memory processes, as predicted by the attention-to-memory (AtoM) hypothesis. The AtoM hypothesis suggests that dorsal parietal cortex (DPC) allocates effortful top-down attention to memory retrieval during cued retrieval, whereas ventral parietal cortex (VPC) mediates spontaneous bottom-up capture of attention by memory during uncued retrieval. To identify networks associated with these two processes, we conducted a functional connectivity analysis of a left DPC and a left VPC region, both identified by a previous analysis of task-related regional activations. We hypothesized that the two parietal regions would be functionally connected with distinct neural networks, reflecting their engagement in the differential mnemonic processes. We found two spatially dissociated networks that overlapped only in the precuneus. During cued trials, DPC was functionally connected with dorsal attention areas, including the superior parietal lobules, right precuneus, and premotor cortex, as well as relevant memory areas, such as the left hippocampus and the middle frontal gyri. During uncued trials, VPC was functionally connected with ventral attention areas, including the supramarginal gyrus, cuneus, and right fusiform gyrus, as well as the parahippocampal gyrus. In addition, activity in the DPC network was associated with faster response times for cued retrieval. This is the first study to show a dissociation of the functional connectivity of posterior parietal regions during episodic memory retrieval, characterized by a top-down AtoM network involving DPC and a bottom-up AtoM network involving VPC. Copyright © 2012 Elsevier Inc. All rights reserved.

  9. A fusion of top-down and bottom-up modeling techniques to constrain regional scale carbon budgets

    Science.gov (United States)

    Goeckede, M.; Turner, D. P.; Michalak, A. M.; Vickers, D.; Law, B. E.

    2009-12-01

    The effort to constrain regional scale carbon budgets benefits from assimilating as many high quality data sources as possible in order to reduce uncertainties. Two of the most common approaches used in this field, bottom-up and top-down techniques, both have their strengths and weaknesses, and partly build on very different sources of information to train, drive, and validate the models. Within the context of the ORCA2 project, we follow both bottom-up and top-down modeling strategies with the ultimate objective of reconciling their surface flux estimates. The ORCA2 top-down component builds on a coupled WRF-STILT transport module that resolves the footprint function of a CO2 concentration measurement in high temporal and spatial resolution. Datasets involved in the current setup comprise GDAS meteorology, remote sensing products, VULCAN fossil fuel inventories, boundary conditions from CarbonTracker, and high-accuracy time series of atmospheric CO2 concentrations. Surface fluxes of CO2 are normally provided through a simple diagnostic model which is optimized against atmospheric observations. For the present study, we replaced the simple model with fluxes generated by an advanced bottom-up process model, Biome-BGC, which uses state-of-the-art algorithms to resolve plant-physiological processes, and 'grow' a biosphere based on biogeochemical conditions and climate history. This approach provides a more realistic description of biomass and nutrient pools than is the case for the simple model. The process model ingests various remote sensing data sources as well as high-resolution reanalysis meteorology, and can be trained against biometric inventories and eddy-covariance data. Linking the bottom-up flux fields to the atmospheric CO2 concentrations through the transport module allows evaluating the spatial representativeness of the BGC flux fields, and in that way assimilates more of the available information than either of the individual modeling techniques alone

  10. Assessing Top-Down and Bottom-Up Contributions to Auditory Stream Segregation and Integration With Polyphonic Music

    Directory of Open Access Journals (Sweden)

    Niels R. Disbergen

    2018-03-01

    Full Text Available Polyphonic music listening well exemplifies processes typically involved in daily auditory scene analysis situations, relying on an interactive interplay between bottom-up and top-down processes. Most studies investigating scene analysis have used elementary auditory scenes, however real-world scene analysis is far more complex. In particular, music, contrary to most other natural auditory scenes, can be perceived by either integrating or, under attentive control, segregating sound streams, often carried by different instruments. One of the prominent bottom-up cues contributing to multi-instrument music perception is their timbre difference. In this work, we introduce and validate a novel paradigm designed to investigate, within naturalistic musical auditory scenes, attentive modulation as well as its interaction with bottom-up processes. Two psychophysical experiments are described, employing custom-composed two-voice polyphonic music pieces within a framework implementing a behavioral performance metric to validate listener instructions requiring either integration or segregation of scene elements. In Experiment 1, the listeners' locus of attention was switched between individual instruments or the aggregate (i.e., both instruments together, via a task requiring the detection of temporal modulations (i.e., triplets incorporated within or across instruments. Subjects responded post-stimulus whether triplets were present in the to-be-attended instrument(s. Experiment 2 introduced the bottom-up manipulation by adding a three-level morphing of instrument timbre distance to the attentional framework. The task was designed to be used within neuroimaging paradigms; Experiment 2 was additionally validated behaviorally in the functional Magnetic Resonance Imaging (fMRI environment. Experiment 1 subjects (N = 29, non-musicians completed the task at high levels of accuracy, showing no group differences between any experimental conditions. Nineteen

  11. Assessing Top-Down and Bottom-Up Contributions to Auditory Stream Segregation and Integration With Polyphonic Music.

    Science.gov (United States)

    Disbergen, Niels R; Valente, Giancarlo; Formisano, Elia; Zatorre, Robert J

    2018-01-01

    Polyphonic music listening well exemplifies processes typically involved in daily auditory scene analysis situations, relying on an interactive interplay between bottom-up and top-down processes. Most studies investigating scene analysis have used elementary auditory scenes, however real-world scene analysis is far more complex. In particular, music, contrary to most other natural auditory scenes, can be perceived by either integrating or, under attentive control, segregating sound streams, often carried by different instruments. One of the prominent bottom-up cues contributing to multi-instrument music perception is their timbre difference. In this work, we introduce and validate a novel paradigm designed to investigate, within naturalistic musical auditory scenes, attentive modulation as well as its interaction with bottom-up processes. Two psychophysical experiments are described, employing custom-composed two-voice polyphonic music pieces within a framework implementing a behavioral performance metric to validate listener instructions requiring either integration or segregation of scene elements. In Experiment 1, the listeners' locus of attention was switched between individual instruments or the aggregate (i.e., both instruments together), via a task requiring the detection of temporal modulations (i.e., triplets) incorporated within or across instruments. Subjects responded post-stimulus whether triplets were present in the to-be-attended instrument(s). Experiment 2 introduced the bottom-up manipulation by adding a three-level morphing of instrument timbre distance to the attentional framework. The task was designed to be used within neuroimaging paradigms; Experiment 2 was additionally validated behaviorally in the functional Magnetic Resonance Imaging (fMRI) environment. Experiment 1 subjects ( N = 29, non-musicians) completed the task at high levels of accuracy, showing no group differences between any experimental conditions. Nineteen listeners also

  12. Saliency of social comparison dimensions

    NARCIS (Netherlands)

    Kuyper, H.

    2007-01-01

    The present article discusses a theory of the saliency of social comparison dimensions and presents the results of an experiment about the effects of two different experimental situations on the saliency of exterior, task-related and socio-emotional dimensions. Saliency was operationalized with a

  13. Untangling Linguistic Salience

    NARCIS (Netherlands)

    Boswijk, Vincent; Coler, Matt; Loerts, Hanneke; Hilton, Nanna

    2018-01-01

    The concept of linguistic salience is broadly used within sociolinguistics to account for processes as diverse as language change (Kerswill & Williams, 2002) and language acquisition (Ellis, 2016) in that salient forms are e.g. more likely to undergo change, or are often acquired earlier than other

  14. Binaural beat salience

    Science.gov (United States)

    Grose, John H.; Buss, Emily; Hall, Joseph W.

    2012-01-01

    Previous studies of binaural beats have noted individual variability and response lability, but little attention has been paid to the salience of the binaural beat percept. The purpose of this study was to gauge the strength of the binaural beat percept by matching its salience to that of sinusoidal amplitude modulation (SAM), and to then compare rate discrimination for the two types of fluctuation. Rate discrimination was measured for standard rates of 4, 8, 16, and 32 Hz – all in the 500-Hz carrier region. Twelve normal-hearing adults participated in this study. The results indicated that discrimination acuity for binaural beats is similar to that for SAM tones whose depths of modulation have been adjusted to provide equivalent modulation salience. The matched-salience SAM tones had relatively shallow depths of modulation, suggesting that the perceptual strength of binaural beats is relatively weak, although all listeners perceived them. The Weber fraction for detection of an increase in binaural beat rate is roughly constant across beat rates, at least for rates above 4 Hz, as is rate discrimination for SAM tones. PMID:22326292

  15. Binaural beat salience.

    Science.gov (United States)

    Grose, John H; Buss, Emily; Hall, Joseph W

    2012-03-01

    Previous studies of binaural beats have noted individual variability and response lability, but little attention has been paid to the salience of the binaural beat percept. The purpose of this study was to gauge the strength of the binaural beat percept by matching its salience to that of sinusoidal amplitude modulation (SAM), and to then compare rate discrimination for the two types of fluctuation. Rate discrimination was measured for standard rates of 4, 8, 16, and 32 Hz - all in the 500-Hz carrier region. Twelve normal-hearing adults participated in this study. The results indicated that discrimination acuity for binaural beats is similar to that for SAM tones whose depths of modulation have been adjusted to provide equivalent modulation salience. The matched-salience SAM tones had relatively shallow depths of modulation, suggesting that the perceptual strength of binaural beats is relatively weak, although all listeners perceived them. The Weber fraction for detection of an increase in binaural beat rate is roughly constant across beat rates, at least for rates above 4 Hz, as is rate discrimination for SAM tones. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. On the distribution of saliency.

    Science.gov (United States)

    Berengolts, Alexander; Lindenbaum, Michael

    2006-12-01

    Detecting salient structures is a basic task in perceptual organization. Saliency algorithms typically mark edge-points with some saliency measure, which grows with the length and smoothness of the curve on which these edge-points lie. Here, we propose a modified saliency estimation mechanism that is based on probabilistically specified grouping cues and on curve length distributions. In this framework, the Shashua and Ullman saliency mechanism may be interpreted as a process for detecting the curve with maximal expected length. Generalized types of saliency naturally follow. We propose several specific generalizations (e.g., gray-level-based saliency) and rigorously derive the limitations on generalized saliency types. We then carry out a probabilistic analysis of expected length saliencies. Using ergodicity and asymptotic analysis, we derive the saliency distributions associated with the main curves and with the rest of the image. We then extend this analysis to finite-length curves. Using the derived distributions, we derive the optimal threshold on the saliency for discriminating between figure and background and bound the saliency-based figure-from-ground performance.

  17. Prediction of visual saliency in video with deep CNNs

    Science.gov (United States)

    Chaabouni, Souad; Benois-Pineau, Jenny; Hadar, Ofer

    2016-09-01

    Prediction of visual saliency in images and video is a highly researched topic. Target applications include Quality assessment of multimedia services in mobile context, video compression techniques, recognition of objects in video streams, etc. In the framework of mobile and egocentric perspectives, visual saliency models cannot be founded only on bottom-up features, as suggested by feature integration theory. The central bias hypothesis, is not respected neither. In this case, the top-down component of human visual attention becomes prevalent. Visual saliency can be predicted on the basis of seen data. Deep Convolutional Neural Networks (CNN) have proven to be a powerful tool for prediction of salient areas in stills. In our work we also focus on sensitivity of human visual system to residual motion in a video. A Deep CNN architecture is designed, where we incorporate input primary maps as color values of pixels and magnitude of local residual motion. Complementary contrast maps allow for a slight increase of accuracy compared to the use of color and residual motion only. The experiments show that the choice of the input features for the Deep CNN depends on visual task:for th eintersts in dynamic content, the 4K model with residual motion is more efficient, and for object recognition in egocentric video the pure spatial input is more appropriate.

  18. Temperature regulation of marine heterotrophic prokaryotes increases latitudinally as a breach between bottom-up and top-down controls

    KAUST Repository

    Moran, Xose Anxelu G.

    2017-04-19

    Planktonic heterotrophic prokaryotes make up the largest living biomass and process most organic matter in the ocean. Determining when and where the biomass and activity of heterotrophic prokaryotes are controlled by resource availability (bottom-up), predation and viral lysis (top-down) or temperature will help in future carbon cycling predictions. We conducted an extensive survey across subtropical and tropical waters of the Atlantic, Indian and Pacific Oceans during the Malaspina 2010 Global Circumnavigation Expedition and assessed indices for these three types of controls at 109 stations (mostly from the surface to 4000 m depth). Temperature control was approached by the apparent activation energy in eV (ranging from 0.46 to 3.41), bottom-up control by the slope of the log-log relationship between biomass and production rate (ranging from -0.12 to 1.09) and top-down control by an index that considers the relative abundances of heterotrophic nanoflagellates and viruses (ranging from 0.82 to 4.83). We conclude that temperature becomes dominant (i.e. activation energy >1.5 eV) within a narrow window of intermediate values of bottom-up (0.3-0.6) and top-down 0.8-1.2) controls. A pervasive latitudinal pattern of decreasing temperature regulation towards the Equator, regardless of the oceanic basin, suggests that the impact of global warming on marine microbes and their biogeochemical function will be more intense at higher latitudes. Our analysis predicts that 1°C ocean warming will result in increased biomass of heterotrophic prokaryoplankton only in waters with <26°C of mean annual surface temperature. This article is protected by copyright. All rights reserved.

  19. Salient region detection by fusing bottom-up and top-down features extracted from a single image.

    Science.gov (United States)

    Tian, Huawei; Fang, Yuming; Zhao, Yao; Lin, Weisi; Ni, Rongrong; Zhu, Zhenfeng

    2014-10-01

    Recently, some global contrast-based salient region detection models have been proposed based on only the low-level feature of color. It is necessary to consider both color and orientation features to overcome their limitations, and thus improve the performance of salient region detection for images with low-contrast in color and high-contrast in orientation. In addition, the existing fusion methods for different feature maps, like the simple averaging method and the selective method, are not effective sufficiently. To overcome these limitations of existing salient region detection models, we propose a novel salient region model based on the bottom-up and top-down mechanisms: the color contrast and orientation contrast are adopted to calculate the bottom-up feature maps, while the top-down cue of depth-from-focus from the same single image is used to guide the generation of final salient regions, since depth-from-focus reflects the photographer's preference and knowledge of the task. A more general and effective fusion method is designed to combine the bottom-up feature maps. According to the degree-of-scattering and eccentricities of feature maps, the proposed fusion method can assign adaptive weights to different feature maps to reflect the confidence level of each feature map. The depth-from-focus of the image as a significant top-down feature for visual attention in the image is used to guide the salient regions during the fusion process; with its aid, the proposed fusion method can filter out the background and highlight salient regions for the image. Experimental results show that the proposed model outperforms the state-of-the-art models on three public available data sets.

  20. Evaluating the Resilience of the Bottom-up Method used to Detect and Benchmark the Smartness of University Campuses

    DEFF Research Database (Denmark)

    Giovannella, Carlo; Andone, Diana; Dascalu, Mihai

    2016-01-01

    A new method to perform a bottom-up extraction and benchmark of the perceived multilevel smartness of complex ecosystems has been recently described and applied to territories and learning ecosystems like university campuses and schools. In this paper we study the resilience of our method...... by comparing and integrating the data collected in several European Campuses during two different academic years, 2014-15 and 2015-16. The overall results are: a) a more adequate and robust definition of the orthogonal multidimensional space of representation of the smartness, and b) the definition...

  1. Benchmarking energy scenarios for China: perspectives from top-down, economic and bottom-up, technical modelling

    DEFF Research Database (Denmark)

    This study uses a soft-linking methodology to harmonise two complex global top-down and bottom-up models with a regional China focus. The baseline follows the GDP and demographic trends of the Shared Socio-economic Pathways (SSP2) scenario, down-scaled for China, while the carbon tax scenario fol......-specific modelling results further. These new sub-regional China features can now be used for a more detailed analysis of China's regional developments in a global context....

  2. Bottoms up design of the Elmo Bumpy Torus - proof of principal (EBT-P) fusion research facility

    International Nuclear Information System (INIS)

    Erickson, D.T.

    1981-01-01

    The McDonnell Douglas Astronautics Company, under subcontract to the Union Carbide Corporation Nuclear Division at the DOE Oak Ridge National Laboratory has committed to furnish the EBT-P research facility. Gilbert Associates, Inc. of Reading, Pennysylvania, as the Architect and Engineering subcontractor has been selected for design and construction of this facility. The bottoms up effort to provide the EBT-P facility is now alive and well, with the property purchased, dedication ceremonies conducted, the Preliminary Design effort completed and detail design currently active

  3. A bottom-up approach for optimization of friction stir processing parameters; a study on aluminium 2024-T3 alloy

    International Nuclear Information System (INIS)

    Nadammal, Naresh; Kailas, Satish V.; Suwas, Satyam

    2015-01-01

    Highlights: • An experimental bottom-up approach has been developed for optimizing the process parameters for friction stir processing. • Optimum parameter processed samples were tested and characterized in detail. • Ultimate tensile strength of 1.3 times the base metal strength was obtained. • Residual stresses on the processed surface were only 10% of the yield strength of base metal. • Microstructure observations revealed fine equi-axed grains with precipitate particles at the grain boundaries. - Abstract: Friction stir processing (FSP) is emerging as one of the most competent severe plastic deformation (SPD) method for producing bulk ultra-fine grained materials with improved properties. Optimizing the process parameters for a defect free process is one of the challenging aspects of FSP to mark its commercial use. For the commercial aluminium alloy 2024-T3 plate of 6 mm thickness, a bottom-up approach has been attempted to optimize major independent parameters of the process such as plunge depth, tool rotation speed and traverse speed. Tensile properties of the optimum friction stir processed sample were correlated with the microstructural characterization done using Scanning Electron Microscope (SEM) and Electron Back-Scattered Diffraction (EBSD). Optimum parameters from the bottom-up approach have led to a defect free FSP having a maximum strength of 93% the base material strength. Micro tensile testing of the samples taken from the center of processed zone has shown an increased strength of 1.3 times the base material. Measured maximum longitudinal residual stress on the processed surface was only 30 MPa which was attributed to the solid state nature of FSP. Microstructural observation reveals significant grain refinement with less variation in the grain size across the thickness and a large amount of grain boundary precipitation compared to the base metal. The proposed experimental bottom-up approach can be applied as an effective method for

  4. Bottom-Up Nano-heteroepitaxy of Wafer-Scale Semipolar GaN on (001) Si

    KAUST Repository

    Hus, Jui Wei

    2015-07-15

    Semipolar {101¯1} InGaN quantum wells are grown on (001) Si substrates with an Al-free buffer and wafer-scale uniformity. The novel structure is achieved by a bottom-up nano-heteroepitaxy employing self-organized ZnO nanorods as the strain-relieving layer. This ZnO nanostructure unlocks the problems encountered by the conventional AlN-based buffer, which grows slowly and contaminates the growth chamber. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Associative Learning Through Acquired Salience.

    Science.gov (United States)

    Treviño, Mario

    2015-01-01

    Most associative learning studies describe the salience of stimuli as a fixed learning-rate parameter. Presumptive saliency signals, however, have also been linked to motivational and attentional processes. An interesting possibility, therefore, is that discriminative stimuli could also acquire salience as they become powerful predictors of outcomes. To explore this idea, we first characterized and extracted the learning curves from mice trained with discriminative images offering varying degrees of structural similarity. Next, we fitted a linear model of associative learning coupled to a series of mathematical representations for stimulus salience. We found that the best prediction, from the set of tested models, was one in which the visual salience depended on stimulus similarity and a non-linear function of the associative strength. Therefore, these analytic results support the idea that the net salience of a stimulus depends both on the items' effective salience and the motivational state of the subject that learns about it. Moreover, this dual salience model can explain why learning about a stimulus not only depends on the effective salience during acquisition but also on the specific learning trajectory that was used to reach this state. Our mathematical description could be instrumental for understanding aberrant salience acquisition under stressful situations and in neuropsychiatric disorders like schizophrenia, obsessive-compulsive disorder, and addiction.

  6. Venom Proteomics of Indonesian King Cobra, Ophiophagus hannah: Integrating Top-Down and Bottom-Up Approaches.

    Science.gov (United States)

    Petras, Daniel; Heiss, Paul; Süssmuth, Roderich D; Calvete, Juan J

    2015-06-05

    We report on the first application of top-down mass spectrometry in snake venomics. De novo sequence tags generated by, and ProSight Lite supported analysis of, combined collisional based dissotiations (CID and HCD) recorded in a hybrid LTQ Orbitrap instrument in data-dependent mode identified a number of proteins from different toxin families, namely, 11 three-finger toxins (7-7.9 kDa), a Kunitz-type inhibitor (6.3 kDa), ohanin (11.9 kDa), a novel phospholipase A2 molecule (13.8 kDa), and the cysteine-rich secretory protein (CRISP) ophanin (25 kDa) from Indonesian king cobra venom. Complementary bottom-up MS/MS analyses contributed to the completion of a locus-resolved venom phenotypic map for Ophiophagus hannah, the world's longest venomous snake and a species of medical concern across its wide distribution range in forests from India to Southeast Asia. Its venom composition, comprising 32-35 proteins/peptides from 10 protein families, is dominated by α-neurotoxins and convincingly explains the main neurotoxic effects of human envenoming caused by king cobra bite. The integration of efficient chromatographic separation of the venom's components and locus-resolved toxin identification through top-down and bottom-up MS/MS-based species-specific database searching and de novo sequencing holds promise that the future will be bright for the field of venom research.

  7. Orchestrated structure evolution: accelerating direct-write nanomanufacturing by combining top-down patterning with bottom-up growth

    Energy Technology Data Exchange (ETDEWEB)

    Kitayaporn, Sathana; Baneyx, Francois; Schwartz, Daniel T [Department of Chemical Engineering, University of Washington, Seattle, WA 98195-1750 (United States); Hoo, Ji Hao; Boehringer, Karl F, E-mail: dts@uw.edu [Department of Electrical Engineering, University of Washington, Seattle, WA 98195-1750 (United States)

    2010-05-14

    Direct-write nanomanufacturing with scanning beams and probes is flexible and can produce high quality products, but it is normally slow and expensive to raster point-by-point over a pattern. We demonstrate the use of an accelerated direct-write nanomanufacturing method called 'orchestrated structure evolution' (OSE), where a direct-write tool patterns a small number of growth 'seeds' that subsequently grow into the final thin film pattern. Through control of seed size and spacing, it is possible to vary the ratio of 'top-down' to 'bottom-up' character of the patterning processes, ranging from conventional top-down raster patterning to nearly pure bottom-up space-filling via seed growth. Electron beam lithography (EBL) and copper electrodeposition were used to demonstrate trade-offs between process time and product quality over nano- to microlength scales. OSE can reduce process times for high-cost EBL patterning by orders of magnitude, at the expense of longer (but inexpensive) copper electrodeposition processing times. We quantify the degradation of pattern quality that accompanies fast OSE patterning by measuring deviations from the desired patterned area and perimeter. We also show that the density of OSE-induced grain boundaries depends upon the seed separation and size. As the seed size is reduced, the uniformity of an OSE film becomes more dependent on details of seed nucleation processes than normally seen for conventionally patterned films.

  8. Evidence of bottom-up limitations in nearshore marine systems based on otolith proxies of fish growth

    Science.gov (United States)

    von Biela, Vanessa R.; Kruse, Gordon H.; Mueter, Franz J.; Black, Bryan A.; Douglas, David C.; Helser, Thomas E.; Zimmerman, Christian E.

    2015-01-01

    Fish otolith growth increments were used as indices of annual production at nine nearshore sites within the Alaska Coastal Current (downwelling region) and California Current (upwelling region) systems (~36–60°N). Black rockfish (Sebastes melanops) and kelp greenling (Hexagrammos decagrammus) were identified as useful indicators in pelagic and benthic nearshore food webs, respectively. To examine the support for bottom-up limitations, common oceanographic indices of production [sea surface temperature (SST), upwelling, and chlorophyll-a concentration] during summer (April–September) were compared to spatial and temporal differences in fish growth using linear mixed models. The relationship between pelagic black rockfish growth and SST was positive in the cooler Alaska Coastal Current and negative in the warmer California Current. These contrasting growth responses to SST among current systems are consistent with the optimal stability window hypothesis in which pelagic production is maximized at intermediate levels of water column stability. Increased growth rates of black rockfish were associated with higher chlorophyll concentrations in the California Current only, but black rockfish growth was unrelated to the upwelling index in either current system. Benthic kelp greenling growth rates were positively associated with warmer temperatures and relaxation of downwelling (upwelling index near zero) in the Alaska Coastal Current, while none of the oceanographic indices were related to their growth in the California Current. Overall, our results are consistent with bottom-up forcing of nearshore marine ecosystems—light and nutrients constrain primary production in pelagic food webs, and temperature constrains benthic food webs.

  9. Neutrino mixing and R{sub K} anomaly in U(1){sub X} models: a bottom-up approach

    Energy Technology Data Exchange (ETDEWEB)

    Bhatia, Disha; Chakraborty, Sabyasachi; Dighe, Amol [Tata Institute of Fundamental Research,Mumbai 400005 (India)

    2017-03-22

    We identify a class of U(1){sub X} models which can explain the R{sub K} anomaly and the neutrino mixing pattern, by using a bottom-up approach. The different X-charges of lepton generations account for the lepton universality violation required to explain R{sub K}. In addition to the three right-handed neutrinos needed for the Type-I seesaw mechanism, these minimal models only introduce an additional doublet Higgs and a singlet scalar. While the former helps in reproducing the quark mixing structure, the latter gives masses to neutrinos and the new gauge boson Z{sup ′}. Our bottom-up approach determines the X-charges of all particles using theoretical consistency and experimental constraints. We find the parameter space allowed by the constraints from neutral meson mixing, rare b→s decays and direct collider searches for Z{sup ′}. Such a Z{sup ′} may be observable at the ongoing run of the Large Hadron Collider with a few hundred fb{sup −1} of integrated luminosity.

  10. Galvanostatic bottom-up filling of TSV-like trenches: Choline-based leveler containing two quaternary ammoniums

    International Nuclear Information System (INIS)

    Kim, Myung Jun; Seo, Youngran; Kim, Hoe Chul; Lee, Yoonjae; Choe, Seunghoe; Kim, Young Gyu; Cho, Sung Ki; Kim, Jae Jeong

    2015-01-01

    Highlights: • The choline-based leveler having two quaternary ammoniums was synthesized. • The adsorption of this leveler with suppressor and accelerator was examined. • Galvanostatic Cu bottom-up filling was achieved with three-additive system. • The mechanism of gap-filling was elucidated based on the additive adsorption. - Abstract: Through Silicon Via (TSV) technology is essential to accomplish 3-dimensional packaging of electronics. Hence, more reliable and faster TSV filling by Cu electrodeposition is required. Our approach to improve Cu gap-filling in TSV is based on the development of new organic additives for feature filling. Here, we introduce our achievements from the synthesis of choline-based leveler to the feature filling using a synthesized leveler. The choline-based leveler, which includes two quaternary ammoniums at both ends of the molecule, is synthesized from glutaric acid. The characteristics of the choline-based additive are examined by the electrochemical analyses, and it is confirmed that the choline-based leveler shows a convection dependent adsorption behavior, which is essential for leveling. The interactions between the polymeric suppressor, accelerator, and the choline-based leveler are also investigated by changing the convection condition. Using the combination of suppressor, accelerator, and the choline-based leveler, the extreme bottom-up filling of Cu at trenches with dimensions similar to TSV are fulfilled. The mechanism of Cu gap-filling is demonstrated based on the results of electrochemical analyses and feature filling

  11. From Cascade to Bottom-Up Ecosystem Services Model: How Does Social Cohesion Emerge from Urban Agriculture?

    Directory of Open Access Journals (Sweden)

    Anna Petit-Boix

    2018-03-01

    Full Text Available Given the expansion of urban agriculture (UA, we need to understand how this system provides ecosystem services, including foundational societal needs such as social cohesion, i.e., people’s willingness to cooperate with one another. Although social cohesion in UA has been documented, there is no framework for its emergence and how it can be modeled within a sustainability framework. In this study, we address this literature gap by showing how the popular cascade ecosystem services model can be modified to include social structures. We then transform the cascade model into a bottom-up causal framework for UA. In this bottom-up framework, basic biophysical (e.g., land availability and social (e.g., leadership ecosystem structures and processes lead to human activities (e.g., learning that can foster specific human attitudes and feelings (e.g., trust. These attitudes and feelings, when aggregated (e.g., social network, generate an ecosystem value of social cohesion. These cause-effect relationships can support the development of causality pathways in social life cycle assessment (S-LCA and further our understanding of the mechanisms behind social impacts and benefits. The framework also supports UA studies by showing the sustainability of UA as an emergent food supplier in cities.

  12. Bottom-up and top-down human impacts interact to affect a protected coastal Chilean marsh.

    Science.gov (United States)

    Fariña, José M; He, Qiang; Silliman, Brian R; Bertness, Mark D

    2016-03-01

    Many ecosystems, even in protected areas, experience multiple anthropogenic impacts. While anthropogenic modification of bottom-up (e.g., eutrophication) and top-down (e.g., livestock grazing) forcing often co-occurs, whether these factors counteract or have additive or synergistic effects on ecosystems is poorly understood. In a Chilean bio-reserve, we examined the interactive impacts of eutrophication and illegal livestock grazing on plant growth with a 4-yr fertilization by cattle exclusion experiment. Cattle grazing generally decreased plant biomass, but had synergistic, additive, and antagonistic interactions with fertilization in the low, middle, and high marsh zones, respectively. In the low marsh, fertilization increased plant biomass by 112%, cattle grazing decreased it by 96%, and together they decreased plant biomass by 77%. In the middle marsh, fertilization increased plant biomass by 47%, cattle grazing decreased it by 37%, and together they did not affect plant biomass. In the high marsh, fertilization and cattle grazing decreased plant biomass by 81% and 92%, respectively, but together they increased plant biomass by 42%. These interactions were also found to be species specific. Different responses of plants to fertilization and cattle grazing were likely responsible for these variable interactions. Thus, common bottom-up and top-down human impacts can interact in different ways to affect communities even within a single ecosystem. Incorporating this knowledge into conservation actions will improve ecosystem management in a time when ecosystems are increasingly challenged by multiple interacting human impacts.

  13. Bottom-up driven involuntary auditory evoked field change: constant sound sequencing amplifies but does not sharpen neural activity.

    Science.gov (United States)

    Okamoto, Hidehiko; Stracke, Henning; Lagemann, Lothar; Pantev, Christo

    2010-01-01

    The capability of involuntarily tracking certain sound signals during the simultaneous presence of noise is essential in human daily life. Previous studies have demonstrated that top-down auditory focused attention can enhance excitatory and inhibitory neural activity, resulting in sharpening of frequency tuning of auditory neurons. In the present study, we investigated bottom-up driven involuntary neural processing of sound signals in noisy environments by means of magnetoencephalography. We contrasted two sound signal sequencing conditions: "constant sequencing" versus "random sequencing." Based on a pool of 16 different frequencies, either identical (constant sequencing) or pseudorandomly chosen (random sequencing) test frequencies were presented blockwise together with band-eliminated noises to nonattending subjects. The results demonstrated that the auditory evoked fields elicited in the constant sequencing condition were significantly enhanced compared with the random sequencing condition. However, the enhancement was not significantly different between different band-eliminated noise conditions. Thus the present study confirms that by constant sound signal sequencing under nonattentive listening the neural activity in human auditory cortex can be enhanced, but not sharpened. Our results indicate that bottom-up driven involuntary neural processing may mainly amplify excitatory neural networks, but may not effectively enhance inhibitory neural circuits.

  14. Approaches in studying the pharmacology of Chinese Medicine formulas: bottom-up, top-down-and meeting in the middle.

    Science.gov (United States)

    Huang, Tao; Zhong, Linda L D; Lin, Chen-Yuan; Zhao, Ling; Ning, Zi-Wan; Hu, Dong-Dong; Zhang, Man; Tian, Ke; Cheng, Chung-Wah; Bian, Zhao-Xiang

    2018-01-01

    Investigating the pharmacology is key to the modernization of Chinese Medicine (CM) formulas. However, identifying which are the active compound(s) of CM formulas, which biological entities they target, and through which signaling pathway(s) they act to modify disease symptoms, are still difficult tasks for researchers, even when equipped with an arsenal of advanced modern technologies. Multiple approaches, including network pharmacology, pharmaco-genomics, -proteomics, and -metabolomics, have been developed to study the pharmacology of CM formulas. They fall into two general categories in terms of how they tackle a problem: bottom-up and top-down. In this article, we compared these two different approaches in several dimensions by using the case of MaZiRenWan (MZRW, also known as Hemp Seed Pill), a CM herbal formula for functional constipation. Multiple hypotheses are easy to be proposed in the bottom-up approach (e.g. network pharmacology); but these hypotheses are usually false positives and hard to be tested. In contrast, it is hard to suggest hypotheses in the top-down approach (e.g. pharmacometabolomics); however, once a hypothesis is proposed, it is much easier to be tested. Merging of these two approaches could results in a powerful approach, which could be the new paradigm for the pharmacological study of CM formulas.

  15. Controlled synthesis of organic single-crystalline nanowires via the synergy approach of the bottom-up/top-down processes.

    Science.gov (United States)

    Zhuo, Ming-Peng; Zhang, Ye-Xin; Li, Zhi-Zhou; Shi, Ying-Li; Wang, Xue-Dong; Liao, Liang-Sheng

    2018-03-15

    The controlled fabrication of organic single-crystalline nanowires (OSCNWs) with a uniform diameter in the nanoscale via the bottom-up approach, which is just based on weak intermolecular interaction, is a great challenge. Herein, we utilize the synergy approach of the bottom-up and the top-down processes to fabricate OSCNWs with diameters of 120 ± 10 nm through stepwise evolution processes. Specifically, the evolution processes vary from the self-assembled organic micro-rods with a quadrangular pyramid-like end-structure bounded with {111}s and {11-1}s crystal planes to the "top-down" synthesized organic micro-rods with the flat cross-sectional {002}s plane, to the organic micro-tubes with a wall thickness of ∼115 nm, and finally to the organic nanowires. Notably, the anisotropic etching process caused by the protic solvent molecules (such as ethanol) is crucial for the evolution of the morphology throughout the whole top-down process. Therefore, our demonstration opens a new avenue for the controlled-fabrication of organic nanowires, and also contributes to the development of nanowire-based organic optoelectronics such as organic nanowire lasers.

  16. From the Bottom-Up: Chemotherapy and Gut-Brain Axis Dysregulation.

    Science.gov (United States)

    Bajic, Juliana E; Johnston, Ian N; Howarth, Gordon S; Hutchinson, Mark R

    2018-01-01

    The central nervous system and gastrointestinal tract form the primary targets of chemotherapy-induced toxicities. Symptoms associated with damage to these regions have been clinically termed chemotherapy-induced cognitive impairment and mucositis. Whilst extensive literature outlines the complex etiology of each pathology, to date neither chemotherapy-induced side-effect has considered the potential impact of one on the pathogenesis of the other disorder. This is surprising considering the close bidirectional relationship shared between each organ; the gut-brain axis. There are complex multiple pathways linking the gut to the brain and vice versa in both normal physiological function and disease. For instance, psychological and social factors influence motility and digestive function, symptom perception, and behaviors associated with illness and pathological outcomes. On the other hand, visceral pain affects central nociception pathways, mood and behavior. Recent interest highlights the influence of functional gut disorders, such as inflammatory bowel diseases and irritable bowel syndrome in the development of central comorbidities. Gut-brain axis dysfunction and microbiota dysbiosis have served as key portals in understanding the potential mechanisms associated with these functional gut disorders and their effects on cognition. In this review we will present the role gut-brain axis dysregulation plays in the chemotherapy setting, highlighting peripheral-to-central immune signaling mechanisms and their contribution to neuroimmunological changes associated with chemotherapy exposure. Here, we hypothesize that dysregulation of the gut-brain axis plays a major role in the intestinal, psychological and neurological complications following chemotherapy. We pay particular attention to evidence surrounding microbiota dysbiosis, the role of intestinal permeability, damage to nerves of the enteric and peripheral nervous systems and vagal and humoral mediated changes.

  17. Saliency Changes Appearance

    Science.gov (United States)

    Kerzel, Dirk; Schönhammer, Josef; Burra, Nicolas; Born, Sabine; Souto, David

    2011-01-01

    Numerous studies have suggested that the deployment of attention is linked to saliency. In contrast, very little is known about how salient objects are perceived. To probe the perception of salient elements, observers compared two horizontally aligned stimuli in an array of eight elements. One of them was salient because of its orientation or direction of motion. We observed that the perceived luminance contrast or color saturation of the salient element increased: the salient stimulus looked even more salient. We explored the possibility that changes in appearance were caused by attention. We chose an event-related potential indexing attentional selection, the N2pc, to answer this question. The absence of an N2pc to the salient object provides preliminary evidence against involuntary attentional capture by the salient element. We suggest that signals from a master saliency map flow back into individual feature maps. These signals boost the perceived feature contrast of salient objects, even on perceptual dimensions different from the one that initially defined saliency. PMID:22162760

  18. Economic burden associated with alcohol dependence in a German primary care sample: a bottom-up study

    Directory of Open Access Journals (Sweden)

    Jakob Manthey

    2016-08-01

    Full Text Available Abstract Background A considerable economic burden has been repeatedly associated with alcohol dependence (AD – mostly calculated using aggregate data and alcohol-attributable fractions (top-down approach. However, this approach is limited by a number of assumptions, which are hard to test. Thus, cost estimates should ideally be validated with studies using individual data to estimate the same costs (bottom-up approach. However, bottom-up studies on the economic burden associated with AD are lacking. Our study aimed to fill this gap using the bottom-up approach to examine costs for AD, and also stratified the results by the following subgroups: sex, age, diagnostic approach and severity of AD, as relevant variations could be expected by these factors. Methods Sample: 1356 primary health care patients, representative for two German regions. AD was diagnosed by a standardized instrument and treating physicians. Individual costs were calculated by combining resource use and productivity data representing a period of six months prior to the time of interview, with unit costs derived from the literature or official statistics. The economic burden associated with AD was determined via excess costs by comparing utilization of various health care resources and impaired productivity between people with and without AD, controlling for relevant confounders. Additional analyses for several AD characteristics were performed. Results Mean costs among alcohol dependent patients were 50 % higher compared to the remaining patients, resulting in 1836 € excess costs per alcohol dependent patient in 6 months. More than half of these excess costs incurred through increased productivity loss among alcohol dependent patients. Treatment for alcohol problems represents only 6 % of these costs. The economic burden associated with AD incurred mainly among males and among 30 to 49 year old patients. Both diagnostic approaches were significantly related to the

  19. Saliency U-Net: A regional saliency map-driven hybrid deep learning network for anomaly segmentation

    Science.gov (United States)

    Karargyros, Alex; Syeda-Mahmood, Tanveer

    2018-02-01

    Deep learning networks are gaining popularity in many medical image analysis tasks due to their generalized ability to automatically extract relevant features from raw images. However, this can make the learning problem unnecessarily harder requiring network architectures of high complexity. In case of anomaly detection, in particular, there is often sufficient regional difference between the anomaly and the surrounding parenchyma that could be easily highlighted through bottom-up saliency operators. In this paper we propose a new hybrid deep learning network using a combination of raw image and such regional maps to more accurately learn the anomalies using simpler network architectures. Specifically, we modify a deep learning network called U-Net using both the raw and pre-segmented images as input to produce joint encoding (contraction) and expansion paths (decoding) in the U-Net. We present results of successfully delineating subdural and epidural hematomas in brain CT imaging and liver hemangioma in abdominal CT images using such network.

  20. The faith of a physicist reflections of a bottom-up thinker : the Gifford lectures for 1993-4

    CERN Document Server

    Polkinghorne, John C

    1994-01-01

    Is it possible to think like a scientist and yet have the faith of a Christian? Although many Westerners might say no, there are also many critically minded individuals who entertain what John Polkinghorne calls a "wistful wariness" toward religion--they feel unable to accept religion on rational grounds yet cannot dismiss it completely. Polkinghorne, both a particle physicist and Anglican priest, here explores just what rational grounds there could be for Christian beliefs, maintaining that the quest for motivated understanding is a concern shared by scientists and religious thinkers alike. Anyone who assumes that religion is based on unquestioning certainties, or that it need not take into account empirical knowledge, will be challenged by Polkinghorne's bottom-up examination of Christian beliefs about events ranging from creation to the resurrection. The author organizes his inquiry around the Nicene Creed, an early statement that continues to summarize Christian beliefs. He applies to each of its tenets ...

  1. Construction of membrane-bound artificial cells using microfluidics: a new frontier in bottom-up synthetic biology.

    Science.gov (United States)

    Elani, Yuval

    2016-06-15

    The quest to construct artificial cells from the bottom-up using simple building blocks has received much attention over recent decades and is one of the grand challenges in synthetic biology. Cell mimics that are encapsulated by lipid membranes are a particularly powerful class of artificial cells due to their biocompatibility and the ability to reconstitute biological machinery within them. One of the key obstacles in the field centres on the following: how can membrane-based artificial cells be generated in a controlled way and in high-throughput? In particular, how can they be constructed to have precisely defined parameters including size, biomolecular composition and spatial organization? Microfluidic generation strategies have proved instrumental in addressing these questions. This article will outline some of the major principles underpinning membrane-based artificial cells and their construction using microfluidics, and will detail some recent landmarks that have been achieved. © 2016 The Author(s).

  2. Closing the gap? Top-down versus bottom-up projections of China's regional energy use and CO2 emissions

    DEFF Research Database (Denmark)

    Dai, Hancheng; Mischke, Peggy; Xie, Xuxuan

    2016-01-01

    . The study finds that it is beneficial to soft-link complex global models under harmonized assumptions. Although this study fails to "close the gap" between the two models completely, the experiences and insights shared here will be beneficial for researchers and policy makers that are drawing conclusions......As the world's largest CO2 emitter, China is a prominent case study for scenario analysis. This study uses two newly developed global top-down and bottom-up models with a regional China focus to compare China's future energy and CO2 emission pathways toward 2050. By harmonizing the economic...... and demographic trends as well as a carbon tax pathway, we explore how both models respond to these identical exogenous inputs. Then a soft-linking methodology is applied to "narrow the gap" between the results computed by these models. We find for example that without soft-linking, China's baseline CO2 emissions...

  3. Bottom-up fabrication of paper-based microchips by blade coating of cellulose microfibers on a patterned surface.

    Science.gov (United States)

    Gao, Bingbing; Liu, Hong; Gu, Zhongze

    2014-12-23

    We report a method for the bottom-up fabrication of paper-based capillary microchips by the blade coating of cellulose microfibers on a patterned surface. The fabrication process is similar to the paper-making process in which an aqueous suspension of cellulose microfibers is used as the starting material and is blade-coated onto a polypropylene substrate patterned using an inkjet printer. After water evaporation, the cellulose microfibers form a porous, hydrophilic, paperlike pattern that wicks aqueous solution by capillary action. This method enables simple, fast, inexpensive fabrication of paper-based capillary channels with both width and height down to about 10 μm. When this method is used, the capillary microfluidic chip for the colorimetric detection of glucose and total protein is fabricated, and the assay requires only 0.30 μL of sample, which is 240 times smaller than for paper devices fabricated using photolithography.

  4. Costs of CO2 abatement in Egypt using both bottom-up and top-down approaches

    International Nuclear Information System (INIS)

    El Mahgary, Y.; Ibrahim, A.-F.; Shama, M.A.-F.

    1994-01-01

    Within the frame of UNEP's project on the Methodologies of Determining the Costs of Abatement of GHG emissions, a case study on Egypt was undertaken by the Technical Research Centre of Finland (VTT) in cooperation with the Egyptian Environment Affairs Authority (EEAA), together with an expert team from different Egyptian organizations. Both bottom-up and top-down approaches were used. Several measures/technologies, including energy conservation, fuel switching, use of renewable energy and material replacement, were considered to decrease CO 2 emissions. It was found that most of the measures were cost-effective, as a considerable potential for energy conservation exists in Egypt. The impact of energy conservation measures on the economy of the country was found to be positive using a macroeconomic model. (author)

  5. Effects of High-Pressure Treatment on the Muscle Proteome of Hake by Bottom-Up Proteomics.

    Science.gov (United States)

    Carrera, Mónica; Fidalgo, Liliana G; Saraiva, Jorge A; Aubourg, Santiago P

    2018-05-02

    A bottom-up proteomics approach was applied for the study of the effects of high-pressure (HP) treatment on the muscle proteome of fish. The performance of the approach was established for a previous HP treatment (150-450 MPa for 2 min) on frozen (up to 5 months at -10 °C) European hake ( Merluccius merluccius). Concerning possible protein biomarkers of quality changes, a significant degradation after applying a pressure ≥430 MPa could be observed for phosphoglycerate mutase-1, enolase, creatine kinase, fructose bisphosphate aldolase, triosephosphate isomerase, and nucleoside diphosphate kinase; contrary, electrophoretic bands assigned to tropomyosin, glyceraldehyde-3-phosphate dehydrogenase, and beta parvalbumin increased their intensity after applying a pressure ≥430 MPa. This repository of potential protein biomarkers may be very useful for further HP investigations related to fish quality.

  6. Methodology to characterize a residential building stock using a bottom-up approach: a case study applied to Belgium

    Directory of Open Access Journals (Sweden)

    Samuel Gendebien

    2014-06-01

    Full Text Available In the last ten years, the development and implementation of measures to mitigate climate change have become of major importance. In Europe, the residential sector accounts for 27% of the final energy consumption [1], and therefore contributes significantly to CO2 emissions. Roadmaps towards energy-efficient buildings have been proposed [2]. In such a context, the detailed characterization of residential building stocks in terms of age, type of construction, insulation level, energy vector, and of evolution prospects appears to be a useful contribution to the assessment of the impact of implementation of energy policies. In this work, a methodology to develop a tree-structure characterizing a residential building stock is presented in the frame of a bottom-up approach that aims to model and simulate domestic energy use. The methodology is applied to the Belgian case for the current situation and up to 2030 horizon. The potential applications of the developed tool are outlined.

  7. The end of cheap oil: Bottom-up economic and geologic modeling of aggregate oil production curves

    International Nuclear Information System (INIS)

    Jakobsson, Kristofer; Bentley, Roger; Söderbergh, Bengt; Aleklett, Kjell

    2012-01-01

    There is a lively debate between ‘concerned’ and ‘unconcerned’ analysts regarding the future availability and affordability of oil. We critically examine two interrelated and seemingly plausible arguments for an unconcerned view: (1) there is a growing amount of remaining reserves; (2) there is a large amount of oil with a relatively low average production cost. These statements are unconvincing on both theoretical and empirical grounds. Oil availability is about flows rather than stocks, and average cost is not relevant in the determination of price and output. We subsequently implement a bottom-up model of regional oil production with micro-foundations in both natural science and economics. An oil producer optimizes net present value under the constraints of reservoir dynamics, technological capacity and economic circumstances. Optimal production profiles for different reservoir drives and economic scenarios are derived. The field model is then combined with a discovery model of random sampling from a lognormal field size-frequency distribution. Regional discovery and production scenarios are generated. Our approach does not rely on the simple assumptions of top-down models such as the Hubbert curve – however it leads to the same qualitative result that production peaks when a substantial fraction of the recoverable resource remains in-ground. - Highlights: ► Remaining oil reserves and average costs are of limited use in forecasting. ► We present a bottom-up approach to the modeling of regional oil production. ► Producers maximize net present value under technological and physical constraints. ► Exploration is modeled as random sampling from a lognormal field size distribution. ► Regional production starts declining before half of the recoverable oil is produced.

  8. A harmonized calculation model for transforming EU bottom-up energy efficiency indicators into empirical estimates of policy impacts

    International Nuclear Information System (INIS)

    Horowitz, Marvin J.; Bertoldi, Paolo

    2015-01-01

    This study is an impact analysis of European Union (EU) energy efficiency policy that employs both top-down energy consumption data and bottom-up energy efficiency statistics or indicators. As such, it may be considered a contribution to the effort called for in the EU's 2006 Energy Services Directive (ESD) to develop a harmonized calculation model. Although this study does not estimate the realized savings from individual policy measures, it does provide estimates of realized energy savings for energy efficiency policy measures in aggregate. Using fixed effects panel models, the annual cumulative savings in 2011 of combined household and manufacturing sector electricity and natural gas usage attributed to EU energy efficiency policies since 2000 is estimated to be 1136 PJ; the savings attributed to energy efficiency policies since 2006 is estimated to be 807 PJ, or the equivalent of 5.6% of 2011 EU energy consumption. As well as its contribution to energy efficiency policy analysis, this study adds to the development of methods that can improve the quality of information provided by standardized energy efficiency and sustainable resource indexes. - Highlights: • Impact analysis of European Union energy efficiency policy. • Harmonization of top-down energy consumption and bottom-up energy efficiency indicators. • Fixed effects models for Member States for household and manufacturing sectors and combined electricity and natural gas usage. • EU energy efficiency policies since 2000 are estimated to have saved 1136 Petajoules. • Energy savings attributed to energy efficiency policies since 2006 are 5.6 percent of 2011 combined electricity and natural gas usage.

  9. Soil and land use research in Europe: Lessons learned from INSPIRATION bottom-up strategic research agenda setting.

    Science.gov (United States)

    Bartke, Stephan; Boekhold, Alexandra E; Brils, Jos; Grimski, Detlef; Ferber, Uwe; Gorgon, Justyna; Guérin, Valérie; Makeschin, Franz; Maring, Linda; Nathanail, C Paul; Villeneuve, Jacques; Zeyer, Josef; Schröter-Schlaack, Christoph

    2018-05-01

    We introduce the INSPIRATION bottom-up approach for the development of a strategic research agenda for spatial planning, land use and soil-sediment-water-system management in Europe. Research and innovation needs were identified by more than 500 European funders, endusers, scientists, policy makers, public administrators and consultants. We report both on the concept and on the implementation of the bottom-up approach, provide a critique of the process and draw key lessons for the development of research agendas in the future. Based on identified strengths and weaknesses we identified as key opportunities and threats 1) a high ranking and attentiveness for the research topics on the political agenda, in press and media or in public awareness, 2) availability of funding for research, 3) the resources available for creating the agenda itself, 4) the role of the sponsor of the agenda development, and 5) the continuity of stakeholder engagement as bases for identification of windows of opportunity, creating ownership for the agenda and facilitating its implementation. Our derived key recommendations are 1) a clear definition of the area for which the agenda is to be developed and for the targeted user, 2) a conceptual model to structure the agenda, 3) making clear the expected roles, tasks, input formats regarding the involvement and communication with the stakeholders and project partners, 4) a sufficient number of iterations and checks of the agenda with stakeholders to insure completeness, relevance and creation of co-ownership for the agenda, and 5) from the beginning prepare the infrastructure for the network to implement the agenda. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Two-stage sparse coding of region covariance via Log-Euclidean kernels to detect saliency.

    Science.gov (United States)

    Zhang, Ying-Ying; Yang, Cai; Zhang, Ping

    2017-05-01

    In this paper, we present a novel bottom-up saliency detection algorithm from the perspective of covariance matrices on a Riemannian manifold. Each superpixel is described by a region covariance matrix on Riemannian Manifolds. We carry out a two-stage sparse coding scheme via Log-Euclidean kernels to extract salient objects efficiently. In the first stage, given background dictionary on image borders, sparse coding of each region covariance via Log-Euclidean kernels is performed. The reconstruction error on the background dictionary is regarded as the initial saliency of each superpixel. In the second stage, an improvement of the initial result is achieved by calculating reconstruction errors of the superpixels on foreground dictionary, which is extracted from the first stage saliency map. The sparse coding in the second stage is similar to the first stage, but is able to effectively highlight the salient objects uniformly from the background. Finally, three post-processing methods-highlight-inhibition function, context-based saliency weighting, and the graph cut-are adopted to further refine the saliency map. Experiments on four public benchmark datasets show that the proposed algorithm outperforms the state-of-the-art methods in terms of precision, recall and mean absolute error, and demonstrate the robustness and efficiency of the proposed method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Characterizing the effects of feature salience and top-down attention in the early visual system.

    Science.gov (United States)

    Poltoratski, Sonia; Ling, Sam; McCormack, Devin; Tong, Frank

    2017-07-01

    The visual system employs a sophisticated balance of attentional mechanisms: salient stimuli are prioritized for visual processing, yet observers can also ignore such stimuli when their goals require directing attention elsewhere. A powerful determinant of visual salience is local feature contrast: if a local region differs from its immediate surround along one or more feature dimensions, it will appear more salient. We used high-resolution functional MRI (fMRI) at 7T to characterize the modulatory effects of bottom-up salience and top-down voluntary attention within multiple sites along the early visual pathway, including visual areas V1-V4 and the lateral geniculate nucleus (LGN). Observers viewed arrays of spatially distributed gratings, where one of the gratings immediately to the left or right of fixation differed from all other items in orientation or motion direction, making it salient. To investigate the effects of directed attention, observers were cued to attend to the grating to the left or right of fixation, which was either salient or nonsalient. Results revealed reliable additive effects of top-down attention and stimulus-driven salience throughout visual areas V1-hV4. In comparison, the LGN exhibited significant attentional enhancement but was not reliably modulated by orientation- or motion-defined salience. Our findings indicate that top-down effects of spatial attention can influence visual processing at the earliest possible site along the visual pathway, including the LGN, whereas the processing of orientation- and motion-driven salience primarily involves feature-selective interactions that take place in early cortical visual areas. NEW & NOTEWORTHY While spatial attention allows for specific, goal-driven enhancement of stimuli, salient items outside of the current focus of attention must also be prioritized. We used 7T fMRI to compare salience and spatial attentional enhancement along the early visual hierarchy. We report additive effects of

  12. Preference for Well-Balanced Saliency in Details Cropped from Photographs

    Science.gov (United States)

    Abeln, Jonas; Fresz, Leonie; Amirshahi, Seyed Ali; McManus, I. Chris; Koch, Michael; Kreysa, Helene; Redies, Christoph

    2016-01-01

    Photographic cropping is the act of selecting part of a photograph to enhance its aesthetic appearance or visual impact. It is common practice with both professional (expert) and amateur (non-expert) photographers. In a psychometric study, McManus et al. (2011b) showed that participants cropped photographs confidently and reliably. Experts tended to select details from a wider range of positions than non-experts, but other croppers did not generally prefer details that were selected by experts. It remained unclear, however, on what grounds participants selected particular details from a photograph while avoiding other details. One of the factors contributing to cropping decision may be visual saliency. Indeed, various saliency-based computer algorithms are available for the automatic cropping of photographs. However, careful experimental studies on the relation between saliency and cropping are lacking to date. In the present study, we re-analyzed the data from the studies by McManus et al. (2011a,b), focusing on statistical image properties. We calculated saliency-based measures for details selected and details avoided during cropping. As expected, we found that selected details contain regions of higher saliency than avoided details on average. Moreover, the saliency center-of-mass was closer to the geometrical center in selected details than in avoided details. Results were confirmed in an eye tracking study with the same dataset of images. Interestingly, the observed regularities in cropping behavior were less pronounced for experts than for non-experts. In summary, our results suggest that, during cropping, participants tend to select salient regions and place them in an image composition that is well-balanced with respect to the distribution of saliency. Our study contributes to the knowledge of perceptual bottom-up features that are germane to aesthetic decisions in photography and their variability in non-experts and experts. PMID:26793086

  13. The effect of visual salience on memory-based choices.

    Science.gov (United States)

    Pooresmaeili, Arezoo; Bach, Dominik R; Dolan, Raymond J

    2014-02-01

    Deciding whether a stimulus is the "same" or "different" from a previous presented one involves integrating among the incoming sensory information, working memory, and perceptual decision making. Visual selective attention plays a crucial role in selecting the relevant information that informs a subsequent course of action. Previous studies have mainly investigated the role of visual attention during the encoding phase of working memory tasks. In this study, we investigate whether manipulation of bottom-up attention by changing stimulus visual salience impacts on later stages of memory-based decisions. In two experiments, we asked subjects to identify whether a stimulus had either the same or a different feature to that of a memorized sample. We manipulated visual salience of the test stimuli by varying a task-irrelevant feature contrast. Subjects chose a visually salient item more often when they looked for matching features and less often so when they looked for a nonmatch. This pattern of results indicates that salient items are more likely to be identified as a match. We interpret the findings in terms of capacity limitations at a comparison stage where a visually salient item is more likely to exhaust resources leading it to be prematurely parsed as a match.

  14. A Comparative Prospective Study of Two Different Treatment Sequences i.e. Bottom Up-Inside Out and Topdown-Outside in, in the Treatment of Panfacial Fractures.

    Science.gov (United States)

    Degala, Saikrishna; Sundar, S Shyam; Mamata, K S

    2015-12-01

    To compare the sequence bottom-up inside-out with top-down outside-in, in the treatment of pan facial fractures and to evaluate the outcome of these approaches. The data from 11 patients with panfacial fracture are prospectively analysed. Five cases are treated with bottom-up approach and six patients with top-down approach. There were 11 male patients (six in top-down approach and five in bottom-up approach), ranging in age from 24 to 50 years. All injuries were result of RTA (n = 11, 100 %). Final treatment outcome was excellent in 3 (50 %), 1 (16 %) good and 2 (32 %) cases were fair in topdown approach, 3 (60 %) excellent and 2 (40 %) fair in bottom up approach with contingency coefficient value (P reconstruction of the remaining. Choice of the bottom-up inside-out or top-down outside-in sequence should be according to the pattern of fractures and preference of the surgeon. However, further controlled clinical trials, comparative studies with a larger sample size would be better to evaluate the final clinical outcome of individual techniques.

  15. Analysis of the Economic Impact of Large-Scale Deployment of Biomass Resources for Energy and Materials in the Netherlands. Appendix 1. Bottom-up Scenarios

    International Nuclear Information System (INIS)

    Hoefnagels, R.; Dornburg, V.; Faaij, A.; Banse, M.

    2009-03-01

    The Bio-based Raw Materials Platform (PGG), part of the Energy Transition in The Netherlands, commissioned the Agricultural Economics Research Institute (LEI) and the Copernicus Institute of Utrecht University to conduct research on the macro-economic impact of large scale deployment of biomass for energy and materials in the Netherlands. Two model approaches were applied based on a consistent set of scenario assumptions: a bottom-up study including technoeconomic projections of fossil and bio-based conversion technologies and a topdown study including macro-economic modelling of (global) trade of biomass and fossil resources. The results of the top-down and bottom-up modelling work are reported separately. The results of the synthesis of the modelling work are presented in the main report. This report (part 1) presents scenarios for future biomass use for energy and materials, and analyses the consequences on energy supply, chemical productions, costs and greenhouse gas (GHG) emissions with a bottom-up approach. The bottom-up projections, as presented in this report, form the basis for modelling work using the top-down macro-economic model (LEITAP) to assess the economic impact of substituting fossil-based energy carriers with biomass in the Netherlands. The results of the macro-economic modelling work, and the linkage between the results of the bottom-up and top-down work, will be presented in the top-down economic part and synthesis report of this study

  16. Remediation Performance and Mechanism of Heavy Metals by a Bottom Up Activation and Extraction System Using Multiple Biochemical Materials.

    Science.gov (United States)

    Xiao, Kemeng; Li, Yunzhen; Sun, Yang; Liu, Ruyue; Li, Junjie; Zhao, Yun; Xu, Heng

    2017-09-13

    Soil contamination with heavy metals has caused serious environmental problems and increased the risks to humans and biota. Herein, we developed an effective bottom up metals removal system based on the synergy between the activation of immobilization metal-resistant bacteria and the extraction of bioaccumulator material (Stropharia rugosoannulata). In this system, the advantages of biochar produced at 400 °C and sodium alginate were integrated to immobilize bacteria. Optimized by response surface methodology, the biochar and bacterial suspension were mixed at a ratio of 1:20 (w:v) for 12 h when 2.5% sodium alginate was added to the mixture. Results demonstrated that the system significantly increased the proportion of acid soluble Cd and Cu and improved the soil microecology (microbial counts, soil respiration, and enzyme activities). The maximum extractions of Cd and Cu were 8.79 and 77.92 mg kg -1 , respectively. Moreover, details of the possible mechanistic insight into the metal removal are discussed, which indicate positive correlation with the acetic acid extractable metals and soil microecology. Meanwhile, the "dilution effect" in S. rugosoannulata probably plays an important role in the metal removal process. Furthermore, the metal-resistant bacteria in this system were successfully colonized, and the soil bacteria community were evaluated to understand the microbial diversity in metal-contaminated soil after remediation.

  17. Calculating systems-scale energy efficiency and net energy returns: A bottom-up matrix-based approach

    International Nuclear Information System (INIS)

    Brandt, Adam R.; Dale, Michael; Barnhart, Charles J.

    2013-01-01

    In this paper we expand the work of Brandt and Dale (2011) on ERRs (energy return ratios) such as EROI (energy return on investment). This paper describes a “bottom-up” mathematical formulation which uses matrix-based computations adapted from the LCA (life cycle assessment) literature. The framework allows multiple energy pathways and flexible inclusion of non-energy sectors. This framework is then used to define a variety of ERRs that measure the amount of energy supplied by an energy extraction and processing pathway compared to the amount of energy consumed in producing the energy. ERRs that were previously defined in the literature are cast in our framework for calculation and comparison. For illustration, our framework is applied to include oil production and processing and generation of electricity from PV (photovoltaic) systems. Results show that ERR values will decline as system boundaries expand to include more processes. NERs (net energy return ratios) tend to be lower than GERs (gross energy return ratios). External energy return ratios (such as net external energy return, or NEER (net external energy ratio)) tend to be higher than their equivalent total energy return ratios. - Highlights: • An improved bottom-up mathematical method for computing net energy return metrics is developed. • Our methodology allows arbitrary numbers of interacting processes acting as an energy system. • Our methodology allows much more specific and rigorous definition of energy return ratios such as EROI or NER

  18. Bottom-up meets top-down: tailored raspberry-like Fe3O4-Pt nanocrystal superlattices.

    Science.gov (United States)

    Qiu, Fen; Vervuurt, René H J; Verheijen, Marcel A; Zaia, Edmond W; Creel, Erin B; Kim, Youngsang; Urban, Jeffrey J; Bol, Ageeth A

    2018-03-29

    Supported catalysts are widely used in industry and can be optimized by tuning the composition, chemical structure, and interface of the nanoparticle catalyst and oxide support. Here we firstly combine a bottom up colloidal synthesis method with a top down atomic layer deposition (ALD) process to achieve a raspberry-like Pt-decorated Fe3O4 (Fe3O4-Pt) nanoparticle superlattices. This nanocomposite ensures the precision of the catalyst/support interface, improving the catalytic efficiency of the Fe3O4-Pt nanocomposite system. The morphology of the hybrid nanocomposites resulting from different cycles of ALD was monitored by scanning transmission electron microscopy, giving insight into the nucleation and growth mechanism of the ALD process. X-ray photoelectron spectroscopy studies confirm the anticipated electron transfer from Fe3O4 to Pt through the nanocomposite interface. Photocurrent measurement further suggests that Fe3O4 superlattices with controlled decoration of Pt have substantial promise for energy-efficient photoelectrocatalytic oxygen evolution reaction. This work opens a new avenue for designing supported catalyst architectures via precisely controlled decoration of single component superlattices with noble metals.

  19. Understanding agent-based models of financial markets: A bottom-up approach based on order parameters and phase diagrams

    Science.gov (United States)

    Lye, Ribin; Tan, James Peng Lung; Cheong, Siew Ann

    2012-11-01

    We describe a bottom-up framework, based on the identification of appropriate order parameters and determination of phase diagrams, for understanding progressively refined agent-based models and simulations of financial markets. We illustrate this framework by starting with a deterministic toy model, whereby N independent traders buy and sell M stocks through an order book that acts as a clearing house. The price of a stock increases whenever it is bought and decreases whenever it is sold. Price changes are updated by the order book before the next transaction takes place. In this deterministic model, all traders based their buy decisions on a call utility function, and all their sell decisions on a put utility function. We then make the agent-based model more realistic, by either having a fraction fb of traders buy a random stock on offer, or a fraction fs of traders sell a random stock in their portfolio. Based on our simulations, we find that it is possible to identify useful order parameters from the steady-state price distributions of all three models. Using these order parameters as a guide, we find three phases: (i) the dead market; (ii) the boom market; and (iii) the jammed market in the phase diagram of the deterministic model. Comparing the phase diagrams of the stochastic models against that of the deterministic model, we realize that the primary effect of stochasticity is to eliminate the dead market phase.

  20. Bioenergy decision-making of farms in Northern Finland. Combining the bottom-up and top-down perspectives

    International Nuclear Information System (INIS)

    Snaekin, Juha-Pekka; Muilu, Toivo; Pesola, Tuomo

    2010-01-01

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers. (author)

  1. Bottom-up control of consumers leads to top-down indirect facilitation of invasive annual herbs in semiarid Chile.

    Science.gov (United States)

    Madrigal, Jaime; Kelt, Douglas A; Meserve, Peter L; Gutierrez, Julio R; Squeo, Francisco A

    2011-02-01

    The abundance of exotic plants is thought to be limited by competition with resident species (including plants and generalist herbivores). In contrast, observations in semiarid Chile suggest that a native generalist rodent, the degu (Octodon degus), may be facilitating the expansion of exotic annual plants. We tested this hypothesis with a 20-year data set from a World Biosphere Reserve in mediterranean Chile. In this semiarid environment, rainfall varies annually and dramatically influences cover by both native and exotic annual plants; degu population density affects the composition and cover of exotic and native annual plants. In low-rainfall years, cover of both native and exotic herbs is extremely low. Higher levels of precipitation result in proportional increases in cover of all annual plants (exotic and native species), leading in turn to increases in degu population densities, at which point they impact native herbs in proportion to their greater cover, indirectly favoring the expansion of exotic plants. We propose that bottom-up control of consumers at our site results in top-down indirect facilitation of invasive annual herbs, and that this pattern may be general to other semiarid ecosystems.

  2. Simple rules describe bottom-up and top-down control in food webs with alternative energy pathways.

    Science.gov (United States)

    Wollrab, Sabine; Diehl, Sebastian; De Roos, André M

    2012-09-01

    Many human influences on the world's ecosystems have their largest direct impacts at either the top or the bottom of the food web. To predict their ecosystem-wide consequences we must understand how these impacts propagate. A long-standing, but so far elusive, problem in this endeavour is how to reduce food web complexity to a mathematically tractable, but empirically relevant system. Simplification to main energy channels linking primary producers to top consumers has been recently advocated. Following this approach, we propose a general framework for the analysis of bottom-up and top-down forcing of ecosystems by reducing food webs to two energy pathways originating from a limiting resource shared by competing guilds of primary producers (e.g. edible vs. defended plants). Exploring dynamical models of such webs we find that their equilibrium responses to nutrient enrichment and top consumer harvesting are determined by only two easily measurable topological properties: the lengths of the component food chains (odd-odd, odd-even, or even-even) and presence vs. absence of a generalist top consumer reconnecting the two pathways (yielding looped vs. branched webs). Many results generalise to other looped or branched web structures and the model can be easily adapted to include a detrital pathway. © 2012 Blackwell Publishing Ltd/CNRS.

  3. Recent patterns of methane and nitrous oxide fluxes in the terrestrial biosphere: The bottom-up approach (Invited)

    Science.gov (United States)

    Tian, H.

    2013-12-01

    Accurately estimating methane and nitrous oxide emissions from terrestrial ecosystems is critical for resolving global budgets of these greenhouse gases (GHGs) and continuing to mitigate climate warming. In this study, we use a bottom-up approach to estimate annual budgets of both methane and nitrous oxide in global terrestrial ecosystem during 1981-2010 and analyze the underlying mechanisms responsible for spatial and temporal variations in these GHGs. Both methane and nitrous oxide emissions significantly increased from 1981 to 2010, primarily owing to increased air temperature, nitrogen fertilizer use, and land use change. Methane and nitrous oxide emissions increased the fastest in Asia due to the more prominent environmental changes compared to other continents. The cooling effects by carbon dioxide sink in the terrestrial biosphere might be completely offset by increasing methane and nitrous oxide emissions, resulting in a positive global warming potential. Asia and South America were the largest contributors to increasing global warming potential. This study suggested that current management practices might not be effective enough to reduce future global warming.

  4. A bottom-up, scientist-based initiative for the communication of climate sciences with the general public

    Science.gov (United States)

    Bourqui, Michel; Bolduc, Cassandra; Paul, Charbonneau; Marie, Charrière; Daniel, Hill; Angelica, Lopez; Enrique, Loubet; Philippe, Roy; Barbara, Winter

    2015-04-01

    This talk introduces a scientists-initiated, new online platform whose aim is to contribute to making climate sciences become public knowledge. It takes a unique bottom-up approach, strictly founded on individual-based participation, high scientific standards and independence The main purpose is to build an open-access, multilingual and peer-reviewed journal publishing short climate articles in non-scientific language. The targeted public includes journalists, teachers, students, local politicians, economists, members of the agriculture sector, and any other citizens from around the world with an interest in climate sciences. This journal is meant to offer a simple and direct channel for scientists wishing to disseminate their research to the general public. A high standard of climate articles is ensured through: a) requiring that the main author is an active climate scientist, and b) an innovative peer-review process involving scientific and non-scientific referees with distinct roles. The platform fosters the direct participation of non-scientists through co-authoring, peer-reviewing, language translation. It furthermore engages the general public in the scientific inquiry by allowing non-scientists to invite manuscripts to be written on topics of their concern. The platform is currently being developed by a community of scientists and non-scientists. In this talk, I will present the basic ideas behind this new online platform, its current state and the plans for the next future. The beta version of the platform is available at: http://www.climateonline.bourquiconsulting.ch

  5. Beyond Defining the Smart City. Meeting Top-Down and Bottom-Up Approaches in the Middle

    Directory of Open Access Journals (Sweden)

    Jonas Breuer

    2014-05-01

    Full Text Available This paper aims to better frame the discussion and the various, divergent operationalisations and interpretations of the Smart City concept. We start by explicating top-down approaches to the Smart City, followed by what purely bottom-up initiatives can look like. We provide a clear overview of stakeholders’ different viewpoints on the city of tomorrow. Particularly the consequences and potential impacts of these differing interpretations and approaches should be of specific interest to researchers, policy makers, city administrations, private actors and anyone involved and concerned with life in cities. Therefore the goal of this article is not so much answering the question of what the Smart City is, but rather what the concept can mean for different stakeholders as well as the consequences of their interpretation. We do this by assembling an eclectic overview, bringing together definitions, examples and operationalisations from academia, policy and industry as well as identifying major trends and approaches to realizing the Smart City. We add to the debate by proposing a different approach that starts from the collective, collaboration and context when researching Smart City initiatives.

  6. Bottom-up priority setting revised. A second evaluation of an institutional intervention in a Swedish health care organisation.

    Science.gov (United States)

    Waldau, Susanne

    2015-09-01

    Transparent priority setting in health care based on specific ethical principles is requested by the Swedish Parliament since 1997. Implementation has been limited. In this case, transparent priority setting was performed for a second time round and engaged an entire health care organisation. Objectives were to refine a bottom-up priority setting process, reach a political decision on service limits to make reallocation towards higher prioritised services possible, and raise systems knowledge. An action research approach was chosen. The national model for priority setting was used with addition of dimensions costs, volumes, gender distribution and feasibility. The intervention included a three step process and specific procedures for each step which were created, revised and evaluated regarding factual and functional aspects. Evaluations methods included analyses of documents, recordings and surveys. Vertical and horizontal priority setting occurred and resources were reallocated. Participants' attitudes remained positive, however less so than in the first priority setting round. Identifying low-priority services was perceived difficult, causing resentment and strategic behaviour. The horizontal stage served to raise quality of the knowledge base, level out differences in ranking of services and raise systems knowledge. Existing health care management systems do not meet institutional requirements for transparent priority setting. Introducing transparent priority setting constitutes a complex institutional reform, which needs to be driven by management/administration. Strong managerial commitment is required. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  7. The Impact of Bottom-Up Parking Information Provision in a Real-Life Context: The Case of Antwerp

    Directory of Open Access Journals (Sweden)

    Geert Tasseron

    2017-01-01

    Full Text Available A number of studies have analyzed the possible impacts of bottom-up parking information or parking reservation systems on parking dynamics in abstract simulation environments. In this paper, we take these efforts one step further by investigating the impacts of these systems in a real-life context: the center of the city of Antwerp, Belgium. In our simulation, we assume that all on-street and off-street parking places are equipped with technology able to transmit their occupancy status to so-called smart cars, which can receive information and reserve a parking place. We employ PARKAGENT, an agent-based simulation model, to simulate the behavior of smart and regular cars. We obtain detailed data on parking demand from FEATHERS, an activity-based transport model. The simulation results show that parking information and reservation hardly impact search time but do reduce walking distance for smart cars, leading to a reduction in total parking time, that is, the sum of search time and walking time. Reductions in search time occur only in zones with high occupancy rates, while a drop in walking distance is especially observed in low occupancy areas. Societal benefits of parking information and reservation are limited, because of the low impact on search time and the possible negative health effects of reduced walking distance.

  8. Integration of bottom-up and top-down models for the energy system. A practical case for Denmark

    International Nuclear Information System (INIS)

    Jacobsen, H.; Morthorst, P.E.; Nielsen, L.; Stephensen, P.

    1996-07-01

    The main objective of the project was to integrate the Danish macro economic model ADAM with elements from the energy simulation model BRUS, developed at Risoe. The project has been carried out by Risoe National Laboratory with assistance from the Ministry of Finance. A theoretical part focuses on the differences between top-down and bottom-up modelling of the energy-economy interaction. A combined hybrid model seems a relevant alternative to the two traditional approaches. The hybrid model developed is called Hybris and includes models for: supply of electricity and heat, household demand for electricity, and household demand for heat. These three models interact in a iterative procedure with the macro economic model ADAM through a number of links. A reference case as well as a number of scenarios illustrating the capabilities of the model has been set up.Hybris is a simulation model which is capable of analyzing combined CO 2 reduction initiatives as regulation of the energy supply system and a CO 2 tax in an integrated and consistent way. (au) 32 tabs., 98 ills., 55 refs

  9. Estimation of Emissions from Sugarcane Field Burning in Thailand Using Bottom-Up Country-Specific Activity Data

    Directory of Open Access Journals (Sweden)

    Wilaiwan Sornpoon

    2014-09-01

    Full Text Available Open burning in sugarcane fields is recognized as a major source of air pollution. However, the assessment of its emission intensity in many regions of the world still lacks information, especially regarding country-specific activity data including biomass fuel load and combustion factor. A site survey was conducted covering 13 sugarcane plantations subject to different farm management practices and climatic conditions. The results showed that pre-harvest and post-harvest burnings are the two main practices followed in Thailand. In 2012, the total production of sugarcane biomass fuel, i.e., dead, dry and fresh leaves, amounted to 10.15 million tonnes, which is equivalent to a fuel density of 0.79 kg∙m−2. The average combustion factor for the pre-harvest and post-harvest burning systems was determined to be 0.64 and 0.83, respectively. Emissions from sugarcane field burning were estimated using the bottom-up country-specific values from the site survey of this study and the results compared with those obtained using default values from the 2006 IPCC Guidelines. The comparison showed that the use of default values lead to underestimating the overall emissions by up to 30% as emissions from post-harvest burning are not accounted for, but it is the second most common practice followed in Thailand.

  10. Conservative and dissipative force field for simulation of coarse-grained alkane molecules: A bottom-up approach

    Energy Technology Data Exchange (ETDEWEB)

    Trément, Sébastien; Rousseau, Bernard, E-mail: bernard.rousseau@u-psud.fr [Laboratoire de Chimie-Physique, UMR 8000 CNRS, Université Paris-Sud, Orsay (France); Schnell, Benoît; Petitjean, Laurent; Couty, Marc [Manufacture Française des Pneumatiques MICHELIN, Centre de Ladoux, 23 place des Carmes, 63000 Clermont-Ferrand (France)

    2014-04-07

    We apply operational procedures available in the literature to the construction of coarse-grained conservative and friction forces for use in dissipative particle dynamics (DPD) simulations. The full procedure rely on a bottom-up approach: large molecular dynamics trajectories of n-pentane and n-decane modeled with an anisotropic united atom model serve as input for the force field generation. As a consequence, the coarse-grained model is expected to reproduce at least semi-quantitatively structural and dynamical properties of the underlying atomistic model. Two different coarse-graining levels are studied, corresponding to five and ten carbon atoms per DPD bead. The influence of the coarse-graining level on the generated force fields contributions, namely, the conservative and the friction part, is discussed. It is shown that the coarse-grained model of n-pentane correctly reproduces self-diffusion and viscosity coefficients of real n-pentane, while the fully coarse-grained model for n-decane at ambient temperature over-predicts diffusion by a factor of 2. However, when the n-pentane coarse-grained model is used as a building block for larger molecule (e.g., n-decane as a two blobs model), a much better agreement with experimental data is obtained, suggesting that the force field constructed is transferable to large macro-molecular systems.

  11. Comparison of the top-down and bottom-up approach to fabricate nanowire-based Silicon/Germanium heterostructures

    International Nuclear Information System (INIS)

    Wolfsteller, A.; Geyer, N.; Nguyen-Duc, T.-K.; Das Kanungo, P.; Zakharov, N.D.; Reiche, M.; Erfurth, W.; Blumtritt, H.; Werner, P.; Goesele, U.

    2010-01-01

    Silicon nanowires (NWs) and vertical nanowire-based Si/Ge heterostructures are expected to be building blocks for future devices, e.g. field-effect transistors or thermoelectric elements. In principle two approaches can be applied to synthesise these NWs: the 'bottom-up' and the 'top-down' approach. The most common method for the former is the vapour-liquid-solid (VLS) mechanism which can also be applied to grow NWs by molecular beam epitaxy (MBE). Although MBE allows a precise growth control under highly reproducible conditions, the general nature of the growth process via a eutectic droplet prevents the synthesis of heterostructures with sharp interfaces and high Ge concentrations. We compare the VLS NW growth with two different top-down methods: The first is a combination of colloidal lithography and metal-assisted wet chemical etching, which is an inexpensive and fast method and results in large arrays of homogenous Si NWs with adjustable diameters down to 50 nm. The second top-down method combines the growth of Si/Ge superlattices by MBE with electron beam lithography and reactive ion etching. Again, large and homogeneous arrays of NWs were created, this time with a diameter of 40 nm and the Si/Ge superlattice inside.

  12. Discrepancies and Uncertainties in Bottom-up Gridded Inventories of Livestock Methane Emissions for the Contiguous United States

    Science.gov (United States)

    Randles, C. A.; Hristov, A. N.; Harper, M.; Meinen, R.; Day, R.; Lopes, J.; Ott, T.; Venkatesh, A.

    2017-12-01

    In this analysis we used a spatially-explicit, bottom-up approach, based on animal inventories, feed intake, and feed intake-based emission factors to estimate county-level enteric (cattle) and manure (cattle, swine, and poultry) livestock methane emissions for the contiguous United States. Combined enteric and manure emissions were highest for counties in California's Central Valley. Overall, this analysis yielded total livestock methane emissions (8,916 Gg/yr; lower and upper bounds of 6,423 and 11,840 Gg/yr, respectively) for 2012 that are comparable to the current USEPA estimates for 2012 (9,295 Gg/yr) and to estimates from the global gridded Emission Database for Global Atmospheric Research (EDGAR) inventory (8,728 Gg/yr), used previously in a number of top-down studies. However, the spatial distribution of emissions developed in this analysis differed significantly from that of EDGAR. As an example, methane emissions from livestock in Texas and California (highest contributors to the national total) in this study were 36% lesser and 100% greater, respectively, than estimates by EDGAR. Thespatial distribution of emissions in gridded inventories (e.g., EDGAR) likely strongly impacts the conclusions of top-down approaches that use them, especially in the source attribution of resulting (posterior) emissions, and hence conclusions from such studies should be interpreted with caution.

  13. 2D FT-ICR MS of Calmodulin: A Top-Down and Bottom-Up Approach.

    Science.gov (United States)

    Floris, Federico; van Agthoven, Maria; Chiron, Lionel; Soulby, Andrew J; Wootton, Christopher A; Lam, Yuko P Y; Barrow, Mark P; Delsuc, Marc-André; O'Connor, Peter B

    2016-09-01

    Two-dimensional Fourier transform ion cyclotron resonance mass spectrometry (2D FT-ICR MS) allows data-independent fragmentation of all ions in a sample and correlation of fragment ions to their precursors through the modulation of precursor ion cyclotron radii prior to fragmentation. Previous results show that implementation of 2D FT-ICR MS with infrared multi-photon dissociation (IRMPD) and electron capture dissociation (ECD) has turned this method into a useful analytical tool. In this work, IRMPD tandem mass spectrometry of calmodulin (CaM) has been performed both in one-dimensional and two-dimensional FT-ICR MS using a top-down and bottom-up approach. 2D IRMPD FT-ICR MS is used to achieve extensive inter-residue bond cleavage and assignment for CaM, using its unique features for fragment identification in a less time- and sample-consuming experiment than doing the same thing using sequential MS/MS experiments. Graphical Abstract ᅟ.

  14. Sensitivity quantification of airport concrete pavement stress responses associated with top-down and bottom-up cracking

    Directory of Open Access Journals (Sweden)

    Adel Rezaei-Tarahomi

    2017-09-01

    Full Text Available The Federal Aviation Administration’s (FAA’s rigid pavement design standard employs the NIKE3D-FAA software to compute critical pavement responses of concrete airport pavement structures. NIKE3D-FAA is a modification of the original NIKE3D three-dimensional finite element analysis program developed by the Lawrence Livermore National Laboratory (LLNL of the U.S. Department of Energy, and is currently used in the FAA’s FAARFIELD program. This study evaluated the sensitivity of NIKE3D-FAA rigid pavement responses with respect to top-down and bottom-up cracking. The analysis was conducted by positioning a Boeing 777-300ER (B777-300ERaircraft at different locations (interior, corner, and edge of slab as baseline while varying other NIKE3D-FAA inputs, including rigid pavement geometric features, mechanical properties of paving and foundation materials, equivalent temperature gradient and thermal coefficient of Portland Cement Concrete (PCC layers. Several sensitivity charts were developed by examining the sensitivity of critical pavement responses to each input variation. Sensitivity evaluations were performed using a normalized sensitivity index (NSI as the quantitative metric. Using such sensitivity evaluation, the most significant NIKE3D-FAA input parameters for generating an effective synthetic database that will lower computational cost for future modeling developments were identified. Keywords: Sensitivity analysis, Airfield concrete pavement, Finite element analysis, Top down cracking

  15. A bottom-up approach to identifying the maximum operational adaptive capacity of water resource systems to a changing climate

    Science.gov (United States)

    Culley, S.; Noble, S.; Yates, A.; Timbs, M.; Westra, S.; Maier, H. R.; Giuliani, M.; Castelletti, A.

    2016-09-01

    Many water resource systems have been designed assuming that the statistical characteristics of future inflows are similar to those of the historical record. This assumption is no longer valid due to large-scale changes in the global climate, potentially causing declines in water resource system performance, or even complete system failure. Upgrading system infrastructure to cope with climate change can require substantial financial outlay, so it might be preferable to optimize existing system performance when possible. This paper builds on decision scaling theory by proposing a bottom-up approach to designing optimal feedback control policies for a water system exposed to a changing climate. This approach not only describes optimal operational policies for a range of potential climatic changes but also enables an assessment of a system's upper limit of its operational adaptive capacity, beyond which upgrades to infrastructure become unavoidable. The approach is illustrated using the Lake Como system in Northern Italy—a regulated system with a complex relationship between climate and system performance. By optimizing system operation under different hydrometeorological states, it is shown that the system can continue to meet its minimum performance requirements for more than three times as many states as it can under current operations. Importantly, a single management policy, no matter how robust, cannot fully utilize existing infrastructure as effectively as an ensemble of flexible management policies that are updated as the climate changes.

  16. The Challenges of Bottom-up Approach of Natural-Social Integration in China Highland Pasture Management

    Science.gov (United States)

    Ai, Likun

    2017-04-01

    The pasture land covers two fifth of total Chinese land area, which is mainly distributed in western highland of Inner Mongolia, Xinjiang, Tibet, Qinghai, Gansu and Sichuan Provinces. China pasture land is not only in charge of providing food resource to regional people, but also plays important role in highland ecosystem services and biodiversity. Along with global warming and enhanced grazing activity, 90% of China pasture land is facing the threat of land degradation. From middle 1990's, Chinese government has released a series of pasture land conservation policies to prevent further environmental degradation. In the same time, lots of pasture ecosystem and environment change researches are supported by national and regional funding agencies. In this study, by monitoring and investigating this top-down approach of pasture land research and policy making processes, we would like to find out the gaps and problems of current research and policy making on China pasture land conservation, especially focusing on the possibility of establishing the bottom-up approach of natural-social sciences integration to support the pasture land conservation and sustainable pasture land management in highland China.

  17. Single, aligned carbon nanotubes in 3D nanoscale architectures enabled by top-down and bottom-up manufacturable processes

    International Nuclear Information System (INIS)

    Kaul, Anupama B; Megerian, Krikor G; Von Allmen, Paul; Baron, Richard L

    2009-01-01

    We have developed manufacturable approaches for forming single, vertically aligned carbon nanotubes, where the tubes are centered precisely, and placed within a few hundred nm of 1-1.5 μm deep trenches. These wafer-scale approaches were enabled by using chemically amplified resists and high density, low pressure plasma etching techniques to form the 3D nanoscale architectures. The tube growth was performed using dc plasma-enhanced chemical vapor deposition (PECVD), and the materials used in the pre-fabricated 3D architectures were chemically and structurally compatible with the high temperature (700 deg. C) PECVD synthesis of our tubes, in an ammonia and acetylene ambient. Such scalable, high throughput top-down fabrication processes, when integrated with the bottom-up tube synthesis techniques, should accelerate the development of plasma grown tubes for a wide variety of applications in electronics, such as nanoelectromechanical systems, interconnects, field emitters and sensors. Tube characteristics were also engineered to some extent, by adjusting the Ni catalyst thickness, as well as the pressure and plasma power during growth.

  18. Top-Down-Assisted Bottom-Up Method for Homologous Protein Sequencing: Hemoglobin from 33 Bird Species

    Science.gov (United States)

    Song, Yang; Laskay, Ünige A.; Vilcins, Inger-Marie E.; Barbour, Alan G.; Wysocki, Vicki H.

    2015-11-01

    Ticks are vectors for disease transmission because they are indiscriminant in their feeding on multiple vertebrate hosts, transmitting pathogens between their hosts. Identifying the hosts on which ticks have fed is important for disease prevention and intervention. We have previously shown that hemoglobin (Hb) remnants from a host on which a tick fed can be used to reveal the host's identity. For the present research, blood was collected from 33 bird species that are common in the U.S. as hosts for ticks but that have unknown Hb sequences. A top-down-assisted bottom-up mass spectrometry approach with a customized searching database, based on variability in known bird hemoglobin sequences, has been devised to facilitate fast and complete sequencing of hemoglobin from birds with unknown sequences. These hemoglobin sequences will be added to a hemoglobin database and used for tick host identification. The general approach has the potential to sequence any set of homologous proteins completely in a rapid manner.

  19. Top-down expectancy versus bottom-up guidance in search for known color-form conjunctions.

    Science.gov (United States)

    Anderson, Giles M; Humphreys, Glyn W

    2015-11-01

    We assessed the effects of pairing a target object with its familiar color on eye movements in visual search, under conditions where the familiar color could or could not be predicted. In Experiment 1 participants searched for a yellow- or purple-colored corn target amongst aubergine distractors, half of which were yellow and half purple. Search was more efficient when the color of the target was familiar and early eye movements more likely to be directed to targets carrying a familiar color than an unfamiliar color. Experiment 2 introduced cues which predicted the target color at 80 % validity. Cue validity did not affect whether early fixations were to the target. Invalid cues, however, disrupted search efficiency for targets in an unfamiliar color whilst there was little cost to search efficiency for targets in their familiar color. These results generalized across items with different colors (Experiment 3). The data are consistent with early processes in selection being automatically modulated in a bottom-up manner to targets in their familiar color, even when expectancies are set for other colors.

  20. A bottom-up partnership of Andean institutions to improve hydrological interventions using a participatory network of research basins

    Science.gov (United States)

    Buytaert, W.; Ochoa-Tocachi, B. F.; De Bièvre, B.

    2017-12-01

    Many watershed interventions in remote data-scarce areas respond to information gaps by extrapolating conventional approaches based on very limited local evidence. However, most interventions, including conservation strategies and adaptation measures, have not been evaluated properly for their hydrological benefits. This is particularly the case for the Andean region, where the complex climatic and hydrological characteristics combined with a very dynamic anthropogenic disturbance, require better monitoring. Here, we present the experience of a partnership of academic and non-governmental institutions who pioneered participatory hydrological monitoring in the Andes. Established in 2009, the Regional Initiative for Hydrological Monitoring of Andean Ecosystems (iMHEA), is a bottom-up initiative that complements the national monitoring networks and more conventional scientific observatories. Using a design based on a trading-space-for-time approach, over 30 paired catchments with a variety of watershed interventions are currently being monitored by 18 local stakeholders in 15 sites in the tropical Andes. Pooling these data into a hydrological impact model allowed the consortium to make more robust predictions about the effectiveness of catchment interventions to improve water resources management and to reduce risks. The collaborative nature of iMHEA has several strengths. We identify as most important of those the ability to: (i) standardize monitoring practices; (ii) ensure quality and technical support; (iii) share responsibility of monitoring activities; (iv) obtain project co-funding and complementarity; and, (v) promote decision maker-scientist engagement. As a result, this network has started to deliver useful information to multi-scale and multi-stakeholder decision making arenas. For example, in the context of growing investment in hydrological ecosystem services in Peru, the sites provide a new generation of hydrological information that allows for evidence

  1. A Facile Bottom-Up Approach to Construct Hybrid Flexible Cathode Scaffold for High-Performance Lithium-Sulfur Batteries.

    Science.gov (United States)

    Ghosh, Arnab; Manjunatha, Revanasiddappa; Kumar, Rajat; Mitra, Sagar

    2016-12-14

    Lithium-sulfur batteries mostly suffer from the low utilization of sulfur, poor cycle life, and low rate performances. The prime factors that affect the performance are enormous volume change of the electrode, soluble intermediate product formation, poor electronic and ionic conductivity of S, and end discharge products (i.e., Li 2 S 2 and Li 2 S). The attractive way to mitigate these challenges underlying in the fabrication of a sulfur nanocomposite electrode consisting of different nanoparticles with distinct properties of lithium storage capability, mechanical reinforcement, and ionic as well as electronic conductivity leading to a mechanically robust and mixed conductive (ionic and electronic conductive) sulfur electrode. Herein, we report a novel bottom-up approach to synthesize a unique freestanding, flexible cathode scaffold made of porous reduced graphene oxide, nanosized sulfur, and Mn 3 O 4 nanoparticles, and all are three-dimensionally interconnected to each other by hybrid polyaniline/sodium alginate (PANI-SA) matrix to serve individual purposes. A capacity of 1098 mAh g -1 is achieved against lithium after 200 cycles at a current rate of 2 A g -1 with 97.6% of initial capacity at a same current rate, suggesting the extreme stability and cycling performance of such electrode. Interestingly, with the higher current density of 5 A g -1 , the composite electrode exhibited an initial capacity of 1015 mA h g -1 and retained 71% of the original capacity after 500 cycles. The in situ Raman study confirms the polysulfide absorption capability of Mn 3 O 4 . This work provides a new strategy to design a mechanically robust, mixed conductive nanocomposite electrode for high-performance lithium-sulfur batteries and a strategy that can be used to develop flexible large power storage devices.

  2. Bottom-up fabrication of artery-mimicking tubular co-cultures in collagen-based microchannel scaffolds.

    Science.gov (United States)

    Tan, A; Fujisawa, K; Yukawa, Y; Matsunaga, Y T

    2016-10-20

    We developed a robust bottom-up approach to construct open-ended, tubular co-culture constructs that simulate the human vascular morphology and microenvironment. By design, these three-dimensional artificial vessels mimic the basic architecture of an artery: a collagen-rich extracellular matrix (as the tunica externa), smooth muscle cells (SMCs) (as the tunica media), and an endothelial cell (EC) lining (as the tunica interna). A versatile needle-based fabrication technique was employed to achieve controllable arterial layouts within a PDMS-hosted collagen microchannel scaffold (330 ± 10 μm in diameter): (direct co-culture) a SMC/EC bilayer to follow the structure of an arteriole-like segment; and (encapsulated co-culture) a lateral SMC multilayer covered by an EC monolayer lining to simulate the architecture of a larger artery. Optical and fluorescence microscopy images clearly evidenced the progressive cell elongation and sprouting behavior of SMCs and ECs along the collagen gel contour and within the gel matrix under static co-culture conditions. The progressive cell growth patterns effectively led to the formation of a tubular co-culture with an internal endothelial lining expressing prominent CD31 (cluster of differentiation 31) intercellular junction markers. During a 4-day static maturation period, the artery constructs showed modest alteration in the luminal diameters (i.e. less than 10% changes from the initial measurements). This argues in favor of stable and predictable arterial architecture achieved via the proposed fabrication protocols. Both co-culture models showed a high glucose metabolic rate during the initial proliferation phase, followed by a temporary quiescent (and thus, mature) stage. These proof-of-concept models with a controllable architecture create an important foundation for advanced vessel manipulations such as the integration of relevant physiological functionality or remodeling into a vascular disease-mimicking tissue.

  3. Systematic Correlation Matrix Evaluation (SCoMaE) - a bottom-up, science-led approach to identifying indicators

    Science.gov (United States)

    Mengis, Nadine; Keller, David P.; Oschlies, Andreas

    2018-01-01

    This study introduces the Systematic Correlation Matrix Evaluation (SCoMaE) method, a bottom-up approach which combines expert judgment and statistical information to systematically select transparent, nonredundant indicators for a comprehensive assessment of the state of the Earth system. The methods consists of two basic steps: (1) the calculation of a correlation matrix among variables relevant for a given research question and (2) the systematic evaluation of the matrix, to identify clusters of variables with similar behavior and respective mutually independent indicators. Optional further analysis steps include (3) the interpretation of the identified clusters, enabling a learning effect from the selection of indicators, (4) testing the robustness of identified clusters with respect to changes in forcing or boundary conditions, (5) enabling a comparative assessment of varying scenarios by constructing and evaluating a common correlation matrix, and (6) the inclusion of expert judgment, for example, to prescribe indicators, to allow for considerations other than statistical consistency. The example application of the SCoMaE method to Earth system model output forced by different CO2 emission scenarios reveals the necessity of reevaluating indicators identified in a historical scenario simulation for an accurate assessment of an intermediate-high, as well as a business-as-usual, climate change scenario simulation. This necessity arises from changes in prevailing correlations in the Earth system under varying climate forcing. For a comparative assessment of the three climate change scenarios, we construct and evaluate a common correlation matrix, in which we identify robust correlations between variables across the three considered scenarios.

  4. Multifaceted roles for low-frequency oscillations in bottom-up and top-down processing during navigation and memory.

    Science.gov (United States)

    Ekstrom, Arne D; Watrous, Andrew J

    2014-01-15

    A prominent and replicated finding is the correlation between running speed and increases in low-frequency oscillatory activity in the hippocampal local field potential. A more recent finding concerns low-frequency oscillations that increase in coherence between the hippocampus and neocortical brain areas such as prefrontal cortex during memory-related behaviors (i.e., remembering the correct location to visit). In this review, we tie together movement-related and memory-related low-frequency oscillations in the rodent with similar findings in humans. We argue that although movement-related low-frequency oscillations, in particular, may have slightly different characteristics in humans than rodents, placing important constraints on our thinking about this issue, both phenomena have similar functional foundations. We review four prominent theoretical models that provide partially conflicting accounts of movement-related low-frequency oscillations. We attempt to tie together these theoretical proposals, and existing data in rodents and humans, with memory-related low-frequency oscillations. We propose that movement-related low-frequency oscillations and memory-related low-frequency oscillatory activity, both of which show significant coherence with oscillations in other brain regions, represent different facets of "spectral fingerprints," or different resonant frequencies within the same brain networks underlying different cognitive processes. Together, movement-related and memory-related low-frequency oscillatory coupling may be linked by their distinct contributions to bottom-up, sensorimotor driven processing and top-down, controlled processing characterizing aspects of memory encoding and retrieval. Copyright © 2013. Published by Elsevier Inc.

  5. A Bottom-up Vulnerability Analysis of Water Systems with Decentralized Decision Making and Demographic Shifts- the Case of Jordan.

    Science.gov (United States)

    Lachaut, T.; Yoon, J.; Klassert, C. J. A.; Talozi, S.; Mustafa, D.; Knox, S.; Selby, P. D.; Haddad, Y.; Gorelick, S.; Tilmant, A.

    2016-12-01

    Probabilistic approaches to uncertainty in water systems management can face challenges of several types: non stationary climate, sudden shocks such as conflict-driven migrations, or the internal complexity and dynamics of large systems. There has been a rising trend in the development of bottom-up methods that place focus on the decision side instead of probability distributions and climate scenarios. These approaches are based on defining acceptability thresholds for the decision makers and considering the entire range of possibilities over which such thresholds are crossed. We aim at improving the knowledge on the applicability and relevance of this approach by enlarging its scope beyond climate uncertainty and single decision makers; thus including demographic shifts, internal system dynamics, and multiple stakeholders at different scales. This vulnerability analysis is part of the Jordan Water Project and makes use of an ambitious multi-agent model developed by its teams with the extensive cooperation of the Ministry of Water and Irrigation of Jordan. The case of Jordan is a relevant example for migration spikes, rapid social changes, resource depletion and climate change impacts. The multi-agent modeling framework used provides a consistent structure to assess the vulnerability of complex water resources systems with distributed acceptability thresholds and stakeholder interaction. A proof of concept and preliminary results are presented for a non-probabilistic vulnerability analysis that involves different types of stakeholders, uncertainties other than climatic and the integration of threshold-based indicators. For each stakeholder (agent) a vulnerability matrix is constructed over a multi-dimensional domain, which includes various hydrologic and/or demographic variables.

  6. Motivation and drives in bottom-up developments in natural hazards management: multiple-use of adaptation strategies in Austria

    Science.gov (United States)

    Thaler, Thomas; Fuchs, Sven

    2015-04-01

    Losses from extreme hydrological events, such as recently experienced in Europe have focused the attention of policymakers as well as researchers on vulnerability to natural hazards. In parallel, the context of changing flood risks under climate and societal change is driving transformation in the role of the state in responsibility sharing and individual responsibilities for risk management and precaution. The new policy agenda enhances the responsibilities of local authorities and private citizens in hazard management and reduces the role of central governments. Within the objective is to place added responsibility on local organisations and citizens to determine locally-based strategies for risk reduction. A major challenge of modelling adaptation is to represent the complexity of coupled human-environmental systems and particularly the feedback loops between environmental dynamics and human decision-making processes on different scales. This paper focuses on bottom-up initiatives to flood risk management which are, by definition, different from the mainstream. These initiatives are clearly influenced (positively or negatively) by a number of factors, where the combination of these interdependences can create specific conditions that alter the opportunity for effective governance arrangements in a local scheme approach. In total, this study identified six general drivers which encourage the implementation of flood storages, such as direct relation to recent major flood frequency and history, the initiative of individual stakeholders (promoters), political pressures from outside (e.g. business companies, private households) and a strong solidarity attitude of municipalities and the stakeholders involved. Although partnership approach may be seen as an 'optimal' solution for flood risk management, in practice there are many limitations and barriers in establishing these collaborations and making them effective (especially in the long term) with the consequences

  7. Regime shift from phytoplankton to macrophyte dominance in a large river: Top-down versus bottom-up effects

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Carles, E-mail: carles.ibanez@irta.cat [IRTA Aquatic Ecosystems, Carretera Poble Nou, Km 5.5, 43540 St. Carles de la Rapita, Catalonia (Spain); Alcaraz, Carles; Caiola, Nuno; Rovira, Albert; Trobajo, Rosa [IRTA Aquatic Ecosystems, Carretera Poble Nou, Km 5.5, 43540 St. Carles de la Rapita, Catalonia (Spain); Alonso, Miguel [United Research Services S.L., Urgell 143, 08036 Barcelona, Catalonia (Spain); Duran, Concha [Confederacion Hidrografica del Ebro, Sagasta 24-26, 50071 Zaragoza, Aragon (Spain); Jimenez, Pere J. [Grup Natura Freixe, Major 56, 43750 Flix, Catalonia (Spain); Munne, Antoni [Agencia Catalana de l' Aigua, Provenca 204-208, 08036 Barcelona, Catalonia (Spain); Prat, Narcis [Departament d' Ecologia, Universitat de Barcelona, Diagonal 645, 08028 Barcelona Catalonia (Spain)

    2012-02-01

    The lower Ebro River (Catalonia, Spain) has recently undergone a regime shift from a phytoplankton-dominated to a macrophyte-dominated system. This shift is well known in shallow lakes but apparently it has never been documented in rivers. Two initial hypotheses to explain the collapse of the phytoplankton were considered: a) the diminution of nutrients (bottom-up); b) the filtering effect due to the colonization of the zebra mussel (top-down). Data on water quality, hydrology and biological communities (phytoplankton, macrophytes and zebra mussel) was obtained both from existing data sets and new surveys. Results clearly indicate that the decrease in phosphorus is the main cause of a dramatic decrease in chlorophyll and large increase in water transparency, triggering the subsequent colonization of macrophytes in the river bed. A Generalized Linear Model analysis showed that the decrease in dissolved phosphorus had a relative importance 14 times higher than the increase in zebra mussel density to explain the variation of total chlorophyll. We suggest that the described changes in the lower Ebro River can be considered a novel ecosystem shift. This shift is triggering remarkable changes in the biological communities beyond the decrease of phytoplankton and the proliferation of macrophytes, such as massive colonization of Simulidae (black fly) and other changes in the benthic invertebrate communities that are currently investigated. - Highlights: Black-Right-Pointing-Pointer We show a regime shift in a large river from phytoplankton to macrophyte dominance. Black-Right-Pointing-Pointer Two main hypotheses are considered: nutrient decrease and zebra mussel grazing. Black-Right-Pointing-Pointer Phosphorus depletion is found to be the main cause of the phytoplankton decline. Black-Right-Pointing-Pointer We conclude that oligotrophication triggered the colonization of macrophytes. Black-Right-Pointing-Pointer This new regime shift in a river is similar to that described

  8. Facile fabrication of uniaxial nanopatterns on shape memory polymer substrates using a complete bottom-up approach

    Science.gov (United States)

    Chen, Zhongbi; Krishnaswamy, Sridhar

    2014-03-01

    In earlier work, we have demonstrated an assisted self-assembly fabrication method for unidirectional submicron patterns using pre-programmed shape memory polymers (SMP) as the substrate in an organic/inorganic bilayer structure. In this paper, we propose a complete bottom-up method for fabrication of uniaxial wrinkles whose wavelength is below 300 nm. The method starts with using the aforementioned self-assembled bi-layer wrinkled surface as the template to make a replica of surface wrinkles on a PDMS layer which is spin-coated on a pre-programmed SMP substrate. When the shape recovery of the substrate is triggered by heating it to its transition temperature, the substrate has been programmed in such a way that it shrinks uniaxially to return to its permanent shape. Consequently, the wrinkle wavelength on PDMS reduces accordingly. A subsequent contact molding process is carried out on the PDMS layer spin-coated on another pre-programmed SMP substrate, but using the wrinkled PDMS surface obtained in the previous step as the master. By activating the shape recovery of the substrate, the wrinkle wavelength is further reduced a second time in a similar fashion. Our experiments showed that the starting wavelength of 640 nm decreased to 290 nm after two cycles of recursive molding. We discuss the advantages and limitations of our recursive molding approach compared to the prevalent top-down fabrication methods represented by lithography. The present study is expected to o er a simple and cost-e ective fabrication method of nano-scale uniaxial wrinkle patterns with the potential for large-scale mass-production.

  9. A Bottom-up Energy Efficiency Improvement Roadmap for China’s Iron and Steel Industry up to 2050

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Qi [Northeastern Univ., Shenyang (China); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hasanbeigi, Ali [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Lynn [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lu, Hongyou [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Arens, Marlene [Fraunhofer Inst. for Systems and Innovation Research (ISI), Karlsruhe (Germany)

    2016-09-01

    Iron and steel manufacturing is energy intensive in China and in the world. China is the world largest steel producer accounting for around half of the world steel production. In this study, we use a bottom-up energy consumption model to analyze four steel-production and energy-efficiency scenarios and evaluate the potential for energy savings from energy-efficient technologies in China’s iron and steel industry between 2010 and 2050. The results show that China’s steel production will rise and peak in the year 2020 at 860 million tons (Mt) per year for the base-case scenario and 680 Mt for the advanced energy-efficiency scenario. From 2020 on, production will gradually decrease to about 510 Mt and 400 Mt in 2050, for the base-case and advanced scenarios, respectively. Energy intensity will decrease from 21.2 gigajoules per ton (G/t) in 2010 to 12.2 GJ/t and 9.9 GJ/t in 2050 for the base-case and advanced scenarios, respectively. In the near term, decreases in iron and steel industry energy intensity will come from adoption of energy-efficient technologies. In the long term, a shift in the production structure of China’s iron and steel industry, reducing the share of blast furnace/basic oxygen furnace production and increasing the share of electric-arc furnace production while reducing the use of pig iron as a feedstock to electric-arc furnaces will continue to reduce the sector’s energy consumption. We discuss barriers to achieving these energy-efficiency gains and make policy recommendations to support improved energy efficiency and a shift in the nature of iron and steel production in China.

  10. Wave disturbance overwhelms top-down and bottom-up control of primary production in California kelp forests.

    Science.gov (United States)

    Reed, Daniel C; Rassweiler, Andrew; Carr, Mark H; Cavanaugh, Kyle C; Malone, Daniel P; Siegel, David A

    2011-11-01

    We took advantage of regional differences in environmental forcing and consumer abundance to examine the relative importance of nutrient availability (bottom-up), grazing pressure (top-down), and storm waves (disturbance) in determining the standing biomass and net primary production (NPP) of the giant kelp Macrocystis pyrifera in central and southern California. Using a nine-year data set collected from 17 sites we show that, despite high densities of sea urchin grazers and prolonged periods of low nutrient availability in southern California, NPP by giant kelp was twice that of central California where nutrient concentrations were consistently high and sea urchins were nearly absent due to predation by sea otters. Waves associated with winter storms were consistently higher in central California, and the loss of kelp biomass to winter wave disturbance was on average twice that of southern California. These observations suggest that the more intense wave disturbance in central California limited NPP by giant kelp under otherwise favorable conditions. Regional patterns of interannual variation in NPP were similar to those of wave disturbance in that year-to-year variation in disturbance and NPP were both greater in southern California. Our findings provide strong evidence that regional differences in wave disturbance overwhelmed those of nutrient supply and grazing intensity to determine NPP by giant kelp. The important role of disturbance in controlling NPP revealed by our study is likely not unique to giant kelp forests, as vegetation dynamics in many systems are dominated by post-disturbance succession with climax communities being relatively uncommon. The effects of disturbance frequency may be easier to detect in giant kelp because it is fast growing and relatively short lived, with cycles of disturbance and recovery occurring on time scales of years. Much longer data sets (decades to centuries) will likely be needed to properly evaluate the role of

  11. River food webs: an integrative approach to bottom-up flow webs, top-down impact webs, and trophic position.

    Science.gov (United States)

    Benke, Arthur C

    2018-03-31

    The majority of food web studies are based on connectivity, top-down impacts, bottom-up flows, or trophic position (TP), and ecologists have argued for decades which is best. Rarely have any two been considered simultaneously. The present study uses a procedure that integrates the last three approaches based on taxon-specific secondary production and gut analyses. Ingestion flows are quantified to create a flow web and the same data are used to quantify TP for all taxa. An individual predator's impacts also are estimated using the ratio of its ingestion (I) of each prey to prey production (P) to create an I/P web. This procedure was applied to 41 invertebrate taxa inhabiting submerged woody habitat in a southeastern U.S. river. A complex flow web starting with five basal food resources had 462 flows >1 mg·m -2 ·yr -1 , providing far more information than a connectivity web. Total flows from basal resources to primary consumers/omnivores were dominated by allochthonous amorphous detritus and ranged from 1 to >50,000 mg·m -2 ·yr -1 . Most predator-prey flows were much lower (1,000  mg·m -2 ·yr -1 . The I/P web showed that 83% of individual predator impacts were weak (90%). Quantitative estimates of TP ranged from 2 to 3.7, contrasting sharply with seven integer-based trophic levels based on longest feeding chain. Traditional omnivores (TP = 2.4-2.9) played an important role by consuming more prey and exerting higher impacts on primary consumers than strict predators (TP ≥ 3). This study illustrates how simultaneous quantification of flow pathways, predator impacts, and TP together provide an integrated characterization of natural food webs. © 2018 by the Ecological Society of America.

  12. Regime shift from phytoplankton to macrophyte dominance in a large river: Top-down versus bottom-up effects

    International Nuclear Information System (INIS)

    Ibáñez, Carles; Alcaraz, Carles; Caiola, Nuno; Rovira, Albert; Trobajo, Rosa; Alonso, Miguel; Duran, Concha; Jiménez, Pere J.; Munné, Antoni; Prat, Narcís

    2012-01-01

    The lower Ebro River (Catalonia, Spain) has recently undergone a regime shift from a phytoplankton-dominated to a macrophyte-dominated system. This shift is well known in shallow lakes but apparently it has never been documented in rivers. Two initial hypotheses to explain the collapse of the phytoplankton were considered: a) the diminution of nutrients (bottom-up); b) the filtering effect due to the colonization of the zebra mussel (top-down). Data on water quality, hydrology and biological communities (phytoplankton, macrophytes and zebra mussel) was obtained both from existing data sets and new surveys. Results clearly indicate that the decrease in phosphorus is the main cause of a dramatic decrease in chlorophyll and large increase in water transparency, triggering the subsequent colonization of macrophytes in the river bed. A Generalized Linear Model analysis showed that the decrease in dissolved phosphorus had a relative importance 14 times higher than the increase in zebra mussel density to explain the variation of total chlorophyll. We suggest that the described changes in the lower Ebro River can be considered a novel ecosystem shift. This shift is triggering remarkable changes in the biological communities beyond the decrease of phytoplankton and the proliferation of macrophytes, such as massive colonization of Simulidae (black fly) and other changes in the benthic invertebrate communities that are currently investigated. - Highlights: ► We show a regime shift in a large river from phytoplankton to macrophyte dominance. ► Two main hypotheses are considered: nutrient decrease and zebra mussel grazing. ► Phosphorus depletion is found to be the main cause of the phytoplankton decline. ► We conclude that oligotrophication triggered the colonization of macrophytes. ► This new regime shift in a river is similar to that described in shallow lakes.

  13. Independent effects of bottom-up temporal expectancy and top-down spatial attention. An audiovisual study using rhythmic cueing.

    Directory of Open Access Journals (Sweden)

    Alexander eJones

    2015-01-01

    Full Text Available Selective attention to a spatial location has shown enhance perception and facilitate behaviour for events at attended locations. However, selection relies not only on where but also when an event occurs. Recently, interest has turned to how intrinsic neural oscillations in the brain entrain to rhythms in our environment, and, stimuli appearing in or out of synch with a rhythm have shown to modulate perception and performance. Temporal expectations created by rhythms and spatial attention are two processes which have independently shown to affect stimulus processing but it remains largely unknown how, and if, they interact. In four separate tasks, this study investigated the effects of voluntary spatial attention and bottom-up temporal expectations created by rhythms in both unimodal and crossmodal conditions. In each task the participant used an informative cue, either colour or pitch, to direct their covert spatial attention to the left or right, and respond as quickly as possible to a target. The lateralized target (visual or auditory was then presented at the attended or unattended side. Importantly, although not task relevant, the cue was a rhythm of either flashes or beeps. The target was presented in or out of sync (early or late with the rhythmic cue. The results showed participants were faster responding to spatially attended compared to unattended targets in all tasks. Moreover, there was an effect of rhythmic cueing upon response times in both unimodal and crossmodal conditions. Responses were faster to targets presented in sync with the rhythm compared to when they appeared too early in both crossmodal tasks. That is, rhythmic stimuli in one modality influenced the temporal expectancy in the other modality, suggesting temporal expectancies created by rhythms are crossmodal. Interestingly, there was no interaction between top-down spatial attention and rhythmic cueing in any task suggesting these two processes largely influenced

  14. Fixation and saliency during search of natural scenes: the case of visual agnosia.

    Science.gov (United States)

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2009-07-01

    Models of eye movement control in natural scenes often distinguish between stimulus-driven processes (which guide the eyes to visually salient regions) and those based on task and object knowledge (which depend on expectations or identification of objects and scene gist). In the present investigation, the eye movements of a patient with visual agnosia were recorded while she searched for objects within photographs of natural scenes and compared to those made by students and age-matched controls. Agnosia is assumed to disrupt the top-down knowledge available in this task, and so may increase the reliance on bottom-up cues. The patient's deficit in object recognition was seen in poor search performance and inefficient scanning. The low-level saliency of target objects had an effect on responses in visual agnosia, and the most salient region in the scene was more likely to be fixated by the patient than by controls. An analysis of model-predicted saliency at fixation locations indicated a closer match between fixations and low-level saliency in agnosia than in controls. These findings are discussed in relation to saliency-map models and the balance between high and low-level factors in eye guidance.

  15. Bottom-up effects on top-down regulation of a floating aquatic plant by two weevil species: the context-specific nature of biological control

    Science.gov (United States)

    1. Plant nutrition (bottom-up effects) impacts a plant’s ability to sustain herbivory (top-down effects) and affects phytophagous insect fecundity. These factors potentially confound efficacy predictions for biological control projects. We investigated the relative importance of these two forces wi...

  16. Flow cell coupled dynamic light scattering for real-time monitoring of nanoparticle size during liquid phase bottom-up synthesis

    NARCIS (Netherlands)

    Meulendijks, N.; van Ee, R.; Stevens, R.; Mourad, M.; Verheijen, M.A.; Kambly, N.; Armenta, R.; Buskens, P.

    2018-01-01

    To tailor the properties of nanoparticles and nanocomposites, precise control over particle size is of vital importance. Real-time monitoring of particle size during bottom-up synthesis in liquids would allow a detailed study of particle nucleation and growth, which provides valuable insights in the

  17. The changing contribution of top-down and bottom-up limitation of mesopredators during 220 years of land use and climate change

    NARCIS (Netherlands)

    Pasanen-Mortensen, Marianne; Elmhagen, Bodil; Lindén, Harto; Bergström, Roger; Wallgren, Märtha; van der Velde, Ype; Cousins, Sara A.O.

    2017-01-01

    Apex predators may buffer bottom-up driven ecosystem change, as top-down suppression may dampen herbivore and mesopredator responses to increased resource availability. However, theory suggests that for this buffering capacity to be realized, the equilibrium abundance of apex predators must

  18. Bottom-up effects on biomass versus top-down effects on identity: a multiple-lake fish community manipulation experiment

    NARCIS (Netherlands)

    Lemmens, P.; Declerck, S.A.J.; Tuytens, K.; Vanderstukken, M.; De Meester, L.

    2018-01-01

    The extent to which ecosystems are regulated by top-down relative to bottom-up control has been a dominant paradigm in ecology for many decades. For lakes, it has been shown that predation by fish is an important determinant of variation in zooplankton and phytoplankton community characteristics.

  19. Illuminating the dark side of DOM: A bottom up approach to understanding the structure and composition of DOM.

    Science.gov (United States)

    Zito, P.; Tarr, M. A.; Spencer, R. G.; Podgorski, D. C.

    2017-12-01

    Dissolved organic matter (DOM) is one of the most complex natural mixtures on Earth. It is generally comprised of hydrocarbons incorporating a diverse subset of oxygen-containing functional groups along with a small amount of nitrogen, sulfur and phosphorous heteroatoms all of which make it very difficult to chromatographically separate. The only way to directly characterize and quantify these structural and compositional changes is by separating the DOM continuum into defined bins of structure and chemistry. In this study, we take an alternate bottom-up approach that utilizes petroleum to work toward identifying the molecular structures of DOM. Although petroleum is the most structurally diverse mixture in nature, it is almost exclusively comprised of hydrocarbons with only trace quantities of heteroatoms, including oxygen. Here, crude oil was chromatographically separated into bins based on the number of aromatic rings to be used as a starting carbon source. Photochemically produced DOM from these aromatic ring bins provides unique opportunities to gain insight in the compositional controls associated with transport, processing and fate of DOM in natural systems. Here, we present EEMs data from individual ring fractions that were subjected to 24 hours of sunlight to use as a model to fingerprint specific aromatic regions in the DOM fraction. Results illustrate that the 1-, 2-, 3-, 4- and 5- ring fractions exhibit a wide range of structurally dependent excitation and emission spectra. A well-known red-shift in the emission and excitation occurs as the number of rings increase. In order to understand changes in the elemental composition of the data, ultra high-resolution mass spectrometry was used to obtain molecular level information. Together, these data will provide a tool to help understand the relationship of the composition and structure of DOM released into the environment in terms of aromaticity. It is well known that aromaticity is an important indicator

  20. Research on ethics in two large Human Biomonitoring projects ECNIS and NewGeneris: a bottom up approach

    Directory of Open Access Journals (Sweden)

    Casteleyn Ludwine

    2008-01-01

    Full Text Available Abstract Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed. This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach

  1. Research on ethics in two large Human Biomonitoring projects ECNIS and NewGeneris: a bottom up approach.

    Science.gov (United States)

    Dumez, Birgit; Van Damme, Karel; Casteleyn, Ludwine

    2008-06-05

    Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed.This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach. This will not only increase

  2. Assisted editing od SensorML with EDI. A bottom-up scenario towards the definition of sensor profiles.

    Science.gov (United States)

    Oggioni, Alessandro; Tagliolato, Paolo; Fugazza, Cristiano; Bastianini, Mauro; Pavesi, Fabio; Pepe, Monica; Menegon, Stefano; Basoni, Anna; Carrara, Paola

    2015-04-01

    -product of this ongoing work is currently constituting an archive of predefined sensor descriptions. This information is being collected in order to further ease metadata creation in the next phase of the project. Users will be able to choose among a number of sensor and sensor platform prototypes: These will be specific instances on which it will be possible to define, in a bottom-up approach, "sensor profiles". We report on the outcome of this activity.

  3. Stochastic error model corrections to improve the performance of bottom-up precipitation products for hydrologic applications

    Science.gov (United States)

    Maggioni, V.; Massari, C.; Ciabatta, L.; Brocca, L.

    2016-12-01

    Accurate quantitative precipitation estimation is of great importance for water resources management, agricultural planning, and forecasting and monitoring of natural hazards such as flash floods and landslides. In situ observations are limited around the Earth, especially in remote areas (e.g., complex terrain, dense vegetation), but currently available satellite precipitation products are able to provide global precipitation estimates with an accuracy that depends upon many factors (e.g., type of storms, temporal sampling, season, etc.). The recent SM2RAIN approach proposes to estimate rainfall by using satellite soil moisture observations. As opposed to traditional satellite precipitation methods, which sense cloud properties to retrieve instantaneous estimates, this new bottom-up approach makes use of two consecutive soil moisture measurements for obtaining an estimate of the fallen precipitation within the interval between two satellite overpasses. As a result, the nature of the measurement is different and complementary to the one of classical precipitation products and could provide a different valid perspective to substitute or improve current rainfall estimates. However, uncertainties in the SM2RAIN product are still not well known and could represent a limitation in utilizing this dataset for hydrological applications. Therefore, quantifying the uncertainty associated with SM2RAIN is necessary for enabling its use. The study is conducted over the Italian territory for a 5-yr period (2010-2014). A number of satellite precipitation error properties, typically used in error modeling, are investigated and include probability of detection, false alarm rates, missed events, spatial correlation of the error, and hit biases. After this preliminary uncertainty analysis, the potential of applying the stochastic rainfall error model SREM2D to correct SM2RAIN and to improve its performance in hydrologic applications is investigated. The use of SREM2D for

  4. The social perceptual salience effect.

    Science.gov (United States)

    Inderbitzin, Martin P; Betella, Alberto; Lanatá, Antonio; Scilingo, Enzo P; Bernardet, Ulysses; Verschure, Paul F M J

    2013-02-01

    Affective processes appraise the salience of external stimuli preparing the agent for action. So far, the relationship between stimuli, affect, and action has been mainly studied in highly controlled laboratory conditions. In order to find the generalization of this relationship to social interaction, we assess the influence of the salience of social stimuli on human interaction. We constructed reality ball game in a mixed reality space where pairs of people collaborated in order to compete with an opposing team. We coupled the players with team members with varying social salience by using both physical and virtual representations of remote players (i.e., avatars). We observe that, irrespective of the team composition, winners and losers display significantly different inter- and intrateam spatial behaviors. We show that subjects regulate their interpersonal distance to both virtual and physical team members in similar ways, but in proportion to the vividness of the stimulus. As an independent validation of this social salience effect, we show that this behavioral effect is also displayed in physiological correlates of arousal. In addition, we found a strong correlation between performance, physiology, and the subjective reports of the subjects. Our results show that proxemics is consistent with affective responses, confirming the existence of a social salience effect. This provides further support for the so-called law of apparent reality, and it generalizes it to the social realm, where it can be used to design more efficient social artifacts. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  5. Reprint of "Two-stage sparse coding of region covariance via Log-Euclidean kernels to detect saliency".

    Science.gov (United States)

    Zhang, Ying-Ying; Yang, Cai; Zhang, Ping

    2017-08-01

    In this paper, we present a novel bottom-up saliency detection algorithm from the perspective of covariance matrices on a Riemannian manifold. Each superpixel is described by a region covariance matrix on Riemannian Manifolds. We carry out a two-stage sparse coding scheme via Log-Euclidean kernels to extract salient objects efficiently. In the first stage, given background dictionary on image borders, sparse coding of each region covariance via Log-Euclidean kernels is performed. The reconstruction error on the background dictionary is regarded as the initial saliency of each superpixel. In the second stage, an improvement of the initial result is achieved by calculating reconstruction errors of the superpixels on foreground dictionary, which is extracted from the first stage saliency map. The sparse coding in the second stage is similar to the first stage, but is able to effectively highlight the salient objects uniformly from the background. Finally, three post-processing methods-highlight-inhibition function, context-based saliency weighting, and the graph cut-are adopted to further refine the saliency map. Experiments on four public benchmark datasets show that the proposed algorithm outperforms the state-of-the-art methods in terms of precision, recall and mean absolute error, and demonstrate the robustness and efficiency of the proposed method. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Energy-environment policy modeling of endogenous technological change with personal vehicles. Combining top-down and bottom-up methods

    International Nuclear Information System (INIS)

    Jaccard, Mark; Murphy, Rose; Rivers, Nic

    2004-01-01

    The transportation sector offers substantial potential for greenhouse gas (GHG) emission abatement, but widely divergent cost estimates complicate policy making; energy-economy policy modelers apply top-down and bottom-up cost definitions and different assumptions about future technologies and the preferences of firms and households. Our hybrid energy-economy policy model is technology-rich, like a bottom-up model, but has empirically estimated behavioral parameters for risk and technology preferences, like a top-down model. Unlike typical top-down models, however, it simulates technological change endogenously with functions that relate the financial costs of technologies to cumulative production and adjust technology preferences as market shares change. We apply it to the choice of personal vehicles to indicate, first, the effect on cost estimates of divergent cost definitions and, second, the possible response to policies that require a minimum market share for low emission vehicles

  7. Abundance and size structure of planktonic protist communities in a Neotropical floodplain: effects of top-down and bottom-up controls

    Directory of Open Access Journals (Sweden)

    Bianca Ramos de Meira

    2017-12-01

    Full Text Available Abstract: Aim: We aimed to assess the influence of bottom-up and top-down control mechanisms on the abundance and size structure of protist communities (heterotrophic flagellates and ciliates. We formulated the following hypothesis: bottom-up control mechanisms, related to the availability of resources in the environment, are responsible for structuring the abundance of these communities, whereas top-down control mechanisms, related to predation effects, determine the size pattern of these organisms. Methods Samples for planktonic organisms were taken in 20 shallow lakes belonging to the upper Paraná River floodplain. We evaluated linear regression models to select the best model which predicts the patterns observed according to Akaike Information Criterion. Results The best models selected to explain the abundance of heterotrophic flagellates included negative relations with picophytoplankton abundance and positive with rotifers abundance, while for their size structure, negative relationships were found with heterotrophic bacteria, ciliates and rotifers biovolumes. In relation to the ciliates, their abundances were positively related to the rotifers and picophytoplankton abundances and negatively with the heterotrophic bacteria abundance. On the other hand, for the size structure, the best models selected strong negative relations with the microcrustaceans biovolumes, in addition to relations with the different fractions of the phytoplankton. Conclusion For both flagellates and ciliates, their abundance is being mainly regulated by a bottom up control mechanism, whereas for the size structure the results showed that both food resources and predators were important, indicating that bottom-up and top-down mechanisms act simultaneously in determining the size of these microorganisms.

  8. Disentangling the Role of Cortico-Basal Ganglia Loops in Top-Down and Bottom-Up Visual Attention: An Investigation of Attention Deficits in Parkinson Disease.

    Science.gov (United States)

    Tommasi, Giorgio; Fiorio, Mirta; Yelnik, Jérôme; Krack, Paul; Sala, Francesca; Schmitt, Emmanuelle; Fraix, Valérie; Bertolasi, Laura; Le Bas, Jean-François; Ricciardi, Giuseppe Kenneth; Fiaschi, Antonio; Theeuwes, Jan; Pollak, Pierre; Chelazzi, Leonardo

    2015-06-01

    It is solidly established that top-down (goal-driven) and bottom-up (stimulus-driven) attention mechanisms depend on distributed cortical networks, including prefrontal and frontoparietal regions. On the other hand, it is less clear whether the BG also contribute to one or the other of these mechanisms, or to both. The current study was principally undertaken to clarify this issue. Parkinson disease (PD), a neurodegenerative disorder primarily affecting the BG, has proven to be an effective model for investigating the contribution of the BG to different brain functions; therefore, we set out to investigate deficits of top-down and bottom-up attention in a selected cohort of PD patients. With this objective in mind, we compared the performance on three computerized tasks of two groups of 12 parkinsonian patients (assessed without any treatment), one otherwise pharmacologically treated and the other also surgically treated, with that of a group of controls. The main behavioral tool for our study was an attentional capture task, which enabled us to tap the competition between top-down and bottom-up mechanisms of visual attention. This task was suitably combined with a choice RT and a simple RT task to isolate any specific deficit of attention from deficits in motor response selection and initiation. In the two groups of patients, we found an equivalent increase of attentional capture but also comparable delays in target selection in the absence of any salient distractor (reflecting impaired top-down mechanisms) and movement initiation compared with controls. In contrast, motor response selection processes appeared to be prolonged only in the operated patients. Our results confirm that the BG are involved in both motor and cognitive domains. Specifically, damage to the BG, as it occurs in PD, leads to a distinct deficit of top-down control of visual attention, and this can account, albeit indirectly, for the enhancement of attentional capture, reflecting weakened

  9. Top-down and bottom-up influences on the left ventral occipito-temporal cortex during visual word recognition: an analysis of effective connectivity.

    Science.gov (United States)

    Schurz, Matthias; Kronbichler, Martin; Crone, Julia; Richlan, Fabio; Klackl, Johannes; Wimmer, Heinz

    2014-04-01

    The functional role of the left ventral occipito-temporal cortex (vOT) in visual word processing has been studied extensively. A prominent observation is higher activation for unfamiliar but pronounceable letter strings compared to regular words in this region. Some functional accounts have interpreted this finding as driven by top-down influences (e.g., Dehaene and Cohen [2011]: Trends Cogn Sci 15:254-262; Price and Devlin [2011]: Trends Cogn Sci 15:246-253), while others have suggested a difference in bottom-up processing (e.g., Glezer et al. [2009]: Neuron 62:199-204; Kronbichler et al. [2007]: J Cogn Neurosci 19:1584-1594). We used dynamic causal modeling for fMRI data to test bottom-up and top-down influences on the left vOT during visual processing of regular words and unfamiliar letter strings. Regular words (e.g., taxi) and unfamiliar letter strings of pseudohomophones (e.g., taksi) were presented in the context of a phonological lexical decision task (i.e., "Does the item sound like a word?"). We found no differences in top-down signaling, but a strong increase in bottom-up signaling from the occipital cortex to the left vOT for pseudohomophones compared to words. This finding can be linked to functional accounts which assume that the left vOT contains neurons tuned to complex orthographic features such as morphemes or words [e.g., Dehaene and Cohen [2011]: Trends Cogn Sci 15:254-262; Kronbichler et al. [2007]: J Cogn Neurosci 19:1584-1594]: For words, bottom-up signals converge onto a matching orthographic representation in the left vOT. For pseudohomophones, the propagated signals do not converge, but (partially) activate multiple orthographic word representations, reflected in increased effective connectivity. Copyright © 2013 Wiley Periodicals, Inc.

  10. Integrated Bottom-Up and Top-Down Liquid Chromatography-Mass Spectrometry for Characterization of Recombinant Human Growth Hormone Degradation Products.

    Science.gov (United States)

    Wang, Yu Annie; Wu, Di; Auclair, Jared R; Salisbury, Joseph P; Sarin, Richa; Tang, Yang; Mozdzierz, Nicholas J; Shah, Kartik; Zhang, Anna Fan; Wu, Shiaw-Lin; Agar, Jeffery N; Love, J Christopher; Love, Kerry R; Hancock, William S

    2017-12-05

    With the advent of biosimilars to the U.S. market, it is important to have better analytical tools to ensure product quality from batch to batch. In addition, the recent popularity of using a continuous process for production of biopharmaceuticals, the traditional bottom-up method, alone for product characterization and quality analysis is no longer sufficient. Bottom-up method requires large amounts of material for analysis and is labor-intensive and time-consuming. Additionally, in this analysis, digestion of the protein with enzymes such as trypsin could induce artifacts and modifications which would increase the complexity of the analysis. On the other hand, a top-down method requires a minimum amount of sample and allows for analysis of the intact protein mass and sequence generated from fragmentation within the instrument. However, fragmentation usually occurs at the N-terminal and C-terminal ends of the protein with less internal fragmentation. Herein, we combine the use of the complementary techniques, a top-down and bottom-up method, for the characterization of human growth hormone degradation products. Notably, our approach required small amounts of sample, which is a requirement due to the sample constraints of small scale manufacturing. Using this approach, we were able to characterize various protein variants, including post-translational modifications such as oxidation and deamidation, residual leader sequence, and proteolytic cleavage. Thus, we were able to highlight the complementarity of top-down and bottom-up approaches, which achieved the characterization of a wide range of product variants in samples of human growth hormone secreted from Pichia pastoris.

  11. Top-down and bottom-up lipidomic analysis of rabbit lipoproteins under different metabolic conditions using flow field-flow fractionation, nanoflow liquid chromatography and mass spectrometry.

    Science.gov (United States)

    Byeon, Seul Kee; Kim, Jin Yong; Lee, Ju Yong; Chung, Bong Chul; Seo, Hong Seog; Moon, Myeong Hee

    2015-07-31

    This study demonstrated the performances of top-down and bottom-up approaches in lipidomic analysis of lipoproteins from rabbits raised under different metabolic conditions: healthy controls, carrageenan-induced inflammation, dehydration, high cholesterol (HC) diet, and highest cholesterol diet with inflammation (HCI). In the bottom-up approach, the high density lipoproteins (HDL) and the low density lipoproteins (LDL) were size-sorted and collected on a semi-preparative scale using a multiplexed hollow fiber flow field-flow fractionation (MxHF5), followed by nanoflow liquid chromatography-ESI-MS/MS (nLC-ESI-MS/MS) analysis of the lipids extracted from each lipoprotein fraction. In the top-down method, size-fractionated lipoproteins were directly infused to MS for quantitative analysis of targeted lipids using chip-type asymmetrical flow field-flow fractionation-electrospray ionization-tandem mass spectrometry (cAF4-ESI-MS/MS) in selected reaction monitoring (SRM) mode. The comprehensive bottom-up analysis yielded 122 and 104 lipids from HDL and LDL, respectively. Rabbits within the HC and HCI groups had lipid patterns that contrasted most substantially from those of controls, suggesting that HC diet significantly alters the lipid composition of lipoproteins. Among the identified lipids, 20 lipid species that exhibited large differences (>10-fold) were selected as targets for the top-down quantitative analysis in order to compare the results with those from the bottom-up method. Statistical comparison of the results from the two methods revealed that the results were not significantly different for most of the selected species, except for those species with only small differences in concentration between groups. The current study demonstrated that top-down lipid analysis using cAF4-ESI-MS/MS is a powerful high-speed analytical platform for targeted lipidomic analysis that does not require the extraction of lipids from blood samples. Copyright © 2015 Elsevier B

  12. Multi-scale image segmentation method with visual saliency constraints and its application

    Science.gov (United States)

    Chen, Yan; Yu, Jie; Sun, Kaimin

    2018-03-01

    Object-based image analysis method has many advantages over pixel-based methods, so it is one of the current research hotspots. It is very important to get the image objects by multi-scale image segmentation in order to carry out object-based image analysis. The current popular image segmentation methods mainly share the bottom-up segmentation principle, which is simple to realize and the object boundaries obtained are accurate. However, the macro statistical characteristics of the image areas are difficult to be taken into account, and fragmented segmentation (or over-segmentation) results are difficult to avoid. In addition, when it comes to information extraction, target recognition and other applications, image targets are not equally important, i.e., some specific targets or target groups with particular features worth more attention than the others. To avoid the problem of over-segmentation and highlight the targets of interest, this paper proposes a multi-scale image segmentation method with visually saliency graph constraints. Visual saliency theory and the typical feature extraction method are adopted to obtain the visual saliency information, especially the macroscopic information to be analyzed. The visual saliency information is used as a distribution map of homogeneity weight, where each pixel is given a weight. This weight acts as one of the merging constraints in the multi- scale image segmentation. As a result, pixels that macroscopically belong to the same object but are locally different can be more likely assigned to one same object. In addition, due to the constraint of visual saliency model, the constraint ability over local-macroscopic characteristics can be well controlled during the segmentation process based on different objects. These controls will improve the completeness of visually saliency areas in the segmentation results while diluting the controlling effect for non- saliency background areas. Experiments show that this method works

  13. Cost development of future technologies for power generation-A study based on experience curves and complementary bottom-up assessments

    International Nuclear Information System (INIS)

    Neij, Lena

    2008-01-01

    Technology foresight studies have become an important tool in identifying realistic ways of reducing the impact of modern energy systems on the climate and the environment. Studies on the future cost development of advanced energy technologies are of special interest. One approach widely adopted for the analysis of future cost is the experience curve approach. The question is, however, how robust this approach is, and which experience curves should be used in energy foresight analysis. This paper presents an analytical framework for the analysis of future cost development of new energy technologies for electricity generation; the analytical framework is based on an assessment of available experience curves, complemented with bottom-up analysis of sources of cost reductions and, for some technologies, judgmental expert assessments of long-term development paths. The results of these three methods agree in most cases, i.e. the cost (price) reductions described by the experience curves match the incremental cost reduction described in the bottom-up analysis and the judgmental expert assessments. For some technologies, the bottom-up analysis confirms large uncertainties in future cost development not captured by the experience curves. Experience curves with a learning rate ranging from 0% to 20% are suggested for the analysis of future cost development

  14. Modeling Technical Change in Energy System Analysis: Analyzing the Introduction of Learning-by-Doing in Bottom-up Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Berglund, Christer; Soederholm, Patrik [Luleaa Univ. of Technology (Sweden). Div. of Economics

    2005-02-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aiming at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options - which is absent in many top-down models - they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they fail in capturing strategic technology diffusion behavior in the energy sector, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). For these reasons bottom-up and top-down models with induced technical change should not be viewed as substitutes but rather as complements.

  15. Modeling technical change in energy system analysis: analyzing the introduction of learning-by-doing in bottom-up energy models

    International Nuclear Information System (INIS)

    Berglund, Christer; Soederholm, Patrik

    2006-01-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aimed at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options-which is absent in many top-down models-they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they often fail in capturing strategic technology diffusion behavior in the energy sector as well as the energy sector's endogenous responses to policy, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). Some suggestions on how innovation and diffusion modeling in bottom-up analysis can be improved are put forward

  16. Stakeholder Salience in ERP Projects

    OpenAIRE

    Salhotra, Eashan

    2014-01-01

    The aim of this study is to examine stakeholder involvement in an Enterprise Resource Planning (ERP) System project that involves implementation and improvement of the implemented system. The study targets stakeholders, their classification, and their degree of importance during different phases of an ERP project life cycle, i.e. planning, implementation, stabilisation and improvement. The study shows that stakeholder involvement and their salience vary along the ERP project life cycle a...

  17. Age-Related Inter-region EEG Coupling Changes during the Control of Bottom-up and Top-down Attention

    Directory of Open Access Journals (Sweden)

    Ling eLi

    2015-12-01

    Full Text Available We investigated age-related changes in electroencephalographic (EEG coupling of theta-, alpha-, and beta-frequency bands during bottom-up and top-down attention. Arrays were presented with either automatic pop-out (bottom-up or effortful search (top-down behavior to younger and older participants. The phase-locking value (PLV was used to estimate coupling strength between scalp recordings. Behavioral performance decreased with age, with a greater age-related decline in accuracy for the search than for the pop-out condition. Aging was associated with a declined coupling strength of theta and alpha frequency bands, with a greater age-related decline in whole-brain coupling values for the search than for the pop-out condition. Specifically, prefronto-frontal coupling in theta- and alpha-bands, fronto-parietal and parieto-occipital couplings in beta-band for younger group showed a right hemispheric dominance, which was reduced with aging to compensate for the inhibitory dysfunction. While pop-out target detection was mainly associated with greater parieto-occipital beta-coupling strength compared to search condition regardless of aging. Furthermore, prefronto-frontal coupling in theta-, alpha- and beta-bands, and parieto-occipital coupling in beta-band functioned as predictors of behavior for both groups. Taken together these findings provide evidence that prefronto-frontal coupling of theta-, alpha-, and beta-bands may serve as a possible basis of aging during visual attention, while parieto-occipital coupling in beta-band could serve for a bottom-up function and be vulnerable to top-down attention control for younger and older groups.

  18. Is the preference of natural versus man-made scenes driven by bottom-up processing of the visual features of nature?

    Directory of Open Access Journals (Sweden)

    Omid eKardan

    2015-04-01

    Full Text Available Previous research has shown that viewing images of nature scenes can have a beneficial effect on memory, attention and mood. In this study we aimed to determine whether the preference of natural versus man-made scenes is driven by bottom-up processing of the low-level visual features of nature. We used participants’ ratings of perceived naturalness as well as aesthetic preference for 307 images with varied natural and urban content. We then quantified ten low-level image features for each image (a combination of spatial and color properties. These features were used to predict aesthetic preference in the images, as well as to decompose perceived naturalness to its predictable (modelled by the low-level visual features and non-modelled aspects. Interactions of these separate aspects of naturalness with the time it took to make a preference judgment showed that naturalness based on low-level features related more to preference when the judgment was faster (bottom-up. On the other hand perceived naturalness that was not modelled by low-level features was related more to preference when the judgment was slower. A quadratic discriminant classification analysis showed how relevant each aspect of naturalness (modelled and non-modelled was to predicting preference ratings, as well as the image features on their own. Finally, we compared the effect of color-related and structure-related modelled naturalness, and the remaining unmodelled naturalness in predicting aesthetic preference. In summary bottom-up (color and spatial properties of natural images captured by our features and the non-modelled naturalness are important to aesthetic judgments of natural and man-made scenes, with each predicting unique variance.

  19. Using Top-down and Bottom-up Costing Approaches in LMICs: The Case for Using Both to Assess the Incremental Costs of New Technologies at Scale.

    Science.gov (United States)

    Cunnama, Lucy; Sinanovic, Edina; Ramma, Lebogang; Foster, Nicola; Berrie, Leigh; Stevens, Wendy; Molapo, Sebaka; Marokane, Puleng; McCarthy, Kerrigan; Churchyard, Gavin; Vassall, Anna

    2016-02-01

    Estimating the incremental costs of scaling-up novel technologies in low-income and middle-income countries is a methodologically challenging and substantial empirical undertaking, in the absence of routine cost data collection. We demonstrate a best practice pragmatic approach to estimate the incremental costs of new technologies in low-income and middle-income countries, using the example of costing the scale-up of Xpert Mycobacterium tuberculosis (MTB)/resistance to riframpicin (RIF) in South Africa. We estimate costs, by applying two distinct approaches of bottom-up and top-down costing, together with an assessment of processes and capacity. The unit costs measured using the different methods of bottom-up and top-down costing, respectively, are $US16.9 and $US33.5 for Xpert MTB/RIF, and $US6.3 and $US8.5 for microscopy. The incremental cost of Xpert MTB/RIF is estimated to be between $US14.7 and $US17.7. While the average cost of Xpert MTB/RIF was higher than previous studies using standard methods, the incremental cost of Xpert MTB/RIF was found to be lower. Costs estimates are highly dependent on the method used, so an approach, which clearly identifies resource-use data collected from a bottom-up or top-down perspective, together with capacity measurement, is recommended as a pragmatic approach to capture true incremental cost where routine cost data are scarce. © 2016 The Authors. Health Economics published by John Wiley & Sons Ltd.

  20. ANÁLISE DAS ESTRATÉGIAS BOTTOM-UP EM LIVROS DIDÁTICOS DE PORTUGUÊS PARA ESTRANGEIROS SEGUNDO A TEORIA DA ATIVIDADE

    OpenAIRE

    Cândida Martins Pinto

    2008-01-01

    Este trabalho tem como objetivo investigar como autores de livros didáticos de português para estrangeiros trabalham a habilidade de leitura no que se refere ao uso das estratégias bottom-up. A fim de verificar como se dá a relação entre habilidade de leitura e competência lingüística, escolhi analisar as seções de leitura de quatro livros didáticos de Português para estrangeiros que possuem livros do professor e estão sendo usados atualmente em sala de aula. Nesse contexto, a metodolog...

  1. Analysis of top-down and bottom-up North American CO2 and CH4 emissions estimates in the second State of the Carbon Cycle Report

    Science.gov (United States)

    Miller, J. B.; Jacobson, A. R.; Bruhwiler, L.; Michalak, A.; Hayes, D. J.; Vargas, R.

    2017-12-01

    In just ten years since publication of the original State of the Carbon Cycle Report in 2007, global CO2 concentrations have risen by more than 22 ppm to 405 ppm. This represents 18% of the increase over preindustrial levels of 280 ppm. This increase is being driven unequivocally by fossil fuel combustion with North American emissions comprising roughly 20% of the global total over the past decade. At the global scale, we know by comparing well-known fossil fuel inventories and rates of atmospheric CO2 increase that about half of all emissions are absorbed at Earth's surface. For North America, however, we can not apply a simple mass balance to determine sources and sinks. Instead, contributions from ecosystems must be estimated using top-down and bottom-up methods. SOCCR-2 estimates North American net CO2 uptake from ecosystems using bottom-up (inventory) methods as 577 +/- 433 TgC/yr and 634 +/- 288 TgC/yr from top-down atmospheric inversions. Although the global terrestrial carbon sink is not precisely known, these values represent possibly 30% of the global values. As with net sink estimates reported in SOCCR, these new top-down and bottom-up estimates are statistically consistent with one another. However, the uncertainties on each of these estimates are now substantially smaller, giving us more confidence about where the truth lies. Atmospheric inversions also yield estimates of interannual variations (IAV) in CO2 and CH4 fluxes. Our syntheses suggest that IAV of ecosystem CO2 fluxes is of order 100 TgC/yr, mainly originating in the conterminous US, with lower variability in boreal and arctic regions. Moreover, this variability is much larger than for inventory-based fluxes reported by the US to the UNFCCC. Unlike CO2, bottom-up CH4 emissions are larger than those derived from large-scale atmospheric data, with the continental discrepancy resulting primarily from differences in arctic and boreal regions. In addition to the current state of the science, we

  2. Influence of top-down and bottom-up manipulations on the R-BT065 subcluster of Betaproteobacteria, an abundant group in bacterioplankton of a freshwater reservoir

    Czech Academy of Sciences Publication Activity Database

    Šimek, Karel; Horňák, Karel; Jezbera, Jan; Mašín, Michal; Nedoma, Jiří; Gasol, J. M. .; Schauer, M.

    2005-01-01

    Roč. 71, č. 5 (2005), s. 2381-2390 ISSN 0099-2240 R&D Projects: GA ČR(CZ) GA206/05/0007; GA ČR(CZ) GA206/02/0003 Grant - others:CSIC(ES) DGICYT REN2001-2120/MAR; EU(XE) EVK3-CT-2002-00078; Austrian Science Foundation(AT) P15655 Institutional research plan: CEZ:AV0Z60170517 Keywords : reservoir * top-down and bottom-up control * microbial food webs * bacterivory * bacterial community composition Subject RIV: EE - Microbiology, Virology Impact factor: 3.818, year: 2005

  3. The baseline in bottom-up energy efficiency and saving calculations - A concept for its formalisation and a discussion of relevant options

    International Nuclear Information System (INIS)

    Reichl, Johannes; Kollmann, Andrea

    2011-01-01

    One of the central variables in bottom-up energy efficiency and saving calculations is the energy consumption baseline. In the evaluation of energy efficiency measures, developing this baseline is a challenging task, which may involve serious problems, especially if the energy service of the analysed subject has changed while the energy efficiency measure was being implemented. In this paper we present a formalised concept of the process of developing the baseline that is flexible enough to deal with various difficulties, such as changes in the levels of the energy services involved. We also discuss the most relevant options for deriving the necessary variables.

  4. Tensor-based spatiotemporal saliency detection

    Science.gov (United States)

    Dou, Hao; Li, Bin; Deng, Qianqian; Zhang, LiRui; Pan, Zhihong; Tian, Jinwen

    2018-03-01

    This paper proposes an effective tensor-based spatiotemporal saliency computation model for saliency detection in videos. First, we construct the tensor representation of video frames. Then, the spatiotemporal saliency can be directly computed by the tensor distance between different tensors, which can preserve the complete temporal and spatial structure information of object in the spatiotemporal domain. Experimental results demonstrate that our method can achieve encouraging performance in comparison with the state-of-the-art methods.

  5. Can bottom-up processes of attention be a source of 'interference' in situations where top-down control of attention is crucial?

    Science.gov (United States)

    Nikolla, Dritan; Edgar, Graham; Catherwood, Dianne; Matthews, Tristan

    2018-02-01

    In this study, we investigate whether emotionally engaged bottom-up processes of attention can be a source of 'interference' in situations where top-down control of attention is necessary. Participants were asked to monitor and report on a video of a war scenario showing a developing battle in two conditions: emotionally positive and emotionally negative. Half of the participants (n = 15) were exposed to task-irrelevant pictures of positive emotional valence embedded within the scenario; the other half were exposed to task-irrelevant pictures of negative emotional valence. Sensitivity and Bias scores were calculated using signal detection theory. Overall, task accuracy scores were dependent upon the valence; negative pictures had an adverse effect on performance, whereas positive pictures improved performance. We concluded that negative emotional pictures interfered with top-down control of attention by attracting competing bottom-up processes of attention. We found the opposite effect for positive emotional stimuli. © 2017 The British Psychological Society.

  6. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    Directory of Open Access Journals (Sweden)

    Matthieu de Stampa

    2010-02-01

    Full Text Available Background: Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs. Purpose: To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Results: In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher followed by a double one (clinician and managers of services in the implementation phase. Conclusion: The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements

  7. Measurements of traffic emissions over a medium-sized city using long-path measurements and comparison against bottom-up city estimates

    Science.gov (United States)

    Waxman, E.; Cossel, K.; Truong, G. W.; Giorgetta, F.; Swann, W.; Coddington, I.; Newbury, N.

    2017-12-01

    Understanding emissions from cities is increasingly important as a growing fraction of the world's population moves to cities. Here we use a novel technology, dual frequency comb spectroscopy, to measure city emissions using a long outdoor open path. We simultaneously measured CO2, CH4, and H2O over the city of Boulder, Colorado and over a clean-air reference path for two months in the fall of 2016. Because of the spatial coverage of our measurements, the layout of the city and power plant locations, and the predominant wind direction, our measurements primarily pick up vehicle emissions. We choose two days with consistent CO2 enhancements over the city relative to the reference path and use a simple 0-D box model to calculate city emissions for these days. We scale these up to annual emissions and compare our measurements with the City of Boulder bottom-up vehicle emissions inventory based on total vehicle miles traveled, fuel efficiency, and vehicle type distribution. We find good agreement (within about a factor of two) between our top-down measurements and the city's bottom-up inventory value.

  8. Assessing the role of "bottom-up" emissions and simplified chemical mechanisms in reconciling CESM2.0 with TOGA observations from the ORCAS and ATom-2 campaigns

    Science.gov (United States)

    Asher, E.; Emmons, L. K.; Kinnison, D. E.; Tilmes, S.; Hills, A. J.; Hornbrook, R. S.; Stephens, B. B.; Apel, E. C.

    2017-12-01

    Surface albedo and precipitation over the Southern Ocean are sensitive to parameterizations of aerosol formation and cloud dynamics in global climate models. Observations of precursor gases for natural aerosols can help constrain the uncertainty in these parameterizations, if used in conjunction with an appropriately simplified chemical mechanism. We implement current oceanic "bottom-up" emission climatologies of dimethyl sulfide (DMS) and isoprene in CESM2.0 (Lana et al. 2016; Archer et al. 2009) and compare modeled constituents from two separate chemical mechanisms with data obtained from the Trace Organic Gas Analyzer (TOGA) on the O2/N2 Ratios and CO2 Airborne Study in the Southern Ocean (ORCAS) and the Atmospheric Tomography Mission 2 (ATom-2). We use ORCAS measurements of DMS, isoprene, methyl vinyl ketone (MVK) and methacrolein (MACR) from over 10 flights in Jan. - Feb. 2016 as a training dataset to improve "bottom-up" emissions. Thereafter, we evaluate the scaled "top-down" emissions in CESM with TOGA data obtained from the Atmospheric Tomography Mission (ATom-2) in Feb. 2017. Recent laboratory studies at NCAR confirm that TOGA surpasses proton transfer reaction mass spectrometry (PTR-MS) and commercial gas chromatography (GC) instruments with respect to accurate measurements of oxygenated VOCs in low nitrogen oxide (NO) environments, such as MVK and MACR.

  9. Top-Down and Bottom-Up Approaches in Engineering 1 T Phase Molybdenum Disulfide (MoS2 ): Towards Highly Catalytically Active Materials.

    Science.gov (United States)

    Chua, Chun Kiang; Loo, Adeline Huiling; Pumera, Martin

    2016-09-26

    The metallic 1 T phase of MoS2 has been widely identified to be responsible for the improved performances of MoS2 in applications including hydrogen evolution reactions and electrochemical supercapacitors. To this aim, various synthetic methods have been reported to obtain 1 T phase-rich MoS2 . Here, the aim is to evaluate the efficiencies of the bottom-up (hydrothermal reaction) and top-down (chemical exfoliation) approaches in producing 1 T phase MoS2 . It is established in this study that the 1 T phase MoS2 produced through the bottom-up approach contains a high proportion of 1 T phase and demonstrates excellent electrochemical and electrical properties. Its performance in the hydrogen evolution reaction and electrochemical supercapacitors also surpassed that of 1 T phase MoS2 produced through a top-down approach. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Top-down and bottom-up characterization of nitrated birch pollen allergen Bet v 1a with CZE hyphenated to an Orbitrap mass spectrometer.

    Science.gov (United States)

    Gusenkov, Sergey; Stutz, Hanno

    2018-02-01

    Tyrosine (Tyr) residues of the major pollen allergen of birch Betula verrucosa, Bet v 1a, were nitrated by peroxynitrite. This modification enhances the allergenicity. Modified tyrosines were identified by analyzing intact allergen variants in combination with top-down and bottom-up approaches. Therefore, a laboratory-built sheath-liquid assisted ESI interface was applied for hyphenation of CE to an Orbitrap mass spectrometer to localize individual nitration sites. The major focus was on identification of primary nitration sites. The top-down approach unambiguously identified Tyr 5 as the most prominent modification site. Fragments from the allergen core and the C-terminal part carried up to three potential nitration sites, respectively. Thus, a bottom-up approach with tryptic digest was used as a complementary strategy which allowed for the unambiguous localization of nitration sites within the respective peptides. Nitration propensity for individual Tyr residues was addressed by comparison of MS signals of nitrated peptides relative to all cognates of homolog primary sequence. Combined data identified surface exposed Tyr 5 and Tyr 66 as major nitration sites followed by less accessible Tyr 158 whereas Tyr 81, 83 and 150 possess a lower nitration tendency and are apparently modified in variants with higher nitration levels. © 2018 The Authors. Electrophoresis published by Wiley-VCH Verlag GmbH & Co. KGaA.

  11. An expanded framework to define and measure shared decision-making in dialogue: A 'top-down' and 'bottom-up' approach.

    Science.gov (United States)

    Callon, Wynne; Beach, Mary Catherine; Links, Anne R; Wasserman, Carly; Boss, Emily F

    2018-03-11

    We aimed to develop a comprehensive, descriptive framework to measure shared decision making (SDM) in clinical encounters. We combined a top-down (theoretical) approach with a bottom-up approach based on audio-recorded dialogue to identify all communication processes related to decision making. We coded 55 pediatric otolaryngology visits using the framework and report interrater reliability. We identified 14 clinician behaviors and 5 patient behaviors that have not been previously described, and developed a new SDM framework that is descriptive (what does happen) rather than normative (what should happen). Through the bottom-up approach we identified three broad domains not present in other SDM frameworks: socioemotional support, understandability of clinician dialogue, and recommendation-giving. We also specify the ways in which decision-making roles are assumed implicitly rather than discussed explicitly. Interrater reliability was >75% for 92% of the coded behaviors. This SDM framework allows for a more expansive understanding and analysis of how decision making takes place in clinical encounters, including new domains and behaviors not present in existing measures. We hope that this new framework will bring attention to a broader conception of SDM and allow researchers to further explore the new domains and behaviors identified. Copyright © 2018. Published by Elsevier B.V.

  12. Tax Salience, Voting, and Deliberation

    DEFF Research Database (Denmark)

    Sausgruber, Rupert; Tyran, Jean-Robert

    Tax incentives can be more or less salient, i.e. noticeable or cognitively easy to process. Our hypothesis is that taxes on consumers are more salient to consumers than equivalent taxes on sellers because consumers underestimate the extent of tax shifting in the market. We show that tax salience...... biases consumers' voting on tax regimes, and that experience is an effective de-biasing mechanism in the experimental laboratory. Pre-vote deliberation makes initially held opinions more extreme rather than correct and does not eliminate the bias in the typical committee. Yet, if voters can discuss...... their experience with the tax regimes they are less likely to be biased....

  13. Toward consistency between trends in bottom-up CO2 emissions and top-down atmospheric measurements in the Los Angeles megacity

    Directory of Open Access Journals (Sweden)

    S. Newman

    2016-03-01

    Full Text Available Large urban emissions of greenhouse gases result in large atmospheric enhancements relative to background that are easily measured. Using CO2 mole fractions and Δ14C and δ13C values of CO2 in the Los Angeles megacity observed in inland Pasadena (2006–2013 and coastal Palos Verdes peninsula (autumn 2009–2013, we have determined time series for CO2 contributions from fossil fuel combustion (Cff for both sites and broken those down into contributions from petroleum and/or gasoline and natural gas burning for Pasadena. We find a 10 % reduction in Pasadena Cff during the Great Recession of 2008–2010, which is consistent with the bottom-up inventory determined by the California Air Resources Board. The isotopic variations and total atmospheric CO2 from our observations are used to infer seasonality of natural gas and petroleum combustion. The trend of CO2 contributions to the atmosphere from natural gas combustion is out of phase with the seasonal cycle of total natural gas combustion seasonal patterns in bottom-up inventories but is consistent with the seasonality of natural gas usage by the area's electricity generating power plants. For petroleum, the inferred seasonality of CO2 contributions from burning petroleum is delayed by several months relative to usage indicated by statewide gasoline taxes. Using the high-resolution Hestia-LA data product to compare Cff from parts of the basin sampled by winds at different times of year, we find that variations in observed fossil fuel CO2 reflect seasonal variations in wind direction. The seasonality of the local CO2 excess from fossil fuel combustion along the coast, on Palos Verdes peninsula, is higher in autumn and winter than spring and summer, almost completely out of phase with that from Pasadena, also because of the annual variations of winds in the region. Variations in fossil fuel CO2 signals are consistent with sampling the bottom-up Hestia-LA fossil CO2 emissions product for sub

  14. HCFC-142b emissions in China: An inventory for 2000 to 2050 basing on bottom-up and top-down methods

    Science.gov (United States)

    Han, Jiarui; Li, Li; Su, Shenshen; Hu, Jianxin; Wu, Jing; Wu, Yusheng; Fang, Xuekun

    2014-05-01

    1-Chloro-1,1-difluoroethane (HCFC-142b) is both ozone depleting substance included in the Montreal Protocol on Substances that Deplete the Ozone Layer (Montreal Protocol) and potent greenhouse gas with high global warming potential. As one of the major HCFC-142b consumption and production countries in the world, China's control action will contribute to both mitigating climate change and protecting ozone layer. Estimating China's HCFC-142b emission is a crucial step for understanding its emission status, drawing up phasing-out plan and evaluating mitigation effect. Both the bottom-up and top-down method were adopted in this research to estimate HCFC-142b emissions from China. Results basing on different methods were compared to test the effectiveness of two methods and validate inventory's reliability. Firstly, a national bottom-up emission inventory of HCFC-142b for China during 2000-2012 was established based on the 2006 IPCC Guidelines for National Greenhouse Gas Inventories and the Montreal Protocol, showing that in contrast to the downward trend revealed by existing results, HCFC-142b emissions kept increasing from 0.1 kt/yr in 2000 to the peak of 14.4 kt/yr in 2012. Meanwhile a top-down emission estimation was also developed using interspecies correlation method. By correlating atmospheric mixing ratio data of HCFC-142b and reference substance HCFC-22 sampled from four representative cities (Beijing, Hangzhou, Lanzhou and Guangzhou, for northern, eastern, western and southern China, respectively), China's HCFC-142b emission in 2012 was calculated. It was 16.24(13.90-18.58) kt, equivalent to 1.06 kt ODP and 37 Tg CO2-eq, taking up 9.78% (ODP) of total HCFCs emission in China or 30.5% of global HCFC-142b emission. This result was 12.7% higher than that in bottom-up inventory. Possible explanations were discussed. The consistency of two results lend credit to methods effectiveness and results reliability. Finally, future HCFC-142b emission was projected to 2050

  15. Top-down or bottom-up? Assessing crevassing directions on surging glaciers and developments for physically testing glacier crevassing models.

    Science.gov (United States)

    Rea, B.; Evans, D. J. A.; Benn, D. I.; Brennan, A. J.

    2012-04-01

    Networks of crevasse squeeze ridges (CSRs) preserved on the forelands of many surging glaciers attest to extensive full-depth crevassing. Full-depth connections have been inferred from turbid water up-welling in crevasses and the formation of concertina eskers however, it has not been clearly established if the crevasses formed from the top-down or the bottom-up. A Linear Elastic Fracture Mechanics (LEFM) approach is used to determine the likely propagation direction for Mode I crevasses on seven surging glaciers. Results indicate that, the high extensional surface strain rates are insufficient to promote top-down full-depth crevasses but have sufficient magnitude to penetrate to depths of 4-12 m, explaining the extensive surface breakup accompanying glacier surges. Top-down, full-depth crevassing is only possible when water depth approaches 97% of the crevasse depth. However, the provision of sufficient meltwater is problematic due to the aforementioned extensive shallow surface crevassing. Full-depth, bottom-up crevassing can occur provided basal water pressures are in excess of 80-90% of flotation which is the default for surging and on occasion water pressures may even become artesian. Therefore CSRs, found across many surging glacier forelands and ice margins most likely result from the infilling of basal crevasses formed, for the most part, by bottom-up hydrofracturing. Despite the importance of crevassing for meltwater routing and calving dynamics physically testing numerical crevassing models remains problematic due to technological limitations, changing stress regimes and difficulties associated with working in crevasse zones on glaciers. Mapping of CSR spacing and matching to surface crevasse patterns can facilitate quantitative comparison between the LEFM model and observed basal crevasses provided ice dynamics are known. However, assessing full-depth top-down crevasse propagation is much harder to monitor in the field and no geomorphological record is

  16. The synthesis of bottom-up and top-down approaches to climate policy modeling: Electric power technology detail in a social accounting framework

    International Nuclear Information System (INIS)

    Sue Wing, Ian

    2008-01-01

    ''Hybrid'' climate policy simulations have sought to bridge the gap between ''bottom-up'' engineering and ''top-down'' macroeconomic models by integrating the former's energy technology detail into the latter's macroeconomic framework. Construction of hybrid models is complicated by the need to numerically calibrate them to multiple, incommensurate sources of economic and engineering data. I develop a solution to this problem following Howitt's [Howitt, R.E., 1995. Positive Mathematical Programming, American Journal of Agricultural Economics 77: 329-342] positive mathematical programming approach. Using data for the U.S., I illustrate how the inputs to the electricity sector in a social accounting matrix may be allocated among discrete types of generation so as to be consistent with both technologies' input shares from engineering cost estimates, and the zero profit and market clearance conditions of the sector's macroeconomic production structure. (author)

  17. Parallel- and serial-contact electrochemical metallization of monolayer nanopatterns: A versatile synthetic tool en route to bottom-up assembly of electric nanocircuits

    Directory of Open Access Journals (Sweden)

    Jonathan Berson

    2012-02-01

    Full Text Available Contact electrochemical transfer of silver from a metal-film stamp (parallel process or a metal-coated scanning probe (serial process is demonstrated to allow site-selective metallization of monolayer template patterns of any desired shape and size created by constructive nanolithography. The precise nanoscale control of metal delivery to predefined surface sites, achieved as a result of the selective affinity of the monolayer template for electrochemically generated metal ions, provides a versatile synthetic tool en route to the bottom-up assembly of electric nanocircuits. These findings offer direct experimental support to the view that, in electrochemical metal deposition, charge is carried across the electrode–solution interface by ion migration to the electrode rather than by electron transfer to hydrated ions in solution.

  18. Bottom-up processing of thermoelectric nanocomposites from colloidal nanocrystal building blocks: the case of Ag2Te–PbTe

    International Nuclear Information System (INIS)

    Cadavid, Doris; Ibáñez, Maria; Gorsse, Stéphane; López, Antonio M.; Cirera, Albert; Morante, Joan Ramon; Cabot, Andreu

    2012-01-01

    Nanocomposites are highly promising materials to enhance the efficiency of current thermoelectric devices. A straightforward and at the same time highly versatile and controllable approach to produce nanocomposites is the assembly of solution-processed nanocrystal building blocks. The convenience of this bottom-up approach to produce nanocomposites with homogeneous phase distributions and adjustable composition is demonstrated here by blending Ag 2 Te and PbTe colloidal nanocrystals to form Ag 2 Te–PbTe bulk nanocomposites. The thermoelectric properties of these nanocomposites are analyzed in the temperature range from 300 to 700 K. The evolution of their electrical conductivity and Seebeck coefficient is discussed in terms of the blend composition and the characteristics of the constituent materials.

  19. Bottom-up processing of thermoelectric nanocomposites from colloidal nanocrystal building blocks: the case of Ag{sub 2}Te-PbTe

    Energy Technology Data Exchange (ETDEWEB)

    Cadavid, Doris [Catalonia Institute for Energy Research, IREC (Spain); Ibanez, Maria [Universitat de Barcelona, Departament d' Electronica (Spain); Gorsse, Stephane [Universite de Bordeaux, ICMCB, CNRS (France); Lopez, Antonio M. [Universitat Politecnica de Catalunya, Departament d' Enginyeria Electronica (Spain); Cirera, Albert [Universitat de Barcelona, Departament d' Electronica (Spain); Morante, Joan Ramon; Cabot, Andreu, E-mail: acabot@irec.cat [Catalonia Institute for Energy Research, IREC (Spain)

    2012-12-15

    Nanocomposites are highly promising materials to enhance the efficiency of current thermoelectric devices. A straightforward and at the same time highly versatile and controllable approach to produce nanocomposites is the assembly of solution-processed nanocrystal building blocks. The convenience of this bottom-up approach to produce nanocomposites with homogeneous phase distributions and adjustable composition is demonstrated here by blending Ag{sub 2}Te and PbTe colloidal nanocrystals to form Ag{sub 2}Te-PbTe bulk nanocomposites. The thermoelectric properties of these nanocomposites are analyzed in the temperature range from 300 to 700 K. The evolution of their electrical conductivity and Seebeck coefficient is discussed in terms of the blend composition and the characteristics of the constituent materials.

  20. The synthesis of bottom-up and top-down approaches to climate policy modeling: Electric power technologies and the cost of limiting US CO2 emissions

    International Nuclear Information System (INIS)

    Wing, Ian Sue

    2006-01-01

    In the US, the bulk of CO 2 abatement induced by carbon taxes comes from electric power. This paper incorporates technology detail into the electricity sector of a computable general equilibrium model of the US economy to characterize electric power's technological margins of adjustment to carbon taxes and to elucidate their general equilibrium effects. Compared to the top-down production function representation of the electricity sector, the technology-rich hybrid specification produces less abatement at a higher welfare cost, suggesting that bottom-up models do not necessarily generate lower costs of abatement than top-down models. This result is shown to be sensitive to the elasticity with which technologies' generating capacities adjust to relative prices

  1. Guidelines for bottom-up approach of nanocarbon film formation from pentacene using heated tungsten on quartz substrate without metal catalyst

    Science.gov (United States)

    Heya, Akira; Matsuo, Naoto

    2018-04-01

    The guidelines for a bottom-up approach of nanographene formation from pentacene using heated tungsten were investigated using a novel method called hot mesh deposition (HMD). In this method, a heated W mesh was set between a pentacene source and a quartz substrate. Pentacene molecules were decomposed by the heated W mesh. The generated pentacene-based decomposed precursors were then deposited on the quartz substrate. The pentacene dimer (peripentacene) was obtained from pentacene by HMD using two heated catalysts. As expected from the calculation with the density functional theory in the literature, it was confirmed that the pentacene dimer can be formed by a reaction between pentacene and 6,13-dihydropentacene. This technique can be applied to the formation of novel nanographene on various substrates without metal catalysts.

  2. Chitosan microspheres with an extracellular matrix-mimicking nanofibrous structure as cell-carrier building blocks for bottom-up cartilage tissue engineering

    Science.gov (United States)

    Zhou, Yong; Gao, Huai-Ling; Shen, Li-Li; Pan, Zhao; Mao, Li-Bo; Wu, Tao; He, Jia-Cai; Zou, Duo-Hong; Zhang, Zhi-Yuan; Yu, Shu-Hong

    2015-12-01

    Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation. Recently, as a valuable alternative, a bottom-up TE approach utilizing cell-loaded micrometer-scale modular components as building blocks to reconstruct a new tissue in vitro or in vivo has been proved to demonstrate a number of desirable advantages compared with the traditional bulk scaffold based top-down TE approach. Nevertheless, micro-components with an ECM-mimicking nanofibrous structure are still very scarce and highly desirable. Chitosan (CS), an accessible natural polymer, has demonstrated appealing intrinsic properties and promising application potential for TE, especially the cartilage tissue regeneration. According to this background, we report here the fabrication of chitosan microspheres with an ECM-mimicking nanofibrous structure for the first time based on a physical gelation process. By combining this physical fabrication procedure with microfluidic technology, uniform CS microspheres (CMS) with controlled nanofibrous microstructure and tunable sizes can be facilely obtained. Especially, no potentially toxic or denaturizing chemical crosslinking agent was introduced into the products. Notably, in vitro chondrocyte culture tests revealed that enhanced cell attachment and proliferation were realized, and a macroscopic 3D geometrically shaped cartilage-like composite can be easily constructed with the nanofibrous CMS (NCMS) and chondrocytes, which demonstrate significant application potential of NCMS as the bottom-up cell-carrier components for cartilage tissue engineering.Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation

  3. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Directory of Open Access Journals (Sweden)

    Sebastian McBride

    Full Text Available Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1 conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2 implementation and validation of the model into robotic hardware (as a representative of an active vision system. Seven computational requirements were identified: 1 transformation of retinotopic to egocentric mappings, 2 spatial memory for the purposes of medium-term inhibition of return, 3 synchronization of 'where' and 'what' information from the two visual streams, 4 convergence of top-down and bottom-up information to a centralized point of information processing, 5 a threshold function to elicit saccade action, 6 a function to represent task relevance as a ratio of excitation and inhibition, and 7 derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  4. "Disorganized in time": impact of bottom-up and top-down negative emotion generation on memory formation among healthy and traumatized adolescents.

    Science.gov (United States)

    Guillery-Girard, Bérengère; Clochon, Patrice; Giffard, Bénédicte; Viard, Armelle; Egler, Pierre-Jean; Baleyte, Jean-Marc; Eustache, Francis; Dayan, Jacques

    2013-09-01

    "Travelling in time," a central feature of episodic memory is severely affected among individuals with Post Traumatic Stress Disorder (PTSD) with two opposite effects: vivid traumatic memories are unorganized in temporality (bottom-up processes), non-traumatic personal memories tend to lack spatio-temporal details and false recognitions occur more frequently that in the general population (top-down processes). To test the effect of these two types of processes (i.e. bottom-up and top-down) on emotional memory, we conducted two studies in healthy and traumatized adolescents, a period of life in which vulnerability to emotion is particularly high. Using negative and neutral images selected from the international affective picture system (IAPS), stimuli were divided into perceptual images (emotion generated by perceptual details) and conceptual images (emotion generated by the general meaning of the material). Both categories of stimuli were then used, along with neutral pictures, in a memory task with two phases (encoding and recognition). In both populations, we reported a differential effect of the emotional material on encoding and recognition. Negative perceptual scenes induced an attentional capture effect during encoding and enhanced the recollective distinctiveness. Conversely, the encoding of conceptual scenes was similar to neutral ones, but the conceptual relatedness induced false memories at retrieval. However, among individuals with PTSD, two subgroups of patients were identified. The first subgroup processed the scenes faster than controls, except for the perceptual scenes, and obtained similar performances to controls in the recognition task. The second subgroup group desmonstrated an attentional deficit in the encoding task with no benefit from the distinctiveness associated with negative perceptual scenes on memory performances. These findings provide a new perspective on how negative emotional information may have opposite influences on memory in

  5. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories.

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G; Quinn, Paul C; Hu, Chao S; Qian, Miao; Fu, Genyue; Lee, Kang

    2015-02-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese, Caucasian, and racially ambiguous faces. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Science.gov (United States)

    McBride, Sebastian; Huelse, Martin; Lee, Mark

    2013-01-01

    Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1) conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2) implementation and validation of the model into robotic hardware (as a representative of an active vision system). Seven computational requirements were identified: 1) transformation of retinotopic to egocentric mappings, 2) spatial memory for the purposes of medium-term inhibition of return, 3) synchronization of 'where' and 'what' information from the two visual streams, 4) convergence of top-down and bottom-up information to a centralized point of information processing, 5) a threshold function to elicit saccade action, 6) a function to represent task relevance as a ratio of excitation and inhibition, and 7) derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  7. Top-Down and Bottom-Up Identification of Proteins by Liquid Extraction Surface Analysis Mass Spectrometry of Healthy and Diseased Human Liver Tissue

    Science.gov (United States)

    Sarsby, Joscelyn; Martin, Nicholas J.; Lalor, Patricia F.; Bunch, Josephine; Cooper, Helen J.

    2014-09-01

    Liquid extraction surface analysis mass spectrometry (LESA MS) has the potential to become a useful tool in the spatially-resolved profiling of proteins in substrates. Here, the approach has been applied to the analysis of thin tissue sections from human liver. The aim was to determine whether LESA MS was a suitable approach for the detection of protein biomarkers of nonalcoholic liver disease (nonalcoholic steatohepatitis, NASH), with a view to the eventual development of LESA MS for imaging NASH pathology. Two approaches were considered. In the first, endogenous proteins were extracted from liver tissue sections by LESA, subjected to automated trypsin digestion, and the resulting peptide mixture was analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS) (bottom-up approach). In the second (top-down approach), endogenous proteins were extracted by LESA, and analyzed intact. Selected protein ions were subjected to collision-induced dissociation (CID) and/or electron transfer dissociation (ETD) mass spectrometry. The bottom-up approach resulted in the identification of over 500 proteins; however identification of key protein biomarkers, liver fatty acid binding protein (FABP1), and its variant (Thr→Ala, position 94), was unreliable and irreproducible. Top-down LESA MS analysis of healthy and diseased liver tissue revealed peaks corresponding to multiple (~15-25) proteins. MS/MS of four of these proteins identified them as FABP1, its variant, α-hemoglobin, and 10 kDa heat shock protein. The reliable identification of FABP1 and its variant by top-down LESA MS suggests that the approach may be suitable for imaging NASH pathology in sections from liver biopsies.

  8. Heat recovery with heat pumps in non-energy intensive industry: A detailed bottom-up model analysis in the French food and drink industry

    International Nuclear Information System (INIS)

    Seck, Gondia Sokhna; Guerassimoff, Gilles; Maïzi, Nadia

    2013-01-01

    Highlights: • First bottom-up energy model for NEI at 4-digit level of NACE for energy analysis. • Energy end-use modelling due to the unsuitability of end-product/process approach. • Analysis of heat recovery with HP on industrial processes up to 2020 in French F and D. • Energy consumption and emissions drop respectively by 10% compared to 2001 and 9% to 1990. • Results only achieved at heat temperature below 100 °C, concentrated in 1/3 of F and D sectors. - Abstract: Rising energy prices and environmental impacts inevitably encourage industrials to get involved in promoting energy efficiency and emissions reductions. To achieve this goal, we have developed the first detailed bottom-up energy model for Non-Energy Intensive industry (NEI) to study its global energy efficiency and the potential for CO 2 emissions reduction at a 4-digit level of NACE classification. The latter, which is generally neglected in energy analyses, is expected to play an important role in reducing industry energy intensity in the long term due to its economic and energy significance and relatively high growth rate. In this paper, the modelling of NEI is done by energy end-use owing to the unsuitability of the end-product/process approach used in the Energy Intensive industry modelling. As an example, we analysed the impact of heat recovery with heat pumps (HP) on industrial processes up to 2020 on energy savings and CO 2 emissions reductions in the French food and drink industry (F and D), the biggest NEI sector. The results showed HP could be an excellent and very promising energy recovery technology. For further detailed analysis, the depiction of HP investment cost payments is given per temperature range for each F and D subsector. This model constitutes a useful decision-making tool for assessing potential energy savings from investing in efficient technologies at the highest level of disaggregation, as well as a better subsectoral screening

  9. Bottom-up and Top-down Approaches to Explore Sodium Dodecyl Sulfate and Soluplus on the Crystallization Inhibition and Dissolution of Felodipine Extrudates.

    Science.gov (United States)

    Chen, Jiali; Chen, Yuqi; Huang, Wencong; Wang, Hanning; Du, Yang; Xiong, Subin

    2018-05-05

    The objectives of this study were to explore sodium dodecyl sulfate (SDS) and Soluplus on the crystallization inhibition and dissolution of felodipine (FLDP) extrudates by bottom-up and top-down approaches. FLDP extrudates with Soluplus and/or SDS were prepared by hot melt extrusion (HME), and characterized by PLM, DSC and FT-IR. Results indicated that Soluplus inhibited FLDP crystallization and the whole amorphous solid dispersions (ASDs) were binary FLDP-Soluplus (1:3) and ternary FLDP-Soluplus-SDS(1:2:0.15∼0.3 and 1:3:0.2∼0.4) extrudates. Internal SDS (5%-10%) decreased Tgs of FLDP-Soluplus-SDS ternary ASDs without presenting molecular interactions with FLDP or Soluplus. The enhanced dissolution rate of binary or ternary Soluplus-rich ASDs in the non-sink condition of 0.05%SDS was achieved. Bottom-up approach indicated that Soluplus was a much stronger crystal inhibitor to the supersaturated FLDP in solutions than SDS. Top-down approach demonstrated that SDS enhanced the dissolution of Soluplus-rich ASDs via wettability and complexation with Soluplus to accelerate the medium uptake and erosion kinetics of extrudates, but induced FLDP recrystallization and resulted in incomplete dissolution of FLDP-rich extrudates. In conclusion, top-down approach is a promising strategy to explore the mechanisms of ASDs' dissolution, and small amount of SDS enhances the dissolution rate of polymer-rich ASDs in the non-sink condition. Copyright © 2018. Published by Elsevier Inc.

  10. Exploring the structure of fucosylated chondroitin sulfate through bottom-up nuclear magnetic resonance and electrospray ionization-high-resolution mass spectrometry approaches.

    Science.gov (United States)

    Santos, Gustavo Rc; Porto, Ana Co; Soares, Paulo Ag; Vilanova, Eduardo; Mourão, Paulo As

    2017-07-01

    Fucosylated chondroitin sulfate (FCS) from sea cucumbers is composed of a chondroitin sulfate (CS) central core and branches of sulfated fucose. The structure of this complex glycosaminoglycan is usually investigated via nuclear magnetic resonance (NMR) analyses of the intact molecule, ergo through a top-down approach, which often yield spectra with intricate sets of signals. Here we employed a bottom-up approach to analyze the FCSs from the sea cucumbers Isostichopus badionotus and Ludwigothurea grisea from their basic constituents, viz. CS cores and sulfated fucose branches, obtained via systematic fragmentation through mild acid hydrolysis. Oligosaccharides derived from the central CS core were analyzed via NMR spectroscopy and the disaccharides produced using chondroitin sulfate lyase via SAX-HPLC. The CS cores from the two species were similar, showing only slight differences in the proportions of 4- or 6-monosulfated and 4,6-disulfated β-d-GalNAc. Sulfated fucose units released from the FCSs were analyzed via NMR and ESI-HRMS spectroscopies. The fucose units from each species presented extensive qualitative differences, but quantitative assessments of these units were hindered, mostly because of their extensive desulfation during the hydrolysis. The bottom-up analysis performed here has proved useful to explore the structure of FCS through a sum-of-the-parts approach in a qualitative manner. We further demonstrate that under specific acidification conditions particular fucose branches can be removed preferentially from FCS. Preparation of derivatives enriched with particular fucose branches could be useful for studies on "structure vs. biological function" of FCS. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  11. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: Contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G.; Quinn, Paul C.; Hu, Chao S.; Qian, Miao; Fu, Genyue; Lee, Kang

    2014-01-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese faces, Caucasian faces, and racially ambiguous morphed face stimuli. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information of racial categories that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time. PMID:25497461

  12. Disease-mediated bottom-up regulation: An emergent virus affects a keystone prey, and alters the dynamics of trophic webs.

    Science.gov (United States)

    Monterroso, Pedro; Garrote, Germán; Serronha, Ana; Santos, Emídio; Delibes-Mateos, Miguel; Abrantes, Joana; Perez de Ayala, Ramón; Silvestre, Fernando; Carvalho, João; Vasco, Inês; Lopes, Ana M; Maio, Elisa; Magalhães, Maria J; Mills, L Scott; Esteves, Pedro J; Simón, Miguel Ángel; Alves, Paulo C

    2016-10-31

    Emergent diseases may alter the structure and functioning of ecosystems by creating new biotic interactions and modifying existing ones, producing cascading processes along trophic webs. Recently, a new variant of the rabbit haemorrhagic disease virus (RHDV2 or RHDVb) arguably caused widespread declines in a keystone prey in Mediterranean ecosystems - the European rabbit (Oryctolagus cuniculus). We quantitatively assess the impact of RHDV2 on natural rabbit populations and in two endangered apex predator populations: the Iberian lynx (Lynx pardinus) and the Spanish Imperial eagle (Aquila adalberti). We found 60-70% declines in rabbit populations, followed by decreases of 65.7% in Iberian lynx and 45.5% in Spanish Imperial eagle fecundities. A revision of the web of trophic interactions among rabbits and their dependent predators suggests that RHDV2 acts as a keystone species, and may steer Mediterranean ecosystems to management-dependent alternative states, dominated by simplified mesopredator communities. This model system stresses the importance of diseases as functional players in the dynamics of trophic webs.

  13. Aberrant salience, self-concept clarity, and interview-rated psychotic-like experiences.

    Science.gov (United States)

    Cicero, David C; Docherty, Anna R; Becker, Theresa M; Martin, Elizabeth A; Kerns, John G

    2015-02-01

    Many social-cognitive models of psychotic-like symptoms posit a role for self-concept and aberrant salience. Previous work has shown that the interaction between aberrant salience and self-concept clarity is associated with self-reported psychotic-like experiences. In the current research with two structured interviews, the interaction between aberrant salience and self-concept clarity was found to be associated with interview-rated psychotic-like experiences. The interaction was associated with psychotic-like experiences composite scores, delusional ideation, grandiosity, and perceptual anomalies. In all cases, self-concept clarity was negatively associated with psychotic-like experiences at high levels of aberrant salience, but unassociated with psychotic-like experiences at low levels of aberrant salience. The interaction was specific to positive psychotic-like experiences and not present for negative or disorganized ratings. The interaction was not mediated by self-esteem levels. These results provide further evidence that aberrant salience and self-concept clarity play an important role in the generation of psychotic-like experiences.

  14. Assessing the construct validity of aberrant salience

    Directory of Open Access Journals (Sweden)

    Kristin Schmidt

    2009-12-01

    Full Text Available We sought to validate the psychometric properties of a recently developed paradigm that aims to measure salience attribution processes proposed to contribute to positive psychotic symptoms, the Salience Attribution Test (SAT. The “aberrant salience” measure from the SAT showed good face validity in previous results, with elevated scores both in high-schizotypy individuals, and in patients with schizophrenia suffering from delusions. Exploring the construct validity of salience attribution variables derived from the SAT is important, since other factors, including latent inhibition/learned irrelevance, attention, probabilistic reward learning, sensitivity to probability, general cognitive ability and working memory could influence these measures. Fifty healthy participants completed schizotypy scales, the SAT, a learned irrelevance task, and a number of other cognitive tasks tapping into potentially confounding processes. Behavioural measures of interest from each task were entered into a principal components analysis, which yielded a five-factor structure accounting for ~75% percent of the variance in behaviour. Implicit aberrant salience was found to load onto its own factor, which was associated with elevated “Introvertive Anhedonia” schizotypy, replicating our previous finding. Learned irrelevance loaded onto a separate factor, which also included implicit adaptive salience, but was not associated with schizotypy. Explicit adaptive and aberrant salience, along with a measure of probabilistic learning, loaded onto a further factor, though this also did not correlate with schizotypy. These results suggest that the measures of learned irrelevance and implicit adaptive salience might be based on similar underlying processes, which are dissociable both from implicit aberrant salience and explicit measures of salience.

  15. A new method for calibrating perceptual salience across dimensions in infants: the case of color vs. luminance.

    Science.gov (United States)

    Kaldy, Zsuzsa; Blaser, Erik A; Leslie, Alan M

    2006-09-01

    We report a new method for calibrating differences in perceptual salience across feature dimensions, in infants. The problem of inter-dimensional salience arises in many areas of infant studies, but a general method for addressing the problem has not previously been described. Our method is based on a preferential looking paradigm, adapted to determine the relative salience of two stimuli. We report here on the case of stimuli differing in color and luminance, though the method has wider potential. We were able to determine on a psychophysical curve the point at which a color contrast was equally salient to infants as a given luminance contrast. We then used these calibrated, 'iso-salient' stimuli in an object memory study. Results showed that 6.5-month-old infants noticed a color, but not a luminance, change while tracking an occluded object. Our method should have numerous applications in the study of bottom-up effects on infant attention and visual working memory.

  16. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    Science.gov (United States)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    Recent interest in offshore renewable energy within the United States has amplified the need for marine spatial planning to direct management strategies and address competing user demands. To assist this effort in Rhode Island, benthic habitat classification maps were developed for two sites in offshore waters being considered for wind turbine installation. Maps characterizing and representing the distribution and extent of benthic habitats are valuable tools for improving understanding of ecosystem patterns and processes, and promoting scientifically-sound management decisions. This project presented the opportunity to conduct a comparison of the methodologies and resulting map outputs of two classification approaches, “top-down” and “bottom-up” in the two study areas. This comparison was undertaken to improve understanding of mapping methodologies and their applicability, including the bottom-up approach in offshore environments where data density tends to be lower, as well as to provide case studies for scientists and managers to consider for their own areas of interest. Such case studies can offer guidance for future work for assessing methodologies and translating them to other areas. The traditional top-down mapping approach identifies biological community patterns based on communities occurring within geologically defined habitat map units, under the concept that geologic environments contain distinct biological assemblages. Alternatively, the bottom-up approach aims to establish habitat map units centered on biological similarity and then uses statistics to identify relationships with associated environmental parameters and determine habitat boundaries. When applied to the two study areas, both mapping approaches produced habitat classes with distinct macrofaunal assemblages and each established statistically strong and significant biotic-abiotic relationships with geologic features, sediment characteristics, water depth, and/or habitat

  17. Top-down and bottom-up: Front to back. Comment on "Move me, astonish me... delight my eyes and brain: The Vienna Integrated Model of top-down and bottom-up processes in Art Perception (VIMAP) and corresponding affective, evaluative, and neurophysiological correlates" by Matthew Pelowski et al.

    Science.gov (United States)

    Nadal, Marcos; Skov, Martin

    2017-07-01

    The model presented here [1] is the latest in an evolving series of psychological models aimed at explaining the experience of art, first proposed by Leder and colleagues [2]. The aim of this new version is to ;explicitly connect early bottom-up, artwork-derived processing sequence and outputs to top-down, viewer-derived contribution to the processing sequence; [1, p. 5f & 6]. The ;meeting; of these two processing sequences, the authors contend, is crucial to the understanding of people's responses to art [sections 3.6ff & 4], and therefore the new model's principal motivation.

  18. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Directory of Open Access Journals (Sweden)

    W. J. F. Acton

    2016-06-01

    Full Text Available This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton-transfer-reaction mass spectrometer (PTR-MS and a proton-transfer-reaction time-of-flight mass spectrometer (PTR-ToF-MS together with the methods of virtual disjunct eddy covariance (using PTR-MS and eddy covariance (using PTR-ToF-MS. Isoprene was the dominant emitted compound with a mean daytime flux of 1.9 mg m−2 h−1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28-day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m−2 h−1 was calculated using the Model of Emissions of Gases and Aerosols from Nature (MEGAN isoprene emission algorithms (Guenther et al., 2006. A detailed tree-species distribution map for the site enabled the leaf-level emission of isoprene and monoterpenes recorded using gas-chromatography mass spectrometry (GC–MS to be scaled up to produce a bottom-up canopy-scale flux. This was compared with the top-down canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant-species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  19. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Science.gov (United States)

    Acton, W. Joe F.; Schallhart, Simon; Langford, Ben; Valach, Amy; Rantala, Pekka; Fares, Silvano; Carriero, Giulia; Tillmann, Ralf; Tomlinson, Sam J.; Dragosits, Ulrike; Gianelle, Damiano; Hewitt, C. Nicholas; Nemitz, Eiko

    2016-06-01

    This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs) 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton-transfer-reaction mass spectrometer (PTR-MS) and a proton-transfer-reaction time-of-flight mass spectrometer (PTR-ToF-MS) together with the methods of virtual disjunct eddy covariance (using PTR-MS) and eddy covariance (using PTR-ToF-MS). Isoprene was the dominant emitted compound with a mean daytime flux of 1.9 mg m-2 h-1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28-day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m-2 h-1 was calculated using the Model of Emissions of Gases and Aerosols from Nature (MEGAN) isoprene emission algorithms (Guenther et al., 2006). A detailed tree-species distribution map for the site enabled the leaf-level emission of isoprene and monoterpenes recorded using gas-chromatography mass spectrometry (GC-MS) to be scaled up to produce a bottom-up canopy-scale flux. This was compared with the top-down canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant-species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  20. Linking top-down and bottom-up approaches for assessing the vulnerability of a 100 % renewable energy system in Northern-Italy

    Science.gov (United States)

    Borga, Marco; Francois, Baptiste; Hingray, Benoit; Zoccatelli, Davide; Creutin, Jean-Dominique; brown, Casey

    2016-04-01

    Due to their variable and un-controllable features, integration of Variable Renewable Energies (e.g. solar-power, wind-power and hydropower, denoted as VRE) into the electricity network implies higher production variability and increased risk of not meeting demand. Two approaches are commonly used for assessing this risk and especially its evolution in a global change context (i.e. climate and societal changes); top-down and bottom-up approaches. The general idea of a top-down approach is to drive analysis of global change or of some key aspects of global change on their systems (e.g., the effects of the COP 21, of the deployment of Smart Grids, or of climate change) with chains of loosely linked simulation models within a predictive framework. The bottom-up approach aims to improve understanding of the dependencies between the vulnerability of regional systems and large-scale phenomenon from knowledge gained through detailed exploration of the response to change of the system of interest, which may reveal vulnerability thresholds, tipping points as well as potential opportunities. Brown et al. (2012) defined an analytical framework to merge these two approaches. The objective is to build, a set of Climate Response Functions (CRFs) putting in perspective i) indicators of desired states ("success") and undesired states ("failure") of a system as defined in collaboration with stakeholders 2) exhaustive exploration of the effects of uncertain forcings and imperfect system understanding on the response of the system itself to a plausible set of possible changes, implemented a with multi-dimensionally consistent "stress test" algorithm, and 3) a set "ex post" hydroclimatic and socioeconomic scenarios that provide insight into the differential effectiveness of alternative policies and serve as entry points for the provision of climate information to inform policy evaluation and choice. We adapted this approach for analyzing a 100 % renewable energy system within a region

  1. Optimization of mass spectrometric parameters improve the identification performance of capillary zone electrophoresis for single-shot bottom-up proteomics analysis.

    Science.gov (United States)

    Zhang, Zhenbin; Dovichi, Norman J

    2018-02-25

    The effects of MS1 injection time, MS2 injection time, dynamic exclusion time, intensity threshold, and isolation width were investigated on the numbers of peptide and protein identifications for single-shot bottom-up proteomics analysis using CZE-MS/MS analysis of a Xenopus laevis tryptic digest. An electrokinetically pumped nanospray interface was used to couple a linear-polyacrylamide coated capillary to a Q Exactive HF mass spectrometer. A sensitive method that used a 1.4 Th isolation width, 60,000 MS2 resolution, 110 ms MS2 injection time, and a top 7 fragmentation produced the largest number of identifications when the CZE loading amount was less than 100 ng. A programmable autogain control method (pAGC) that used a 1.4 Th isolation width, 15,000 MS2 resolution, 110 ms MS2 injection time, and top 10 fragmentation produced the largest number of identifications for CZE loading amounts greater than 100 ng; 7218 unique peptides and 1653 protein groups were identified from 200 ng by using the pAGC method. The effect of mass spectrometer conditions on the performance of UPLC-MS/MS was also investigated. A fast method that used a 1.4 Th isolation width, 30,000 MS2 resolution, 45 ms MS2 injection time, and top 12 fragmentation produced the largest number of identifications for 200 ng UPLC loading amount (6025 unique peptides and 1501 protein groups). This is the first report where the identification number for CZE surpasses that of the UPLC at the 200 ng loading level. However, more peptides (11476) and protein groups (2378) were identified by using UPLC-MS/MS when the sample loading amount was increased to 2 μg with the fast method. To exploit the fast scan speed of the Q-Exactive HF mass spectrometer, higher sample loading amounts are required for single-shot bottom-up proteomics analysis using CZE-MS/MS. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Evaluation of Modeling NO2 Concentrations Driven by Satellite-Derived and Bottom-Up Emission Inventories Using In-Situ Measurements Over China

    Science.gov (United States)

    Liu, Fei; van der A, Ronald J.; Eskes, Henk; Ding, Jieying; Mijling, Bas

    2018-01-01

    Chemical transport models together with emission inventories are widely used to simulate NO2 concentrations over China, but validation of the simulations with in situ measurements has been extremely limited. Here we use ground measurements obtained from the air quality monitoring network recently developed by the Ministry of Environmental Protection of China to validate modeling surface NO2 concentrations from the CHIMERE regional chemical transport model driven by the satellite-derived DECSO and the bottom-up MIX emission inventories. We applied a correction factor to the observations to account for the interferences of other oxidized nitrogen compounds (NOz), based on the modeled ratio of NO2 to NOz. The model accurately reproduces the spatial variability in NO2 from in situ measurements, with a spatial correlation coefficient of over 0.7 for simulations based on both inventories. A negative and positive bias is found for the simulation with the DECSO (slopeD0.74 and 0.64 for the daily mean and daytime only) and the MIX (slopeD1.3 and 1.1) inventories, respectively, suggesting an underestimation and overestimation of NOx emissions from corresponding inventories. The bias between observed and modeled concentrations is reduced, with the slope dropping from 1.3 to 1.0 when the spatial distribution of NOx emissions in the DECSO inventory is applied as the spatial proxy for the MIX inventory, which suggests an improvement of the distribution of emissions between urban and suburban or rural areas in the DECSO inventory compared to that used in the bottom-up inventory. A rough estimate indicates that the observed concentrations, from sites predominantly placed in the populated urban areas, may be 10-40% higher than the corresponding model grid cell mean. This reduces the estimate of the negative bias of the DECSO-based simulation to the range of -30 to 0% on average and more firmly establishes that the MIX inventory is biased high over major cities. The performance of

  3. Evaluation of modeling NO2 concentrations driven by satellite-derived and bottom-up emission inventories using in situ measurements over China

    Science.gov (United States)

    Liu, Fei; van der A, Ronald J.; Eskes, Henk; Ding, Jieying; Mijling, Bas

    2018-03-01

    Chemical transport models together with emission inventories are widely used to simulate NO2 concentrations over China, but validation of the simulations with in situ measurements has been extremely limited. Here we use ground measurements obtained from the air quality monitoring network recently developed by the Ministry of Environmental Protection of China to validate modeling surface NO2 concentrations from the CHIMERE regional chemical transport model driven by the satellite-derived DECSO and the bottom-up MIX emission inventories. We applied a correction factor to the observations to account for the interferences of other oxidized nitrogen compounds (NOz), based on the modeled ratio of NO2 to NOz. The model accurately reproduces the spatial variability in NO2 from in situ measurements, with a spatial correlation coefficient of over 0.7 for simulations based on both inventories. A negative and positive bias is found for the simulation with the DECSO (slope = 0.74 and 0.64 for the daily mean and daytime only) and the MIX (slope = 1.3 and 1.1) inventories, respectively, suggesting an underestimation and overestimation of NOx emissions from corresponding inventories. The bias between observed and modeled concentrations is reduced, with the slope dropping from 1.3 to 1.0 when the spatial distribution of NOx emissions in the DECSO inventory is applied as the spatial proxy for the MIX inventory, which suggests an improvement of the distribution of emissions between urban and suburban or rural areas in the DECSO inventory compared to that used in the bottom-up inventory. A rough estimate indicates that the observed concentrations, from sites predominantly placed in the populated urban areas, may be 10-40 % higher than the corresponding model grid cell mean. This reduces the estimate of the negative bias of the DECSO-based simulation to the range of -30 to 0 % on average and more firmly establishes that the MIX inventory is biased high over major cities. The

  4. A two-step combination of top-down and bottom-up fire emission estimates at regional and global scales: strengths and main uncertainties

    Science.gov (United States)

    Sofiev, Mikhail; Soares, Joana; Kouznetsov, Rostislav; Vira, Julius; Prank, Marje

    2016-04-01

    Top-down emission estimation via inverse dispersion modelling is used for various problems, where bottom-up approaches are difficult or highly uncertain. One of such areas is the estimation of emission from wild-land fires. In combination with dispersion modelling, satellite and/or in-situ observations can, in principle, be used to efficiently constrain the emission values. This is the main strength of the approach: the a-priori values of the emission factors (based on laboratory studies) are refined for real-life situations using the inverse-modelling technique. However, the approach also has major uncertainties, which are illustrated here with a few examples of the Integrated System for wild-land Fires (IS4FIRES). IS4FIRES generates the smoke emission and injection profile from MODIS and SEVIRI active-fire radiative energy observations. The emission calculation includes two steps: (i) initial top-down calibration of emission factors via inverse dispersion problem solution that is made once using training dataset from the past, (ii) application of the obtained emission coefficients to individual-fire radiative energy observations, thus leading to bottom-up emission compilation. For such a procedure, the major classes of uncertainties include: (i) imperfect information on fires, (ii) simplifications in the fire description, (iii) inaccuracies in the smoke observations and modelling, (iv) inaccuracies of the inverse problem solution. Using examples of the fire seasons 2010 in Russia, 2012 in Eurasia, 2007 in Australia, etc, it is pointed out that the top-down system calibration performed for a limited number of comparatively moderate cases (often the best-observed ones) may lead to errors in application to extreme events. For instance, the total emission of 2010 Russian fires is likely to be over-estimated by up to 50% if the calibration is based on the season 2006 and fire description is simplified. Longer calibration period and more sophisticated parameterization

  5. Isolating the Incentive Salience of Reward-Associated Stimuli: Value, Choice, and Persistence

    Science.gov (United States)

    Beckmann, Joshua S.; Chow, Jonathan J.

    2015-01-01

    Sign- and goal-tracking are differentially associated with drug abuse-related behavior. Recently, it has been hypothesized that sign- and goal-tracking behavior are mediated by different neurobehavioral valuation systems, including differential incentive salience attribution. Herein, we used different conditioned stimuli to preferentially elicit…

  6. MPEG-4 AVC saliency map computation

    Science.gov (United States)

    Ammar, M.; Mitrea, M.; Hasnaoui, M.

    2014-02-01

    A saliency map provides information about the regions inside some visual content (image, video, ...) at which a human observer will spontaneously look at. For saliency maps computation, current research studies consider the uncompressed (pixel) representation of the visual content and extract various types of information (intensity, color, orientation, motion energy) which are then fusioned. This paper goes one step further and computes the saliency map directly from the MPEG-4 AVC stream syntax elements with minimal decoding operations. In this respect, an a-priori in-depth study on the MPEG-4 AVC syntax elements is first carried out so as to identify the entities appealing the visual attention. Secondly, the MPEG-4 AVC reference software is completed with software tools allowing the parsing of these elements and their subsequent usage in objective benchmarking experiments. This way, it is demonstrated that an MPEG-4 saliency map can be given by a combination of static saliency and motion maps. This saliency map is experimentally validated under a robust watermarking framework. When included in an m-QIM (multiple symbols Quantization Index Modulation) insertion method, PSNR average gains of 2.43 dB, 2.15dB, and 2.37 dB are obtained for data payload of 10, 20 and 30 watermarked blocks per I frame, i.e. about 30, 60, and 90 bits/second, respectively. These quantitative results are obtained out of processing 2 hours of heterogeneous video content.

  7. Bottom-up-then-up-down Route for Multi-level Construction of Hierarchical Bi2S3 Superstructures with Magnetism Alteration

    Science.gov (United States)

    Wei, Chengzhen; Wang, Lanfang; Dang, Liyun; Chen, Qun; Lu, Qingyi; Gao, Feng

    2015-01-01

    A bottom-up-then-up-down route was proposed to construct multi-level Bi2S3 hierarchical architectures assembled by two-dimensional (2D) Bi2S3 sheet-like networks. BiOCOOH hollow spheres and flower-like structures, which are both assembled by 2D BiOCOOH nanosheets, were prepared first by a “bottom-up” route through a “quasi-emulsion” mechanism. Then the BiOCOOH hierarchical structures were transferred to hierarchical Bi2S3 architectures through an “up-down” route by an ion exchange method. The obtained Bi2S3 nanostructures remain hollow-spherical and flower-like structures of the precursors but the constructing blocks are changed to 2D sheet-like networks interweaving by Bi2S3 nanowires. The close matching of crystal lattices between Bi2S3 and BiOCOOH was believed to be the key reason for the topotactic transformation from BiOCOOH nanosheets to 2D Bi2S3 sheet-like nanowire networks. Magnetism studies reveal that unlike diamagnetism of comparative Bi2S3 nanostructures, the obtained multi-level Bi2S3 structures display S-type hysteresis and ferromagnetism at low field which might result from ordered structure of 2D networks. PMID:26028331

  8. A Vulnerability-Based, Bottom-up Assessment of Future Riverine Flood Risk Using a Modified Peaks-Over-Threshold Approach and a Physically Based Hydrologic Model

    Science.gov (United States)

    Knighton, James; Steinschneider, Scott; Walter, M. Todd

    2017-12-01

    There is a chronic disconnection among purely probabilistic flood frequency analysis of flood hazards, flood risks, and hydrological flood mechanisms, which hamper our ability to assess future flood impacts. We present a vulnerability-based approach to estimating riverine flood risk that accommodates a more direct linkage between decision-relevant metrics of risk and the dominant mechanisms that cause riverine flooding. We adapt the conventional peaks-over-threshold (POT) framework to be used with extreme precipitation from different climate processes and rainfall-runoff-based model output. We quantify the probability that at least one adverse hydrologic threshold, potentially defined by stakeholders, will be exceeded within the next N years. This approach allows us to consider flood risk as the summation of risk from separate atmospheric mechanisms, and supports a more direct mapping between hazards and societal outcomes. We perform this analysis within a bottom-up framework to consider the relevance and consequences of information, with varying levels of credibility, on changes to atmospheric patterns driving extreme precipitation events. We demonstrate our proposed approach using a case study for Fall Creek in Ithaca, NY, USA, where we estimate the risk of stakeholder-defined flood metrics from three dominant mechanisms: summer convection, tropical cyclones, and spring rain and snowmelt. Using downscaled climate projections, we determine how flood risk associated with a subset of mechanisms may change in the future, and the resultant shift to annual flood risk. The flood risk approach we propose can provide powerful new insights into future flood threats.

  9. Toward improved prediction of the bedrock depth underneath hillslopes: Bayesian inference of the bottom-up control hypothesis using high-resolution topographic data

    Science.gov (United States)

    Gomes, Guilherme J. C.; Vrugt, Jasper A.; Vargas, Eurípedes A.

    2016-04-01

    The depth to bedrock controls a myriad of processes by influencing subsurface flow paths, erosion rates, soil moisture, and water uptake by plant roots. As hillslope interiors are very difficult and costly to illuminate and access, the topography of the bedrock surface is largely unknown. This essay is concerned with the prediction of spatial patterns in the depth to bedrock (DTB) using high-resolution topographic data, numerical modeling, and Bayesian analysis. Our DTB model builds on the bottom-up control on fresh-bedrock topography hypothesis of Rempe and Dietrich (2014) and includes a mass movement and bedrock-valley morphology term to extent the usefulness and general applicability of the model. We reconcile the DTB model with field observations using Bayesian analysis with the DREAM algorithm. We investigate explicitly the benefits of using spatially distributed parameter values to account implicitly, and in a relatively simple way, for rock mass heterogeneities that are very difficult, if not impossible, to characterize adequately in the field. We illustrate our method using an artificial data set of bedrock depth observations and then evaluate our DTB model with real-world data collected at the Papagaio river basin in Rio de Janeiro, Brazil. Our results demonstrate that the DTB model predicts accurately the observed bedrock depth data. The posterior mean DTB simulation is shown to be in good agreement with the measured data. The posterior prediction uncertainty of the DTB model can be propagated forward through hydromechanical models to derive probabilistic estimates of factors of safety.

  10. Bottom-up derivation of conservative and dissipative interactions for coarse-grained molecular liquids with the conditional reversible work method

    International Nuclear Information System (INIS)

    Deichmann, Gregor; Marcon, Valentina; Vegt, Nico F. A. van der

    2014-01-01

    Molecular simulations of soft matter systems have been performed in recent years using a variety of systematically coarse-grained models. With these models, structural or thermodynamic properties can be quite accurately represented while the prediction of dynamic properties remains difficult, especially for multi-component systems. In this work, we use constraint molecular dynamics simulations for calculating dissipative pair forces which are used together with conditional reversible work (CRW) conservative forces in dissipative particle dynamics (DPD) simulations. The combined CRW-DPD approach aims to extend the representability of CRW models to dynamic properties and uses a bottom-up approach. Dissipative pair forces are derived from fluctuations of the direct atomistic forces between mapped groups. The conservative CRW potential is obtained from a similar series of constraint dynamics simulations and represents the reversible work performed to couple the direct atomistic interactions between the mapped atom groups. Neopentane, tetrachloromethane, cyclohexane, and n-hexane have been considered as model systems. These molecular liquids are simulated with atomistic molecular dynamics, coarse-grained molecular dynamics, and DPD. We find that the CRW-DPD models reproduce the liquid structure and diffusive dynamics of the liquid systems in reasonable agreement with the atomistic models when using single-site mapping schemes with beads containing five or six heavy atoms. For a two-site representation of n-hexane (3 carbons per bead), time scale separation can no longer be assumed and the DPD approach consequently fails to reproduce the atomistic dynamics

  11. Effects of bottom-up and top-down intervention principles in emergent literacy in children at risk of developmental dyslexia: a longitudinal study.

    Science.gov (United States)

    Helland, Turid; Tjus, Tomas; Hovden, Marit; Ofte, Sonja; Heimann, Mikael

    2011-01-01

    This longitudinal study focused on the effects of two different principles of intervention in children at risk of developing dyslexia from 5 to 8 years old. The children were selected on the basis of a background questionnaire given to parents and preschool teachers, with cognitive and functional magnetic resonance imaging results substantiating group differences in neuropsychological processes associated with phonology, orthography, and phoneme-grapheme correspondence (i.e., alphabetic principle). The two principles of intervention were bottom-up (BU), "from sound to meaning", and top-down (TD), "from meaning to sound." Thus, four subgroups were established: risk/BU, risk/TD, control/BU, and control/TD. Computer-based training took place for 2 months every spring, and cognitive assessments were performed each fall of the project period. Measures of preliteracy skills for reading and spelling were phonological awareness, working memory, verbal learning, and letter knowledge. Literacy skills were assessed by word reading and spelling. At project end the control group scored significantly above age norm, whereas the risk group scored within the norm. In the at-risk group, training based on the BU principle had the strongest effects on phonological awareness and working memory scores, whereas training based on the TD principle had the strongest effects on verbal learning, letter knowledge, and literacy scores. It was concluded that appropriate, specific, data-based intervention starting in preschool can mitigate literacy impairment and that interventions should contain BU training for preliteracy skills and TD training for literacy training.

  12. When top-down becomes bottom up: behaviour of hyperdense howler monkeys (Alouatta seniculus) trapped on a 0.6 ha island.

    Science.gov (United States)

    Orihuela, Gabriela; Terborgh, John; Ceballos, Natalia; Glander, Kenneth

    2014-01-01

    Predators are a ubiquitous presence in most natural environments. Opportunities to contrast the behaviour of a species in the presence and absence of predators are thus rare. Here we report on the behaviour of howler monkey groups living under radically different conditions on two land-bridge islands in Lago Guri, Venezuela. One group of 6 adults inhabited a 190-ha island (Danto) where they were exposed to multiple potential predators. This group, the control, occupied a home range of 23 ha and contested access to food resources with neighbouring groups in typical fashion. The second group, containing 6 adults, was isolated on a remote, predator-free 0.6 ha islet (Iguana) offering limited food resources. Howlers living on the large island moved, fed and rested in a coherent group, frequently engaged in affiliative activities, rarely displayed agonistic behaviour and maintained intergroup spacing through howling. In contrast, the howlers on Iguana showed repulsion, as individuals spent most of their time spaced widely around the perimeter of the island. Iguana howlers rarely engaged in affiliative behaviour, often chased or fought with one another and were not observed to howl. These behaviors are interpreted as adjustments to the unrelenting deprivation associated with bottom-up limitation in a predator-free environment.

  13. Synthesis of a Cementitious Material Nanocement Using Bottom-Up Nanotechnology Concept: An Alternative Approach to Avoid CO2 Emission during Production of Cement

    Directory of Open Access Journals (Sweden)

    Byung Wan Jo

    2014-01-01

    Full Text Available The world’s increasing need is to develop smart and sustainable construction material, which will generate minimal climate changing gas during their production. The bottom-up nanotechnology has established itself as a promising alternative technique for the production of the cementitious material. The present investigation deals with the chemical synthesis of cementitious material using nanosilica, sodium aluminate, sodium hydroxide, and calcium nitrate as reacting phases. The characteristic properties of the chemically synthesized nanocement were verified by the chemical composition analysis, setting time measurement, particle size distribution, fineness analysis, and SEM and XRD analyses. Finally, the performance of the nanocement was ensured by the fabrication and characterization of the nanocement based mortar. Comparing the results with the commercially available cement product, it is demonstrated that the chemically synthesized nanocement not only shows better physical and mechanical performance, but also brings several encouraging impacts to the society, including the reduction of CO2 emission and the development of sustainable construction material. A plausible reaction scheme has been proposed to explain the synthesis and the overall performances of the nanocement.

  14. On the advantages of spring magnets compared to pure FePt: Strategy for rare-earth free permanent magnets following a bottom-up approach

    Energy Technology Data Exchange (ETDEWEB)

    Pousthomis, M.; Garnero, C. [Université de Toulouse, UMR 5215 INSA, CNRS, UPS, Laboratoire de Physique et Chimie des Nano-Objets, 135 avenue de Rangueil, F-31077 Toulouse Cedex 4 (France); Marcelot, C.G. [Université de Toulouse, UMR 5215 INSA, CNRS, UPS, Laboratoire de Physique et Chimie des Nano-Objets, 135 avenue de Rangueil, F-31077 Toulouse Cedex 4 (France); Centre d’Elaboration de Matériaux et d’Etudes Structurales, CEMES-CNRS, 29 rue Jeanne Marvig, B.P. 94347, 31055 Toulouse (France); Blon, T.; Cayez, S. [Université de Toulouse, UMR 5215 INSA, CNRS, UPS, Laboratoire de Physique et Chimie des Nano-Objets, 135 avenue de Rangueil, F-31077 Toulouse Cedex 4 (France); Cassignol, C.; Du, V.A.; Krispin, M. [Siemens AG, Corporate Technology, Munich (Germany); Arenal, R. [Transpyrenean Advanced Laboratory for Electron Microscopy (TALEM), INSA - INA, CNRS - Universidad de Zaragoza, 30155 Toulouse (France); Laboratorio de Microscopias Avanzadas (LMA), Instituto de Nanociencia de Aragon (INA), U. Zaragoza, C/Mariano Esquillor s/n, 50018 Zaragoza (Spain); Fundacion ARAID, 50018 Zaragoza (Spain); Soulantica, K.; Viau, G. [Université de Toulouse, UMR 5215 INSA, CNRS, UPS, Laboratoire de Physique et Chimie des Nano-Objets, 135 avenue de Rangueil, F-31077 Toulouse Cedex 4 (France); Lacroix, L.-M., E-mail: lmlacroi@insa-toulouse.fr [Université de Toulouse, UMR 5215 INSA, CNRS, UPS, Laboratoire de Physique et Chimie des Nano-Objets, 135 avenue de Rangueil, F-31077 Toulouse Cedex 4 (France); Transpyrenean Advanced Laboratory for Electron Microscopy (TALEM), INSA - INA, CNRS - Universidad de Zaragoza, 30155 Toulouse (France)

    2017-02-15

    Nanostructured magnets benefiting from efficient exchange-coupling between hard and soft grains represent an appealing approach for integrated miniaturized magnetic power sources. Using a bottom-up approach, nanostructured materials were prepared from binary assemblies of bcc FeCo and fcc FePt nanoparticles and compared with pure L1{sub 0}-FePt materials. The use of a bifunctional mercapto benzoic acid yields homogeneous assemblies of the two types of particles while reducing the organic matter amount. The 650 °C thermal annealing, mandatory to allow the L1{sub 0}-FePt phase transition, led to an important interdiffusion and thus decreased drastically the amount of soft phase present in the final composites. The analysis of recoil curves however evidenced the presence of an efficient interphase exchange coupling, which allows obtaining better magnetic performances than pure L1{sub 0} FePt materials, energy product above 100 kJ m{sup −3} being estimated for a Pt content of only 33%. These results clearly evidenced the interest of chemically grown nanoparticles for the preparation of performant spring-magnets, opening promising perspective for integrated subcentimetric magnets with optimized properties.

  15. Bottom-up derivation of conservative and dissipative interactions for coarse-grained molecular liquids with the conditional reversible work method

    Energy Technology Data Exchange (ETDEWEB)

    Deichmann, Gregor; Marcon, Valentina; Vegt, Nico F. A. van der, E-mail: vandervegt@csi.tu-darmstadt.de [Center of Smart Interfaces, Technische Universität Darmstadt, Alarich-Weiss-Straße 10, 64287 Darmstadt (Germany)

    2014-12-14

    Molecular simulations of soft matter systems have been performed in recent years using a variety of systematically coarse-grained models. With these models, structural or thermodynamic properties can be quite accurately represented while the prediction of dynamic properties remains difficult, especially for multi-component systems. In this work, we use constraint molecular dynamics simulations for calculating dissipative pair forces which are used together with conditional reversible work (CRW) conservative forces in dissipative particle dynamics (DPD) simulations. The combined CRW-DPD approach aims to extend the representability of CRW models to dynamic properties and uses a bottom-up approach. Dissipative pair forces are derived from fluctuations of the direct atomistic forces between mapped groups. The conservative CRW potential is obtained from a similar series of constraint dynamics simulations and represents the reversible work performed to couple the direct atomistic interactions between the mapped atom groups. Neopentane, tetrachloromethane, cyclohexane, and n-hexane have been considered as model systems. These molecular liquids are simulated with atomistic molecular dynamics, coarse-grained molecular dynamics, and DPD. We find that the CRW-DPD models reproduce the liquid structure and diffusive dynamics of the liquid systems in reasonable agreement with the atomistic models when using single-site mapping schemes with beads containing five or six heavy atoms. For a two-site representation of n-hexane (3 carbons per bead), time scale separation can no longer be assumed and the DPD approach consequently fails to reproduce the atomistic dynamics.

  16. The bottom-up approach to defining life : deciphering the functional organization of biological cells via multi-objective representation of biological complexity from molecules to cells

    Directory of Open Access Journals (Sweden)

    Sathish ePeriyasamy

    2013-12-01

    Full Text Available In silico representation of cellular systems needs to represent the adaptive dynamics of biological cells, recognizing a cell’s multi-objective topology formed by spatially and temporally cohesive intracellular structures. The design of these models needs to address the hierarchical and concurrent nature of cellular functions and incorporate the ability to self-organise in response to transitions between healthy and pathological phases, and adapt accordingly. The functions of biological systems are constantly evolving, due to the ever changing demands of their environment. Biological systems meet these demands by pursuing objectives, aided by their constituents, giving rise to biological functions. A biological cell is organised into an objective/task hierarchy. These objective hierarchy corresponds to the nested nature of temporally cohesive structures and representing them will facilitate in studying pleiotropy and polygeny by modeling causalities propagating across multiple interconnected intracellular processes. Although biological adaptations occur in physiological, developmental and reproductive timescales, the paper is focused on adaptations that occur within physiological timescales, where the biomolecular activities contributing to functional organisation, play a key role in cellular physiology. The paper proposes a multi-scale and multi-objective modelling approach from the bottom-up by representing temporally cohesive structures for multi-tasking of intracellular processes. Further the paper characterises the properties and constraints that are consequential to the organisational and adaptive dynamics in biological cells.

  17. Manipulation of the Geometry and Modulation of the Optical Response of Surfactant-Free Gold Nanostars: A Systematic Bottom-Up Synthesis.

    Science.gov (United States)

    De Silva Indrasekara, Agampodi S; Johnson, Sean F; Odion, Ren A; Vo-Dinh, Tuan

    2018-02-28

    Among plasmonic nanoparticles, surfactant-free branched gold nanoparticles have exhibited exceptional properties as a nanoplatform for a wide variety of applications ranging from surface-enhanced Raman scattering sensing and imaging applications to photothermal treatment and photoimmunotherapy for cancer treatments. The effectiveness and reliability of branched gold nanoparticles in biomedical applications strongly rely on the consistency and reproducibility of physical, chemical, optical, and therapeutic properties of nanoparticles, which are mainly governed by their morphological features. Herein, we present an optimized bottom-up synthesis that improves the reproducibility and homogeneity of the gold-branched nanoparticles with desired morphological features and optical properties. We identified that the order of reagent addition is crucial for improved homogeneity of the branched nature of nanoparticles that enable a high batch-to-batch reproducibility and reliability. In addition, a different combination of the synthesis parameters, in particular, additive halides and concentration ratios of reactive Au to Ag and Au to Au seeds, which yield branched nanoparticle of similar localized surface plasmon resonances but with distinguishable changes in the dimensions of the branches, was realized. Overall, our study introduces the design parameters for the purpose-tailored manufacturing of surfactant-free gold nanostars in a reliable manner.

  18. Perceived Effects of Pornography on the Couple Relationship: Initial Findings of Open-Ended, Participant-Informed, "Bottom-Up" Research.

    Science.gov (United States)

    Kohut, Taylor; Fisher, William A; Campbell, Lorne

    2017-02-01

    The current study adopted a participant-informed, "bottom-up," qualitative approach to identifying perceived effects of pornography on the couple relationship. A large sample (N = 430) of men and women in heterosexual relationships in which pornography was used by at least one partner was recruited through online (e.g., Facebook, Twitter, etc.) and offline (e.g., newspapers, radio, etc.) sources. Participants responded to open-ended questions regarding perceived consequences of pornography use for each couple member and for their relationship in the context of an online survey. In the current sample of respondents, "no negative effects" was the most commonly reported impact of pornography use. Among remaining responses, positive perceived effects of pornography use on couple members and their relationship (e.g., improved sexual communication, more sexual experimentation, enhanced sexual comfort) were reported frequently; negative perceived effects of pornography (e.g., unrealistic expectations, decreased sexual interest in partner, increased insecurity) were also reported, albeit with considerably less frequency. The results of this work suggest new research directions that require more systematic attention.

  19. Middle-Out Approaches to Reform of University Teaching and Learning: Champions striding between the top-down and bottom-up approaches

    Directory of Open Access Journals (Sweden)

    Rick Cummings

    2005-03-01

    Full Text Available In recent years, Australian universities have been driven by a diversity of external forces, including funding cuts, massification of higher education, and changing student demographics, to reform their relationship with students and improve teaching and learning, particularly for those studying off-campus or part-time. Many universities have responded to these forces either through formal strategic plans developed top-down by executive staff or through organic developments arising from staff in a bottom-up approach. By contrast, much of Murdoch University’s response has been led by a small number of staff who have middle management responsibilities and who have championed the reform of key university functions, largely in spite of current policy or accepted practice. This paper argues that the ‘middle-out’ strategy has both a basis in change management theory and practice, and a number of strengths, including low risk, low cost, and high sustainability. Three linked examples of middle-out change management in teaching and learning at Murdoch University are described and the outcomes analyzed to demonstrate the benefits and pitfalls of this approach.

  20. Quantitative Analysis on the Energy and Environmental Impact of the Korean National Energy R&D Roadmap a Using Bottom-Up Energy System Model

    Directory of Open Access Journals (Sweden)

    Sang Jin Choi

    2017-03-01

    Full Text Available According to the Paris Agreement at the 21st Conference of the Parties, 196 member states are obliged to submit their Intended Nationally Determined Contributions (INDC for every 5 years. As a member, South Korea has already proposed the reduction target and need to submit the achievement as a result of the policies and endeavors in the near future. In this paper, a Korean bottom-up energy system model to support the low-carbon national energy R&D roadmap will be introduced and through the modeling of various scenarios, the mid-to long-term impact on energy consumptions and CO2 emissions will be analyzed as well. The results of the analysis showed that, assuming R&D investments for the 11 types of technologies, savings of 13.7% with regards to final energy consumptions compared to the baseline scenario would be feasible by 2050. Furthermore, in the field of power generation, the generation proportion of new and renewable energy is expected to increase from 3.0% as of 2011 to 19.4% by 2050. This research also suggested that the analysis on the Energy Technology R&D Roadmap based on the model can be used not only for overall impact analysis and R&D portfolio establishment, but also for the development of detailed R&D strategies.

  1. Gender effect in human brain responses to bottom-up and top-down attention using the EEG 3D-Vector Field Tomography.

    Science.gov (United States)

    Kosmidou, Vasiliki E; Adam, Aikaterini; Papadaniil, Chrysa D; Tsolaki, Magda; Hadjileontiadis, Leontios J; Kompatsiaris, Ioannis

    2015-01-01

    The effect of gender in rapidly allocating attention to objects, features or locations, as reflected in brain activity, is examined in this study. A visual-attention task, consisting of bottom-up (visual pop-out) and top-down (visual search) conditions during stimuli of four triangles, i.e., a target and three distractors, was engaged. In pop-out condition, both color and orientation of the distractors differed from target, while in search condition they differed only in orientation. During the task, high-density EEG (256 channels) data were recorded and analyzed by means of behavioral, event-related potentials, i.e., the P300 component and brain source localization analysis using 3D-Vector Field Tomography (3D-VFT). Twenty subjects (half female; 32±4.7 years old) participated in the experiments, performing 60 trials for each condition. Behavioral analysis revealed that both female and male outperformed in the pop-out condition compared to the search one, with respect to accuracy and reaction time, whereas no gender-related statistical significant differences were found. Nevertheless, in the search condition, higher P300 amplitudes were detected for females compared to males (p left inferior frontal and superior temporal gyri, whereas in males it was found in the right inferior frontal and superior temporal gyri. Overall, the experimental results show that visual attention depends on contributions from different brain lateralization linked to gender, posing important implications in studying developmental disorders, characterized by gender differences.

  2. Fabricación de electrodos para control de transporte y alineamiento a micro y nanoescalas usando técnicas bottom-up y top-down

    Directory of Open Access Journals (Sweden)

    Darwin Rodríguez

    2014-12-01

    Full Text Available El continuo avance de aplicaciones en dispositivos de autoensamble, posicionamiento, sensores, actuadores, y que permitan controladamente la manipulación de micro y nanoestructuras, han generado amplio interés en el desarrollo de metodologías que permitan optimizar la fabricación de dispositivos para el control y manipulación a micro y nanoescalas. Este proyecto explora técnicas de fabricación de electrodos con el fin de encontrar una técnica óptima y reproducible. Se compara el rendimiento de cada técnica y se describen protocolos de limpieza y seguridad. Se diseñan e implementan tres geometrías para movilizar y posicionar micro y nanopartículas de hierro en una solución de aceite natural. Finalmente se generan campos eléctricos a partir de electroforesis, con el fin de encontrar la curva que describe el desplazamiento de las partículas con respecto al potencial aplicado. Estos resultados generan gran impacto en los actuales esfuerzos de fabricación bottom-up (controlando con campos la ubicación y la movilidad en dispositivos electrónicos. El hecho de fabricar geometría planar con electrodos genera la posibilidad de que se pueda integrar movimiento de partículas a los circuitos integrados que se fabrican en la actualidad.

  3. Photosensitive nanostructured TiO{sub 2} grown at room temperature by novel 'bottom-up' approached CBD method

    Energy Technology Data Exchange (ETDEWEB)

    Patil, U.M.; Kulkarni, S.B.; Deshmukh, P.R.; Salunkhe, R.R. [Thin Film Physics Laboratory, Department of Physics, Shivaji University, Kolhapur 416 004 (India); Lokhande, C.D., E-mail: l_chandrakant@yahoo.com [Thin Film Physics Laboratory, Department of Physics, Shivaji University, Kolhapur 416 004 (India)

    2011-05-26

    Research highlights: > The TiO{sub 2} nanoflakes like morphology have been controllably synthesized by the CBD method with NH{sub 4}Cl as complexing agent. > The XRD and FTIR studies confirmed the formation of anatase TiO{sub 2} thin film. > The SEM images reveal the improvement in flower size of TiO{sub 2} as an effect of heat treatment. > The strong influence of wide band gap of this electrode in spite this, PEC performance can motivate to check its feasibility in DSSC's devices. - Abstract: The current paper incorporates with a 'bottom-up' approached chemical bath deposition method to grow titanium dioxide (TiO{sub 2}) nanostructure at room temperature on glass and stainless steel substrates. The room temperature deposited TiO{sub 2} films are heat treated at 673 K for 1 h in air and the corresponding change in structural, morphological and optical properties are studied by means of X-ray diffraction (XRD), Fourier transform infrared (FTIR), scanning electron microscopy (SEM), and UV-VIS-NIR spectrophotometer. The heat-treated films are utilized as a photocathode in photoelectrochemical (PEC) cell in 1 M NaOH electrolyte. The experimental results show that, the CBD method allows formation of photosensitive, anatase TiO{sub 2} thin film, which can be potentially tuned in many functional applications with feasibility.

  4. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems, Using a Bottom-Up Approach and Installer Survey - Second Edition

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, B.; Ardani, K.; Feldman, D.; Citron, R.; Margolis, R.; Zuboy, J.

    2013-10-01

    This report presents results from the second U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs -- often referred to as 'business process' or 'soft' costs -- for U.S. residential and commercial photovoltaic (PV) systems. In service to DOE's SunShot Initiative, annual expenditure and labor-hour-productivity data are analyzed to benchmark 2012 soft costs related to (1) customer acquisition and system design (2) permitting, inspection, and interconnection (PII). We also include an in-depth analysis of costs related to financing, overhead, and profit. Soft costs are both a major challenge and a major opportunity for reducing PV system prices and stimulating SunShot-level PV deployment in the United States. The data and analysis in this series of benchmarking reports are a step toward the more detailed understanding of PV soft costs required to track and accelerate these price reductions.

  5. When top-down becomes bottom up: behaviour of hyperdense howler monkeys (Alouatta seniculus trapped on a 0.6 ha island.

    Directory of Open Access Journals (Sweden)

    Gabriela Orihuela

    Full Text Available Predators are a ubiquitous presence in most natural environments. Opportunities to contrast the behaviour of a species in the presence and absence of predators are thus rare. Here we report on the behaviour of howler monkey groups living under radically different conditions on two land-bridge islands in Lago Guri, Venezuela. One group of 6 adults inhabited a 190-ha island (Danto where they were exposed to multiple potential predators. This group, the control, occupied a home range of 23 ha and contested access to food resources with neighbouring groups in typical fashion. The second group, containing 6 adults, was isolated on a remote, predator-free 0.6 ha islet (Iguana offering limited food resources. Howlers living on the large island moved, fed and rested in a coherent group, frequently engaged in affiliative activities, rarely displayed agonistic behaviour and maintained intergroup spacing through howling. In contrast, the howlers on Iguana showed repulsion, as individuals spent most of their time spaced widely around the perimeter of the island. Iguana howlers rarely engaged in affiliative behaviour, often chased or fought with one another and were not observed to howl. These behaviors are interpreted as adjustments to the unrelenting deprivation associated with bottom-up limitation in a predator-free environment.

  6. Sensitivity of California water supply to changes in runoff magnitude and timing: A bottom-up assessment of vulnerabilities and adaptation strategies

    Science.gov (United States)

    Fefer, M.; Dogan, M. S.; Herman, J. D.

    2017-12-01

    Long-term shifts in the timing and magnitude of reservoir inflows will potentially have significant impacts on water supply reliability in California, though projections remain uncertain. Here we assess the vulnerability of the statewide system to changes in total annual runoff (a function of precipitation) and the fraction of runoff occurring during the winter months (primarily a function of temperature). An ensemble of scenarios is sampled using a bottom-up approach and compared to the most recent available streamflow projections from the state's 4th Climate Assessment. We evaluate these scenarios using a new open-source version of the CALVIN model, a network flow optimization model encompassing roughly 90% of the urban and agricultural water demands in California, which is capable of running scenario ensembles on a high-performance computing cluster. The economic representation of water demand in the model yields several advantages for this type of analysis: optimized reservoir operating policies to minimize shortage cost and the marginal value of adaptation opportunities, defined by shadow prices on infrastructure and regulatory constraints. Results indicate a shift in optimal reservoir operations and high marginal value of additional reservoir storage in the winter months. The collaborative management of reservoirs in CALVIN yields increased storage in downstream reservoirs to store the increased winter runoff. This study contributes an ensemble evaluation of a large-scale network model to investigate uncertain climate projections, and an approach to interpret the results of economic optimization through the lens of long-term adaptation strategies.

  7. Combined top-down and bottom-up climate change impact assessment for the hydrological system in the Vu Gia- Thu Bon River Basin.

    Science.gov (United States)

    Tra, Tran Van; Thinh, Nguyen Xuan; Greiving, Stefan

    2018-07-15

    Vu Gia- Thu Bon (VGTB) River Basin, located in the Central Coastal zone of Viet Nam currently faces water shortage. Climate change is expected to exacerbate the challenge. Therefore, there is a need to study the impacts of climate change on water shortage in the river basin. The study adopts a combined top-down and bottom-up climate change impact assessment to address the impacts of climate change on water shortage in the VGTB River Basin. A MIKE BASIN water balance model for the river basin was established to simulate the response of the hydrological system. Simulations were performed through parametrically varying temperature and precipitation to determine the vulnerability space of water shortage. General Circulation Models (GCMs) were then utilized to provide climate projections for the river basin. The output from GCMs was then mapped onto the vulnerability space determined earlier. In total, 9 out of 55 water demand nodes in the simulation are expected to face problematic conditions as future climate changes. Copyright © 2018 Elsevier B.V. All rights reserved.

  8. Employment impacts of EU biofuels policy. Combining bottom-up technology information and sectoral market simulations in an input-output framework

    International Nuclear Information System (INIS)

    Neuwahl, Frederik; Mongelli, Ignazio; Delgado, Luis; Loeschel, Andreas

    2008-01-01

    This paper analyses the employment consequences of policies aimed to support biofuels in the European Union. The promotion of biofuel use has been advocated as a means to promote the sustainable use of natural resources and to reduce greenhouse gas emissions originating from transport activities on the one hand, and to reduce dependence on imported oil and thereby increase security of the European energy supply on the other hand. The employment impacts of increasing biofuels shares are calculated by taking into account a set of elements comprising the demand for capital goods required to produce biofuels, the additional demand for agricultural feedstock, higher fuel prices or reduced household budget in the case of price subsidisation, price effects ensuing from a hypothetical world oil price reduction linked to substitution in the EU market, and price impacts on agro-food commodities. The calculations refer to scenarios for the year 2020 targets as set out by the recent Renewable Energy Roadmap. Employment effects are assessed in an input-output framework taking into account bottom-up technology information to specify biofuels activities and linked to partial equilibrium models for the agricultural and energy sectors. The simulations suggest that biofuels targets on the order of 10-15% could be achieved without adverse net employment effects. (author)

  9. Evolutionary Steps in the Emergence of Life Deduced from the Bottom-Up Approach and GADV Hypothesis (Top-Down Approach).

    Science.gov (United States)

    Ikehara, Kenji

    2016-01-26

    It is no doubt quite difficult to solve the riddle of the origin of life. So, firstly, I would like to point out the kinds of obstacles there are in solving this riddle and how we should tackle these difficult problems, reviewing the studies that have been conducted so far. After that, I will propose that the consecutive evolutionary steps in a timeline can be rationally deduced by using a common event as a juncture, which is obtained by two counter-directional approaches: one is the bottom-up approach through which many researchers have studied the origin of life, and the other is the top-down approach, through which I established the [GADV]-protein world hypothesis or GADV hypothesis on the origin of life starting from a study on the formation of entirely new genes in extant microorganisms. Last, I will describe the probable evolutionary process from the formation of Earth to the emergence of life, which was deduced by using a common event-the establishment of the first genetic code encoding [GADV]-amino acids-as a juncture for the results obtained from the two approaches.

  10. Evolutionary Steps in the Emergence of Life Deduced from the Bottom-Up Approach and GADV Hypothesis (Top-Down Approach

    Directory of Open Access Journals (Sweden)

    Kenji Ikehara

    2016-01-01

    Full Text Available It is no doubt quite difficult to solve the riddle of the origin of life. So, firstly, I would like to point out the kinds of obstacles there are in solving this riddle and how we should tackle these difficult problems, reviewing the studies that have been conducted so far. After that, I will propose that the consecutive evolutionary steps in a timeline can be rationally deduced by using a common event as a juncture, which is obtained by two counter-directional approaches: one is the bottom-up approach through which many researchers have studied the origin of life, and the other is the top-down approach, through which I established the [GADV]-protein world hypothesis or GADV hypothesis on the origin of life starting from a study on the formation of entirely new genes in extant microorganisms. Last, I will describe the probable evolutionary process from the formation of Earth to the emergence of life, which was deduced by using a common event—the establishment of the first genetic code encoding [GADV]-amino acids—as a juncture for the results obtained from the two approaches.

  11. Ultra rapidly dissolving repaglinide nanosized crystals prepared via bottom-up and top-down approach: influence of food on pharmacokinetics behavior.

    Science.gov (United States)

    Gadadare, Rahul; Mandpe, Leenata; Pokharkar, Varsha

    2015-08-01

    The present work was undertaken with the objectives of improving the dissolution velocity, related oral bioavailability, and minimizing the fasted/fed state variability of repaglinide, a poorly water-soluble anti-diabetic active by exploring the principles of nanotechnology. Nanocrystal formulations were prepared by both top-down and bottom-up approaches. These approaches were compared in light of their ability to provide the formulation stability in terms of particle size. Soluplus® was used as a stabilizer and Kolliphor™ E-TPGS was used as an oral absorption enhancer. In vitro dissolution profiles were investigated in distilled water, fasted and fed state simulated gastric fluid, and compared with the pure repaglinide. In vivo pharmacokinetics was performed in both the fasted and fed state using Wistar rats. Oral hypoglycemic activity was also assessed in streptozotocin-induced diabetic rats. Nanocrystals TD-A and TD-B showed 19.86 and 25.67-fold increase in saturation solubility, respectively, when compared with pure repaglinide. Almost 10 (TD-A) and 15 (TD-B)-fold enhancement in the oral bioavailability of nanocrystals was observed regardless of the fasted/fed state compared to pure repaglinide. Nanocrystal formulations also demonstrated significant (p < 0.001) hypoglycemic activity with faster onset (less than 30 min) and prolonged duration (up to 8 h) compared to pure repaglinide (after 60 min; up to 4 h, respectively).

  12. Top-down impact through a bottom-up mechanism: the effect of limpet grazing on growth, productivity and carbon allocation of Zostera marina L. (eelgrass).

    Science.gov (United States)

    Zimmerman, Richard C; Kohrs, Donald G; Alberte, Randall S

    1996-09-01

    The unusual appearance of a commensal eelgrass limpet [Tectura depicta (Berry)] from southern California at high density (up to 10 shoot -1 ) has coincided with the catastrophic decline of a subtidal Zostera marina L. meadow in Monterey Bay, California. Some commensal limpets graze the chloroplast-rich epidermis of eelgrass leaves, but were not known to affect seagrass growth or productivity. We evaluated the effect on eelgrass productivity of grazing by limpets maintained at natural densities (8±2 shoot -1 ) in a natural light mesocosm for 45 days. Growth rates, carbon reserves, root proliferation and net photosynthesis of grazed plants were 50-80% below those of ungrazed plants, but biomass-specific respiration was unaffected. The daily period of irradiance-saturated photosynthesis (H sat ) needed to maintain positive carbon balance in grazed plants approached 13.5 h, compared with 5-6 h for ungrazed plants. The amount of carbon allocated to roots of ungrazed plants was 800% higher than for grazed plants. By grazing the chlorophyll-rich epidermis, T. depicta induced carbon limitation in eelgrass growing in an other-wise light-replete environment. Continued northward movement of T. depicta, may have significant impacts on eelgrass production and population dynamics in the northeast Pacific, even thought this limpet consumes very little plant biomass. This interaction is a dramatic example of top-down control (grazing/predation) of eelgrass productivity and survival operating via a bottom-up mechanism (photosynthesis limitation).

  13. Climate change, pink salmon, and the nexus between bottom-up and top-down forcing in the subarctic Pacific Ocean and Bering Sea.

    Science.gov (United States)

    Springer, Alan M; van Vliet, Gus B

    2014-05-06

    Climate change in the last century was associated with spectacular growth of many wild Pacific salmon stocks in the North Pacific Ocean and Bering Sea, apparently through bottom-up forcing linking meteorology to ocean physics, water temperature, and plankton production. One species in particular, pink salmon, became so numerous by the 1990s that they began to dominate other species of salmon for prey resources and to exert top-down control in the open ocean ecosystem. Information from long-term monitoring of seabirds in the Aleutian Islands and Bering Sea reveals that the sphere of influence of pink salmon is much larger than previously known. Seabirds, pink salmon, other species of salmon, and by extension other higher-order predators, are tightly linked ecologically and must be included in international management and conservation policies for sustaining all species that compete for common, finite resource pools. These data further emphasize that the unique 2-y cycle in abundance of pink salmon drives interannual shifts between two alternate states of a complex marine ecosystem.

  14. Dissociated roles of the inferior frontal gyrus and superior temporal sulcus in audiovisual processing: top-down and bottom-up mismatch detection.

    Science.gov (United States)

    Uno, Takeshi; Kawai, Kensuke; Sakai, Katsuyuki; Wakebe, Toshihiro; Ibaraki, Takuya; Kunii, Naoto; Matsuo, Takeshi; Saito, Nobuhito

    2015-01-01

    Visual inputs can distort auditory perception, and accurate auditory processing requires the ability to detect and ignore visual input that is simultaneous and incongruent with auditory information. However, the neural basis of this auditory selection from audiovisual information is unknown, whereas integration process of audiovisual inputs is intensively researched. Here, we tested the hypothesis that the inferior frontal gyrus (IFG) and superior temporal sulcus (STS) are involved in top-down and bottom-up processing, respectively, of target auditory information from audiovisual inputs. We recorded high gamma activity (HGA), which is associated with neuronal firing in local brain regions, using electrocorticography while patients with epilepsy judged the syllable spoken by a voice while looking at a voice-congruent or -incongruent lip movement from the speaker. The STS exhibited stronger HGA if the patient was presented with information of large audiovisual incongruence than of small incongruence, especially if the auditory information was correctly identified. On the other hand, the IFG exhibited stronger HGA in trials with small audiovisual incongruence when patients correctly perceived the auditory information than when patients incorrectly perceived the auditory information due to the mismatched visual information. These results indicate that the IFG and STS have dissociated roles in selective auditory processing, and suggest that the neural basis of selective auditory processing changes dynamically in accordance with the degree of incongruity between auditory and visual information.

  15. Bottom-up coarse-grained models with predictive accuracy and transferability for both structural and thermodynamic properties of heptane-toluene mixtures.

    Science.gov (United States)

    Dunn, Nicholas J H; Noid, W G

    2016-05-28

    This work investigates the promise of a "bottom-up" extended ensemble framework for developing coarse-grained (CG) models that provide predictive accuracy and transferability for describing both structural and thermodynamic properties. We employ a force-matching variational principle to determine system-independent, i.e., transferable, interaction potentials that optimally model the interactions in five distinct heptane-toluene mixtures. Similarly, we employ a self-consistent pressure-matching approach to determine a system-specific pressure correction for each mixture. The resulting CG potentials accurately reproduce the site-site rdfs, the volume fluctuations, and the pressure equations of state that are determined by all-atom (AA) models for the five mixtures. Furthermore, we demonstrate that these CG potentials provide similar accuracy for additional heptane-toluene mixtures that were not included their parameterization. Surprisingly, the extended ensemble approach improves not only the transferability but also the accuracy of the calculated potentials. Additionally, we observe that the required pressure corrections strongly correlate with the intermolecular cohesion of the system-specific CG potentials. Moreover, this cohesion correlates with the relative "structure" within the corresponding mapped AA ensemble. Finally, the appendix demonstrates that the self-consistent pressure-matching approach corresponds to minimizing an appropriate relative entropy.

  16. Object recognition with hierarchical discriminant saliency networks.

    Science.gov (United States)

    Han, Sunhyoung; Vasconcelos, Nuno

    2014-01-01

    The benefits of integrating attention and object recognition are investigated. While attention is frequently modeled as a pre-processor for recognition, we investigate the hypothesis that attention is an intrinsic component of recognition and vice-versa. This hypothesis is tested with a recognition model, the hierarchical discriminant saliency network (HDSN), whose layers are top-down saliency detectors, tuned for a visual class according to the principles of discriminant saliency. As a model of neural computation, the HDSN has two possible implementations. In a biologically plausible implementation, all layers comply with the standard neurophysiological model of visual cortex, with sub-layers of simple and complex units that implement a combination of filtering, divisive normalization, pooling, and non-linearities. In a convolutional neural network implementation, all layers are convolutional and implement a combination of filtering, rectification, and pooling. The rectification is performed with a parametric extension of the now popular rectified linear units (ReLUs), whose parameters can be tuned for the detection of target object classes. This enables a number of functional enhancements over neural network models that lack a connection to saliency, including optimal feature denoising mechanisms for recognition, modulation of saliency responses by the discriminant power of the underlying features, and the ability to detect both feature presence and absence. In either implementation, each layer has a precise statistical interpretation, and all parameters are tuned by statistical learning. Each saliency detection layer learns more discriminant saliency templates than its predecessors and higher layers have larger pooling fields. This enables the HDSN to simultaneously achieve high selectivity to target object classes and invariance. The performance of the network in saliency and object recognition tasks is compared to those of models from the biological and

  17. A Local Texture-Based Superpixel Feature Coding for Saliency Detection Combined with Global Saliency

    Directory of Open Access Journals (Sweden)

    Bingfei Nan

    2015-12-01

    Full Text Available Because saliency can be used as the prior knowledge of image content, saliency detection has been an active research area in image segmentation, object detection, image semantic understanding and other relevant image-based applications. In the case of saliency detection from cluster scenes, the salient object/region detected needs to not only be distinguished clearly from the background, but, preferably, to also be informative in terms of complete contour and local texture details to facilitate the successive processing. In this paper, a Local Texture-based Region Sparse Histogram (LTRSH model is proposed for saliency detection from cluster scenes. This model uses a combination of local texture patterns and color distribution as well as contour information to encode the superpixels to characterize the local feature of image for region contrast computing. Combining the region contrast as computed with the global saliency probability, a full-resolution salient map, in which the salient object/region detected adheres more closely to its inherent feature, is obtained on the bases of the corresponding high-level saliency spatial distribution as well as on the pixel-level saliency enhancement. Quantitative comparisons with five state-of-the-art saliency detection methods on benchmark datasets are carried out, and the comparative results show that the method we propose improves the detection performance in terms of corresponding measurements.

  18. Reducing energy consumption and CO2 emissions by energy efficiency measures and international trading: A bottom-up modeling for the U.S. iron and steel sector

    International Nuclear Information System (INIS)

    Karali, Nihan; Xu, Tengfang; Sathaye, Jayant

    2014-01-01

    Highlights: • Use ISEEM to evaluate energy and emission reduction in U.S. Iron and Steel sector. • ISEEM is a new bottom-up optimization model for industry sector energy planning. • Energy and emission reduction includes efficiency measure and international trading. • International trading includes commodity and carbon among U.S., China and India. • Project annual energy use, CO 2 emissions, production, and costs from 2010 to 2050. - Abstract: Using the ISEEM modeling framework, we analyzed the roles of energy efficiency measures, steel commodity and international carbon trading in achieving specific CO 2 emission reduction targets in the U.S iron and steel sector from 2010 to 2050. We modeled how steel demand is balanced under three alternative emission reduction scenarios designed to include national energy efficiency measures, commodity trading, and international carbon trading as key instruments to meet a particular emission restriction target in the U.S. iron and steel sector; and how production, process structure, energy supply, and system costs change with those scenarios. The results advance our understanding of long-term impacts of different energy policy options designed to reduce energy consumption and CO 2 emissions for U.S. iron and steel sector, and generate insight of policy implications for the sector’s environmentally and economically sustainable development. The alternative scenarios associated with 20% emission-reduction target are projected to result in approximately 11–19% annual energy reduction in the medium term (i.e., 2030) and 9–20% annual energy reduction in the long term (i.e., 2050) compared to the Base scenario

  19. Setting an ecological baseline prior to the bottom-up establishment of a marine protected area in Santorini island, Aegean Sea

    Directory of Open Access Journals (Sweden)

    M. SALOMIDI

    2016-08-01

    Full Text Available Since 2010, a bottom-up initiative has been launched in Santorini Island (Aegean Sea, Eastern Mediterranean for the establishment of the first fully-protected marine protected area in the Cyclades, aiming at improving fisheries and enhancing responsible recreational uses at sea. Following discussions with local small-scale fishers and divers, two sites along the southern and southeastern coasts of the island were suggested as suitable to this end. In 2012, a baseline study was conducted at these areas to assess their state and provide an ecological snapshot that would enable sound designation and monitoring. Several ad hoc indices and metrics were applied, taking into account structural and functional features of the upper infralittoral algae and Posidonia oceanica beds. An integrated assessment of the infralittoral fish assemblages and their associated benthic communities was also performed. Our most important findings were: (i the low total fish biomass and the absence of adult top predators, indicating overfishing; (ii the overgrazing effects of the overabundant alien herbivore spinefoot fishes (Siganus spp., as reflected by the abnormal structure of the algal communities; (iii the scarcity of signs of pollution or other direct anthropogenic pressures, as indicated by the good environmental status of the P. oceanica meadows and the upper infralittoral vegetation; and (iv the presence of a rich diversity of species and habitats, especially along the Akrotiri Peninsula and the wider volcanic Caldera. These findings provide useful insights on strengths and weaknesses of the study area and are discussed together with their implications for protection and management.

  20. Top-down/bottom-up description of electricity sector for Switzerland using the GEM-E3 computable general equilibrium model

    International Nuclear Information System (INIS)

    Krakowski, R. A.

    2006-06-01

    Participation of the Paul Scherrer Institute (PSI) in the advancement and extension of the multi-region, Computable General Equilibrium (CGE) model GEM-E3 (CES/KUL, 2002) focused primarily on two top-level facets: a) extension of the model database and model calibration, particularly as related to the second component of this study, which is; b) advancement of the dynamics of innovation and investment, primarily through the incorporation of Exogenous Technical Learning (ETL) into he Bottom-Up (BU, technology-based) part of the dynamic upgrade; this latter activity also included the completion of the dynamic coupling of the BU description of the electricity sector with the 'Top-Down' (TD, econometric) description of the economy inherent to the GEM-E3 CGE model. The results of this two- component study are described in two parts that have been combined in this single summary report: Part I describes the methodology and gives illustrative results from the BUTD integration, as well as describing the approach to and giving preliminary results from incorporating an ETL description into the BU component of the overall model; Part II reports on the calibration component of task in terms of: a) formulating a BU technology database for Switzerland based on previous work; incorporation of that database into the GEM-E3 model; and calibrating the BU database with the TD database embodied in the (Swiss) Social Accounting Matrix (SAM). The BUTD coupling along with the ETL incorporation described in Part I represent the major effort embodied in this investigation, but this effort could not be completed without the calibration preamble reported herein as Part II. A brief summary of the scope of each of these key study components is given. (author)

  1. From War to Tolerance? Bottom-up and Top-down Approaches to (Rebuilding Interethnic Ties in the Areas of the Former Yugoslavia

    Directory of Open Access Journals (Sweden)

    Boris Banovac

    2014-01-01

    Full Text Available In recent history the Balkans passed through periods of conflict and violence typical of many post-imperial nation-states that are unable to establish lateral links with their neighbors without or outside the central (imperial connection. In a way, these states imitated historical path of imperial conquests. In this regard, ethnic conflicts that escalated into wars of the former Yugoslavia can be taken as examples of an erratic transformation of post-imperial into modern nation-states that are eager to build up democracy at home and develop peaceful coexistence with others in international environment. Nevertheless, not all multiethnic areas were caught up in violence (e.g. instances of “peace enclaves” in multiethnic areas in Croatia, Bosnia and Heregovina and in Kosovo. Through such examples, which will be illustrated with results of empirical research, we recognize potentials for building tolerance from below. On the other hand, in most other places peace was a follow up of post-conflict processes. In these cases, local potentials of ethnic tolerance were rather weak. The paper provides some examples illustrating regional differences in this regard within Croatia. Actually, the whole process of normalization of ethnic relations in peaceful terms is far from being linear and is hardly going smoothly. Some parts of national elites foster distance and antagonism against the ”others”. On the other hand, especially following EU accession of Croatia, nationalistic rhetoric significantly receded on the level of the official politics. The question is then whether the impact of policies in institutional sphere, both national and international, i.e. top-down approach, is decisive in shaping inter-ethnic relations. The conclusion is that the institutional, top-down arrangements of peace and tolerance cannot be sustainable without concomitant bottom-up processes on micro level, which theoretically corresponds to a “conformant policy” against

  2. The control of automatic imitation based on bottom-up and top-down cues to animacy: insights from brain and behavior.

    Science.gov (United States)

    Klapper, André; Ramsey, Richard; Wigboldus, Daniël; Cross, Emily S

    2014-11-01

    Humans automatically imitate other people's actions during social interactions, building rapport and social closeness in the process. Although the behavioral consequences and neural correlates of imitation have been studied extensively, little is known about the neural mechanisms that control imitative tendencies. For example, the degree to which an agent is perceived as human-like influences automatic imitation, but it is not known how perception of animacy influences brain circuits that control imitation. In the current fMRI study, we examined how the perception and belief of animacy influence the control of automatic imitation. Using an imitation-inhibition paradigm that involves suppressing the tendency to imitate an observed action, we manipulated both bottom-up (visual input) and top-down (belief) cues to animacy. Results show divergent patterns of behavioral and neural responses. Behavioral analyses show that automatic imitation is equivalent when one or both cues to animacy are present but reduces when both are absent. By contrast, right TPJ showed sensitivity to the presence of both animacy cues. Thus, we demonstrate that right TPJ is biologically tuned to control imitative tendencies when the observed agent both looks like and is believed to be human. The results suggest that right TPJ may be involved in a specialized capacity to control automatic imitation of human agents, rather than a universal process of conflict management, which would be more consistent with generalist theories of imitative control. Evidence for specialized neural circuitry that "controls" imitation offers new insight into developmental disorders that involve atypical processing of social information, such as autism spectrum disorders.

  3. Costs and potentials of energy conservation in China's coal-fired power industry: A bottom-up approach considering price uncertainties

    International Nuclear Information System (INIS)

    Chen, Hao; Kang, Jia-Ning; Liao, Hua; Tang, Bao-Jun; Wei, Yi-Ming

    2017-01-01

    Energy conservation technologies in the coal-fired power sector are important solutions for the environmental pollution and climate change issues. However, a unified framework for estimating their costs and potentials is still needed due to the wide technology choices, especially considering their economic feasibility under fuel and carbon price uncertainties. Therefore, this study has employed a bottom-up approach to analyze the costs and potentials of 32 key technologies’ new promotions during the 13th Five-Year Plan period (2016–2020), which combines the conservation supply curve (CSC) approach and break-even analysis. Findings show that (1) these 32 technologies have a total coal conservation potential of 275.77 Mt with a cost of 238.82 billion yuan, and their break-even coal price is 866 yuan/ton. (2) steam-water circulation system has the largest energy conservation potential in the coal-fired power industry. (3) considering the co-benefits will facilitate these technologies’ promotions, because their break-even coal prices will decrease by 2.35 yuan/ton when the carbon prices increase by 1 yuan/ton. (4) discount rates have the largest impacts on the technologies’ cost-effectiveness, while the future generation level affect their energy conservation potentials most. - Highlights: • The 32 technologies can save 275.77 Mt coal with a cost of 238.82 billion yuan. • The steam-water circulation system has the largest energy conservation potential. • Considering the co-benefits will facilitate the technology promotions • Discount rates have the largest impacts on the technologies’ cost-effectiveness.

  4. The macroeconomic effects of ambitious energy efficiency policy in Germany – Combining bottom-up energy modelling with a non-equilibrium macroeconomic model

    International Nuclear Information System (INIS)

    Hartwig, Johannes; Kockat, Judit; Schade, Wolfgang; Braungardt, Sibylle

    2017-01-01

    Energy efficiency is one of the fastest and most cost-effective contributions to a sustainable, secure and affordable energy system. Furthermore, the so-called “non-energy benefits”, “co-benefits” or “multiple benefits” of energy efficiency are receiving increased interest from policy makers and the scientific community. Among the various non-energy benefits of energy efficiency initiatives, the macroeconomic benefits play an important role. Our study presents a detailed analysis of the long-term macroeconomic effects of German energy efficiency policy including the industry and service sectors as well as residential energy demand. We quantify the macroeconomic effects of an ambitious energy efficiency scenario by combining bottom-up models with an extended dynamic input-output model. We study sectoral shifts within the economy regarding value added and employment compared to the baseline scenario. We provide an in-depth analysis of the effects of energy efficiency policy on consumers, individual industry sectors, and the economy as a whole. We find significant positive macroeconomic effects resulting from energy efficiency initiatives, with growth effects for both GDP and employment ranging between 0.88% and 3.38%. Differences in sectoral gains lead to a shift in the economy. Our methodological approach provides a comprehensive framework for analyzing the macroeconomic benefits of energy efficiency. - Highlights: • Integration of detailed sectoral models for energy demand with macroeconomic model. • Detailed assessment of effects of ambitious energy efficiency targets for Germany. • Positive macroeconomic effects can support policymaking and reduce uncertainty.

  5. Top-down/bottom-up description of electricity sector for Switzerland using the GEM-E3 computable general equilibrium model

    Energy Technology Data Exchange (ETDEWEB)

    Krakowski, R. A

    2006-06-15

    Participation of the Paul Scherrer Institute (PSI) in the advancement and extension of the multi-region, Computable General Equilibrium (CGE) model GEM-E3 (CES/KUL, 2002) focused primarily on two top-level facets: a) extension of the model database and model calibration, particularly as related to the second component of this study, which is; b) advancement of the dynamics of innovation and investment, primarily through the incorporation of Exogenous Technical Learning (ETL) into he Bottom-Up (BU, technology-based) part of the dynamic upgrade; this latter activity also included the completion of the dynamic coupling of the BU description of the electricity sector with the 'Top-Down' (TD, econometric) description of the economy inherent to the GEM-E3 CGE model. The results of this two- component study are described in two parts that have been combined in this single summary report: Part I describes the methodology and gives illustrative results from the BUTD integration, as well as describing the approach to and giving preliminary results from incorporating an ETL description into the BU component of the overall model; Part II reports on the calibration component of task in terms of: a) formulating a BU technology database for Switzerland based on previous work; incorporation of that database into the GEM-E3 model; and calibrating the BU database with the TD database embodied in the (Swiss) Social Accounting Matrix (SAM). The BUTD coupling along with the ETL incorporation described in Part I represent the major effort embodied in this investigation, but this effort could not be completed without the calibration preamble reported herein as Part II. A brief summary of the scope of each of these key study components is given. (author)

  6. Top-down model estimates, bottom-up inventories, and future projections of global natural and anthropogenic emissions of nitrous oxide

    Science.gov (United States)

    Davidson, E. A.; Kanter, D.

    2013-12-01

    Nitrous oxide (N2O) is the third most abundantly emitted greenhouse gas and the largest remaining emitted ozone depleting substance. It is a product of nitrifying and denitrifying bacteria in soils, sediments and water bodies. Humans began to disrupt the N cycle in the preindustrial era as they expanded agricultural land, used fire for land clearing and management, and cultivated leguminous crops that carry out biological N fixation. This disruption accelerated after the industrial revolution, especially as the use of synthetic N fertilizers became common after 1950. Here we present findings from a new United Nations Environment Programme report, in which we constrain estimates of the anthropogenic and natural emissions of N2O and consider scenarios for future emissions. Inventory-based estimates of natural emissions from terrestrial, marine and atmospheric sources range from 10 to 12 Tg N2O-N/yr. Similar values can be derived for global N2O emissions that were predominantly natural before the industrial revolution. While there was inter-decadal variability, there was little or no consistent trend in atmospheric N2O concentrations between 1730 and 1850, allowing us to assume near steady state. Assuming an atmospheric lifetime of 120 years, the 'top-down' estimate of pre-industrial emissions of 11 Tg N2O-N/yr is consistent with the bottom-up inventories for natural emissions, although the former includes some modest pre-industrial anthropogenic effects (probably business-as-usual scenarios over the period 2013-2050 is ~102 Tg N2O-N; equivalent to ~48 Gt CO2e or ~2730 kt ozone depleting potential. The impact of growing demand for biofuels is highly uncertain, ranging from trivial to the most significant N2O source to date, depending on the types of plants, their nutrient management, the amount of land used for their cultivation, and the fates of their waste products.

  7. SALIENCY BASED SEGMENTATION OF SATELLITE IMAGES

    Directory of Open Access Journals (Sweden)

    A. Sharma

    2015-03-01

    Full Text Available Saliency gives the way as humans see any image and saliency based segmentation can be eventually helpful in Psychovisual image interpretation. Keeping this in view few saliency models are used along with segmentation algorithm and only the salient segments from image have been extracted. The work is carried out for terrestrial images as well as for satellite images. The methodology used in this work extracts those segments from segmented image which are having higher or equal saliency value than a threshold value. Salient and non salient regions of image become foreground and background respectively and thus image gets separated. For carrying out this work a dataset of terrestrial images and Worldview 2 satellite images (sample data are used. Results show that those saliency models which works better for terrestrial images are not good enough for satellite image in terms of foreground and background separation. Foreground and background separation in terrestrial images is based on salient objects visible on the images whereas in satellite images this separation is based on salient area rather than salient objects.

  8. Energetic Bottomup in the Low Countries. Energy transition from the bottom-up. On Happy energetic civilians, Solar and wind cooperatives, New utility companies; Energieke BottomUp in Lage Landen. De Energietransitie van Onderaf. Over Vrolijke energieke burgers, Zon- en windcooperaties, Nieuwe nuts

    Energy Technology Data Exchange (ETDEWEB)

    Schwencke, A.M.

    2012-08-15

    This essay is an outline of the 'energy transition from the bottom-up'. Leading questions are: (1) what are the actual initiatives; (2) who is involved; (3) how does one work (organization, business models); (4) why are people active in this field; (5) what good is it; (6) what is the aim? The essay is based on public information sources (websites, blogs, publications) and interviews with people involved [Dutch] Dit essay is een verkenning van de 'energietransitie van onderaf'. Leidende vragen zijn: (1) om wat voor initiatieven gaat het nu eigenlijk?; (2) wie zijn daarbij betrokken?; (3) hoe gaat men te werk (organisatie, business modellen)?; (4) waarom is men er op die manier mee bezig?; (5) Zet het zoden aan de dijk?; (6) Waar beweegt het naar toe? Het essay baseert zich op openbare bronnen (websites, blogs, publicaties) en gesprekken met mensen uit het veld.

  9. Visual Saliency Models for Text Detection in Real World.

    Directory of Open Access Journals (Sweden)

    Renwu Gao

    Full Text Available This paper evaluates the degree of saliency of texts in natural scenes using visual saliency models. A large scale scene image database with pixel level ground truth is created for this purpose. Using this scene image database and five state-of-the-art models, visual saliency maps that represent the degree of saliency of the objects are calculated. The receiver operating characteristic curve is employed in order to evaluate the saliency of scene texts, which is calculated by visual saliency models. A visualization of the distribution of scene texts and non-texts in the space constructed by three kinds of saliency maps, which are calculated using Itti's visual saliency model with intensity, color and orientation features, is given. This visualization of distribution indicates that text characters are more salient than their non-text neighbors, and can be captured from the background. Therefore, scene texts can be extracted from the scene images. With this in mind, a new visual saliency architecture, named hierarchical visual saliency model, is proposed. Hierarchical visual saliency model is based on Itti's model and consists of two stages. In the first stage, Itti's model is used to calculate the saliency map, and Otsu's global thresholding algorithm is applied to extract the salient region that we are interested in. In the second stage, Itti's model is applied to the salient region to calculate the final saliency map. An experimental evaluation demonstrates that the proposed model outperforms Itti's model in terms of captured scene texts.

  10. A novel visual saliency detection method for infrared video sequences

    Science.gov (United States)

    Wang, Xin; Zhang, Yuzhen; Ning, Chen

    2017-12-01

    Infrared video applications such as target detection and recognition, moving target tracking, and so forth can benefit a lot from visual saliency detection, which is essentially a method to automatically localize the ;important; content in videos. In this paper, a novel visual saliency detection method for infrared video sequences is proposed. Specifically, for infrared video saliency detection, both the spatial saliency and temporal saliency are considered. For spatial saliency, we adopt a mutual consistency-guided spatial cues combination-based method to capture the regions with obvious luminance contrast and contour features. For temporal saliency, a multi-frame symmetric difference approach is proposed to discriminate salient moving regions of interest from background motions. Then, the spatial saliency and temporal saliency are combined to compute the spatiotemporal saliency using an adaptive fusion strategy. Besides, to highlight the spatiotemporal salient regions uniformly, a multi-scale fusion approach is embedded into the spatiotemporal saliency model. Finally, a Gestalt theory-inspired optimization algorithm is designed to further improve the reliability of the final saliency map. Experimental results demonstrate that our method outperforms many state-of-the-art saliency detection approaches for infrared videos under various backgrounds.

  11. Enhancing the wettability of high aspect-ratio through-silicon vias lined with LPCVD silicon nitride or PE-ALD titanium nitride for void-free bottom-up copper electroplating

    NARCIS (Netherlands)

    Saadaoui, M.; Zeijl, H. van; Wien, W.H.A.; Pham, H.T.M.; Kwakernaak, C.; Knoops, H.C.M.; Erwin Kessels, W.M.M.; Sanden, R.M.C.M. van de; Voogt, F.C.; Roozeboom, F.; Sarro, P.M.

    2011-01-01

    One of the critical steps toward producing void-free and uniform bottom-up copper electroplating in high aspect-ratio (AR) through-silicon vias (TSVs) is the ability of the copper electrolyte to spontaneously flow through the entire depth of the via. This can be accomplished by reducing the

  12. Enhancing the Wettability of High Aspect-Ratio Through-Silicon Vias Lined with LPCVD Silicon Nitride or PE-ALD Titanium Nitride for Void-Free Bottom-Up Copper Electroplating

    NARCIS (Netherlands)

    Saadaoui, M.; van Zeijl, H.; Wien, W. H. A.; Pham, H. T. M.; Kwakernaak, C.; Knoops, H. C. M.; Kessels, W. M. M.; R. van de Sanden,; Voogt, F. C.; Roozeboom, F.; Sarro, P. M.

    2011-01-01

    One of the critical steps toward producing void-free and uniform bottom-up copper electroplating in high aspect-ratio (AR) through-silicon vias (TSVs) is the ability of the copper electrolyte to spontaneously flow through the entire depth of the via. This can be accomplished by reducing the

  13. Analysis of Academic and Non-Academic Outcomes from a Bottom-up Comprehensive School Reform in the Absence of Student Level Data through Simulation Methods: A Mixed Methods Case Study

    Science.gov (United States)

    Sondergeld, Toni A.

    2009-01-01

    This dissertation examines the efficacy of a bottom-up comprehensive school reform (CSR) program by evaluating its impact on student achievement, attendance, and behavior outcomes through an explanatory mixed methods design. The CSR program (Gear Up) was implemented in an urban junior high school over the course of seven years allowing for…

  14. The protection pyramid approach : A contribution to the protection of internally displaced persons by combining bottom up coping mechanisms and top down protection strategies into a partnership approach to protection

    NARCIS (Netherlands)

    Janssen, Laura

    2017-01-01

    In het huidige debat over de bescherming van Intern Ontheemde Mensen (IDPs) spelen twee groepen actoren een rol. Enerzijds de ontheemden zelf (Bottom Up actoren genoemd), anderzijds de Staat, niet-Statelijke actoren en andere hulpverleners (Top Down actoren). Van oudsher wordt er meer aandacht

  15. Culturally divergent responses to mortality salience.

    Science.gov (United States)

    Ma-Kellams, Christine; Blascovich, Jim

    2011-08-01

    Two experiments compared the effects of death thoughts, or mortality salience, on European and Asian Americans. Research on terror management theory has demonstrated that in Western cultural groups, individuals typically employ self-protective strategies in the face of death-related thoughts. Given fundamental East-West differences in self-construal (i.e., the independent vs. interdependent self), we predicted that members of Eastern cultural groups would affirm other people, rather than defend and affirm the self, after encountering conditions of mortality salience. We primed European Americans and Asian Americans with either a death or a control prime and examined the effect of this manipulation on attitudes about a person who violates cultural norms (Study 1) and on attributions about the plight of an innocent victim (Study 2). Mortality salience promoted culturally divergent responses, leading European Americans to defend the self and Asian Americans to defend other people.

  16. Discrimination learning with variable stimulus 'salience'

    Directory of Open Access Journals (Sweden)

    Treviño Mario

    2011-08-01

    Full Text Available Abstract Background In nature, sensory stimuli are organized in heterogeneous combinations. Salient items from these combinations 'stand-out' from their surroundings and determine what and how we learn. Yet, the relationship between varying stimulus salience and discrimination learning remains unclear. Presentation of the hypothesis A rigorous formulation of the problem of discrimination learning should account for varying salience effects. We hypothesize that structural variations in the environment where the conditioned stimulus (CS is embedded will be a significant determinant of learning rate and retention level. Testing the hypothesis Using numerical simulations, we show how a modified version of the Rescorla-Wagner model, an influential theory of associative learning, predicts relevant interactions between varying salience and discrimination learning. Implications of the hypothesis If supported by empirical data, our model will help to interpret critical experiments addressing the relations between attention, discrimination and learning.

  17. Object recognition with hierarchical discriminant saliency networks

    Directory of Open Access Journals (Sweden)

    Sunhyoung eHan

    2014-09-01

    Full Text Available The benefits of integrating attention and object recognition are investigated. While attention is frequently modeled as pre-processor for recognition, we investigate the hypothesis that attention is an intrinsic component of recognition and vice-versa. This hypothesis is tested with a recognitionmodel, the hierarchical discriminant saliency network (HDSN, whose layers are top-down saliency detectors, tuned for a visual class according to the principles of discriminant saliency. The HDSN has two possible implementations. In a biologically plausible implementation, all layers comply with the standard neurophysiological model of visual cortex, with sub-layers of simple and complex units that implement a combination of filtering, divisive normalization, pooling, and non-linearities. In a neuralnetwork implementation, all layers are convolutional and implement acombination of filtering, rectification, and pooling. The rectificationis performed with a parametric extension of the now popular rectified linearunits (ReLUs, whose parameters can be tuned for the detection of targetobject classes. This enables a number of functional enhancementsover neural network models that lack a connection to saliency, including optimal feature denoising mechanisms for recognition, modulation ofsaliency responses by the discriminant power of the underlying features,and the ability to detect both feature presence and absence.In either implementation, each layer has a precise statistical interpretation, and all parameters are tuned by statistical learning. Each saliency detection layer learns more discriminant saliency templates than its predecessors and higher layers have larger pooling fields. This enables the HDSN to simultaneously achieve high selectivity totarget object classes and invariance. The resulting performance demonstrates benefits for all the functional enhancements of the HDSN.

  18. Objective correlates of pitch salience using pupillometry

    DEFF Research Database (Denmark)

    Bianchi, Federica; Santurette, Sébastien; Wendt, Dorothea

    2014-01-01

    the frequency region and F 0 , were considered. Pupil size was measured for each condition, while the subjects’ task was to detect the deviants by pressing a response button. The expected trend was that pupil size would increase with decreasing salience. Results for musically trained listeners showed...... the expected trend, whereby pupil size significantly increased with decreasing salience of the stimuli. Non-musically trained listeners showed, however, a smaller pupil size for the least salient condition as compared to a medium salient condition, probably due to a too demanding task...

  19. Top-down and bottom-up aerosol-cloud closure: towards understanding sources of uncertainty in deriving cloud shortwave radiative flux

    Science.gov (United States)

    Sanchez, Kevin J.; Roberts, Gregory C.; Calmer, Radiance; Nicoll, Keri; Hashimshoni, Eyal; Rosenfeld, Daniel; Ovadnevaite, Jurgita; Preissler, Jana; Ceburnis, Darius; O'Dowd, Colin; Russell, Lynn M.

    2017-08-01

    Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head Atmospheric Research Station in Galway, Ireland, in August 2015. This study is part of the BACCHUS (Impact of Biogenic versus Anthropogenic emissions on Clouds and Climate: towards a Holistic UnderStanding) European collaborative project, with the goal of understanding key processes affecting aerosol-cloud shortwave radiative flux closures to improve future climate predictions and develop sustainable policies for Europe. Instrument platforms include ground-based unmanned aerial vehicles (UAVs)1 and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1-D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction or a five-hole probe for 3-D wind vectors. UAV cloud measurements are rare and have only become possible in recent years through the miniaturization of instrumentation. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 and 60 W m-2. After accounting for entrainment

  20. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems Using a Bottom-Up Approach and Installer Survey

    Energy Technology Data Exchange (ETDEWEB)

    Ardani, Kristen [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Feldman, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, Sean [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-11-01

    This report presents results from the first U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs—often referred to as “business process” or “soft” costs—for residential and commercial photovoltaic (PV) systems. Annual expenditure and labor-hour-productivity data are analyzed to benchmark 2010 soft costs related to the DOE priority areas of (1) customer acquisition; (2) permitting, inspection, and interconnection; (3) installation labor; and (4) installer labor for arranging third-party financing. Annual expenditure and labor-hour data were collected from 87 PV installers. After eliminating outliers, the survey sample consists of 75 installers, representing approximately 13% of all residential PV installations and 4% of all commercial installations added in 2010. Including assumed permitting fees, in 2010 the average soft costs benchmarked in this analysis total $1.50/W for residential systems (ranging from $0.66/W to $1.66/W between the 20th and 80th percentiles). For commercial systems, the median 2010 benchmarked soft costs (including assumed permitting fees) are $0.99/W for systems smaller than 250 kW (ranging from $0.51/W to $1.45/W between the 20th and 80th percentiles) and $0.25/W for systems larger than 250 kW (ranging from $0.17/W to $0.78/W between the 20th and 80th percentiles). Additional soft costs not benchmarked in the present analysis (e.g., installer profit, overhead, financing, and contracting) are significant and would add to these figures. The survey results provide a benchmark for measuring—and helping to accelerate—progress over the next decade toward achieving the DOE SunShot Initiative’s soft-cost-reduction targets. We conclude that the selected non-hardware business processes add considerable cost to U.S. PV systems, constituting 23% of residential PV system price, 17% of small commercial system price, and 5% of large commercial system price (in 2010

  1. Bottom-up and top-down triggers of diversification: A new look at the evolutionary ecology of scavenging amphipods in the deep sea

    Science.gov (United States)

    Havermans, Charlotte; Smetacek, Victor

    2018-05-01

    The initial, anthropocentric view of the deep ocean was that of a hostile environment inhabited by organisms rendered lethargic by constant high pressure, low temperature and sparse food supply, hence evolving slowly. This conceptual framework of a spatially and temporally homogeneous, connected, strongly bottom-up controlled habitat implied a strong constraint on, or poor incentive for, speciation. Hence, the discovery in the late 1960s of high species diversity of abyssal benthic invertebrates came as a surprise. Since then, the slow-motion view of deep-sea ecology and evolution has speeded up and diversified in the light of increasing evidence accumulating from in situ visual observations complemented by molecular and other tools. The emerging picture is that of a much livelier, highly diversified and more complex deep-sea fauna than previously assumed. In this review we examine the consequences of the incoming information for developing a broader view of evolutionary ecology in the deep sea, and for scavenging amphipods in particular. We revisit the food supply to the deep-sea floor and hypothesize that the dead bodies of animals, ranging from zooplankton to large fish are likely to be a more important source of food than their friable faeces. Camera observations of baited traps indicate that amphipod carrion-feeders arrive within hours at the bait which continues to draw new individuals for days to months later, presumably by scent trails in tidal currents. We explore the different stages of food acquisition upon which natural selection may have acted, from detection to ingestion, and discuss the possibility of a broader range of food acquisition strategies, including predation and specializations. Although currently neglected in deep-sea ecology, top-down factors are likely to play a more important role in the evolution of deep-sea organisms. Predation on amphipods at baits by bathyal and abyssal fishes, and large predatory crustaceans in the hadal zone, is

  2. A bottom-up approach in estimating the measurement uncertainty and other important considerations for quantitative analyses in drug testing for horses.

    Science.gov (United States)

    Leung, Gary N W; Ho, Emmie N M; Kwok, W Him; Leung, David K K; Tang, Francis P W; Wan, Terence S M; Wong, April S Y; Wong, Colton H F; Wong, Jenny K Y; Yu, Nola H

    2007-09-07

    Quantitative determination, particularly for threshold substances in biological samples, is much more demanding than qualitative identification. A proper assessment of any quantitative determination is the measurement uncertainty (MU) associated with the determined value. The International Standard ISO/IEC 17025, "General requirements for the competence of testing and calibration laboratories", has more prescriptive requirements on the MU than its superseded document, ISO/IEC Guide 25. Under the 2005 or 1999 versions of the new standard, an estimation of the MU is mandatory for all quantitative determinations. To comply with the new requirement, a protocol was established in the authors' laboratory in 2001. The protocol has since evolved based on our practical experience, and a refined version was adopted in 2004. This paper describes our approach in establishing the MU, as well as some other important considerations, for the quantification of threshold substances in biological samples as applied in the area of doping control for horses. The testing of threshold substances can be viewed as a compliance test (or testing to a specified limit). As such, it should only be necessary to establish the MU at the threshold level. The steps in a "Bottom-Up" approach adopted by us are similar to those described in the EURACHEM/CITAC guide, "Quantifying Uncertainty in Analytical Measurement". They involve first specifying the measurand, including the relationship between the measurand and the input quantities upon which it depends. This is followed by identifying all applicable uncertainty contributions using a "cause and effect" diagram. The magnitude of each uncertainty component is then calculated and converted to a standard uncertainty. A recovery study is also conducted to determine if the method bias is significant and whether a recovery (or correction) factor needs to be applied. All standard uncertainties with values greater than 30% of the largest one are then used to

  3. Benefits of China's efforts in gaseous pollutant control indicated by the bottom-up emissions and satellite observations 2000-2014

    Science.gov (United States)

    Xia, Yinmin; Zhao, Yu; Nielsen, Chris P.

    2016-07-01

    To evaluate the effectiveness of national air pollution control policies, the emissions of SO2, NOX, CO and CO2 in China are estimated using bottom-up methods for the most recent 15-year period (2000-2014). Vertical column densities (VCDs) from satellite observations are used to test the temporal and spatial patterns of emissions and to explore the ambient levels of gaseous pollutants across the country. The inter-annual trends in emissions and VCDs match well except for SO2. Such comparison is improved with an optimistic assumption in emission estimation that the emission standards for given industrial sources issued after 2010 have been fully enforced. Underestimation of emission abatement and enhanced atmospheric oxidization likely contribute to the discrepancy between SO2 emissions and VCDs. As suggested by VCDs and emissions estimated under the assumption of full implementation of emission standards, the control of SO2 in the 12th Five-Year Plan period (12th FYP, 2011-2015) is estimated to be more effective than that in the 11th FYP period (2006-2010), attributed to improved use of flue gas desulfurization in the power sector and implementation of new emission standards in key industrial sources. The opposite was true for CO, as energy efficiency improved more significantly from 2005 to 2010 due to closures of small industrial plants. Iron & steel production is estimated to have had particularly strong influence on temporal and spatial patterns of CO. In contrast to fast growth before 2011 driven by increased coal consumption and limited controls, NOX emissions decreased from 2011 to 2014 due to the penetration of selective catalytic/non-catalytic reduction systems in the power sector. This led to reduced NO2 VCDs, particularly in relatively highly polluted areas such as the eastern China and Pearl River Delta regions. In developed areas, transportation is playing an increasingly important role in air pollution, as suggested by the increased ratio of NO2 to SO

  4. Top-down and Bottom-up aerosol-cloud-closure: towards understanding sources of unvertainty in deriving cloud radiative flux

    Science.gov (United States)

    Sanchez, K.; Roberts, G.; Calmer, R.; Nicoll, K.; Hashimshoni, E.; Rosenfeld, D.; Ovadnevaite, J.; Preissler, J.; Ceburnis, D.; O'Dowd, C. D. D.; Russell, L. M.

    2017-12-01

    Top-down and bottom-up aerosol-cloud shortwave radiative flux closures were conducted at the Mace Head atmospheric research station in Galway, Ireland in August 2015. Instrument platforms include ground-based, unmanned aerial vehicles (UAV), and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN) concentration were used to initiate a 1D microphysical aerosol-cloud parcel model (ACPM). UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction, or a 5-hole probe for 3D wind vectors. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in-situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF) by between 25 W m-2 and 60 W m-2. After accounting for entrainment, satellite-derived cloud droplet number concentrations (CDNC) were within 30% of simulated CDNC. In cases with a well-mixed boundary layer, δRF is no greater than 20 W m-2 after accounting for cloud-top entrainment, and up to 50 W m-2 when entrainment is not taken into account. In cases with a decoupled boundary layer, cloud microphysical properties are inconsistent with ground-based aerosol measurements, as expected, and δRF is as high as 88 W m-2, even high (> 30 W m-2) after

  5. The role of ethnic school segregation for adolescents’ religious salience

    OpenAIRE

    Van der Bracht, Koen; D'hondt, Fanny; Van Houtte, Mieke; Van de Putte, Bart; Stevens, Peter

    2016-01-01

    Public concerns over the possible effects of school segregation on immigrant and ethnic majority religiosity have been on the rise over the last few years. In this paper we focus on (1) the association between ethnic school composition and religious salience, (2) intergenerational differences in religious salience and (3) the role of ethnic school composition for intergenerational differences in religious salience. We perform analyses on religious salience, one five-point Likert scale item me...

  6. Visualization of neural networks using saliency maps

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.; Kjems, Ulrik; Hansen, Lars Kai

    1995-01-01

    The saliency map is proposed as a new method for understanding and visualizing the nonlinearities embedded in feedforward neural networks, with emphasis on the ill-posed case, where the dimensionality of the input-field by far exceeds the number of examples. Several levels of approximations...

  7. Referent Salience Affects Second Language Article Use

    Science.gov (United States)

    Trenkic, Danijela; Pongpairoj, Nattama

    2013-01-01

    The effect of referent salience on second language (L2) article production in real time was explored. Thai (-articles) and French (+articles) learners of English described dynamic events involving two referents, one visually cued to be more salient at the point of utterance formulation. Definiteness marking was made communicatively redundant with…

  8. Transcriptomics-guided bottom-up and top-down venomics of neonate and adult specimens of the arboreal rear-fanged Brown Treesnake, Boiga irregularis, from Guam.

    Science.gov (United States)

    Pla, Davinia; Petras, Daniel; Saviola, Anthony J; Modahl, Cassandra M; Sanz, Libia; Pérez, Alicia; Juárez, Elena; Frietze, Seth; Dorrestein, Pieter C; Mackessy, Stephen P; Calvete, Juan J

    2018-03-01

    neonate and adult B. irregularis from Guam, further highlighting evolutionary trends in venom composition among rear-fanged venomous snakes. The Brown Treesnake (Boiga irregularis) has caused extensive ecological and economic damage to the island of Guam where it has become a classic example of the negative impacts of invasive species. In the current study, we report the first combined transcriptomic and proteomic analysis of B. irregularis venom of Guam origin. The transcriptome of an adult snake contained toxin sequences belonging to 18 protein families, with three-finger toxin (3FTx) isoforms being the most abundant and representing 94% of all venom protein transcript reads. Our bottom-up and top-down venomic analyses confirmed that 3FTxs are the major components of B. irregularis venom, and a comparative analysis of neonate and adult venoms demonstrate a clear ontogenetic shift in toxin abundance, likely driven by dietary variation between the two age classes. Second-generation antivenomics and Western blot analysis using purified anti-Brown Treesnake rabbit serum IgGs (anti-BTS IgGs) showed strong immunoreactivity toward B. irregularis venom. Interestingly, our anti-BTS IgGs did not cross-react with 3FTxs found in several other rear-fanged snake venoms, or against 3FTxs in the venom of the elapid Ophiophagus hannah, indicating that epitopes in these 3FTx molecules are quite distinct. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Top-down and bottom-up aerosol–cloud closure: towards understanding sources of uncertainty in deriving cloud shortwave radiative flux

    Directory of Open Access Journals (Sweden)

    K. J. Sanchez

    2017-08-01

    Full Text Available Top-down and bottom-up aerosol–cloud shortwave radiative flux closures were conducted at the Mace Head Atmospheric Research Station in Galway, Ireland, in August 2015. This study is part of the BACCHUS (Impact of Biogenic versus Anthropogenic emissions on Clouds and Climate: towards a Holistic UnderStanding European collaborative project, with the goal of understanding key processes affecting aerosol–cloud shortwave radiative flux closures to improve future climate predictions and develop sustainable policies for Europe. Instrument platforms include ground-based unmanned aerial vehicles (UAVs1 and satellite measurements of aerosols, clouds and meteorological variables. The ground-based and airborne measurements of aerosol size distributions and cloud condensation nuclei (CCN concentration were used to initiate a 1-D microphysical aerosol–cloud parcel model (ACPM. UAVs were equipped for a specific science mission, with an optical particle counter for aerosol distribution profiles, a cloud sensor to measure cloud extinction or a five-hole probe for 3-D wind vectors. UAV cloud measurements are rare and have only become possible in recent years through the miniaturization of instrumentation. These are the first UAV measurements at Mace Head. ACPM simulations are compared to in situ cloud extinction measurements from UAVs to quantify closure in terms of cloud shortwave radiative flux. Two out of seven cases exhibit sub-adiabatic vertical temperature profiles within the cloud, which suggests that entrainment processes affect cloud microphysical properties and lead to an overestimate of simulated cloud shortwave radiative flux. Including an entrainment parameterization and explicitly calculating the entrainment fraction in the ACPM simulations both improved cloud-top radiative closure. Entrainment reduced the difference between simulated and observation-derived cloud-top shortwave radiative flux (δRF by between 25 and 60 W m−2. After

  10. The impact of napping on memory for future-relevant stimuli: Prioritization among multiple salience cues.

    Science.gov (United States)

    Bennion, Kelly A; Payne, Jessica D; Kensinger, Elizabeth A

    2016-06-01

    Prior research has demonstrated that sleep enhances memory for future-relevant information, including memory for information that is salient due to emotion, reward, or knowledge of a later memory test. Although sleep has been shown to prioritize information with any of these characteristics, the present study investigates the novel question of how sleep prioritizes information when multiple salience cues exist. Participants encoded scenes that were future-relevant based on emotion (emotional vs. neutral), reward (rewarded vs. unrewarded), and instructed learning (intentionally vs. incidentally encoded), preceding a delay consisting of a nap, an equivalent time period spent awake, or a nap followed by wakefulness (to control for effects of interference). Recognition testing revealed that when multiple dimensions of future relevance co-occur, sleep prioritizes top-down, goal-directed cues (instructed learning, and to a lesser degree, reward) over bottom-up, stimulus-driven characteristics (emotion). Further, results showed that these factors interact; the effect of a nap on intentionally encoded information was especially strong for neutral (relative to emotional) information, suggesting that once one cue for future relevance is present, there are diminishing returns with additional cues. Sleep may binarize information based on whether it is future-relevant or not, preferentially consolidating memory for the former category. Potential neural mechanisms underlying these selective effects and the implications of this research for educational and vocational domains are discussed. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  11. A bottom-up perspective on leadership of collaborative innovation in the public sector:The social construction of leadership for disadvantaged city districts in The City of Copenhagen

    OpenAIRE

    Hansen, Jesper Rohr

    2014-01-01

    The thesis investigates how new forms of public leadership can contribute to solving complex problems in today’s welfare societies through innovation. A bottom-up type of leadership for collaborative innovation addressing wicked problems is theorised, displaying a social constructive process approach to leadership; a theoretical model emphasises that leadership emerges through social processes of recognition. Leadership is recognised by utilising the uncertainty of a wicked problem and innova...

  12. Reconciling Basin-Scale Top-Down and Bottom-Up Methane Emission Measurements for Onshore Oil and Gas Development: Cooperative Research and Development Final Report, CRADA Number CRD-14-572

    Energy Technology Data Exchange (ETDEWEB)

    Heath, Garvin A. [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-12-04

    The overall objective of the Research Partnership to Secure Energy for America (RPSEA)-funded research project is to develop independent estimates of methane emissions using top-down and bottom-up measurement approaches and then to compare the estimates, including consideration of uncertainty. Such approaches will be applied at two scales: basin and facility. At facility scale, multiple methods will be used to measure methane emissions of the whole facility (controlled dual tracer and single tracer releases, aircraft-based mass balance and Gaussian back-trajectory), which are considered top-down approaches. The bottom-up approach will sum emissions from identified point sources measured using appropriate source-level measurement techniques (e.g., high-flow meters). At basin scale, the top-down estimate will come from boundary layer airborne measurements upwind and downwind of the basin, using a regional mass balance model plus approaches to separate atmospheric methane emissions attributed to the oil and gas sector. The bottom-up estimate will result from statistical modeling (also known as scaling up) of measurements made at selected facilities, with gaps filled through measurements and other estimates based on other studies. The relative comparison of the bottom-up and top-down estimates made at both scales will help improve understanding of the accuracy of the tested measurement and modeling approaches. The subject of this CRADA is NREL's contribution to the overall project. This project resulted from winning a competitive solicitation no. RPSEA RFP2012UN001, proposal no. 12122-95, which is the basis for the overall project. This Joint Work Statement (JWS) details the contributions of NREL and Colorado School of Mines (CSM) in performance of the CRADA effort.

  13. Exploring the underlying structure of mental disorders: cross-diagnostic differences and similarities from a network perspective using both a top-down and a bottom-up approach.

    Science.gov (United States)

    Wigman, J T W; van Os, J; Borsboom, D; Wardenaar, K J; Epskamp, S; Klippel, A; Viechtbauer, W; Myin-Germeys, I; Wichers, M

    2015-08-01

    It has been suggested that the structure of psychopathology is best described as a complex network of components that interact in dynamic ways. The goal of the present paper was to examine the concept of psychopathology from a network perspective, combining complementary top-down and bottom-up approaches using momentary assessment techniques. A pooled Experience Sampling Method (ESM) dataset of three groups (individuals with a diagnosis of depression, psychotic disorder or no diagnosis) was used (pooled N = 599). The top-down approach explored the network structure of mental states across different diagnostic categories. For this purpose, networks of five momentary mental states ('cheerful', 'content', 'down', 'insecure' and 'suspicious') were compared between the three groups. The complementary bottom-up approach used principal component analysis to explore whether empirically derived network structures yield meaningful higher order clusters. Individuals with a clinical diagnosis had more strongly connected moment-to-moment network structures, especially the depressed group. This group also showed more interconnections specifically between positive and negative mental states than the psychotic group. In the bottom-up approach, all possible connections between mental states were clustered into seven main components that together captured the main characteristics of the network dynamics. Our combination of (i) comparing network structure of mental states across three diagnostically different groups and (ii) searching for trans-diagnostic network components across all pooled individuals showed that these two approaches yield different, complementary perspectives in the field of psychopathology. The network paradigm therefore may be useful to map transdiagnostic processes.

  14. Decision salience signals in posterior cingulate cortex

    Directory of Open Access Journals (Sweden)

    Sarah eHeilbronner

    2011-04-01

    Full Text Available Despite its phylogenetic antiquity and clinical importance, the posterior cingulate cortex (CGp remains an enigmatic nexus of attention, memory, motivation, and decision making. Here we show that CGp neurons track decision salience—the degree to which an option differs from a standard—but not the subjective value of a decision. To do this, we recorded the spiking activity of CGp neurons in monkeys choosing between options varying in reward-related risk, delay to reward, and social outcomes, each of which varied in level of decision salience. Firing rates were higher when monkeys chose the risky option, consistent with their risk-seeking preferences, but were also higher when monkeys chose the delayed and social options, contradicting their preferences. Thus, across decision contexts, neuronal activity was uncorrelated with how much monkeys valued a given option, as inferred from choice. Instead, neuronal activity signaled the deviation of the chosen option from the standard, independently of how it differed. The observed decision salience signals suggest a role for CGp in the flexible allocation of neural resources to motivationally significant information, akin to the role of attention in selective processing of sensory inputs.

  15. The visual attention saliency map for movie retrospection

    Science.gov (United States)

    Rogalska, Anna; Napieralski, Piotr

    2018-04-01

    The visual saliency map is becoming important and challenging for many scientific disciplines (robotic systems, psychophysics, cognitive neuroscience and computer science). Map created by the model indicates possible salient regions by taking into consideration face presence and motion which is essential in motion pictures. By combining we can obtain credible saliency map with a low computational cost.

  16. The visual attention saliency map for movie retrospection

    Directory of Open Access Journals (Sweden)

    Rogalska Anna

    2018-04-01

    Full Text Available The visual saliency map is becoming important and challenging for many scientific disciplines (robotic systems, psychophysics, cognitive neuroscience and computer science. Map created by the model indicates possible salient regions by taking into consideration face presence and motion which is essential in motion pictures. By combining we can obtain credible saliency map with a low computational cost.

  17. Morality salience increases adherence to salient norms and values

    NARCIS (Netherlands)

    Gailliot, M.T.; Stillman, T.F.; Schmeichel, B.J.; Maner, J.K.; Plant, E.A.

    2008-01-01

    Four studies indicate that mortality salience increases adherence to social norms and values, but only when cultural norms and values are salient. In Study 1, mortality salience coupled with a reminder about cultural values of egalitarianism reduced prejudice toward Blacks among non-Black

  18. The Aberrant Salience Inventory: A New Measure of Psychosis Proneness

    Science.gov (United States)

    Cicero, David C.; Kerns, John G.; McCarthy, Denis M.

    2010-01-01

    Aberrant salience is the unusual or incorrect assignment of salience, significance, or importance to otherwise innocuous stimuli and has been hypothesized to be important for psychosis and psychotic disorders such as schizophrenia. Despite the importance of this concept in psychosis research, no questionnaire measures are available to assess…

  19. What Drives Farmers to Make Top-Down or Bottom-Up Adaptation to Climate Change and Fluctuations? A Comparative Study on 3 Cases of Apple Farming in Japan and South Africa

    Science.gov (United States)

    Fujisawa, Mariko; Kobayashi, Kazuhiko; Johnston, Peter; New, Mark

    2015-01-01

    Agriculture is one of the most vulnerable sectors to climate change. Farmers have been exposed to multiple stressors including climate change, and they have managed to adapt to those risks. The adaptation actions undertaken by farmers and their decision making are, however, only poorly understood. By studying adaptation practices undertaken by apple farmers in three regions: Nagano and Kazuno in Japan and Elgin in South Africa, we categorize the adaptation actions into two types: farmer initiated bottom-up adaptation and institution led top-down adaptation. We found that the driver which differentiates the type of adaptation likely adopted was strongly related to the farmers’ characteristics, particularly their dependence on the institutions, e.g. the farmers’ cooperative, in selling their products. The farmers who rely on the farmers’ cooperative for their sales are likely to adopt the institution-led adaptation, whereas the farmers who have established their own sales channels tend to start innovative actions by bottom-up. We further argue that even though the two types have contrasting features, the combinations of the both types of adaptations could lead to more successful adaptation particularly in agriculture. This study also emphasizes that more farm-level studies for various crops and regions are warranted to provide substantial feedbacks to adaptation policy. PMID:25822534

  20. What drives farmers to make top-down or bottom-up adaptation to climate change and fluctuations? A comparative study on 3 cases of apple farming in Japan and South Africa.

    Directory of Open Access Journals (Sweden)

    Mariko Fujisawa

    Full Text Available Agriculture is one of the most vulnerable sectors to climate change. Farmers have been exposed to multiple stressors including climate change, and they have managed to adapt to those risks. The adaptation actions undertaken by farmers and their decision making are, however, only poorly understood. By studying adaptation practices undertaken by apple farmers in three regions: Nagano and Kazuno in Japan and Elgin in South Africa, we categorize the adaptation actions into two types: farmer initiated bottom-up adaptation and institution led top-down adaptation. We found that the driver which differentiates the type of adaptation likely adopted was strongly related to the farmers' characteristics, particularly their dependence on the institutions, e.g. the farmers' cooperative, in selling their products. The farmers who rely on the farmers' cooperative for their sales are likely to adopt the institution-led adaptation, whereas the farmers who have established their own sales channels tend to start innovative actions by bottom-up. We further argue that even though the two types have contrasting features, the combinations of the both types of adaptations could lead to more successful adaptation particularly in agriculture. This study also emphasizes that more farm-level studies for various crops and regions are warranted to provide substantial feedbacks to adaptation policy.

  1. Improving Saliency Models by Predicting Human Fixation Patches

    KAUST Repository

    Dubey, Rachit

    2015-04-16

    There is growing interest in studying the Human Visual System (HVS) to supplement and improve the performance of computer vision tasks. A major challenge for current visual saliency models is predicting saliency in cluttered scenes (i.e. high false positive rate). In this paper, we propose a fixation patch detector that predicts image patches that contain human fixations with high probability. Our proposed model detects sparse fixation patches with an accuracy of 84 % and eliminates non-fixation patches with an accuracy of 84 % demonstrating that low-level image features can indeed be used to short-list and identify human fixation patches. We then show how these detected fixation patches can be used as saliency priors for popular saliency models, thus, reducing false positives while maintaining true positives. Extensive experimental results show that our proposed approach allows state-of-the-art saliency methods to achieve better prediction performance on benchmark datasets.

  2. Measuring and modeling salience with the theory of visual attention.

    Science.gov (United States)

    Krüger, Alexander; Tünnermann, Jan; Scharlau, Ingrid

    2017-08-01

    For almost three decades, the theory of visual attention (TVA) has been successful in mathematically describing and explaining a wide variety of phenomena in visual selection and recognition with high quantitative precision. Interestingly, the influence of feature contrast on attention has been included in TVA only recently, although it has been extensively studied outside the TVA framework. The present approach further develops this extension of TVA's scope by measuring and modeling salience. An empirical measure of salience is achieved by linking different (orientation and luminance) contrasts to a TVA parameter. In the modeling part, the function relating feature contrasts to salience is described mathematically and tested against alternatives by Bayesian model comparison. This model comparison reveals that the power function is an appropriate model of salience growth in the dimensions of orientation and luminance contrast. Furthermore, if contrasts from the two dimensions are combined, salience adds up additively.

  3. Improving Saliency Models by Predicting Human Fixation Patches

    KAUST Repository

    Dubey, Rachit; Dave, Akshat; Ghanem, Bernard

    2015-01-01

    There is growing interest in studying the Human Visual System (HVS) to supplement and improve the performance of computer vision tasks. A major challenge for current visual saliency models is predicting saliency in cluttered scenes (i.e. high false positive rate). In this paper, we propose a fixation patch detector that predicts image patches that contain human fixations with high probability. Our proposed model detects sparse fixation patches with an accuracy of 84 % and eliminates non-fixation patches with an accuracy of 84 % demonstrating that low-level image features can indeed be used to short-list and identify human fixation patches. We then show how these detected fixation patches can be used as saliency priors for popular saliency models, thus, reducing false positives while maintaining true positives. Extensive experimental results show that our proposed approach allows state-of-the-art saliency methods to achieve better prediction performance on benchmark datasets.

  4. Object detection system based on multimodel saliency maps

    Science.gov (United States)

    Guo, Ya'nan; Luo, Chongfan; Ma, Yide

    2017-03-01

    Detection of visually salient image regions is extensively applied in computer vision and computer graphics, such as object detection, adaptive compression, and object recognition, but any single model always has its limitations to various images, so in our work, we establish a method based on multimodel saliency maps to detect the object, which intelligently absorbs the merits of various individual saliency detection models to achieve promising results. The method can be roughly divided into three steps: in the first step, we propose a decision-making system to evaluate saliency maps obtained by seven competitive methods and merely select the three most valuable saliency maps; in the second step, we introduce heterogeneous PCNN algorithm to obtain three prime foregrounds; and then a self-designed nonlinear fusion method is proposed to merge these saliency maps; at last, the adaptive improved and simplified PCNN model is used to detect the object. Our proposed method can constitute an object detection system for different occasions, which requires no training, is simple, and highly efficient. The proposed saliency fusion technique shows better performance over a broad range of images and enriches the applicability range by fusing different individual saliency models, this proposed system is worthy enough to be called a strong model. Moreover, the proposed adaptive improved SPCNN model is stemmed from the Eckhorn's neuron model, which is skilled in image segmentation because of its biological background, and in which all the parameters are adaptive to image information. We extensively appraise our algorithm on classical salient object detection database, and the experimental results demonstrate that the aggregation of saliency maps outperforms the best saliency model in all cases, yielding highest precision of 89.90%, better recall rates of 98.20%, greatest F-measure of 91.20%, and lowest mean absolute error value of 0.057, the value of proposed saliency evaluation

  5. Mediatization

    DEFF Research Database (Denmark)

    Hjarvard, Stig

    2017-01-01

    Mediatization research shares media effects studies' ambition of answering the difficult questions with regard to whether and how media matter and influence contemporary culture and society. The two approaches nevertheless differ fundamentally in that mediatization research seeks answers...... to these general questions by distinguishing between two concepts: mediation and mediatization. The media effects tradition generally considers the effects of the media to be a result of individuals being exposed to media content, i.e. effects are seen as an outcome of mediated communication. Mediatization...... research is concerned with long-term structural changes involving media, culture, and society, i.e. the influences of the media are understood in relation to how media are implicated in social and cultural changes and how these processes come to create new conditions for human communication and interaction...

  6. Salience of the lambs: a test of the saliency map hypothesis with pictures of emotive objects.

    Science.gov (United States)

    Humphrey, Katherine; Underwood, Geoffrey; Lambert, Tony

    2012-01-25

    Humans have an ability to rapidly detect emotive stimuli. However, many emotional objects in a scene are also highly visually salient, which raises the question of how dependent the effects of emotionality are on visual saliency and whether the presence of an emotional object changes the power of a more visually salient object in attracting attention. Participants were shown a set of positive, negative, and neutral pictures and completed recall and recognition memory tests. Eye movement data revealed that visual saliency does influence eye movements, but the effect is reliably reduced when an emotional object is present. Pictures containing negative objects were recognized more accurately and recalled in greater detail, and participants fixated more on negative objects than positive or neutral ones. Initial fixations were more likely to be on emotional objects than more visually salient neutral ones, suggesting that the processing of emotional features occurs at a very early stage of perception.

  7. Distribution of attention modulates salience signals in early visual cortex

    NARCIS (Netherlands)

    Mulckhuyse, M.; Belopolsky, A.V.; Heslenfeld, D.J.; Talsma, D.; Theeuwes, J.

    2011-01-01

    Previous research has shown that the extent to which people spread attention across the visual field plays a crucial role in visual selection and the occurrence of bottom-up driven attentional capture. Consistent with previous findings, we show that when attention was diffusely distributed across

  8. How longer saccade latencies lead to a competition for salience

    NARCIS (Netherlands)

    de Vries, Jelmer P.; Hooge, Ignace T.C.; Wiering, Marco A.; Verstraten, Frans A.J.

    It has been suggested that independent bottom-up and top-down processes govern saccadic selection. However, recent findings are hard to explain in such terms. We hypothesized that differences in visual-processing time can explain these findings, and we tested this using search displays containing

  9. Saliency detection by conditional generative adversarial network

    Science.gov (United States)

    Cai, Xiaoxu; Yu, Hui

    2018-04-01

    Detecting salient objects in images has been a fundamental problem in computer vision. In recent years, deep learning has shown its impressive performance in dealing with many kinds of vision tasks. In this paper, we propose a new method to detect salient objects by using Conditional Generative Adversarial Network (GAN). This type of network not only learns the mapping from RGB images to salient regions, but also learns a loss function for training the mapping. To the best of our knowledge, this is the first time that Conditional GAN has been used in salient object detection. We evaluate our saliency detection method on 2 large publicly available datasets with pixel accurate annotations. The experimental results have shown the significant and consistent improvements over the state-of-the-art method on a challenging dataset, and the testing speed is much faster.

  10. Salience Effects in the North-West of England

    Directory of Open Access Journals (Sweden)

    Sandra Jansen

    2014-06-01

    Full Text Available The question of how we can define salience, what properties it includes and how we can quantify it have been discussed widely over the past thirty years but we still have more questions than answers about this phenomenon, e. g. not only how salience arises, but also how we can define it. However, despite the lack of a clear definition, salience is often taken into account as an explanatory factor in language change. The scientific discourse on salience has in most cases revolved around phonetic features, while hardly any variables on other linguistic levels have been investigated in terms of their salience. Hence, one goal of this paper is to argue for an expanded view of salience in the sociolinguistic context. This article investigates the variation and change of two groups of variables in Carlisle, an urban speech community in the north west of England. I analyse the variable (th and in particular the replacement of /θ/ with [f] which is widely known as th-fronting. The use of three discourse markers is also examined. Both groups of features will then be discussed in the light of sociolinguistic salience.

  11. Finding the Secret of Image Saliency in the Frequency Domain.

    Science.gov (United States)

    Li, Jia; Duan, Ling-Yu; Chen, Xiaowu; Huang, Tiejun; Tian, Yonghong

    2015-12-01

    There are two sides to every story of visual saliency modeling in the frequency domain. On the one hand, image saliency can be effectively estimated by applying simple operations to the frequency spectrum. On the other hand, it is still unclear which part of the frequency spectrum contributes the most to popping-out targets and suppressing distractors. Toward this end, this paper tentatively explores the secret of image saliency in the frequency domain. From the results obtained in several qualitative and quantitative experiments, we find that the secret of visual saliency may mainly hide in the phases of intermediate frequencies. To explain this finding, we reinterpret the concept of discrete Fourier transform from the perspective of template-based contrast computation and thus develop several principles for designing the saliency detector in the frequency domain. Following these principles, we propose a novel approach to design the saliency detector under the assistance of prior knowledge obtained through both unsupervised and supervised learning processes. Experimental results on a public image benchmark show that the learned saliency detector outperforms 18 state-of-the-art approaches in predicting human fixations.

  12. Bridging the Gap between the Nanometer-Scale Bottom-Up and Micrometer-Scale Top-Down Approaches for Site-Defined InP/InAs Nanowires.

    Science.gov (United States)

    Zhang, Guoqiang; Rainville, Christophe; Salmon, Adrian; Takiguchi, Masato; Tateno, Kouta; Gotoh, Hideki

    2015-11-24

    This work presents a method that bridges the gap between the nanometer-scale bottom-up and micrometer-scale top-down approaches for site-defined nanostructures, which has long been a significant challenge for applications that require low-cost and high-throughput manufacturing processes. We realized the bridging by controlling the seed indium nanoparticle position through a self-assembly process. Site-defined InP nanowires were then grown from the indium-nanoparticle array in the vapor-liquid-solid mode through a "seed and grow" process. The nanometer-scale indium particles do not always occupy the same locations within the micrometer-scale open window of an InP exposed substrate due to the scale difference. We developed a technique for aligning the nanometer-scale indium particles on the same side of the micrometer-scale window by structuring the surface of a misoriented InP (111)B substrate. Finally, we demonstrated that the developed method can be used to grow a uniform InP/InAs axial-heterostructure nanowire array. The ability to form a heterostructure nanowire array with this method makes it possible to tune the emission wavelength over a wide range by employing the quantum confinement effect and thus expand the application of this technology to optoelectronic devices. Successfully pairing a controllable bottom-up growth technique with a top-down substrate preparation technique greatly improves the potential for the mass-production and widespread adoption of this technology.

  13. Properties of V1 neurons tuned to conjunctions of visual features: application of the V1 saliency hypothesis to visual search behavior.

    Directory of Open Access Journals (Sweden)

    Li Zhaoping

    Full Text Available From a computational theory of V1, we formulate an optimization problem to investigate neural properties in the primary visual cortex (V1 from human reaction times (RTs in visual search. The theory is the V1 saliency hypothesis that the bottom-up saliency of any visual location is represented by the highest V1 response to it relative to the background responses. The neural properties probed are those associated with the less known V1 neurons tuned simultaneously or conjunctively in two feature dimensions. The visual search is to find a target bar unique in color (C, orientation (O, motion direction (M, or redundantly in combinations of these features (e.g., CO, MO, or CM among uniform background bars. A feature singleton target is salient because its evoked V1 response largely escapes the iso-feature suppression on responses to the background bars. The responses of the conjunctively tuned cells are manifested in the shortening of the RT for a redundant feature target (e.g., a CO target from that predicted by a race between the RTs for the two corresponding single feature targets (e.g., C and O targets. Our investigation enables the following testable predictions. Contextual suppression on the response of a CO-tuned or MO-tuned conjunctive cell is weaker when the contextual inputs differ from the direct inputs in both feature dimensions, rather than just one. Additionally, CO-tuned cells and MO-tuned cells are often more active than the single feature tuned cells in response to the redundant feature targets, and this occurs more frequently for the MO-tuned cells such that the MO-tuned cells are no less likely than either the M-tuned or O-tuned neurons to be the most responsive neuron to dictate saliency for an MO target.

  14. Properties of V1 neurons tuned to conjunctions of visual features: application of the V1 saliency hypothesis to visual search behavior.

    Science.gov (United States)

    Zhaoping, Li; Zhe, Li

    2012-01-01

    From a computational theory of V1, we formulate an optimization problem to investigate neural properties in the primary visual cortex (V1) from human reaction times (RTs) in visual search. The theory is the V1 saliency hypothesis that the bottom-up saliency of any visual location is represented by the highest V1 response to it relative to the background responses. The neural properties probed are those associated with the less known V1 neurons tuned simultaneously or conjunctively in two feature dimensions. The visual search is to find a target bar unique in color (C), orientation (O), motion direction (M), or redundantly in combinations of these features (e.g., CO, MO, or CM) among uniform background bars. A feature singleton target is salient because its evoked V1 response largely escapes the iso-feature suppression on responses to the background bars. The responses of the conjunctively tuned cells are manifested in the shortening of the RT for a redundant feature target (e.g., a CO target) from that predicted by a race between the RTs for the two corresponding single feature targets (e.g., C and O targets). Our investigation enables the following testable predictions. Contextual suppression on the response of a CO-tuned or MO-tuned conjunctive cell is weaker when the contextual inputs differ from the direct inputs in both feature dimensions, rather than just one. Additionally, CO-tuned cells and MO-tuned cells are often more active than the single feature tuned cells in response to the redundant feature targets, and this occurs more frequently for the MO-tuned cells such that the MO-tuned cells are no less likely than either the M-tuned or O-tuned neurons to be the most responsive neuron to dictate saliency for an MO target.

  15. Bottom-up Experiments and Concrete Utopias

    DEFF Research Database (Denmark)

    Andersson, Lasse

    2010-01-01

    Artiklen undersøger hvorledes brugerdrevne experimenter kan udfordre den standardiserede erhvervsorienterede udgave af Oplevelsesbyen og via eksperimentet stimulerer lokalt forankrede og demokratiske udgaver af en oplevelses- og vidensbaseret by....

  16. Teaching Listening Comprehension: Bottom-Up Approach

    Science.gov (United States)

    Khuziakhmetov, Anvar N.; Porchesku, Galina V.

    2016-01-01

    Improving listening comprehension skills is one of the urgent contemporary educational problems in the field of second language acquisition. Understanding how L2 listening comprehension works can have a serious influence on language pedagogy. The aim of the paper is to discuss the practical and methodological value of the notion of the perception…

  17. Mobile Handsets from the Bottom Up

    DEFF Research Database (Denmark)

    Wallis, Cara; Linchuan Qiu, Jack; Ling, Richard

    2013-01-01

    The setting could be a hole-in-the-wall that serves as a shop in a narrow alley in Guangzhou, a cart on a dusty street on the outskirts of Accra, a bustling marketplace in Mexico City, or a tiny storefront near downtown Los Angeles’ garment district. At such locales, men and women hawk an array o......-income, largely immigrant communities in cities in the developed world....

  18. Policy intersections: Strengthening bottom up accountability amidst ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... presented both economic opportunities and threats to traditional livelihoods. This trend has reduced access to key resources, like water and grazing land, upon which ... approach of marginalized rural groups, limits any accountability strategy. ... women and youth in achieving land and associated resource tenure rights in ...

  19. Police reform from the bottom up

    African Journals Online (AJOL)

    and practitioners across the usual North-South divide. Few police ... colleagues in the North. ... The introduction sets out the ... descriptive overview of key reforms in American policing. ... departments to illustrate the scope for involving officers ...

  20. Milk bottom-up proteomics: method optimisation.

    Directory of Open Access Journals (Sweden)

    Delphine eVincent

    2016-01-01

    Full Text Available Milk is a complex fluid whose proteome displays a diverse set of proteins of high abundance such as caseins and medium to low abundance whey proteins such as ß-lactoglobulin, lactoferrin, immunoglobulins, glycoproteins, peptide hormones and enzymes. A sample preparation method that enables high reproducibility and throughput is key in reliably identifying proteins present or proteins responding to conditions such as a diet, health or genetics. Using skim milk samples from Jersey and Holstein-Friesian cows, we compared three extraction procedures which have not previously been applied to samples of cows’ milk. Method A (urea involved a simple dilution of the milk in a urea-based buffer, method B (TCA/acetone involved a trichloroacetic acid (TCA/acetone precipitation and method C (methanol/chloroform involved a tri-phasic partition method in chloroform/methanol solution. Protein assays, SDS-PAGE profiling, and trypsin digestion followed by nanoHPLC-electrospray ionisation-tandem mass spectrometry (nLC-ESI-MS/MS analyses were performed to assess their efficiency. Replicates were used at each analytical step (extraction, digestion, injection to assess reproducibility. Mass spectrometry (MS data are available via ProteomeXchange with identifier PXD002529. Overall 186 unique accessions, major and minor proteins, were identified with a combination of methods. Method C (methanol/chloroform yielded the best resolved SDS-patterns and highest protein recovery rates, method A (urea yielded the greatest number of accessions, and, of the three procedures, method B (TCA/acetone was the least compatible of all with a wide range of downstream analytical procedures. Our results also highlighted breed differences between the proteins in milk of Jersey and Holstein-Friesian cows.

  1. Research and Development from the bottom up

    DEFF Research Database (Denmark)

    Brem, Alexander; Wolfram, P.

    2014-01-01

    and ecological context or the growing interest of developed market firms in approaches from emerging markets. Hence, the presented framework supports further research in new paradigms for research and development (R&D) in developed market firms (DMFs), particularly in relation to emerging markets. This framework...... enables scholars to compare concepts from developed and emerging markets, to address studies specifically by using consistent terms, and to advance research into the concepts according their characterization....

  2. Top down vision or bottom up demand?

    African Journals Online (AJOL)

    Town and Regional Planning. Journal Home · ABOUT THIS JOURNAL · Advanced Search · Current Issue · Archives · Journal Home > Vol 56 (2010) >. Log in or Register to get access to full text downloads.

  3. Diversification of visual media retrieval results using saliency detection

    Science.gov (United States)

    Muratov, Oleg; Boato, Giulia; De Natale, Franesco G. B.

    2013-03-01

    Diversification of retrieval results allows for better and faster search. Recently there has been proposed different methods for diversification of image retrieval results mainly utilizing text information and techniques imported from natural language processing domain. However, images contain visual information that is impossible to describe in text and the use of visual features is inevitable. Visual saliency is information about the main object of an image implicitly included by humans while creating visual content. For this reason it is naturally to exploit this information for the task of diversification of the content. In this work we study whether visual saliency can be used for the task of diversification and propose a method for re-ranking image retrieval results using saliency. The evaluation has shown that the use of saliency information results in higher diversity of retrieval results.

  4. Mortality salience increases personal relevance of the norm of reciprocity.

    Science.gov (United States)

    Schindler, Simon; Reinhard, Marc-André; Stahlberg, Dagmar

    2012-10-01

    Research on terror management theory found evidence that people under mortality salience strive to live up to salient cultural norms and values, like egalitarianism, pacifism, or helpfulness. A basic, strongly internalized norm in most human societies is the norm of reciprocity: people should support those who supported them (i.e., positive reciprocity), and people should injure those who injured them (i.e., negative reciprocity), respectively. In an experiment (N = 98; 47 women, 51 men), mortality salience overall significantly increased personal relevance of the norm of reciprocity (M = 4.45, SD = 0.65) compared to a control condition (M = 4.19, SD = 0.59). Specifically, under mortality salience there was higher motivation to punish those who treated them unfavourably (negative norm of reciprocity). Unexpectedly, relevance of the norm of positive reciprocity remained unaffected by mortality salience. Implications and limitations are discussed.

  5. Shallow and deep convolutional networks for saliency prediction

    OpenAIRE

    Pan, Junting; Sayrol Clols, Elisa; Giró Nieto, Xavier; McGuinness, Kevin; O'Connor, Noel

    2016-01-01

    The prediction of salient areas in images has been traditionally addressed with hand-crafted features based on neuroscience principles. This paper, however, addresses the problem with a completely data-driven approach by training a convolutional neural network (convnet). The learning process is formulated as a minimization of a loss function that measures the Euclidean distance of the predicted saliency map with the provided ground truth. The recent publication of large datasets of saliency p...

  6. Saliency predicts change detection in pictures of natural scenes.

    Science.gov (United States)

    Wright, Michael J

    2005-01-01

    It has been proposed that the visual system encodes the salience of objects in the visual field in an explicit two-dimensional map that guides visual selective attention. Experiments were conducted to determine whether salience measurements applied to regions of pictures of outdoor scenes could predict the detection of changes in those regions. To obtain a quantitative measure of change detection, observers located changes in pairs of colour pictures presented across an interstimulus interval (ISI). Salience measurements were then obtained from different observers for image change regions using three independent methods, and all were positively correlated with change detection. Factor analysis extracted a single saliency factor that accounted for 62% of the variance contained in the four measures. Finally, estimates of the magnitude of the image change in each picture pair were obtained, using nine separate visual filters representing low-level vision features (luminance, colour, spatial frequency, orientation, edge density). None of the feature outputs was significantly associated with change detection or saliency. On the other hand it was shown that high-level (structural) properties of the changed region were related to saliency and to change detection: objects were more salient than shadows and more detectable when changed.

  7. Gaze distribution analysis and saliency prediction across age groups.

    Science.gov (United States)

    Krishna, Onkar; Helo, Andrea; Rämä, Pia; Aizawa, Kiyoharu

    2018-01-01

    Knowledge of the human visual system helps to develop better computational models of visual attention. State-of-the-art models have been developed to mimic the visual attention system of young adults that, however, largely ignore the variations that occur with age. In this paper, we investigated how visual scene processing changes with age and we propose an age-adapted framework that helps to develop a computational model that can predict saliency across different age groups. Our analysis uncovers how the explorativeness of an observer varies with age, how well saliency maps of an age group agree with fixation points of observers from the same or different age groups, and how age influences the center bias tendency. We analyzed the eye movement behavior of 82 observers belonging to four age groups while they explored visual scenes. Explorative- ness was quantified in terms of the entropy of a saliency map, and area under the curve (AUC) metrics was used to quantify the agreement analysis and the center bias tendency. Analysis results were used to develop age adapted saliency models. Our results suggest that the proposed age-adapted saliency model outperforms existing saliency models in predicting the regions of interest across age groups.

  8. A review of bottom-up vs. top-down control of sponges on Caribbean fore-reefs: what’s old, what’s new, and future directions

    Directory of Open Access Journals (Sweden)

    Joseph R. Pawlik

    2018-01-01

    Full Text Available Interest in the ecology of sponges on coral reefs has grown in recent years with mounting evidence that sponges are becoming dominant members of reef communities, particularly in the Caribbean. New estimates of water column processing by sponge pumping activities combined with discoveries related to carbon and nutrient cycling have led to novel hypotheses about the role of sponges in reef ecosystem function. Among these developments, a debate has emerged about the relative effects of bottom-up (food availability and top-down (predation control on the community of sponges on Caribbean fore-reefs. In this review, we evaluate the impact of the latest findings on the debate, as well as provide new insights based on older citations. Recent studies that employed different research methods have demonstrated that dissolved organic carbon (DOC and detritus are the principal sources of food for a growing list of sponge species, challenging the idea that the relative availability of living picoplankton is the sole proxy for sponge growth or abundance. New reports have confirmed earlier findings that reef macroalgae release labile DOC available for sponge nutrition. Evidence for top-down control of sponge community structure by fish predation is further supported by gut content studies and historical population estimates of hawksbill turtles, which likely had a much greater impact on relative sponge abundances on Caribbean reefs of the past. Implicit to investigations designed to address the bottom-up vs. top-down debate are appropriate studies of Caribbean fore-reef environments, where benthic communities are relatively homogeneous and terrestrial influences and abiotic effects are minimized. One recent study designed to test both aspects of the debate did so using experiments conducted entirely in shallow lagoonal habitats dominated by mangroves and seagrass beds. The top-down results from this study are reinterpreted as supporting past research

  9. Design of the Bottom-up Innovation project--a participatory, primary preventive, organizational level intervention on work-related stress and well-being for workers in Dutch vocational education.

    Science.gov (United States)

    Schelvis, Roosmarijn M C; Oude Hengel, Karen M; Wiezer, Noortje M; Blatter, Birgitte M; van Genabeek, Joost A G M; Bohlmeijer, Ernst T; van der Beek, Allard J

    2013-08-15

    In the educational sector job demands have intensified, while job resources remained the same. A prolonged disbalance between demands and resources contributes to lowered vitality and heightened need for recovery, eventually resulting in burnout, sickness absence and retention problems. Until now stress management interventions in education focused mostly on strengthening the individual capacity to cope with stress, instead of altering the sources of stress at work at the organizational level. These interventions have been only partly effective in influencing burnout and well-being. Therefore, the "Bottom-up Innovation" project tests a two-phased participatory, primary preventive organizational level intervention (i.e. a participatory action approach) that targets and engages all workers in the primary process of schools. It is hypothesized that participating in the project results in increased occupational self-efficacy and organizational efficacy. The central research question: is an organization focused stress management intervention based on participatory action effective in reducing the need for recovery and enhancing vitality in school employees in comparison to business as usual? The study is designed as a controlled trial with mixed methods and three measurement moments: baseline (quantitative measures), six months and 18 months (quantitative and qualitative measures). At first follow-up short term effects of taking part in the needs assessment (phase 1) will be determined. At second follow-up the long term effects of taking part in the needs assessment will be determined as well as the effects of implemented tailored workplace solutions (phase 2). A process evaluation based on quantitative and qualitative data will shed light on whether, how and why the intervention (does not) work(s). "Bottom-up Innovation" is a combined effort of the educational sector, intervention providers and researchers. Results will provide insight into (1) the relation between

  10. Methylphenidate alters selective attention by amplifying salience.

    Science.gov (United States)

    ter Huurne, Niels; Fallon, Sean James; van Schouwenburg, Martine; van der Schaaf, Marieke; Buitelaar, Jan; Jensen, Ole; Cools, Roshan

    2015-12-01

    Methylphenidate, the most common treatment of attention deficit hyperactivity disorder (ADHD), is increasingly used by healthy individuals as a "smart drug" to enhance cognitive abilities like attention. A key feature of (selective) attention is the ability to ignore irrelevant but salient information in the environment (distractors). Although crucial for cognitive performance, until now, it is not known how the use of methylphenidate affects resistance to attentional capture by distractors. The present study aims to clarify how methylphenidate affects distractor suppression in healthy individuals. The effect of methylphenidate (20 mg) on distractor suppression was assessed in healthy subjects (N = 20), in a within-subject double-blind placebo-controlled crossover design. We used a visuospatial attention task with target faces flanked by strong (faces) or weak distractors (scrambled faces). Methylphenidate increased accuracy on trials that required gender identification of target face stimuli (methylphenidate 88.9 ± 1.4 [mean ± SEM], placebo 86.0 ± 1.2 %; p = .003), suggesting increased processing of the faces. At the same time, however, methylphenidate increased reaction time when the target face was flanked by a face distractor relative to a scrambled face distractor (methylphenidate 34.9 ± 3.73, placebo 26.7 ± 2.84 ms; p = .027), suggesting enhanced attentional capture by distractors with task-relevant features. We conclude that methylphenidate amplifies salience of task-relevant information at the level of the stimulus category. This leads to enhanced processing of the target (faces) but also increased attentional capture by distractors drawn from the same category as the target.

  11. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands

    Directory of Open Access Journals (Sweden)

    Rita S. W. Yam

    2016-02-01

    Full Text Available Pomacea canaliculata (Ampullariidae has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents.

  12. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands.

    Science.gov (United States)

    Yam, Rita S W; Fan, Yen-Tzu; Wang, Tzu-Ting

    2016-02-24

    Pomacea canaliculata (Ampullariidae) has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents.

  13. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands

    Science.gov (United States)

    Yam, Rita S. W.; Fan, Yen-Tzu; Wang, Tzu-Ting

    2016-01-01

    Pomacea canaliculata (Ampullariidae) has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents. PMID:26927135

  14. Correct primary structure assessment and extensive glyco-profiling of cetuximab by a combination of intact, middle-up, middle-down and bottom-up ESI and MALDI mass spectrometry techniques.

    Science.gov (United States)

    Ayoub, Daniel; Jabs, Wolfgang; Resemann, Anja; Evers, Waltraud; Evans, Catherine; Main, Laura; Baessmann, Carsten; Wagner-Rousset, Elsa; Suckau, Detlev; Beck, Alain

    2013-01-01

    The European Medicines Agency received recently the first marketing authorization application for a biosimilar monoclonal antibody (mAb) and adopted the final guidelines on biosimilar mAbs and Fc-fusion proteins. The agency requires high similarity between biosimilar and reference products for approval. Specifically, the amino acid sequences must be identical. The glycosylation pattern of the antibody is also often considered to be a very important quality attribute due to its strong effect on quality, safety, immunogenicity, pharmacokinetics and potency. Here, we describe a case study of cetuximab, which has been marketed since 2004. Biosimilar versions of the product are now in the pipelines of numerous therapeutic antibody biosimilar developers. We applied a combination of intact, middle-down, middle-up and bottom-up electrospray ionization and matrix assisted laser desorption ionization mass spectrometry techniques to characterize the amino acid sequence and major post-translational modifications of the marketed cetuximab product, with special emphasis on glycosylation. Our results revealed a sequence error in the reported sequence of the light chain in databases and in publications, thus highlighting the potency of mass spectrometry to establish correct antibody sequences. We were also able to achieve a comprehensive identification of cetuximab's glycoforms and glycosylation profile assessment on both Fab and Fc domains. Taken together, the reported approaches and data form a solid framework for the comparability of antibodies and their biosimilar candidates that could be further applied to routine structural assessments of these and other antibody-based products.

  15. A top-down / bottom-up approach for multi-actors and multi-criteria assessment of mining projects for sustainable development. Application on Arlit Uranium mines (Niger)

    International Nuclear Information System (INIS)

    Chamaret, A.

    2007-06-01

    This thesis aims to appraise the relevance of using an hybrid top-down / bottom-up approach to evaluate mining projects in the perspective of sustainable development. With the advent of corporate social responsibility and sustainable development concepts, new social expectations have appeared towards companies that go beyond a sole requirement of profit earning capacity. If companies do not answer to these expectations, they risk to lose their social legitimacy. Traditionally associated with social, environmental, economical and political impacts and risks, mining activity is particularly concerned by these new issues. Whereas mineral resources needs have never been so high, mining companies are now expected to limit their negative effects and to take into account their different audiences' expectations in order to define, together, the terms of their social license to operate. Considering the diversity of issues, scales, actors and contexts, the challenge is real and necessitates tools to better understand issues and to structure dialogues. Based on the Uranium mines of Arlit (Niger) case study, this work shows that associating participatory approaches to structuration tools and literature propositions, appears as an efficient formula to better organize issues diversity and to build a structured dialogue between mining companies and their stakeholders. First Part aims to present the theoretical, institutional and sectorial contexts of the thesis. Second Part exposes work and results of the evaluation carried out in Niger. And, Third Part, shows the conclusions that can be derived from this work and presents a proposal for an evaluation framework, potentially applicable to other mining sites. (author)

  16. Bottom-up electrochemical preparation of solid-state carbon nanodots directly from nitriles/ionic liquids using carbon-free electrodes and the applications in specific ferric ion detection and cell imaging.

    Science.gov (United States)

    Niu, Fushuang; Xu, Yuanhong; Liu, Mengli; Sun, Jing; Guo, Pengran; Liu, Jingquan

    2016-03-14

    Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often required. Herein, solid-state C-dots were simply prepared by bottom-up EC carbonization of nitriles (e.g. acetonitrile) in the presence of an ionic liquid [e.g. 1-butyl-3-methylimidazolium hexafluorophosphate (BMIMPF6)], using carbon-free electrodes. Due to the positive charges of BMIM(+) on the C-dots, the final products presented in a precipitate form on the cathode, and the unreacted nitriles and BMIMPF6 can be easily removed by simple vacuum filtration. The as-prepared solid-state C-dots can be well dispersed in an aqueous medium with excellent photoluminescence properties. The average size of the C-dots was found to be 3.02 ± 0.12 nm as evidenced by transmission electron microscopy. Other techniques such as UV-vis spectroscopy, fluorescence spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy were applied for the characterization of the C-dots and to analyze the possible generation mechanism. These C-dots have been successfully applied in efficient cell imaging and specific ferric ion detection.

  17. Visual saliency in MPEG-4 AVC video stream

    Science.gov (United States)

    Ammar, M.; Mitrea, M.; Hasnaoui, M.; Le Callet, P.

    2015-03-01

    Visual saliency maps already proved their efficiency in a large variety of image/video communication application fields, covering from selective compression and channel coding to watermarking. Such saliency maps are generally based on different visual characteristics (like color, intensity, orientation, motion,…) computed from the pixel representation of the visual content. This paper resumes and extends our previous work devoted to the definition of a saliency map solely extracted from the MPEG-4 AVC stream syntax elements. The MPEG-4 AVC saliency map thus defined is a fusion of static and dynamic map. The static saliency map is in its turn a combination of intensity, color and orientation features maps. Despite the particular way in which all these elementary maps are computed, the fusion techniques allowing their combination plays a critical role in the final result and makes the object of the proposed study. A total of 48 fusion formulas (6 for combining static features and, for each of them, 8 to combine static to dynamic features) are investigated. The performances of the obtained maps are evaluated on a public database organized at IRCCyN, by computing two objective metrics: the Kullback-Leibler divergence and the area under curve.

  18. Landmark Detection in Orbital Images Using Salience Histograms

    Science.gov (United States)

    Wagstaff, Kiri L.; Panetta, Julian; Schorghofer, Norbert; Greeley, Ronald; PendletonHoffer, Mary; bunte, Melissa

    2010-01-01

    NASA's planetary missions have collected, and continue to collect, massive volumes of orbital imagery. The volume is such that it is difficult to manually review all of the data and determine its significance. As a result, images are indexed and searchable by location and date but generally not by their content. A new automated method analyzes images and identifies "landmarks," or visually salient features such as gullies, craters, dust devil tracks, and the like. This technique uses a statistical measure of salience derived from information theory, so it is not associated with any specific landmark type. It identifies regions that are unusual or that stand out from their surroundings, so the resulting landmarks are context-sensitive areas that can be used to recognize the same area when it is encountered again. A machine learning classifier is used to identify the type of each discovered landmark. Using a specified window size, an intensity histogram is computed for each such window within the larger image (sliding the window across the image). Next, a salience map is computed that specifies, for each pixel, the salience of the window centered at that pixel. The salience map is thresholded to identify landmark contours (polygons) using the upper quartile of salience values. Descriptive attributes are extracted for each landmark polygon: size, perimeter, mean intensity, standard deviation of intensity, and shape features derived from an ellipse fit.

  19. Visual Saliency Prediction and Evaluation across Different Perceptual Tasks.

    Directory of Open Access Journals (Sweden)

    Shafin Rahman

    Full Text Available Saliency maps produced by different algorithms are often evaluated by comparing output to fixated image locations appearing in human eye tracking data. There are challenges in evaluation based on fixation data due to bias in the data. Properties of eye movement patterns that are independent of image content may limit the validity of evaluation results, including spatial bias in fixation data. To address this problem, we present modeling and evaluation results for data derived from different perceptual tasks related to the concept of saliency. We also present a novel approach to benchmarking to deal with some of the challenges posed by spatial bias. The results presented establish the value of alternatives to fixation data to drive improvement and development of models. We also demonstrate an approach to approximate the output of alternative perceptual tasks based on computational saliency and/or eye gaze data. As a whole, this work presents novel benchmarking results and methods, establishes a new performance baseline for perceptual tasks that provide an alternative window into visual saliency, and demonstrates the capacity for saliency to serve in approximating human behaviour for one visual task given data from another.

  20. Robust online tracking via adaptive samples selection with saliency detection

    Science.gov (United States)

    Yan, Jia; Chen, Xi; Zhu, QiuPing

    2013-12-01

    Online tracking has shown to be successful in tracking of previously unknown objects. However, there are two important factors which lead to drift problem of online tracking, the one is how to select the exact labeled samples even when the target locations are inaccurate, and the other is how to handle the confusors which have similar features with the target. In this article, we propose a robust online tracking algorithm with adaptive samples selection based on saliency detection to overcome the drift problem. To deal with the problem of degrading the classifiers using mis-aligned samples, we introduce the saliency detection method to our tracking problem. Saliency maps and the strong classifiers are combined to extract the most correct positive samples. Our approach employs a simple yet saliency detection algorithm based on image spectral residual analysis. Furthermore, instead of using the random patches as the negative samples, we propose a reasonable selection criterion, in which both the saliency confidence and similarity are considered with the benefits that confusors in the surrounding background are incorporated into the classifiers update process before the drift occurs. The tracking task is formulated as a binary classification via online boosting framework. Experiment results in several challen