WorldWideScience

Sample records for bottom-up control factors

  1. Bottom-up and top-down controls on picoplankton in the East China Sea

    Science.gov (United States)

    Guo, C.; Liu, H.; Zheng, L.; Song, S.; Chen, B.; Huang, B.

    2013-05-01

    Dynamics of picoplankton population distribution in the East China Sea (ECS), a marginal sea in the western North Pacific Ocean, were studied during two "CHOICE-C" cruises in August 2009 (summer) and January 2010 (winter). Dilution experiments were conducted during the two cruises to investigate the growth and grazing among picophytoplantkon populations. Picoplankton accounted for an average of ~29% (2% to 88%) of community carbon biomass in the ECS on average, with lower percentages in plume region than in shelf and kuroshio regions. Averaged growth rates (μ) for Prochlorococcus (Pro), Synechococcus (Syn) and picoeukaryotes (peuk) were 0.36, 0.89, 0.90 d-1, respectively, in summer, and 0.46, 0.58, 0.56 d-1, respectively, in winter. Seawater salinity and nutrient availability exerted significant controls on picoplankton growth rate. Averaged grazing mortality (m) were 0.46, 0.63, 0.68 d-1 in summer, and 0.22, 0.32, 0.22 d-1 in winter for Pro, Syn and peuk respectively. The three populations demonstrated very different distribution patterns regionally and seasonally affected by both bottom-up and top-down controls. In summer, Pro, Syn and peuk were dominant in Kuroshio, transitional and plume regions respectively. Protist grazing consumed 84%, 78%, 73% and 45%, 47%, 57% of production for Pro, Syn and peuk in summer and winter respectively, suggesting more significant top-down controls in summer. In winter, all three populations tended to distribute in offshore regions, although the area of coverage was different (peuk > Syn > Pro). Bottom-up factors can explain as much as 91.5%, 82% and 81.2% of Pro, Syn and peuk abundance variance in winter, while only 59.1% and 43.7% for Pro and peuk in summer. Regionally, Yangtze River discharge plays a significant role in affecting the intensity of top-down control, indicated by significant and negative association between salinity and grazing mortality of all three populations and higher grazing mortality to growth rate ratio

  2. Bottom-up and top-down controls on picoplankton in the East China Sea

    Directory of Open Access Journals (Sweden)

    C. Guo

    2013-05-01

    Full Text Available Dynamics of picoplankton population distribution in the East China Sea (ECS, a marginal sea in the western North Pacific Ocean, were studied during two "CHOICE-C" cruises in August 2009 (summer and January 2010 (winter. Dilution experiments were conducted during the two cruises to investigate the growth and grazing among picophytoplantkon populations. Picoplankton accounted for an average of ~29% (2% to 88% of community carbon biomass in the ECS on average, with lower percentages in plume region than in shelf and kuroshio regions. Averaged growth rates (μ for Prochlorococcus (Pro, Synechococcus (Syn and picoeukaryotes (peuk were 0.36, 0.89, 0.90 d−1, respectively, in summer, and 0.46, 0.58, 0.56 d−1, respectively, in winter. Seawater salinity and nutrient availability exerted significant controls on picoplankton growth rate. Averaged grazing mortality (m were 0.46, 0.63, 0.68 d−1 in summer, and 0.22, 0.32, 0.22 d−1 in winter for Pro, Syn and peuk respectively. The three populations demonstrated very different distribution patterns regionally and seasonally affected by both bottom-up and top-down controls. In summer, Pro, Syn and peuk were dominant in Kuroshio, transitional and plume regions respectively. Protist grazing consumed 84%, 78%, 73% and 45%, 47%, 57% of production for Pro, Syn and peuk in summer and winter respectively, suggesting more significant top-down controls in summer. In winter, all three populations tended to distribute in offshore regions, although the area of coverage was different (peuk > Syn > Pro. Bottom-up factors can explain as much as 91.5%, 82% and 81.2% of Pro, Syn and peuk abundance variance in winter, while only 59.1% and 43.7% for Pro and peuk in summer. Regionally, Yangtze River discharge plays a significant role in affecting the intensity of top-down control, indicated by significant and negative association between salinity and grazing mortality of all three populations and higher grazing mortality to

  3. Top-down vs. bottom-up control on vegetation composition in a tidal marsh depends on scale

    NARCIS (Netherlands)

    Elschot, Kelly; Vermeulen, Anke; Vandenbruwaene, Wouter; Bakker, Jan P.; Bouma, Tjeerd J.; Stahl, Julia; Castelijns, Henk; Temmerman, Stijn

    2017-01-01

    The relative impact of top-down control by herbivores and bottom-up control by environmental conditions on vegetation is a subject of debate in ecology. In this study, we hypothesize that top-down control by goose foraging and bottom-up control by sediment accretion on vegetation composition with

  4. Top-down vs. bottom-up control on vegetation composition in a tidal marsh depends on scale.

    Science.gov (United States)

    Elschot, Kelly; Vermeulen, Anke; Vandenbruwaene, Wouter; Bakker, Jan P; Bouma, Tjeerd J; Stahl, Julia; Castelijns, Henk; Temmerman, Stijn

    2017-01-01

    The relative impact of top-down control by herbivores and bottom-up control by environmental conditions on vegetation is a subject of debate in ecology. In this study, we hypothesize that top-down control by goose foraging and bottom-up control by sediment accretion on vegetation composition within an ecosystem can co-occur but operate at different spatial and temporal scales. We used a highly dynamic marsh system with a large population of the Greylag goose (Anser anser) to investigate the potential importance of spatial and temporal scales on these processes. At the local scale, Greylag geese grub for below-ground storage organs of the vegetation, thereby creating bare patches of a few square metres within the marsh vegetation. In our study, such activities by Greylag geese allowed them to exert top-down control by setting back vegetation succession. However, we found that the patches reverted back to the initial vegetation type within 12 years. At large spatial (i.e. several square kilometres) and temporal scales (i.e. decades), high rates of sediment accretion surpassing the rate of local sea-level rise were found to drive long-term vegetation succession and increased cover of several climax vegetation types. In summary, we conclude that the vegetation composition within this tidal marsh was primarily controlled by the bottom-up factor of sediment accretion, which operates at large spatial as well as temporal scales. Top-down control exerted by herbivores was found to be a secondary process and operated at much smaller spatial and temporal scales.

  5. Adaptive genetic variation mediates bottom-up and top-down control in an aquatic ecosystem

    Science.gov (United States)

    Rudman, Seth M.; Rodriguez-Cabal, Mariano A.; Stier, Adrian; Sato, Takuya; Heavyside, Julian; El-Sabaawi, Rana W.; Crutsinger, Gregory M.

    2015-01-01

    Research in eco-evolutionary dynamics and community genetics has demonstrated that variation within a species can have strong impacts on associated communities and ecosystem processes. Yet, these studies have centred around individual focal species and at single trophic levels, ignoring the role of phenotypic variation in multiple taxa within an ecosystem. Given the ubiquitous nature of local adaptation, and thus intraspecific variation, we sought to understand how combinations of intraspecific variation in multiple species within an ecosystem impacts its ecology. Using two species that co-occur and demonstrate adaptation to their natal environments, black cottonwood (Populus trichocarpa) and three-spined stickleback (Gasterosteus aculeatus), we investigated the effects of intraspecific phenotypic variation on both top-down and bottom-up forces using a large-scale aquatic mesocosm experiment. Black cottonwood genotypes exhibit genetic variation in their productivity and consequently their leaf litter subsidies to the aquatic system, which mediates the strength of top-down effects from stickleback on prey abundances. Abundances of four common invertebrate prey species and available phosphorous, the most critically limiting nutrient in freshwater systems, are dictated by the interaction between genetic variation in cottonwood productivity and stickleback morphology. These interactive effects fit with ecological theory on the relationship between productivity and top-down control and are comparable in strength to the effects of predator addition. Our results illustrate that intraspecific variation, which can evolve rapidly, is an under-appreciated driver of community structure and ecosystem function, demonstrating that a multi-trophic perspective is essential to understanding the role of evolution in structuring ecological patterns. PMID:26203004

  6. Sponge communities on Caribbean coral reefs are structured by factors that are top-down, not bottom-up.

    Directory of Open Access Journals (Sweden)

    Joseph R Pawlik

    Full Text Available Caribbean coral reefs have been transformed in the past few decades with the demise of reef-building corals, and sponges are now the dominant habitat-forming organisms on most reefs. Competing hypotheses propose that sponge communities are controlled primarily by predatory fishes (top-down or by the availability of picoplankton to suspension-feeding sponges (bottom-up. We tested these hypotheses on Conch Reef, off Key Largo, Florida, by placing sponges inside and outside predator-excluding cages at sites with less and more planktonic food availability (15 m vs. 30 m depth. There was no evidence of a bottom-up effect on the growth of any of 5 sponge species, and 2 of 5 species grew more when caged at the shallow site with lower food abundance. There was, however, a strong effect of predation by fishes on sponge species that lacked chemical defenses. Sponges with chemical defenses grew slower than undefended species, demonstrating a resource trade-off between growth and the production of secondary metabolites. Surveys of the benthic community on Conch Reef similarly did not support a bottom-up effect, with higher sponge cover at the shallower depth. We conclude that the structure of sponge communities on Caribbean coral reefs is primarily top-down, and predict that removal of sponge predators by overfishing will shift communities toward faster-growing, undefended species that better compete for space with threatened reef-building corals.

  7. Metatranscriptomic Evidence for Co-Occurring Top-Down and Bottom-Up Controls on Toxic Cyanobacterial Communities

    Science.gov (United States)

    Steffen, Morgan M.; Belisle, B. Shafer; Watson, Sue B.; Boyer, Gregory L.; Bourbonniere, Richard A.

    2015-01-01

    Little is known about the molecular and physiological function of co-occurring microbes within freshwater cyanobacterial harmful algal blooms (cHABs). To address this, community metatranscriptomes collected from the western basin of Lake Erie during August 2012 were examined. Using sequence data, we tested the hypothesis that the activity of the microbial community members is independent of community structure. Predicted metabolic and physiological functional profiles from spatially distinct metatranscriptomes were determined to be ≥90% similar between sites. Targeted analysis of Microcystis aeruginosa, the historical causative agent of cyanobacterial harmful algal blooms over the past ∼20 years, as well as analysis of Planktothrix agardhii and Anabaena cylindrica, revealed ongoing transcription of genes involved in microcystin toxin synthesis as well as the acquisition of both nitrogen and phosphorus, nutrients often implicated as independent bottom-up drivers of eutrophication in aquatic systems. Transcription of genes involved in carbon dioxide (CO2) concentration and metabolism also provided support for the alternate hypothesis that high-pH conditions and dense algal biomass result in CO2-limiting conditions that further favor cyanobacterial dominance. Additionally, the presence of Microcystis-specific cyanophage sequences provided preliminary evidence of possible top-down virus-mediated control of cHAB populations. Overall, these data provide insight into the complex series of constraints associated with Microcystis blooms that dominate the western basin of Lake Erie during summer months, demonstrating that multiple environmental factors work to shape the microbial community. PMID:25662977

  8. Bottom-up control of geomagnetic secular variation by the Earth's inner core.

    Science.gov (United States)

    Aubert, Julien; Finlay, Christopher C; Fournier, Alexandre

    2013-10-10

    Temporal changes in the Earth's magnetic field, known as geomagnetic secular variation, occur most prominently at low latitudes in the Atlantic hemisphere (that is, from -90 degrees east to 90 degrees east), whereas in the Pacific hemisphere there is comparatively little activity. This is a consequence of the geographical localization of intense, westward drifting, equatorial magnetic flux patches at the core surface. Despite successes in explaining the morphology of the geomagnetic field, numerical models of the geodynamo have so far failed to account systematically for this striking pattern of geomagnetic secular variation. Here we show that it can be reproduced provided that two mechanisms relying on the inner core are jointly considered. First, gravitational coupling aligns the inner core with the mantle, forcing the flow of liquid metal in the outer core into a giant, westward drifting, sheet-like gyre. The resulting shear concentrates azimuthal magnetic flux at low latitudes close to the core-mantle boundary, where it is expelled by core convection and subsequently transported westward. Second, differential inner-core growth, fastest below Indonesia, causes an asymmetric buoyancy release in the outer core which in turn distorts the gyre, forcing it to become eccentric, in agreement with recent core flow inversions. This bottom-up heterogeneous driving of core convection dominates top-down driving from mantle thermal heterogeneities, and localizes magnetic variations in a longitudinal sector centred beneath the Atlantic, where the eccentric gyre reaches the core surface. To match the observed pattern of geomagnetic secular variation, the solid material forming the inner core must now be in a state of differential growth rather than one of growth and melting induced by convective translation.

  9. Bottom-up factors influencing riparian willow recovery in Yellowstone National Park

    Science.gov (United States)

    Tercek, M.T.; Stottlemyer, R.; Renkin, R.

    2010-01-01

    After the elimination of wolves (Canis lupis L.) in the 1920s, woody riparian plant communities on the northern range of Yellowstone National Park (YNP) declined an estimated 50%. After the reintroduction of wolves in 19951996, riparian willows (Salix spp.) on YNP's northern range showed significant growth for the first time since the 1920s. However, the pace of willow recovery has not been uniform. Some communities have exceeded 400 cm, while others are still at pre-1995 levels of 250 cm max. height) willow sites where willows had escaped elk (Cervus elaphus L.) browsing with "short" willow sites that could still be browsed. Unlike studies that manipulated willow height with fences and artificial dams, we examined sites that had natural growth differences in height since the reintroduction of wolves. Tall willow sites had greater water availability, more-rapid net soil nitrogen mineralization, greater snow depth, lower soil respiration rates, and cooler summer soil temperatures than nearby short willow sites. Most of these differences were measured both in herbaceous areas adjacent to the willow patches and in the willow patches themselves, suggesting that they were not effects of varying willow height recovery but were instead preexisting site differences that may have contributed to increased plant productivity. Our results agree with earlier studies in experimental plots which suggest that the varying pace of willow recovery has been influenced by abiotic limiting factors that interact with top-down reductions in willow browsing by elk. ?? 2010 Western North American Naturalist.

  10. Optimal Environmental Conditions and Anomalous Ecosystem Responses: Constraining Bottom-up Controls of Phytoplankton Biomass in the California Current System

    Science.gov (United States)

    Jacox, Michael G.; Hazen, Elliott L.; Bograd, Steven J.

    2016-06-01

    In Eastern Boundary Current systems, wind-driven upwelling drives nutrient-rich water to the ocean surface, making these regions among the most productive on Earth. Regulation of productivity by changing wind and/or nutrient conditions can dramatically impact ecosystem functioning, though the mechanisms are not well understood beyond broad-scale relationships. Here, we explore bottom-up controls during the California Current System (CCS) upwelling season by quantifying the dependence of phytoplankton biomass (as indicated by satellite chlorophyll estimates) on two key environmental parameters: subsurface nitrate concentration and surface wind stress. In general, moderate winds and high nitrate concentrations yield maximal biomass near shore, while offshore biomass is positively correlated with subsurface nitrate concentration. However, due to nonlinear interactions between the influences of wind and nitrate, bottom-up control of phytoplankton cannot be described by either one alone, nor by a combined metric such as nitrate flux. We quantify optimal environmental conditions for phytoplankton, defined as the wind/nitrate space that maximizes chlorophyll concentration, and present a framework for evaluating ecosystem change relative to environmental drivers. The utility of this framework is demonstrated by (i) elucidating anomalous CCS responses in 1998-1999, 2002, and 2005, and (ii) providing a basis for assessing potential biological impacts of projected climate change.

  11. Optimal Environmental Conditions and Anomalous Ecosystem Responses: Constraining Bottom-up Controls of Phytoplankton Biomass in the California Current System.

    Science.gov (United States)

    Jacox, Michael G; Hazen, Elliott L; Bograd, Steven J

    2016-06-09

    In Eastern Boundary Current systems, wind-driven upwelling drives nutrient-rich water to the ocean surface, making these regions among the most productive on Earth. Regulation of productivity by changing wind and/or nutrient conditions can dramatically impact ecosystem functioning, though the mechanisms are not well understood beyond broad-scale relationships. Here, we explore bottom-up controls during the California Current System (CCS) upwelling season by quantifying the dependence of phytoplankton biomass (as indicated by satellite chlorophyll estimates) on two key environmental parameters: subsurface nitrate concentration and surface wind stress. In general, moderate winds and high nitrate concentrations yield maximal biomass near shore, while offshore biomass is positively correlated with subsurface nitrate concentration. However, due to nonlinear interactions between the influences of wind and nitrate, bottom-up control of phytoplankton cannot be described by either one alone, nor by a combined metric such as nitrate flux. We quantify optimal environmental conditions for phytoplankton, defined as the wind/nitrate space that maximizes chlorophyll concentration, and present a framework for evaluating ecosystem change relative to environmental drivers. The utility of this framework is demonstrated by (i) elucidating anomalous CCS responses in 1998-1999, 2002, and 2005, and (ii) providing a basis for assessing potential biological impacts of projected climate change.

  12. Warming shifts top-down and bottom-up control of pond food web structure and function.

    Science.gov (United States)

    Shurin, Jonathan B; Clasen, Jessica L; Greig, Hamish S; Kratina, Pavel; Thompson, Patrick L

    2012-11-05

    The effects of global and local environmental changes are transmitted through networks of interacting organisms to shape the structure of communities and the dynamics of ecosystems. We tested the impact of elevated temperature on the top-down and bottom-up forces structuring experimental freshwater pond food webs in western Canada over 16 months. Experimental warming was crossed with treatments manipulating the presence of planktivorous fish and eutrophication through enhanced nutrient supply. We found that higher temperatures produced top-heavy food webs with lower biomass of benthic and pelagic producers, equivalent biomass of zooplankton, zoobenthos and pelagic bacteria, and more pelagic viruses. Eutrophication increased the biomass of all organisms studied, while fish had cascading positive effects on periphyton, phytoplankton and bacteria, and reduced biomass of invertebrates. Surprisingly, virus biomass was reduced in the presence of fish, suggesting the possibility for complex mechanisms of top-down control of the lytic cycle. Warming reduced the effects of eutrophication on periphyton, and magnified the already strong effects of fish on phytoplankton and bacteria. Warming, fish and nutrients all increased whole-system rates of net production despite their distinct impacts on the distribution of biomass between producers and consumers, plankton and benthos, and microbes and macrobes. Our results indicate that warming exerts a host of indirect effects on aquatic food webs mediated through shifts in the magnitudes of top-down and bottom-up forcing.

  13. Temporal shifts in top-down vs. bottom-up control of epiphytic algae in a seagrass ecosystem

    Science.gov (United States)

    Whalen, Matthew A.; Duffy, J. Emmett; Grace, James B.

    2013-01-01

    In coastal marine food webs, small invertebrate herbivores (mesograzers) have long been hypothesized to occupy an important position facilitating dominance of habitat-forming macrophytes by grazing competitively superior epiphytic algae. Because of the difficulty of manipulating mesograzers in the field, however, their impacts on community organization have rarely been rigorously documented. Understanding mesograzer impacts has taken on increased urgency in seagrass systems due to declines in seagrasses globally, caused in part by widespread eutrophication favoring seagrass overgrowth by faster-growing algae. Using cage-free field experiments in two seasons (fall and summer), we present experimental confirmation that mesograzer reduction and nutrients can promote blooms of epiphytic algae growing on eelgrass (Zostera marina). In this study, nutrient additions increased epiphytes only in the fall following natural decline of mesograzers. In the summer, experimental mesograzer reduction stimulated a 447% increase in epiphytes, appearing to exacerbate seasonal dieback of eelgrass. Using structural equation modeling, we illuminate the temporal dynamics of complex interactions between macrophytes, mesograzers, and epiphytes in the summer experiment. An unexpected result emerged from investigating the interaction network: drift macroalgae indirectly reduced epiphytes by providing structure for mesograzers, suggesting that the net effect of macroalgae on seagrass depends on macroalgal density. Our results show that mesograzers can control proliferation of epiphytic algae, that top-down and bottom-up forcing are temporally variable, and that the presence of macroalgae can strengthen top-down control of epiphytic algae, potentially contributing to eelgrass persistence.

  14. Biocompatible PEGylated MoS2 nanosheets: controllable bottom-up synthesis and highly efficient photothermal regression of tumor.

    Science.gov (United States)

    Wang, Shige; Li, Kai; Chen, Yu; Chen, Hangrong; Ma, Ming; Feng, Jingwei; Zhao, Qinghua; Shi, Jianlin

    2015-01-01

    Two-dimensional transition metal dichalcogenides, particularly MoS2 nanosheets, have been deemed as a novel category of NIR photothermal transducing agent. Herein, an efficient and versatile one-pot solvothermal synthesis based on "bottom-up" strategy has been, for the first time, proposed for the controlled synthesis of PEGylated MoS2 nanosheets by using a novel "integrated" precursor containing both Mo and S elements. This facile but unique PEG-mediated solvothermal procedure endowed MoS2 nanosheets with controlled size, increased crystallinity and excellent colloidal stability. The photothermal performance of nanosheets was optimized via modulating the particulate size and surface PEGylation. PEGylated MoS2 nanosheets with desired photothermal conversion performance and excellent colloidal and photothermal stability were further utilized for highly efficient photothermal therapy of cancer in a tumor-bearing mouse xenograft. Without showing observable in vitro and in vivo hemolysis, coagulation and toxicity, the optimized MoS2-PEG nanosheets showed promising in vitro and in vivo anti-cancer efficacy.

  15. Temporal shifts in top-down vs. bottom-up control of epiphytic algae in a seagrass ecosystem.

    Science.gov (United States)

    Whalen, Matthew A; Duffy, J Emmett; Grace, James B

    2013-02-01

    In coastal marine food webs, small invertebrate herbivores (mesograzers) have long been hypothesized to occupy an important position facilitating dominance of habitat-forming macrophytes by grazing competitively superior epiphytic algae. Because of the difficulty of manipulating mesograzers in the field, however, their impacts on community organization have rarely been rigorously documented. Understanding mesograzer impacts has taken on increased urgency in seagrass systems due to declines in seagrasses globally, caused in part by widespread eutrophication favoring seagrass overgrowth by faster-growing algae. Using cage-free field experiments in two seasons (fall and summer), we present experimental confirmation that mesograzer reduction and nutrients can promote blooms of epiphytic algae growing on eelgrass (Zostera marina). In this study, nutrient additions increased epiphytes only in the fall following natural decline of mesograzers. In the summer, experimental mesograzer reduction stimulated a 447% increase in epiphytes, appearing to exacerbate seasonal dieback of eelgrass. Using structural equation modeling, we illuminate the temporal dynamics of complex interactions between macrophytes, mesograzers, and epiphytes in the summer experiment. An unexpected result emerged from investigating the interaction network: drift macroalgae indirectly reduced epiphytes by providing structure for mesograzers, suggesting that the net effect of macroalgae on seagrass depends on macroalgal density. Our results show that mesograzers can control proliferation of epiphytic algae, that top-down and bottom-up forcing are temporally variable, and that the presence of macroalgae can strengthen top-down control of epiphytic algae, potentially contributing to eelgrass persistence.

  16. The benefits of China's efforts on gaseous pollutant control indicated by the bottom-up emissions and satellite observation

    Science.gov (United States)

    Xia, Y.; Zhao, Y.

    2015-12-01

    To evaluate the effectiveness of national policies of air pollution control, the emissions of SO2, NOX, CO and CO2 in China are estimated with a bottom-up method from 2000 to 2014, and vertical column densities (VCD) from satellite observation are used to evaluate the inter-annual trends and spatial distribution of emissions and the temporal and spatial patterns of ambient levels of gaseous pollutants across the country. In particular, an additional emission case named STD case, which combines the most recent issued emission standards for specific industrial sources, is developed for 2012-2014. The inter-annual trends in emissions and VCDs match well except for SO2, and the revised emissions in STD case improve the comparison, implying the benefits of emission control for most recent years. Satellite retrieval error, underestimation of emission reduction and improved atmospheric oxidization caused the differences between emissions and VCDs trend of SO2. Coal-fired power plants play key roles in SO2 and NOX emission reduction. As suggested by VCD and emission inventory, the control of CO in 11th five year plan (FYP) period was more effective than that in the 12th FYP period, while the SO2 appeared opposite. As the new control target added in 12th FYP, NOX emissions have been clearly decreased 4.3 Mt from 2011 to 2014, in contrast to the fast growth before 2011. The inter-annual trends in NO2 VCDs has the poorest correlation with vehicle ownership (R=0.796), due to the staged emission standard of vehicles. In developed regions, transportation has become the main pollutants emission source and we prove this by comparing VCDs of NO2 to VCDs of SO2. Moreover, air quality in mega cities has been evaluated based on satellite observation and emissions, and results indicate that Beijing suffered heavily from the emissions from Hebei and Tianjin, while the local emissions tend to dominate in Shanghai.

  17. Evidence of bottom-up control of marine productivity in the Mediterranean Sea during the last 50 years

    Science.gov (United States)

    Macías, Diego; Garcia-Gorriz, Elisa; Piroddi, Chiara; Stips, Adolf

    2014-05-01

    The temporal dynamics of biogeochemical variables derived from a coupled 3D hydrodynamic-biogeochemical model of the entire Mediterranean Sea is evaluated during the last 50 years (1960 - 2010). Realistic atmospheric forcing and river discharge are used to force the dynamics of the coupled model system. The time evolutions of primary and secondary productions in the entire basin are assessed against available independent data on fisheries yields and catches per unit effort for the same time period. Concordant patterns are found in the time-series of all biological variables (from the model and from fisheries statistics), with low values at the beginning of the series, a later increase with maximum values reached at the end of the 1990's and a posterior stabilization or a small decline. Spectral analysis of the annual biological time-series reveals coincident low-frequency signals in all of them; the first, more energetic signal, peaks at 2000 while the second one (less energetic) presents maximum values at around 1982. Almost identical low-frequency signals are found in the nutrient loads of the main rivers of the basin and in the integrated (0-100 meters) mean nutrient concentrations in the marine ecosystem. Nitrate concentration shows an increasing trend up to 1998 with a later stabilization or a slight decline to present day values. This nitrate evolution seems to be driving the first low-frequency signal found in the biological time series. Phosphate, on the other hand, shows maximum concentrations around 1982 and a posterior sharp decline. This nutrient seems to be responsible for the second low-frequency signal observed in the biological time-series. Our analysis shows that the control of marine productivity (from plankton to fish) in the Mediterranean basin seem to be principally mediated through bottom-up processes that could be traced back to the characteristics of riverine discharges. Other types of control could not be excluded from our analysis (e

  18. Bottom-Up Earley Deduction

    CERN Document Server

    Erbach, G

    1995-01-01

    We propose a bottom-up variant of Earley deduction. Bottom-up deduction is preferable to top-down deduction because it allows incremental processing (even for head-driven grammars), it is data-driven, no subsumption check is needed, and preference values attached to lexical items can be used to guide best-first search. We discuss the scanning step for bottom-up Earley deduction and indexing schemes that help avoid useless deduction steps.

  19. Toward improved prediction of the bedrock depth underneath hillslopes: Bayesian inference of the bottom-up control hypothesis using high-resolution topographic data

    Science.gov (United States)

    Gomes, Guilherme J. C.; Vrugt, Jasper A.; Vargas, Eurípedes A.

    2016-04-01

    The depth to bedrock controls a myriad of processes by influencing subsurface flow paths, erosion rates, soil moisture, and water uptake by plant roots. As hillslope interiors are very difficult and costly to illuminate and access, the topography of the bedrock surface is largely unknown. This essay is concerned with the prediction of spatial patterns in the depth to bedrock (DTB) using high-resolution topographic data, numerical modeling, and Bayesian analysis. Our DTB model builds on the bottom-up control on fresh-bedrock topography hypothesis of Rempe and Dietrich (2014) and includes a mass movement and bedrock-valley morphology term to extent the usefulness and general applicability of the model. We reconcile the DTB model with field observations using Bayesian analysis with the DREAM algorithm. We investigate explicitly the benefits of using spatially distributed parameter values to account implicitly, and in a relatively simple way, for rock mass heterogeneities that are very difficult, if not impossible, to characterize adequately in the field. We illustrate our method using an artificial data set of bedrock depth observations and then evaluate our DTB model with real-world data collected at the Papagaio river basin in Rio de Janeiro, Brazil. Our results demonstrate that the DTB model predicts accurately the observed bedrock depth data. The posterior mean DTB simulation is shown to be in good agreement with the measured data. The posterior prediction uncertainty of the DTB model can be propagated forward through hydromechanical models to derive probabilistic estimates of factors of safety.

  20. Bottom-up versus top-down control of tree regeneration in the Bialowieza Primeval Forest, Poland

    NARCIS (Netherlands)

    Kuijper, D.P.J.; Cromsigt, J.P.G.M.; Jedrzejewska, B.A.; Miscicki, S.C.; Jedrzejewski, W.A.; Kweczlich, I.C.

    2010-01-01

    We tested the interactions between biotic and abiotic factors in structuring temperate forest communities by comparing tree recruitment after 7 years inside 30 pairs of exclosure (excluding ungulates: red deer, roe deer, bison, moose, wild boar) and control plots (7 × 7 m each) in one of the most na

  1. Age-Related Inter-region EEG Coupling Changes during the Control of Bottom-up and Top-down Attention

    Directory of Open Access Journals (Sweden)

    Ling eLi

    2015-12-01

    Full Text Available We investigated age-related changes in electroencephalographic (EEG coupling of theta-, alpha-, and beta-frequency bands during bottom-up and top-down attention. Arrays were presented with either automatic pop-out (bottom-up or effortful search (top-down behavior to younger and older participants. The phase-locking value (PLV was used to estimate coupling strength between scalp recordings. Behavioral performance decreased with age, with a greater age-related decline in accuracy for the search than for the pop-out condition. Aging was associated with a declined coupling strength of theta and alpha frequency bands, with a greater age-related decline in whole-brain coupling values for the search than for the pop-out condition. Specifically, prefronto-frontal coupling in theta- and alpha-bands, fronto-parietal and parieto-occipital couplings in beta-band for younger group showed a right hemispheric dominance, which was reduced with aging to compensate for the inhibitory dysfunction. While pop-out target detection was mainly associated with greater parieto-occipital beta-coupling strength compared to search condition regardless of aging. Furthermore, prefronto-frontal coupling in theta-, alpha- and beta-bands, and parieto-occipital coupling in beta-band functioned as predictors of behavior for both groups. Taken together these findings provide evidence that prefronto-frontal coupling of theta-, alpha-, and beta-bands may serve as a possible basis of aging during visual attention, while parieto-occipital coupling in beta-band could serve for a bottom-up function and be vulnerable to top-down attention control for younger and older groups.

  2. Culture from the Bottom Up

    Science.gov (United States)

    Atkinson, Dwight; Sohn, Jija

    2013-01-01

    The culture concept has been severely criticized for its top-down nature in TESOL, leading arguably to its falling out of favor in the field. But what of the fact that people do "live culturally" (Ingold, 1994)? This article describes a case study of culture from the bottom up--culture as understood and enacted by its individual users.…

  3. Building from the Bottom Up

    Science.gov (United States)

    2003-05-01

    through billions of years of prebiotic and molecular selection and evolution, there are bio-organic by Shuguang Zhang Building from the bottom up... Health , Du Pont-MIT Alliance, and the Whitaker Foundation. I also gratefully acknowledge Intel Corporation Academic Program for the generous donation

  4. "Bottom-up" transparent electrodes.

    Science.gov (United States)

    Morag, Ahiud; Jelinek, Raz

    2016-11-15

    Transparent electrodes (TEs) have attracted significant scientific, technological, and commercial interest in recent years due to the broad and growing use of such devices in electro-optics, consumer products (touch-screens for example), solar cells, and others. Currently, almost all commercial TEs are fabricated through "top-down" approaches (primarily lithography-based techniques), with indium tin oxide (ITO) as the most common material employed. Several problems are encountered, however, in this field, including the cost and complexity of TE production using top-down technologies, the limited structural flexibility, high-cost of indium, and brittle nature and low transparency in the far-IR spectral region of ITO. Alternative routes based upon bottom-up processes, have recently emerged as viable alternatives for production of TEs. Bottom up technologies are based upon self-assembly of building blocks - atoms, molecules, or nanoparticles - generating thin patterned films that exhibit both electrical conductivity and optical transparency. In this Feature Article we discuss the recent progress in this active and exciting field, including bottom-up TE systems produced from carbon materials (carbon nanotubes, graphene, graphene-oxide), silver, gold, and other metals. The current hurdles encountered for broader use of bottom-up strategies along with their significant potential are analyzed.

  5. Latitudinal variation in top-down and bottom-up control of a salt marsh food web.

    Science.gov (United States)

    Marczak, L B; Ho, C K; Wieski, K; Vu, H; Denno, R F; Pennings, S C

    2011-02-01

    The shrub Iva frutescens, which occupies the terrestrial border of U.S. Atlantic Coast salt marshes, supports a food web that varies strongly across latitude. We tested whether latitudinal variation in plant quality (higher at high latitudes), consumption by omnivores (a crab, present only at low latitudes), consumption by mesopredators (ladybugs, present at all latitudes), or the life history stage of an herbivorous beetle could explain continental-scale field patterns of herbivore density. In a mesocosm experiment, crabs exerted strong top-down control on herbivorous beetles, ladybugs exerted strong top-down control on aphids, and both predators benefited plants through trophic cascades. Latitude of plant origin had no effect on consumers. Herbivorous beetle density was greater if mesocosms were stocked with beetle adults rather than larvae, and aphid densities were reduced in the "adult beetle" treatment. Treatment combinations representing high and low latitudes produced patterns of herbivore density similar to those in the field. We conclude that latitudinal variation in plant quality is less important than latitudinal variation in top consumers and competition in mediating food web structure. Climate may also play a strong role in structuring high-latitude salt marshes by limiting the number of herbivore generations per growing season and causing high overwintering mortality.

  6. Fabricación de electrodos para control de transporte y alineamiento a micro y nanoescalas usando técnicas bottom-up y top-down

    Directory of Open Access Journals (Sweden)

    Darwin Rodríguez

    2014-12-01

    Full Text Available El continuo avance de aplicaciones en dispositivos de autoensamble, posicionamiento, sensores, actuadores, y que permitan controladamente la manipulación de micro y nanoestructuras, han generado amplio interés en el desarrollo de metodologías que permitan optimizar la fabricación de dispositivos para el control y manipulación a micro y nanoescalas. Este proyecto explora técnicas de fabricación de electrodos con el fin de encontrar una técnica óptima y reproducible. Se compara el rendimiento de cada técnica y se describen protocolos de limpieza y seguridad. Se diseñan e implementan tres geometrías para movilizar y posicionar micro y nanopartículas de hierro en una solución de aceite natural. Finalmente se generan campos eléctricos a partir de electroforesis, con el fin de encontrar la curva que describe el desplazamiento de las partículas con respecto al potencial aplicado. Estos resultados generan gran impacto en los actuales esfuerzos de fabricación bottom-up (controlando con campos la ubicación y la movilidad en dispositivos electrónicos. El hecho de fabricar geometría planar con electrodos genera la posibilidad de que se pueda integrar movimiento de partículas a los circuitos integrados que se fabrican en la actualidad.

  7. Role of zinc interstitials and oxygen vacancies of ZnO in photocatalysis: a bottom-up approach to control defect density.

    Science.gov (United States)

    Kayaci, Fatma; Vempati, Sesha; Donmez, Inci; Biyikli, Necmi; Uyar, Tamer

    2014-09-07

    Oxygen vacancies (V(O)s) in ZnO are well-known to enhance photocatalytic activity (PCA) despite various other intrinsic crystal defects. In this study, we aim to elucidate the effect of zinc interstitials (Zn(i)) and V(O)s on PCA, which has applied as well as fundamental interest. To achieve this, the major hurdle of fabricating ZnO with controlled defect density requires to be overcome, where it is acknowledged that defect level control in ZnO is significantly difficult. In the present context, we fabricated nanostructures and thoroughly characterized their morphological (SEM, TEM), structural (XRD, TEM), chemical (XPS) and optical (photoluminescence, PL) properties. To fabricate the nanostructures, we adopted atomic layer deposition (ALD), which is a powerful bottom-up approach. However, to control defects, we chose polysulfone electrospun nanofibers as a substrate on which the non-uniform adsorption of ALD precursors is inevitable because of the differences in the hydrophilic nature of the functional groups. For the first 100 cycles, Zn(i)s were predominant in ZnO quantum dots (QDs), while the presence of V(O)s was negligible. As the ALD cycle number increased, V(O)s were introduced, whereas the density of Zn(i) remained unchanged. We employed PL spectra to identify and quantify the density of each defect for all the samples. PCA was performed on all the samples, and the percent change in the decay constant for each sample was juxtaposed with the relative densities of Zn(i)s and V(O)s. A logical comparison of the relative defect densities of Zn(i)s and V(O)s suggested that the former are less efficient than the latter because of the differences in the intrinsic nature and the physical accessibility of the defects. Other reasons for the efficiency differences were elaborated.

  8. Benefits of China's efforts in gaseous pollutant control indicated by the bottom-up emissions and satellite observations 2000-2014

    Science.gov (United States)

    Xia, Yinmin; Zhao, Yu; Nielsen, Chris P.

    2016-07-01

    To evaluate the effectiveness of national air pollution control policies, the emissions of SO2, NOX, CO and CO2 in China are estimated using bottom-up methods for the most recent 15-year period (2000-2014). Vertical column densities (VCDs) from satellite observations are used to test the temporal and spatial patterns of emissions and to explore the ambient levels of gaseous pollutants across the country. The inter-annual trends in emissions and VCDs match well except for SO2. Such comparison is improved with an optimistic assumption in emission estimation that the emission standards for given industrial sources issued after 2010 have been fully enforced. Underestimation of emission abatement and enhanced atmospheric oxidization likely contribute to the discrepancy between SO2 emissions and VCDs. As suggested by VCDs and emissions estimated under the assumption of full implementation of emission standards, the control of SO2 in the 12th Five-Year Plan period (12th FYP, 2011-2015) is estimated to be more effective than that in the 11th FYP period (2006-2010), attributed to improved use of flue gas desulfurization in the power sector and implementation of new emission standards in key industrial sources. The opposite was true for CO, as energy efficiency improved more significantly from 2005 to 2010 due to closures of small industrial plants. Iron & steel production is estimated to have had particularly strong influence on temporal and spatial patterns of CO. In contrast to fast growth before 2011 driven by increased coal consumption and limited controls, NOX emissions decreased from 2011 to 2014 due to the penetration of selective catalytic/non-catalytic reduction systems in the power sector. This led to reduced NO2 VCDs, particularly in relatively highly polluted areas such as the eastern China and Pearl River Delta regions. In developed areas, transportation is playing an increasingly important role in air pollution, as suggested by the increased ratio of NO2 to SO

  9. Bottom-up Initiatives for Photovoltaic: Incentives and Barriers

    Directory of Open Access Journals (Sweden)

    Kathrin Reinsberger

    2014-06-01

    Full Text Available When facing the challenge of restructuring the energy system, bottom-up initiatives can aid the diffusion of decentralized and clean energy technologies. We focused here on a bottom-up initiative of citizen-funded and citizen-operated photovoltaic power plants. The project follows a case study-based approach and examines two different community initiatives. The aim is to investigate the potential incentives and barriers relating to participation or non-participation in predefined community PV projects. Qualitative, as well as quantitative empirical research was used to examine the key factors in the further development of bottom-up initiatives as contributors to a general energy transition.

  10. Bottom-up organic integrated circuits

    NARCIS (Netherlands)

    Smits, Edsger C. P.; Mathijssen, Simon G. J.; van Hal, Paul A.; Setayesh, Sepas; Geuns, Thomas C. T.; Mutsaers, Kees A. H. A.; Cantatore, Eugenio; Wondergem, Harry J.; Werzer, Oliver; Resel, Roland; Kemerink, Martijn; Kirchmeyer, Stephan; Muzafarov, Aziz M.; Ponomarenko, Sergei A.; de Boer, Bert; Blom, Paul W. M.; de Leeuw, Dago M.

    2008-01-01

    Self- assembly - the autonomous organization of components into patterns and structures(1) - is a promising technology for the mass production of organic electronics. Making integrated circuits using a bottom- up approach involving self- assembling molecules was proposed(2) in the 1970s. The basic b

  11. Bottom-up holographic approach to QCD

    Energy Technology Data Exchange (ETDEWEB)

    Afonin, S. S. [V. A. Fock Department of Theoretical Physics, Saint Petersburg State University, 1 ul. Ulyanovskaya, 198504 (Russian Federation)

    2016-01-22

    One of the most known result of the string theory consists in the idea that some strongly coupled gauge theories may have a dual description in terms of a higher dimensional weakly coupled gravitational theory — the so-called AdS/CFT correspondence or gauge/gravity correspondence. The attempts to apply this idea to the real QCD are often referred to as “holographic QCD” or “AdS/QCD approach”. One of directions in this field is to start from the real QCD and guess a tentative dual higher dimensional weakly coupled field model following the principles of gauge/gravity correspondence. The ensuing phenomenology can be then developed and compared with experimental data and with various theoretical results. Such a bottom-up holographic approach turned out to be unexpectedly successful in many cases. In the given short review, the technical aspects of the bottom-up holographic approach to QCD are explained placing the main emphasis on the soft wall model.

  12. Bottom-up holographic approach to QCD

    Science.gov (United States)

    Afonin, S. S.

    2016-01-01

    One of the most known result of the string theory consists in the idea that some strongly coupled gauge theories may have a dual description in terms of a higher dimensional weakly coupled gravitational theory — the so-called AdS/CFT correspondence or gauge/gravity correspondence. The attempts to apply this idea to the real QCD are often referred to as "holographic QCD" or "AdS/QCD approach". One of directions in this field is to start from the real QCD and guess a tentative dual higher dimensional weakly coupled field model following the principles of gauge/gravity correspondence. The ensuing phenomenology can be then developed and compared with experimental data and with various theoretical results. Such a bottom-up holographic approach turned out to be unexpectedly successful in many cases. In the given short review, the technical aspects of the bottom-up holographic approach to QCD are explained placing the main emphasis on the soft wall model.

  13. Bottom-up assembly of metallic germanium.

    Science.gov (United States)

    Scappucci, Giordano; Klesse, Wolfgang M; Yeoh, LaReine A; Carter, Damien J; Warschkow, Oliver; Marks, Nigel A; Jaeger, David L; Capellini, Giovanni; Simmons, Michelle Y; Hamilton, Alexander R

    2015-08-10

    Extending chip performance beyond current limits of miniaturisation requires new materials and functionalities that integrate well with the silicon platform. Germanium fits these requirements and has been proposed as a high-mobility channel material, a light emitting medium in silicon-integrated lasers, and a plasmonic conductor for bio-sensing. Common to these diverse applications is the need for homogeneous, high electron densities in three-dimensions (3D). Here we use a bottom-up approach to demonstrate the 3D assembly of atomically sharp doping profiles in germanium by a repeated stacking of two-dimensional (2D) high-density phosphorus layers. This produces high-density (10(19) to 10(20) cm(-3)) low-resistivity (10(-4)Ω · cm) metallic germanium of precisely defined thickness, beyond the capabilities of diffusion-based doping technologies. We demonstrate that free electrons from distinct 2D dopant layers coalesce into a homogeneous 3D conductor using anisotropic quantum interference measurements, atom probe tomography, and density functional theory.

  14. Top Down Chemistry Versus Bottom up Chemistry

    Science.gov (United States)

    Oka, Takeshi; Witt, Adolf N.

    2016-06-01

    The idea of interstellar top down chemistry (TDC), in which molecules are produced from decomposition of larger molecules and dust in contrast to ordinary bottom up chemistry (BUC) in which molecules are produced synthetically from smaller molecules and atoms in the ISM, has been proposed in the chemistry of PAH and carbon chain molecules both for diffusea,c and dense cloudsb,d. A simple and natural idea, it must have occurred to many people and has been in the air for sometime. The validity of this hypothesis is apparent for diffuse clouds in view of the observed low abundance of small molecules and its rapid decrease with molecular size on the one hand and the high column densities of large carbon molecules demonstrated by the many intense diffuse interstellar bands (DIBs) on the other. Recent identification of C60^+ as the carrier of 5 near infrared DIBs with a high column density of 2×1013 cm-2 by Maier and others confirms the TDC. This means that the large molecules and dust produced in the high density high temperature environment of circumstellar envelopes are sufficiently stable to survive decompositions due to stellar UV radiaiton, cosmic rays, C-shocks etc. for a long time (≥ 10^7 year) of their migration to diffuse clouds and seems to disagree with the consensus in the field of interstellar grains. The stability of molecules and aggregates in the diffuse interstellar medium will be discussed. Duley, W. W. 2006, Faraday Discuss. 133, 415 Zhen,J., Castellanos, P., Paardekooper, D. M., Linnartz, H., Tielens, A. G. G. M. 2014, ApJL, 797, L30 Huang, J., Oka, T. 2015, Mol. Phys. 113, 2159 Guzmán, V. V., Pety, J., Goicoechea, J. R., Gerin, M., Roueff, E., Gratier, P., Öberg, K. I. 2015, ApJL, 800, L33 L. Ziurys has sent us many papers beginning Ziurys, L. M. 2006, PNAS 103, 12274 indicating she had long been a proponent of the idea. Campbell, E. K., Holz, M., Maier, J. P., Gerlich, D., Walker, G. A. H., Bohlender, D, 2016, ApJ, in press Draine, B. T. 2003

  15. Bottom-up fabrication of graphene nanostructures on Ru(1010).

    Science.gov (United States)

    Song, Junjie; Zhang, Han-jie; Cai, Yiliang; Zhang, Yuxi; Bao, Shining; He, Pimo

    2016-02-01

    Investigations on the bottom-up fabrication of graphene nanostructures with 10, 10'-dibromo-9, 9'-bianthryl (DBBA) as a precursor on Ru(1010) were carried out using scanning tunnelling microscopy (STM) and density functional theory (DFT) calculations. Upon annealing the sample at submonolayer DBBA coverage, N = 7 graphene nanoribbons (GNRs) aligned along the [1210] direction form. Higher DBBA coverage and higher annealing temperature lead to the merging of GNRs into ribbon-like graphene nanoflakes with multiple orientations. These nanoflakes show different Moiré patterns, and their structures were determined by DFT simulations. The results showed that GNRs possess growth preference on the Ru(1010) substrate with a rectangular unit cell, and GNRs with armchair and zigzag boundaries are obtainable. Further DFT calculations suggest that the interaction between graphene and the substrate controls the orientations of the graphene overlayer and the growth of graphene on Ru(1010).

  16. Nanoelectronics: Thermoelectric Phenomena in «Bottom-Up» Approach

    Directory of Open Access Journals (Sweden)

    Yu.A. Kruglyak

    2014-04-01

    Full Text Available Thermoelectric phenomena of Seebeck and Peltier, quality indicators and thermoelectric optimization, ballistic and diffusive phonon heat current are discussed in the frame of the «bottom-up» approach of modern nanoelectronics.

  17. On a Bottom-Up Approach to Scientific Discovery

    Science.gov (United States)

    Huang, Xiang

    2014-03-01

    Two popular models of scientific discovery, abduction and the inference to the best explanation (IBE), presuppose that the reason for accepting a hypothetical explanation A comes from the epistemic and/or explanatory force manifested in the fact that observed fact C is an inferred consequence of A. However, not all discoveries take this top-down procedure from A to C, in which the result of discovery A implies the observed fact C. I contend that discovery can be modeled as a bottom-up procedure based on inductive and analogical rules that lead us to infer from C to A. I take the theory of Dignaga, an Indian medieval logician, as a model of this bottom-up approach. My argument has three panels: 1) this bottom-up approach applies to both commonsense and scientific discovery without the assumption that C has to be an inferred consequence of A; 2) this bottom-up approach helps us get around problems that crop up in applying abduction and/or IBE, which means that scientific discovery need not to be modeled exclusively by top-down approaches; and 3) the existence of the bottom-up approach requires a pluralist attitude towards modeling of scientific discovery.

  18. A plea for Global Health Action bottom-up

    Directory of Open Access Journals (Sweden)

    Ulrich Laaser

    2016-10-01

    Full Text Available This opinion piece focuses on global health action by hands-on bottom-up practice: Initiation of an organizational framework and securing financial efficiency are – however - essential, both clearly a domain of well trained public health professionals. Examples of action are cited in the four main areas of global threats: planetary climate change, global divides and inequity, global insecurity and violent conflicts, global instability and financial crises. In conclusion a stable health systems policy framework would greatly enhance success. However, such organisational framework dries out if not linked to public debates channelling fresh thoughts and controversial proposals: the structural stabilisation is essential but has to serve not to dominate bottom-up activities. In other words a horizontal management is required, a balanced equilibrium between bottom-up initiative and top-down support. Last not least rewarding voluntary and charity work by public acknowledgement is essential.

  19. Bottom up approaches to defining future climate mitigation commitments

    NARCIS (Netherlands)

    Elzen MGJ den; Berk MM; KMD

    2005-01-01

    Dit rapport beschrijft de resultaten van een aantal in de literatuur geopperde alternatieve, bottom-up benaderingen om verplichtingen vorm te geven,i.e. technologie en performance standaards, technologie onderzoek en ontwikkelingsafspraken, sectorale verplichtingen, S-CDM (Sectoraal CDM) en SD-P

  20. Bottom up approaches to defining future climate mitigation commitments

    NARCIS (Netherlands)

    Elzen MGJ den; Berk MM; KMD

    2005-01-01

    This report analyses a number of alternative, bottom-up approaches, i.e. technology and performance standards; technology Research and Development agreements, sectoral targets (national /transnational), sector based CDM, and sustainable development policies and measures (SD-PAMs). Included are tech

  1. A bottom-up approach to MEDLINE indexing recommendations

    DEFF Research Database (Denmark)

    Jimeno-Yepes, Antonio; Wilkowski, Bartlomiej; Mork, James/G

    2011-01-01

    MEDLINE indexing performed by the US National Library of Medicine staff describes the essence of a biomedical publication in about 14 Medical Subject Headings (MeSH). Since 2002, this task is assisted by the Medical Text Indexer (MTI) program. We present a bottom-up approach to MEDLINE indexing...

  2. Bottom-Up Analysis of Single-Case Research Designs

    Science.gov (United States)

    Parker, Richard I.; Vannest, Kimberly J.

    2012-01-01

    This paper defines and promotes the qualities of a "bottom-up" approach to single-case research (SCR) data analysis. Although "top-down" models, for example, multi-level or hierarchical linear models, are gaining momentum and have much to offer, interventionists should be cautious about analyses that are not easily understood, are not governed by…

  3. Reading Nature from a "Bottom-Up" Perspective

    Science.gov (United States)

    Magntorn, Ola; Hellden, Gustav

    2007-01-01

    This paper reports on a study of ecology teaching and learning in a Swedish primary school class (age 10-11 yrs). A teaching sequence was designed to help students read nature in a river ecosystem. The teaching sequence had a "bottom up" approach, taking as its starting point a common key organism--the freshwater shrimp. From this species and its…

  4. Bottom-up effects on attention capture and choice

    DEFF Research Database (Denmark)

    Peschel, Anne; Orquin, Jacob Lund; Mueller Loose, Simone

    Attention processes and decision making are accepted to be closely linked together because only information that is attended to can be incorporated in the decision process. Little is known however, to which extent bottom-up processes of attention affect stimulus selection and therefore...... the information available to form a decision. Does changing one visual cue in the stimulus set affect attention towards this cue and what does that mean for the choice outcome? To address this, we conducted a combined eye tracking and choice experiment in a consumer choice setting with visual shelf simulations...... of different product categories. Surface size and visual saliency of a product label were manipulated to determine bottom-up effects on attention and choice. Results show a strong and significant increase in attention in terms of fixation likelihood towards product labels which are larger and more visually...

  5. A Bottom-Up Approach to SUSY Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Horn, Claus; /SLAC

    2009-08-03

    This paper proposes a new way to perform event generation and analysis in searches for new physics at the LHC. An abstract notation is used to describe the new particles on a level which better corresponds to detector resolution of LHC experiments. In this way the SUSY discovery space can be decomposed into a small number of eigenmodes each with only a few parameters, which allows to investigate the SUSY parameter space in a model-independent way. By focusing on the experimental observables for each process investigated the Bottom-Up Approach allows to systematically study the boarders of the experimental efficiencies and thus to extend the sensitivity for new physics.

  6. A Bottom-Up Approach to SUSY Analyses

    Energy Technology Data Exchange (ETDEWEB)

    Horn, Claus; /SLAC

    2011-11-11

    This paper proposes a new way to do event generation and analysis in searches for new physics at the LHC. An abstract notation is used to describe the new particles on a level which better corresponds to detector resolution of LHC experiments. In this way the SUSY discovery space can be decomposed into a small number of eigenmodes each with only a few parameters, which allows to investigate the SUSY parameter space in a model-independent way. By focusing on the experimental observables for each process investigated the Bottom-Up Approach allows to systematically study the boarders of the experimental efficiencies and thus to extend the sensitivity for new physics.

  7. Recent progress in backreacted bottom-up holographic QCD

    Energy Technology Data Exchange (ETDEWEB)

    Järvinen, Matti [Laboratoire de Physique Théorique, École Normale Supérieure, 24 rue Lhomond, 75231 Paris Cedex 05 (France)

    2016-01-22

    Recent progress in constructing holographic models for QCD is discussed, concentrating on the bottom-up models which implement holographically the renormalization group flow of QCD. The dynamics of gluons can be modeled by using a string-inspired model termed improved holographic QCD, and flavor can be added by introducing space filling branes in this model. The flavor fully backreacts to the glue in the Veneziano limit, giving rise to a class of models which are called V-QCD. The phase diagrams and spectra of V-QCD are in good agreement with results for QCD obtained by other methods.

  8. Bottom-up capacity building for data providers in RITMARE

    Science.gov (United States)

    Pepe, Monica; Basoni, Anna; Bastianini, Mauro; Fugazza, Cristiano; Menegon, Stefano; Oggioni, Alessandro; Pavesi, Fabio; Sarretta, Alessandro; Carrara, Paola

    2014-05-01

    RITMARE is a Flagship Project by the Italian Ministry of Research, coordinated by the National Research Council (CNR). It aims at the interdisciplinary integration of Italian marine research. Sub-project 7 shall create an interoperable infrastructure for the project, capable of interconnecting the whole community of researchers involved. It will allow coordinating and sharing of data, processes, and information produced by the other sub-projects [1]. Spatial Data Infrastructures (SDIs) allow for interoperable sharing among heterogeneous, distributed spatial content providers. The INSPIRE Directive [2] regulates the development of a pan-european SDI despite the great variety of national approaches in managing spatial data. However, six years after its adoption, its growth is still hampered by technological, cultural, and methodological gaps. In particular, in the research sector, actors may not be prone to comply with INSPIRE (or feel not compelled to) because they are too concentrated on domain-specific activities or hindered by technological issues. Indeed, the available technologies and tools for enabling standard-based discovery and access services are far from being user-friendly and requires time-consuming activities, such as metadata creation. Moreover, the INSPIRE implementation guidelines do not accommodate an essential component in environmental research, that is, in situ observations. In order to overcome most of the aforementioned issues and to enable researchers to actively give their contribution in the creation of the project infrastructure, a bottom-up approach has been adopted: a software suite has been developed, called Starter Kit, which is offered to research data production units, so that they can become autonomous, independent nodes of data provision. The Starter Kit enables the provision of geospatial resources, either geodata (e.g., maps and layers) or observations pulled from sensors, which are made accessible according to the OGC standards

  9. Inverse Magnetic Catalysis in Bottom-Up Holographic QCD

    CERN Document Server

    Evans, Nick; Scott, Marc

    2016-01-01

    We explore the effect of magnetic field on chiral condensation in QCD via a simple bottom up holographic model which inputs QCD dynamics through the running of the anomalous dimension of the quark bilinear. Bottom up holography is a form of effective field theory and we use it to explore the dependence on the coefficients of the two lowest order terms linking the magnetic field and the quark condensate. In the massless theory, we identify a region of parameter space where magnetic catalysis occurs at zero temperature but inverse magnetic catalysis at temperatures of order the thermal phase transition. The model shows similar non-monotonic behaviour in the condensate with B at intermediate T as the lattice data. This behaviour is due to the separation of the meson melting and chiral transitions in the holographic framework. The introduction of quark mass raises the scale of B where inverse catalysis takes over from catalysis until the inverse catalysis lies outside the regime of validity of the effective descr...

  10. Mindfulness meditation associated with alterations in bottom-up processing: psychophysiological evidence for reduced reactivity.

    Science.gov (United States)

    van den Hurk, Paul A M; Janssen, Barbara H; Giommi, Fabio; Barendregt, Henk P; Gielen, Stan C

    2010-11-01

    Mental training by meditation has been related to changes in high-level cognitive functions that involve top-down processing. The aim of this study was to investigate whether the practice of meditation is also related to alterations in low-level, bottom-up processing. Therefore, intersensory facilitation (IF) effects in a group of mindfulness meditators (MM) were compared to IF effects in an age- and gender-matched control group. Smaller and even absent IF effects were found in the MM group, which suggests that changes in bottom-up processing are associated with MM. Furthermore, reduced interference of a visual warning stimulus with the IF effects was found, which suggests an improved allocation of attentional resources in mindfulness meditators, even across modalities.

  11. The generation of myricetin-nicotinamide nanococrystals by top down and bottom up technologies

    Science.gov (United States)

    Liu, Mingyu; Hong, Chao; Li, Guowen; Ma, Ping; Xie, Yan

    2016-09-01

    Myricetin-nicotinamide (MYR-NIC) nanococrystal preparation methods were developed and optimized using both top down and bottom up approaches. The grinding (top down) method successfully achieved nanococrystals, but there were some micrometer range particles and aggregation. The key consideration of the grinding technology was to control the milling time to determine a balance between the particle size and distribution. In contrast, a modified bottom up approach based on a solution method in conjunction with sonochemistry resulted in a uniform MYR-NIC nanococrystal that was confirmed by powder x-ray diffraction, scanning electron microscopy, dynamic light scattering, and differential scanning calorimeter, and the particle dissolution rate and amount were significantly greater than that of MYR-NIC cocrystal. Notably, this was a simple method without the addition of any non-solvent. We anticipate our findings will provide some guidance for future nanococrystal preparation as well as its application in both chemical and pharmaceutical area.

  12. Bottom-up laboratory testing of the DKIST Visible Broadband Imager (VBI)

    Science.gov (United States)

    Ferayorni, Andrew; Beard, Andrew; Cole, Wes; Gregory, Scott; Wöeger, Friedrich

    2016-08-01

    The Daniel K. Inouye Solar Telescope (DKIST) is a 4-meter solar observatory under construction at Haleakala, Hawaii [1]. The Visible Broadband Imager (VBI) is a first light instrument that will record images at the highest possible spatial and temporal resolution of the DKIST at a number of scientifically important wavelengths [2]. The VBI is a pathfinder for DKIST instrumentation and a test bed for developing processes and procedures in the areas of unit, systems integration, and user acceptance testing. These test procedures have been developed and repeatedly executed during VBI construction in the lab as part of a "test early and test often" philosophy aimed at identifying and resolving issues early thus saving cost during integration test and commissioning on summit. The VBI team recently completed a bottom up end-to-end system test of the instrument in the lab that allowed the instrument's functionality, performance, and usability to be validated against documented system requirements. The bottom up testing approach includes four levels of testing, each introducing another layer in the control hierarchy that is tested before moving to the next level. First the instrument mechanisms are tested for positioning accuracy and repeatability using a laboratory position-sensing detector (PSD). Second the real-time motion controls are used to drive the mechanisms to verify speed and timing synchronization requirements are being met. Next the high-level software is introduced and the instrument is driven through a series of end-to-end tests that exercise the mechanisms, cameras, and simulated data processing. Finally, user acceptance testing is performed on operational and engineering use cases through the use of the instrument engineering graphical user interface (GUI). In this paper we present the VBI bottom up test plan, procedures, example test cases and tools used, as well as results from test execution in the laboratory. We will also discuss the benefits realized

  13. A bottom-up approach of stochastic demand allocation in water quality modelling

    Directory of Open Access Journals (Sweden)

    E. J. M. Blokker

    2010-01-01

    Full Text Available An "all pipes" hydraulic model of a DMA-sized drinking water distribution system was constructed with two types of demand allocations. One is constructed with the conventional top-down approach, i.e. a demand multiplier pattern from the booster station is allocated to all demand nodes with a correction factor to account for the average water demand on that node. The other is constructed with a bottom-up approach of demand allocation, i.e., each individual home is represented by one demand node with its own stochastic water demand pattern.

    The stochastic water demand patterns are constructed with an end-use model on a per second basis and per individual home. The flow entering the test area was measured and a tracer test with sodium chloride was performed to measure travel times. The two models were evaluated on the predicted sum of demands and travel times, compared with what was measured in the test area.

    The new bottom-up approach performs at least as well as the conventional top-down approach with respect to total demand and travel times, without the need for any flow measurements or calibration measurements. The bottom-up approach leads to a stochastic method of hydraulic modelling and gives insight into the variability of travel times as an added feature beyond the conventional way of modelling.

  14. A Bottom-up Trend in Research of Management of Technology

    Directory of Open Access Journals (Sweden)

    Yoko Ishino

    2014-12-01

    Full Text Available Management of Technology (MOT is defined as an academic discipline of management that enables organizations to manage their technological fundamentals to create competitive advantage. MOT covers a wide range of contents including administrative strategy, R&D management, manufacturing management, technology transfer, production control, marketing, accounting, finance, business ethics, and others. For each topic, researchers have conducted their MOT research at various levels. However, a practical and pragmatic side of MOT surely affects its research trends. Finding changes of MOT research trends, or the chronological transitions of principal subjects, can help understand the key concepts of current MOT. This paper studied a bottom-up trend in research fields in MOT by applying a text-mining method to the conference proceedings of IAMOT (International Association for Management of Technology. First, focusing on only nouns found several keywords, which more frequently emerge over time in the IAMOT proceedings. Then, expanding the scope into other parts of speech viewed the keywords in a natural context. Finally, it was found that the use of an important keyword has qualitatively and quantitatively extended over time. In conclusion, a bottom-up trend in MOT research was detected and the effects of the social situation on the trend were discussed.Keywords: Management of Technology; Text Mining; Research Trend; Bottom-up Trend; Patent

  15. Contextualised ICT4D: a Bottom-Up Approach

    DEFF Research Database (Denmark)

    Lund, Henrik Hautop; Sutinen, Erkki

    2010-01-01

    . In a certain way, this agenda can be understood as a topdown approach which transfers technology in a hierarchical way to actual users. Complementary to the traditional approach, a bottom-up approach starts by identifying communities that are ready to participate in a process to use technology to transform......The term ICT4D refers to the opportunities of Information and Communication Technology (ICT) as an agent of development. Much of the research in the field is based on evaluating the feasibility of existing technologies, mostly of Western or Asian origin, in the context of developing countries...... their own strengths to new levels by designing appropriate technologies with experts of technology and design. The bottomup approach requires a new kind of ICT education at the undergraduate level. An example of the development of a contextualized IT degree program at Tumaini University in Tanzania shows...

  16. A bottom-up approach to MEDLINE indexing recommendations.

    Science.gov (United States)

    Jimeno-Yepes, Antonio; Wilkowski, Bartłomiej; Mork, James G; Van Lenten, Elizabeth; Fushman, Dina Demner; Aronson, Alan R

    2011-01-01

    MEDLINE indexing performed by the US National Library of Medicine staff describes the essence of a biomedical publication in about 14 Medical Subject Headings (MeSH). Since 2002, this task is assisted by the Medical Text Indexer (MTI) program. We present a bottom-up approach to MEDLINE indexing in which the abstract is searched for indicators for a specific MeSH recommendation in a two-step process. Supervised machine learning combined with triage rules improves sensitivity of recommendations while keeping the number of recommended terms relatively small. Improvement in recommendations observed in this work warrants further exploration of this approach to MTI recommendations on a larger set of MeSH headings.

  17. BitCube: A Bottom-Up Cubing Engineering

    Science.gov (United States)

    Ferro, Alfredo; Giugno, Rosalba; Puglisi, Piera Laura; Pulvirenti, Alfredo

    Enhancing on line analytical processing through efficient cube computation plays a key role in Data Warehouse management. Hashing, grouping and mining techniques are commonly used to improve cube pre-computation. BitCube, a fast cubing method which uses bitmaps as inverted indexes for grouping, is presented. It horizontally partitions data according to the values of one dimension and for each resulting fragment it performs grouping following bottom-up criteria. BitCube allows also partial materialization based on iceberg conditions to treat large datasets for which a full cube pre-computation is too expensive. Space requirement of bitmaps is optimized by applying an adaption of the WAH compression technique. Experimental analysis, on both synthetic and real datasets, shows that BitCube outperforms previous algorithms for full cube computation and results comparable on iceberg cubing.

  18. Making the results of bottom-up energy savings comparable

    Directory of Open Access Journals (Sweden)

    Moser Simon

    2012-01-01

    Full Text Available The Energy Service Directive (ESD has pushed forward the issue of energy savings calculations without clarifying the methodological basis. Savings achieved in the Member States are calculated with rather non-transparent and hardly comparable Bottom-up (BU methods. This paper develops the idea of parallel evaluation tracks separating the Member States’ issue of ESD verification and comparable savings calculations. Comparability is ensured by developing a standardised BU calculation kernel for different energy efficiency improvement (EEI actions which simultaneously depicts the different calculation options in a structured way (e.g. baseline definition, system boundaries, double counting. Due to the heterogeneity of BU calculations the approach requires a central database where Member States feed in input data on BU actions according to a predefined structure. The paper demonstrates the proposed approach including a concrete example of application.

  19. Bottom-Up Discrete Symmetries for Cabibbo Mixing

    CERN Document Server

    Varzielas, Ivo de Medeiros; Talbert, Jim

    2016-01-01

    We perform a bottom-up search for discrete non-Abelian symmetries capable of quantizing the Cabibbo angle that parameterizes CKM mixing. Given a particular Abelian symmetry structure in the up and down sectors, we construct representations of the associated residual generators which explicitly depend on the degrees of freedom present in our effective mixing matrix. We then discretize those degrees of freedom and utilize the Groups, Algorithms, Programming (GAP) package to close the associated finite groups. This short study is performed in the context of recent results indicating that, without resorting to special model-dependent corrections, no small-order finite group can simultaneously predict all four parameters of the three-generation CKM matrix and that only groups of $\\mathcal{O}(10^{2})$ can predict the analogous parameters of the leptonic PMNS matrix, regardless of whether neutrinos are Dirac or Majorana particles. Therefore a natural model of flavour might instead incorporate small(er) finite groups...

  20. BUEES:a bottom-up event extraction system

    Institute of Scientific and Technical Information of China (English)

    Xiao DING; Bing QIN; Ting LIU

    2015-01-01

    Traditional event extraction systems focus mainly on event type identifi cation and event participant extraction based on pre-specifi ed event type paradigms and manually annotated corpora. However, different domains have different event type paradigms. When transferring to a new domain, we have to build a new event type paradigm and annotate a new corpus from scratch. This kind of conventional event extraction system requires massive human effort, and hence prevents event extraction from being widely applicable. In this paper, we present BUEES, a bottom-up event extraction system, which extracts events from the web in a completely unsupervised way. The system automatically builds an event type paradigm in the input corpus, and then proceeds to extract a large number of instance patterns of these events. Subsequently, the system extracts event arguments according to these patterns. By conducting a series of experiments, we demonstrate the good performance of BUEES and compare it to a state-of-the-art Chinese event extraction system, i.e., a supervised event extraction system. Experimental results show that BUEES performs comparably (5% higher F-measure in event type identifi cation and 3% higher F-measure in event argument extraction), but without any human effort.

  1. Top down and bottom up engineering of bone.

    Science.gov (United States)

    Knothe Tate, Melissa L

    2011-01-11

    The goal of this retrospective article is to place the body of my lab's multiscale mechanobiology work in context of top-down and bottom-up engineering of bone. We have used biosystems engineering, computational modeling and novel experimental approaches to understand bone physiology, in health and disease, and across time (in utero, postnatal growth, maturity, aging and death, as well as evolution) and length scales (a single bone like a femur, m; a sample of bone tissue, mm-cm; a cell and its local environment, μm; down to the length scale of the cell's own skeleton, the cytoskeleton, nm). First we introduce the concept of flow in bone and the three calibers of porosity through which fluid flows. Then we describe, in the context of organ-tissue, tissue-cell and cell-molecule length scales, both multiscale computational models and experimental methods to predict flow in bone and to understand the flow of fluid as a means to deliver chemical and mechanical cues in bone. Addressing a number of studies in the context of multiple length and time scales, the importance of appropriate boundary conditions, site specific material parameters, permeability measures and even micro-nanoanatomically correct geometries are discussed in context of model predictions and their value for understanding multiscale mechanobiology of bone. Insights from these multiscale computational modeling and experimental methods are providing us with a means to predict, engineer and manufacture bone tissue in the laboratory and in the human body.

  2. Bottom-Up Synthesis and Sensor Applications of Biomimetic Nanostructures

    Directory of Open Access Journals (Sweden)

    Li Wang

    2016-01-01

    Full Text Available The combination of nanotechnology, biology, and bioengineering greatly improved the developments of nanomaterials with unique functions and properties. Biomolecules as the nanoscale building blocks play very important roles for the final formation of functional nanostructures. Many kinds of novel nanostructures have been created by using the bioinspired self-assembly and subsequent binding with various nanoparticles. In this review, we summarized the studies on the fabrications and sensor applications of biomimetic nanostructures. The strategies for creating different bottom-up nanostructures by using biomolecules like DNA, protein, peptide, and virus, as well as microorganisms like bacteria and plant leaf are introduced. In addition, the potential applications of the synthesized biomimetic nanostructures for colorimetry, fluorescence, surface plasmon resonance, surface-enhanced Raman scattering, electrical resistance, electrochemistry, and quartz crystal microbalance sensors are presented. This review will promote the understanding of relationships between biomolecules/microorganisms and functional nanomaterials in one way, and in another way it will guide the design and synthesis of biomimetic nanomaterials with unique properties in the future.

  3. Visual anticipation biases conscious decision making but not bottom-up visual processing.

    Science.gov (United States)

    Mathews, Zenon; Cetnarski, Ryszard; Verschure, Paul F M J

    2014-01-01

    Prediction plays a key role in control of attention but it is not clear which aspects of prediction are most prominent in conscious experience. An evolving view on the brain is that it can be seen as a prediction machine that optimizes its ability to predict states of the world and the self through the top-down propagation of predictions and the bottom-up presentation of prediction errors. There are competing views though on whether prediction or prediction errors dominate the formation of conscious experience. Yet, the dynamic effects of prediction on perception, decision making and consciousness have been difficult to assess and to model. We propose a novel mathematical framework and a psychophysical paradigm that allows us to assess both the hierarchical structuring of perceptual consciousness, its content and the impact of predictions and/or errors on conscious experience, attention and decision-making. Using a displacement detection task combined with reverse correlation, we reveal signatures of the usage of prediction at three different levels of perceptual processing: bottom-up fast saccades, top-down driven slow saccades and consciousnes decisions. Our results suggest that the brain employs multiple parallel mechanism at different levels of perceptual processing in order to shape effective sensory consciousness within a predicted perceptual scene. We further observe that bottom-up sensory and top-down predictive processes can be dissociated through cognitive load. We propose a probabilistic data association model from dynamical systems theory to model the predictive multi-scale bias in perceptual processing that we observe and its role in the formation of conscious experience. We propose that these results support the hypothesis that consciousness provides a time-delayed description of a task that is used to prospectively optimize real time control structures, rather than being engaged in the real-time control of behavior itself.

  4. Building Models from the Bottom Up: The HOBBES Project

    Science.gov (United States)

    Medellin-Azuara, J.; Sandoval Solis, S.; Lund, J. R.; Chu, W.

    2013-12-01

    Water problems are often bigger than technical and data challenges associated in representing a water system using a model. Controversy and complexity is inherent when water is to be allocated among different uses making difficult to maintain coherent and productive discussions on addressing water problems. Quantification of a water supply system through models has proven to be helpful to improve understanding, explore and develop adaptable solutions to water problems. However, models often become too large and complex and become hostages of endless discussions of the assumptions, their algorithms and their limitations. Data management organization and documentation keep model flexible and useful over time. The UC Davis HOBBES project is a new approach, building models from the bottom up. Reversing the traditional model development, where data are arranged around a model algorithm, in Hobbes the data structure, organization and documentation are established first, followed by application of simulation or optimization modeling algorithms for a particular problem at hand. The HOBBES project establishes standards for storing, documenting and sharing datasets on California water system. This allows models to be developed and modified more easily and transparently, with greater comparability. Elements in the database have a spatial definition and can aggregate several infrastructural elements into detailed to coarse representations of the water system. Elements in the database represent reservoirs, groundwater basins, pumping stations, hydropower and water treatment facilities, demand areas and conveyance infrastructure statewide. These elements also host time series, economic and other information from hydrologic, economic, climate and other models. This presentation provides an overview of the project HOBBES project, its applications and prospects for California and elsewhere. The HOBBES Project

  5. Top-down (Prior Knowledge) and Bottom-up (Perceptual Modality) Influences on Spontaneous Interpersonal Synchronization.

    Science.gov (United States)

    Gipson, Christina L; Gorman, Jamie C; Hessler, Eric E

    2016-04-01

    Coordination with others is such a fundamental part of human activity that it can happen unintentionally. This unintentional coordination can manifest as synchronization and is observed in physical and human systems alike. We investigated the role of top-down influences (prior knowledge of the perceptual modality their partner is using) and bottom-up factors (perceptual modality combination) on spontaneous interpersonal synchronization. We examine this phenomena with respect to two different theoretical perspectives that differently emphasize top-down and bottom-up factors in interpersonal synchronization: joint-action/shared cognition theories and ecological-interactive theories. In an empirical study twelve dyads performed a finger oscillation task while attending to each other's movements through either visual, auditory, or visual and auditory perceptual modalities. Half of the participants were given prior knowledge of their partner's perceptual capabilities for coordinating across these different perceptual modality combinations. We found that the effect of top-down influence depends on the perceptual modality combination between two individuals. When people used the same perceptual modalities, top-down influence resulted in less synchronization and when people used different perceptual modalities, top-down influence resulted in more synchronization. Furthermore, persistence in the change in behavior as a result of having perceptual information about each other ('social memory') was stronger when this top-down influence was present.

  6. Top-down and bottom-up definitions of human failure events in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-10-01

    In the probabilistic risk assessments (PRAs) used in the nuclear industry, human failure events (HFEs) are determined as a subset of hardware failures, namely those hardware failures that could be triggered by human action or inaction. This approach is top-down, starting with hardware faults and deducing human contributions to those faults. Elsewhere, more traditionally human factors driven approaches would tend to look at opportunities for human errors first in a task analysis and then identify which of those errors is risk significant. The intersection of top-down and bottom-up approaches to defining HFEs has not been carefully studied. Ideally, both approaches should arrive at the same set of HFEs. This question is crucial, however, as human reliability analysis (HRA) methods are generalized to new domains like oil and gas. The HFEs used in nuclear PRAs tend to be top-down—defined as a subset of the PRA—whereas the HFEs used in petroleum quantitative risk assessments (QRAs) often tend to be bottom-up—derived from a task analysis conducted by human factors experts. The marriage of these approaches is necessary in order to ensure that HRA methods developed for top-down HFEs are also sufficient for bottom-up applications.

  7. Atomically Precise Bottom-up Fabrication of Graphene Nanoribbons

    Science.gov (United States)

    Cai, Jinming

    2011-03-01

    Graphene nanoribbons (GNRs) -- narrow stripes of graphene -- are predicted to exhibit remarkable properties making them suitable for future electronic applications. Contrary to their two-dimensional (2D) parent material graphene, which exhibits semimetallic behavior, GNRs with widths smaller than 10 nm are predicted to be semiconductors due to quantum confinement and edge effects. Despite significant advances in GNR fabrication using chemical, sonochemical and lithographic methods as well as recent reports on the successful unzipping of carbon nanotubes into GNRs, the production of sub-10 nm GNRs with chemical precision remains a major challenge. In this talk, we will present a simple GNR fabrication method that allows for the production of atomically precise GNRs of different topologies and widths. Our bottom-up approach consists in the surface-assisted coupling of suitably designed molecular precursors into linear polyphenylenes and their subsequent cyclodehydrogenation, and results in GNRs whose topology, width and edge periphery are defined by the precursor monomers. By means of STM and Raman characterization, we demonstrate that this fabrication process allows for the atomically precise fabrication of complex GNR topologies. Furthermore, we have developed a reliable procedure to transfer GNRs fabricated on metal surfaces onto other substrates. It will for example be shown that millimeter sized sheets of crosslinked GNRs can be transferred onto silicon wafers, making them available for further processing, e.g. by lithography, prototype device fabrication and characterization. Coauthors: Pascal Ruffieux, Rached Jaafar, Marco Bieri, Thomas Braun, and Stephan Blankenburg, Empa, Swiss Federal Laboratories for Materials Science and Technology, 3602 Thun and 8600 Dübendorf, Switzerland; Matthias Muoth, ETH Zurich, Department of Mechanical and Process Engineering, 8092 Zurich, Switzerland; Ari P. Seitsonen, University of Zurich, Physical Chemistry Institute, 8057

  8. A combined bottom-up/top-down approach to prepare a sterile injectable nanosuspension.

    Science.gov (United States)

    Hu, Xi; Chen, Xi; Zhang, Ling; Lin, Xia; Zhang, Yu; Tang, Xing; Wang, Yanjiao

    2014-09-10

    To prepare a uniform nanosuspension of strongly hydrophobic riboflavin laurate (RFL) allowing sterile filtration, physical modification (bottom-up) was combined with high-pressure homogenization (top-down) method. Unlike other bottom-up approaches, physical modification with surfactants (TPGS and PL-100) by lyophilization controlled crystallization and compensated for the poor wettability of RFL. On one hand, crystal growth and aggregation during freezing was restricted by a stabilizer-layer adsorbed on the drug surface by hydrophobic interaction. On the other hand, subsequent crystallization of drug in the sublimation process was limited to the interstitial spaces between solvent crystals. After lyophilization, modified drug with a smaller particle size and better wettability was obtained. When adding surfactant solution, water molecules passed between the hydrophilic groups of surface active molecules and activated the polymer chains allowing them to stretch into water. The coarse suspension was crushed into a nanosuspension (MP=162 nm) by high-pressure homogenization. For long term stability, lyophilization was applied again to solidify the nanosuspension (sorbitol as cryoprotectant). A slight crystal growth to about 600 nm was obtained to allow slow release for a sustained effect after muscular administration. Moreover, no paw-licking responses and very slight muscular inflammation demonstrated the excellent biocompatibility of this long-acting RFL injection.

  9. The ideological divide and climate change opinion: "top-down" and "bottom-up" approaches.

    Science.gov (United States)

    Jacquet, Jennifer; Dietrich, Monica; Jost, John T

    2014-01-01

    The United States wields disproportionate global influence in terms of carbon dioxide emissions and international climate policy. This makes it an especially important context in which to examine the interplay among social, psychological, and political factors in shaping attitudes and behaviors related to climate change. In this article, we review the emerging literature addressing the liberal-conservative divide in the U.S. with respect to thought, communication, and action concerning climate change. Because of its theoretical and practical significance, we focus on the motivational basis for skepticism and inaction on the part of some, including "top-down" institutional forces, such as corporate strategy, and "bottom-up" psychological factors, such as ego, group, and system justification. Although more research is needed to elucidate fully the social, cognitive, and motivational bases of environmental attitudes and behavior, a great deal has been learned in just a few years by focusing on specific ideological factors in addition to general psychological principles.

  10. Pre-stimulus activity predicts the winner of top-down vs. bottom-up attentional selection.

    Directory of Open Access Journals (Sweden)

    Ali Mazaheri

    Full Text Available Our ability to process visual information is fundamentally limited. This leads to competition between sensory information that is relevant for top-down goals and sensory information that is perceptually salient, but task-irrelevant. The aim of the present study was to identify, from EEG recordings, pre-stimulus and pre-saccadic neural activity that could predict whether top-down or bottom-up processes would win the competition for attention on a trial-by-trial basis. We employed a visual search paradigm in which a lateralized low contrast target appeared alone, or with a low (i.e., non-salient or high contrast (i.e., salient distractor. Trials with a salient distractor were of primary interest due to the strong competition between top-down knowledge and bottom-up attentional capture. Our results demonstrated that 1 in the 1-sec pre-stimulus interval, frontal alpha (8-12 Hz activity was higher on trials where the salient distractor captured attention and the first saccade (bottom-up win; and 2 there was a transient pre-saccadic increase in posterior-parietal alpha (7-8 Hz activity on trials where the first saccade went to the target (top-down win. We propose that the high frontal alpha reflects a disengagement of attentional control whereas the transient posterior alpha time-locked to the saccade indicates sensory inhibition of the salient distractor and suppression of bottom-up oculomotor capture.

  11. Pre-stimulus activity predicts the winner of top-down vs. bottom-up attentional selection.

    Science.gov (United States)

    Mazaheri, Ali; DiQuattro, Nicholas E; Bengson, Jesse; Geng, Joy J

    2011-02-28

    Our ability to process visual information is fundamentally limited. This leads to competition between sensory information that is relevant for top-down goals and sensory information that is perceptually salient, but task-irrelevant. The aim of the present study was to identify, from EEG recordings, pre-stimulus and pre-saccadic neural activity that could predict whether top-down or bottom-up processes would win the competition for attention on a trial-by-trial basis. We employed a visual search paradigm in which a lateralized low contrast target appeared alone, or with a low (i.e., non-salient) or high contrast (i.e., salient) distractor. Trials with a salient distractor were of primary interest due to the strong competition between top-down knowledge and bottom-up attentional capture. Our results demonstrated that 1) in the 1-sec pre-stimulus interval, frontal alpha (8-12 Hz) activity was higher on trials where the salient distractor captured attention and the first saccade (bottom-up win); and 2) there was a transient pre-saccadic increase in posterior-parietal alpha (7-8 Hz) activity on trials where the first saccade went to the target (top-down win). We propose that the high frontal alpha reflects a disengagement of attentional control whereas the transient posterior alpha time-locked to the saccade indicates sensory inhibition of the salient distractor and suppression of bottom-up oculomotor capture.

  12. Developing a comprehensive and comparative questionnaire for measuring personality in chimpanzees using a simultaneous top-down/bottom-up design.

    Science.gov (United States)

    Freeman, Hani D; Brosnan, Sarah F; Hopper, Lydia M; Lambeth, Susan P; Schapiro, Steven J; Gosling, Samuel D

    2013-10-01

    One effective method for measuring personality in primates is to use personality trait ratings to distill the experience of people familiar with the individual animals. Previous rating instruments were created using either top-down or bottom-up approaches. Top-down approaches, which essentially adapt instruments originally designed for use with another species, can unfortunately lead to the inclusion of traits irrelevant to chimpanzees or fail to include all relevant aspects of chimpanzee personality. Conversely, because bottom-up approaches derive traits specifically for chimpanzees, their unique items may impede comparisons with findings in other studies and other species. To address the limitations of each approach, we developed a new personality rating scale using a combined top-down/bottom-up design. Seventeen raters rated 99 chimpanzees on the new 41-item scale, with all but one item being rated reliably. Principal components analysis, using both varimax and direct oblimin rotations, identified six broad factors. Strong evidence was found for five of the factors (Reactivity/Undependability, Dominance, Openness, Extraversion, and Agreeableness). A sixth factor (Methodical) was offered provisionally until more data are collected. We validated the factors against behavioral data collected independently on the chimpanzees. The five factors demonstrated good evidence for convergent and predictive validity, thereby underscoring the robustness of the factors. Our combined top-down/bottom-up approach provides the most extensive data to date to support the universal existence of these five personality factors in chimpanzees. This framework, which facilitates cross-species comparisons, can also play a vital role in understanding the evolution of personality and can assist with husbandry and welfare efforts.

  13. Towards nano-organic chemistry: perspectives for a bottom-up approach to the synthesis of low-dimensional carbon nanostructures

    Science.gov (United States)

    Mercuri, Francesco; Baldoni, Matteo; Sgamellotti, Antonio

    2012-01-01

    Low-dimensional carbon nanostructures, such as nanotubes and graphenes, represent one of the most promising classes of materials, in view of their potential use in nanotechnology. However, their exploitation in applications is often hindered by difficulties in their synthesis and purification. Despite the huge efforts by the research community, the production of nanostructured carbon materials with controlled properties is still beyond reach. Nonetheless, this step is nowadays mandatory for significant progresses in the realization of advanced applications and devices based on low-dimensional carbon nanostructures. Although promising alternative routes for the fabrication of nanostructured carbon materials have recently been proposed, a comprehensive understanding of the key factors governing the bottom-up assembly of simple precursors to form complex systems with tailored properties is still at its early stages. In this paper, following a survey of recent experimental efforts in the bottom-up synthesis of carbon nanostructures, we attempt to clarify generalized criteria for the design of suitable precursors that can be used as building blocks in the production of complex systems based on sp2 carbon atoms and discuss potential synthetic strategies. In particular, the approaches presented in this feature article are based on the application of concepts borrowed from traditional organic chemistry, such as valence-bond theory and Clar sextet theory, and on their extension to the case of complex carbon nanomaterials. We also present and discuss a validation of these approaches through first-principle calculations on prototypical systems. Detailed studies on the processes involved in the bottom-up fabrication of low-dimensional carbon nanostructures are expected to pave the way for the design and optimization of precursors and efficient synthetic routes, thus allowing the development of novel materials with controlled morphology and properties that can be used in

  14. Charge transport in bottom-up inorganic-organic and quantum-coherent nanostructures

    NARCIS (Netherlands)

    Makarenko, Ksenia Sergeevna

    2015-01-01

    This thesis is based on results obtained from experiments designed for a consistent study of charge transport in bottom-up inorganic-organic and quantum-coherent nanostructures. New unconventional ways to build elements of electrical circuits (like dielectrophoresis, wedging transfer and bottom-up f

  15. Ion mobility tandem mass spectrometry enhances performance of bottom-up proteomics.

    Science.gov (United States)

    Helm, Dominic; Vissers, Johannes P C; Hughes, Christopher J; Hahne, Hannes; Ruprecht, Benjamin; Pachl, Fiona; Grzyb, Arkadiusz; Richardson, Keith; Wildgoose, Jason; Maier, Stefan K; Marx, Harald; Wilhelm, Mathias; Becher, Isabelle; Lemeer, Simone; Bantscheff, Marcus; Langridge, James I; Kuster, Bernhard

    2014-12-01

    One of the limiting factors in determining the sensitivity of tandem mass spectrometry using hybrid quadrupole orthogonal acceleration time-of-flight instruments is the duty cycle of the orthogonal ion injection system. As a consequence, only a fraction of the generated fragment ion beam is collected by the time-of-flight analyzer. Here we describe a method utilizing postfragmentation ion mobility spectrometry of peptide fragment ions in conjunction with mobility time synchronized orthogonal ion injection leading to a substantially improved duty cycle and a concomitant improvement in sensitivity of up to 10-fold for bottom-up proteomic experiments. This enabled the identification of 7500 human proteins within 1 day and 8600 phosphorylation sites within 5 h of LC-MS/MS time. The method also proved powerful for multiplexed quantification experiments using tandem mass tags exemplified by the chemoproteomic interaction analysis of histone deacetylases with Trichostatin A.

  16. Bottom-up effects of a no-take zone on endangered penguin demographics.

    Science.gov (United States)

    Sherley, Richard B; Winker, Henning; Altwegg, Res; van der Lingen, Carl D; Votier, Stephen C; Crawford, Robert J M

    2015-07-01

    Marine no-take zones can have positive impacts for target species and are increasingly important management tools. However, whether they indirectly benefit higher order predators remains unclear. The endangered African penguin (Spheniscus demersus) depends on commercially exploited forage fish. We examined how chick survival responded to an experimental 3-year fishery closure around Robben Island, South Africa, controlling for variation in prey biomass and fishery catches. Chick survival increased by 18% when the closure was initiated, which alone led to a predicted 27% higher population compared with continued fishing. However, the modelled population continued to decline, probably because of high adult mortality linked to poor prey availability over larger spatial scales. Our results illustrate that small no-take zones can have bottom-up benefits for highly mobile marine predators, but are only one component of holistic, ecosystem-based management regimes.

  17. Comparing top-down and bottom-up costing approaches for economic evaluation within social welfare.

    Science.gov (United States)

    Olsson, Tina M

    2011-10-01

    This study compares two approaches to the estimation of social welfare intervention costs: one "top-down" and the other "bottom-up" for a group of social welfare clients with severe problem behavior participating in a randomized trial. Intervention costs ranging over a two-year period were compared by intervention category (foster care placement, institutional placement, mentorship services, individual support services and structured support services), estimation method (price, micro costing, average cost) and treatment group (intervention, control). Analyses are based upon 2007 costs for 156 individuals receiving 404 interventions. Overall, both approaches were found to produce reliable estimates of intervention costs at the group level but not at the individual level. As choice of approach can greatly impact the estimate of mean difference, adjustment based on estimation approach should be incorporated into sensitivity analyses. Analysts must take care in assessing the purpose and perspective of the analysis when choosing a costing approach for use within economic evaluation.

  18. Highly directional bottom-up 3D nanoantenna for visible light.

    Science.gov (United States)

    Tong, L; Pakizeh, T; Feuz, L; Dmitriev, A

    2013-01-01

    Controlling light at the nanoscale is of fundamental importance and is essential for applications ranging from optical sensing and metrology to information processing, communications, and quantum optics. Considerable efforts are currently directed towards optical nanoantennas that directionally convert light into strongly localized energy and vice versa. Here we present highly directional 3D nanoantenna operating with visible light. We demonstrate a simple bottom-up approach to produce macroscopic arrays of such nanoantennas and present a way to address their functionality via interaction with quantum dots (QDs), properly embedded in the structure of the nanoantenna. The ease and accessibility of this structurally robust optical antenna device prompts its use as an affordable test bed for concepts in nano-optics and nanophotonics applications.

  19. A bottom up approach for engineering catchments through sustainable runoff management

    Science.gov (United States)

    Wilkinson, M.; Quinn, P. F.; Jonczyk, J.; Burke, S.

    2010-12-01

    developed that puts in place novel measures to tackle diffuse pollution and reduce flood risk whilst collecting the science needed to influence the policy about these measures. This has been possible through four key practices: full stakeholder engagement, a problem solving agenda set in place, a bottom up approach to solving problems, and the collection of the appropriate science to support the benefits. Hands on, multi-objective work is the most cost effective way to manage catchments. Tackling water quality issues and controlling fast pathway runoff at the source in partnership with farmers and local landowners has proved to be the key to success. Tackling issues in sub-catchments can lead to solving problems at the catchment scale.

  20. Bottom-up communication. Identifying opportunities and limitations through an exploratory field-based evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, C.; Irvine, K.N. [Institute of Energy and Sustainable Development, De Montfort University, Leicester, LE1 9BH (United Kingdom)

    2013-02-15

    Communication to promote behaviours like energy saving can use significant resources. What is less clear is the comparative value of different approaches available to communicators. While it is generally agreed that 'bottom-up' approaches, where individuals are actively involved rather than passive, are preferable to 'top-down' authority-led projects, there is a dearth of evidence that verifies why this should be. Additionally, while the literature has examined the mechanics of the different approaches, there has been less attention paid to the associated psychological implications. This paper reports on an exploratory comparative study that examined the effects of six distinct communication activities. The activities used different communication approaches, some participative and others more top-down informational. Two theories, from behavioural studies and communication, were used to identify key variables for consideration in this field-based evaluation. The evaluation aimed to assess not just which activity might be most successful, as this has limited generalisability, but to also gain insight into what psychological impacts might contribute to success. Analysis found support for the general hypothesis that bottom-up approaches have more impact on behaviour change than top-down. The study also identified that, in this instance, the difference in reported behaviour across the activities related partly to the extent to which intentions to change behaviour were implemented. One possible explanation for the difference in reported behaviour change across the activities is that a bottom-up approach may offer a supportive environment where participants can discuss progress with like-minded individuals. A further possible explanation is that despite controlling for intention at an individual level, the pre-existence of strong intentions may have an effect on group success. These suggestive findings point toward the critical need for additional and larger-scale studies

  1. Reconciling Top Down and Bottom Up Approaches to Understand Land Carbon Cycle Variability

    Science.gov (United States)

    Collatz, G. J.; Gurney, K. R.; Denning, A. S.; Randerson, J. T.; van der Werf, G. R.

    2004-12-01

    Cycle Variability Two fundamentally different approaches for estimating global carbon sources and sinks have been used over the past 15 years. The so-called "Top-down" approach involves analysis of atmospheric composition and often includes inversions of atmospheric transport. Bottom-up approaches, on the other hand, involve using carbon cycle process models driven by various observational data. Reconciling the results of these two approaches can provide powerful constraints on each but is challenging because of the large uncertainties in atmospheric measurements and transport and in our understanding of the processes controlling biogeochemical cycling of carbon. Recently, the Atmospheric Carbon Inversion Intercomparison (TransCom 3) completed mean seasonal cycle and interannual variability inversions using 12 transport models. Their results include predictions of biogeochemically driven net carbon fluxes with associated uncertainties for the globe divided into 22 regions, half of which are land regions. The cyclo-stationary inversions predicted the mean seasonal cycle as well as the mean sink/source of each region. The interannual inversions predicted the interannual variability in the sources and sinks for each region between 1980 and 2000. This study describes an analysis of the processes controlling biogeochemically driven net carbon fluxes over the seasonal cycle for each of the Transcom land regions. The processes considered are those included in the CASA biogeochemical model. The seasonally variable model inputs include NDVI, temperature, precipitation and solar radiation and burned area. The contributions of NPP, heterotrophic respiration and fire season to the seasonal cycle are evaluated for each of the 11 TransCom 3 land regions. We prescribed plausible scenarios in the biogeochemical model to evaluate the mechanisms responsible for the size and seasonality of the mean annual carbon sinks reported by TransCom 3. Initial results will also be presented for

  2. Bottom-Up Energy Analysis System (BUENAS). An international appliance efficiency policy tool

    Energy Technology Data Exchange (ETDEWEB)

    McNeil, M.A.; Letschert, V.E.; De la Rue du Can, S.; Ke, Jing [Lawrence Berkeley National Laboratory LBNL, 1 Cyclotron Rd, Berkeley, CA (United States)

    2013-02-15

    The Bottom-Up Energy Analysis System (BUENAS) calculates potential energy and greenhouse gas emission impacts of efficiency policies for lighting, heating, ventilation, and air conditioning, appliances, and industrial equipment through 2030. The model includes 16 end use categories and covers 11 individual countries plus the European Union. BUENAS is a bottom-up stock accounting model that predicts energy consumption for each type of equipment in each country according to engineering-based estimates of annual unit energy consumption, scaled by projections of equipment stock. Energy demand in each scenario is determined by equipment stock, usage, intensity, and efficiency. When available, BUENAS uses sales forecasts taken from country studies to project equipment stock. Otherwise, BUENAS uses an econometric model of household appliance uptake developed by the authors. Once the business as usual scenario is established, a high-efficiency policy scenario is constructed that includes an improvement in the efficiency of equipment installed in 2015 or later. Policy case efficiency targets represent current 'best practice' and include standards already established in a major economy or well-defined levels known to enjoy a significant market share in a major economy. BUENAS calculates energy savings according to the difference in energy demand in the two scenarios. Greenhouse gas emission mitigation is then calculated using a forecast of electricity carbon factor. We find that mitigation of 1075 mt annual CO2 emissions is possible by 2030 from adopting current best practices of appliance efficiency policies. This represents a 17 % reduction in emissions in the business as usual case in that year.

  3. A bottom-up approach of stochastic demand allocation in water quality modelling

    Directory of Open Access Journals (Sweden)

    E. J. M. Blokker

    2010-04-01

    Full Text Available An "all pipes" hydraulic model of a drinking water distribution system was constructed with two types of demand allocations. One is constructed with the conventional top-down approach, i.e. a demand multiplier pattern from the booster station is allocated to all demand nodes with a correction factor to account for the average water demand on that node. The other is constructed with a bottom-up approach of demand allocation, i.e., each individual home is represented by one demand node with its own stochastic water demand pattern. This was done for a drinking water distribution system of approximately 10 km of mains and serving ca. 1000 homes. The system was tested in a real life situation.

    The stochastic water demand patterns were constructed with the end-use model SIMDEUM on a per second basis and per individual home. Before applying the demand patterns in a network model, some temporal aggregation was done. The flow entering the test area was measured and a tracer test with sodium chloride was performed to determine travel times. The two models were validated on the total sum of demands and on travel times.

    The study showed that the bottom-up approach leads to realistic water demand patterns and travel times, without the need for any flow measurements or calibration. In the periphery of the drinking water distribution system it is not possible to calibrate models on pressure, because head losses are too low. The study shows that in the periphery it is also difficult to calibrate on water quality (e.g. with tracer measurements, as a consequence of the high variability between days. The stochastic approach of hydraulic modelling gives insight into the variability of travel times as an added feature beyond the conventional way of modelling.

  4. The Interplay of Top-Down and Bottom-Up

    DEFF Research Database (Denmark)

    Winkler, Till; Brown, Carol V.; Ozturk, Pinar

    2014-01-01

    positions before the HITECH funding. Based on our analyses of interview data collected from 34 leaders at the state, HIO, and provider level, our objective is to develop a model of contextual and operational factors that influence the sustainability of HIOs. The implications of our findings for other...

  5. Mapping practices of project management – merging top-down and bottom-up perspectives

    DEFF Research Database (Denmark)

    Thuesen, Christian

    2015-01-01

    This paper presents a new methodology for studying different accounts of project management practices based on network mapping and analysis. Drawing upon network mapping and visualization as an analytical strategy top-down and bottom-up accounts of project management practice are analysed...... and compared. The analysis initially reveals a substantial difference between the top-down and bottom-up accounts of practice. Furthermore it identifies a soft side of project management that is central in the bottom-up account but absent from the top-down. Finally, the study shows that network mapping...... is a promising strategy for visualizing and analysing different accounts of project management practices....

  6. Bottom-up regulation of capelin, a keystone forage species.

    Directory of Open Access Journals (Sweden)

    Alejandro D Buren

    Full Text Available The Northwest Atlantic marine ecosystem off Newfoundland and Labrador, Canada, has been commercially exploited for centuries. Although periodic declines in various important commercial fish stocks have been observed in this ecosystem, the most drastic changes took place in the early 1990s when the ecosystem structure changed abruptly and has not returned to its previous configuration. In the Northwest Atlantic, food web dynamics are determined largely by capelin (Mallotus villosus, the focal forage species which links primary and secondary producers with the higher trophic levels. Notwithstanding the importance of capelin, the factors that influence its population dynamics have remained elusive. We found that a regime shift and ocean climate, acting via food availability, have discernible impacts on the regulation of this population. Capelin biomass and timing of spawning were well explained by a regime shift and seasonal sea ice dynamics, a key determinant of the pelagic spring bloom. Our findings are important for the development of ecosystem approaches to fisheries management and raise questions on the potential impacts of climate change on the structure and productivity of this marine ecosystem.

  7. Social and ethical checkpoints for bottom-up synthetic biology, or protocells.

    Science.gov (United States)

    Bedau, Mark A; Parke, Emily C; Tangen, Uwe; Hantsche-Tangen, Brigitte

    2009-12-01

    An alternative to creating novel organisms through the traditional "top-down" approach to synthetic biology involves creating them from the "bottom up" by assembling them from non-living components; the products of this approach are called "protocells." In this paper we describe how bottom-up and top-down synthetic biology differ, review the current state of protocell research and development, and examine the unique ethical, social, and regulatory issues raised by bottom-up synthetic biology. Protocells have not yet been developed, but many expect this to happen within the next five to ten years. Accordingly, we identify six key checkpoints in protocell development at which particular attention should be given to specific ethical, social and regulatory issues concerning bottom-up synthetic biology, and make ten recommendations for responsible protocell science that are tied to the achievement of these checkpoints.

  8. Painful faces-induced attentional blink modulated by top-down and bottom-up mechanisms

    OpenAIRE

    2015-01-01

    Pain-related stimuli can capture attention in an automatic (bottom-up) or intentional (top-down) fashion. Previous studies have examined attentional capture by pain-related information using spatial attention paradigms that involve mainly a bottom-up mechanism. In the current study, we investigated the pain information–induced attentional blink (AB) using a rapid serial visual presentation (RSVP) task, and compared the effects of task-irrelevant and task-relevant pain distractors. Relationshi...

  9. Quantifying the uncertainties of a bottom-up emission inventory of anthropogenic atmospheric pollutants in China

    Science.gov (United States)

    Zhao, Y.; Nielsen, C. P.; Lei, Y.; McElroy, M. B.; Hao, J.

    2011-03-01

    The uncertainties of a national, bottom-up inventory of Chinese emissions of anthropogenic SO2, NOx, and particulate matter (PM) of different size classes and carbonaceous species are comprehensively quantified, for the first time, using Monte Carlo simulation. The inventory is structured by seven dominant sectors: coal-fired electric power, cement, iron and steel, other industry (boiler combustion), other industry (non-combustion processes), transportation, and residential. For each parameter related to emission factors or activity-level calculations, the uncertainties, represented as probability distributions, are either statistically fitted using results of domestic field tests or, when these are lacking, estimated based on foreign or other domestic data. The uncertainties (i.e., 95% confidence intervals around the central estimates) of Chinese emissions of SO2, NOx, total PM, PM10, PM2.5, black carbon (BC), and organic carbon (OC) in 2005 are estimated to be -14%~13%, -13%~37%, -11%~38%, -14%~45%, -17%~54%, -25%~136%, and -40%~121%, respectively. Variations at activity levels (e.g., energy consumption or industrial production) are not the main source of emission uncertainties. Due to narrow classification of source types, large sample sizes, and relatively high data quality, the coal-fired power sector is estimated to have the smallest emission uncertainties for all species except BC and OC. Due to poorer source classifications and a wider range of estimated emission factors, considerable uncertainties of NOx and PM emissions from cement production and boiler combustion in other industries are found. The probability distributions of emission factors for biomass burning, the largest source of BC and OC, are fitted based on very limited domestic field measurements, and special caution should thus be taken interpreting these emission uncertainties. Although Monte Carlo simulation yields narrowed estimates of uncertainties compared to previous bottom-up emission

  10. Bottoms Up

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    China’s high-end liquor is becoming a luxury item and a favorite among collectors spring Festival, the most important festival for the Chinese, is a time for celebration—and what would a celebration be without bottles of holi-

  11. A 'bottom-up' approach to aetiological research in autism spectrum disorders

    Directory of Open Access Journals (Sweden)

    Lisa Marie Unwin

    2013-09-01

    Full Text Available Autism Spectrum Disorders (ASD are currently diagnosed in the presence of impairments in social interaction and communication, and a restricted range of activities and interests. However, there is considerable variability in the behaviours of different individuals with an ASD diagnosis. The heterogeneity spans the entire range of IQ and language abilities, as well as other behavioural, communicative and social functions. While any psychiatric condition is likely to incorporate a degree of heterogeneity, the variability in the nature and severity of behaviours observed in ASD is thought to exceed that of other disorders. The current paper aims to provide a model for future research into ASD subgroups. In doing so, we examined whether two proposed risk factors – low birth weight (LBW, and in-utero exposure to selective serotonin reuptake inhibitors (SSRIs – are associated with greater behavioural homogeneity. Using data from the Western Australian Autism Biological Registry, this study found that LBW and maternal SSRI use during pregnancy were associated with greater sleep disturbances and a greater number of gastrointestinal complaints in children with ASD, respectively. The findings from this ‘proof of principle’ paper provide support for this 'bottom-up' approach as a feasible method for creating homogenous groups.

  12. Rational design of modular circuits for gene transcription: A test of the bottom-up approach

    Directory of Open Access Journals (Sweden)

    Giordano Emanuele

    2010-11-01

    Full Text Available Abstract Background Most of synthetic circuits developed so far have been designed by an ad hoc approach, using a small number of components (i.e. LacI, TetR and a trial and error strategy. We are at the point where an increasing number of modular, inter-changeable and well-characterized components is needed to expand the construction of synthetic devices and to allow a rational approach to the design. Results We used interchangeable modular biological parts to create a set of novel synthetic devices for controlling gene transcription, and we developed a mathematical model of the modular circuits. Model parameters were identified by experimental measurements from a subset of modular combinations. The model revealed an unexpected feature of the lactose repressor system, i.e. a residual binding affinity for the operator site by induced lactose repressor molecules. Once this residual affinity was taken into account, the model properly reproduced the experimental data from the training set. The parameters identified in the training set allowed the prediction of the behavior of networks not included in the identification procedure. Conclusions This study provides new quantitative evidences that the use of independent and well-characterized biological parts and mathematical modeling, what is called a bottom-up approach to the construction of gene networks, can allow the design of new and different devices re-using the same modular parts.

  13. Self-assembled nanostructured resistive switching memory devices fabricated by templated bottom-up growth.

    Science.gov (United States)

    Song, Ji-Min; Lee, Jang-Sik

    2016-01-07

    Metal-oxide-based resistive switching memory device has been studied intensively due to its potential to satisfy the requirements of next-generation memory devices. Active research has been done on the materials and device structures of resistive switching memory devices that meet the requirements of high density, fast switching speed, and reliable data storage. In this study, resistive switching memory devices were fabricated with nano-template-assisted bottom up growth. The electrochemical deposition was adopted to achieve the bottom-up growth of nickel nanodot electrodes. Nickel oxide layer was formed by oxygen plasma treatment of nickel nanodots at low temperature. The structures of fabricated nanoscale memory devices were analyzed with scanning electron microscope and atomic force microscope (AFM). The electrical characteristics of the devices were directly measured using conductive AFM. This work demonstrates the fabrication of resistive switching memory devices using self-assembled nanoscale masks and nanomateirals growth from bottom-up electrochemical deposition.

  14. Social and ethical checkpoints for bottom-up synthetic biology, or protocells

    OpenAIRE

    2009-01-01

    An alternative to creating novel organisms through the traditional “top-down” approach to synthetic biology involves creating them from the “bottom up” by assembling them from non-living components; the products of this approach are called “protocells.” In this paper we describe how bottom-up and top-down synthetic biology differ, review the current state of protocell research and development, and examine the unique ethical, social, and regulatory issues raised by bottom-up synthetic biology....

  15. Quantifying the uncertainties of a bottom-up emission inventory of anthropogenic atmospheric pollutants in China

    Directory of Open Access Journals (Sweden)

    Y. Zhao

    2010-11-01

    Full Text Available The uncertainties of a national, bottom-up inventory of Chinese emissions of anthropogenic SO2, NOx, and particulate matter (PM of different size classes and carbonaceous species are comprehensively quantified, for the first time, using Monte Carlo simulation. The inventory is structured by seven dominant sectors: coal-fired electric power, cement, iron and steel, other industry (boiler combustion, other industry (non-combustion processes, transportation, and residential. For each parameter related to emission factors or activity-level calculations, the uncertainties, represented as probability distributions, are either statistically fitted using results of domestic field tests or, when these are lacking, estimated based on foreign or other domestic data. The uncertainties (i.e., 95% confidence intervals around the central estimates of Chinese emissions of SO2, NOx, total PM, PM10, PM2.5, black carbon (BC, and organic carbon (OC in 2005 are estimated to be −14%~12%, −10%~36%, −10%~36%, −12%~42% −16%~52%, −23%~130%, and −37%~117%, respectively. Variations at activity levels (e.g., energy consumption or industrial production are not the main source of emission uncertainties. Due to narrow classification of source types, large sample sizes, and relatively high data quality, the coal-fired power sector is estimated to have the smallest emission uncertainties for all species except BC and OC. Due to poorer source classifications and a wider range of estimated emission factors, considerable uncertainties of NOx and PM emissions from cement production and boiler combustion in other industries are found. The probability distributions of emission factors for biomass burning, the largest source of BC and OC, are fitted based on very limited domestic field measurements, and special caution should thus be taken interpreting these emission uncertainties. Although Monte

  16. An integrated top-down and bottom-up strategy for characterization protein isoforms and modifications

    Energy Technology Data Exchange (ETDEWEB)

    Wu, Si; Tolic, Nikola; Tian, Zhixin; Robinson, Errol W.; Pasa-Tolic, Ljiljana

    2011-04-15

    Bottom-up and top-down strategies are two commonly used methods for mass spectrometry (MS) based protein identification; each method has its own advantages and disadvantages. In this chapter, we describe an integrated top-down and bottom-up approach facilitated by concurrent liquid chromatography-mass spectrometry (LC-MS) analysis and fraction collection for comprehensive high-throughput intact protein profiling. The approach employs a high resolution reversed phase (RP) LC separation coupled with LC eluent fraction collection and concurrent on-line MS with a high field (12 Tesla) Fourier-transform ion cyclotron resonance (FTICR) mass spectrometer. Protein elusion profiles and tentative modified protein identification are made using detected intact protein mass in conjunction with bottom-up protein identifications from the enzymatic digestion and analysis of corresponding LC fractions. Specific proteins of biological interest are incorporated into a target ion list for subsequent off-line gas-phase fragmentation that uses an aliquot of the original collected LC fraction, an aliquot of which was also used for bottom-up analysis.

  17. Oriented bottom-up growth of armchair graphene nanoribbons on germanium

    Science.gov (United States)

    Arnold, Michael Scott; Jacobberger, Robert Michael

    2016-03-15

    Graphene nanoribbon arrays, methods of growing graphene nanoribbon arrays and electronic and photonic devices incorporating the graphene nanoribbon arrays are provided. The graphene nanoribbons in the arrays are formed using a scalable, bottom-up, chemical vapor deposition (CVD) technique in which the (001) facet of the germanium is used to orient the graphene nanoribbon crystals along the [110] directions of the germanium.

  18. Bottom-Up Molecular Tunneling Junctions Formed by Self-Assembly

    NARCIS (Netherlands)

    Zhang, Yanxi; Zhao, Zhiyuan; Fracasso, Davide; Chiechi, Ryan C

    2014-01-01

    This Minireview focuses on bottom-up molecular tunneling junctions - a fundamental component of molecular electronics - that are formed by self-assembly. These junctions are part of devices that, in part, fabricate themselves, and therefore, are particularly dependent on the chemistry of the molecul

  19. Coupling 2D Finite Element Models and Circuit Equations Using a Bottom-Up Methodology

    Science.gov (United States)

    2002-11-01

    EQUATIONS USING A BOTTOM-UP METHODOLOGY E. G6mezl, J. Roger-Folch2 , A. Gabald6nt and A. Molina’ ’Dpto. de Ingenieria Eldctrica. Universidad Polit...de Ingenieria Elictrica. ETSII. Universidad Politdcnica de Valencia. PO Box 22012, 46071. Valencia, Spain. E-mail: iroger adie.upv.es ABSTRACT The

  20. A Bottom-Up Approach for Implementing Electronic Portfolios in a Teacher Education Program

    Science.gov (United States)

    An, Heejung; Wilder, Hilary

    2010-01-01

    In an effort to generate a bottom-up approach for the program-wide implementation of electronic portfolios, this article first reports on the ways in which teacher candidates perceived the benefits and setbacks of this experience, after an initial course. Second, this article reports on whether and how the teacher candidates continued to develop…

  1. Bottom-up GGM algorithm for constructing multiple layered hierarchical gene regulatory networks

    Science.gov (United States)

    Multilayered hierarchical gene regulatory networks (ML-hGRNs) are very important for understanding genetics regulation of biological pathways. However, there are currently no computational algorithms available for directly building ML-hGRNs that regulate biological pathways. A bottom-up graphic Gaus...

  2. Achieving Campus Sustainability: Top-Down, Bottom-Up, or Neither?

    Science.gov (United States)

    Brinkhurst, Marena; Rose, Peter; Maurice, Gillian; Ackerman, Josef Daniel

    2011-01-01

    Purpose: The dynamics of organizational change related to environmental sustainability on university campuses are examined in this article. Whereas case studies of campus sustainability efforts tend to classify leadership as either "top-down" or "bottom-up", this classification neglects consideration of the leadership roles of…

  3. Managing Bottom up Strategizing : Collective Sensemaking of Strategic Issues in a Dutch Bank

    NARCIS (Netherlands)

    van der Steen, Martijn

    2016-01-01

    This paper discusses a bottom-up approach to strategizing in two member banks of a Dutch cooperative bank. In both banks, through a collective process of sensemaking, organisational participants evaluated their day-to-day experiences in order to identify strategic issues. The potential benefits of s

  4. Co-financing of bottom-up approaches towards Broadband Infrastructure Development

    DEFF Research Database (Denmark)

    Williams, Idongesit

    2016-01-01

    Bottom – up Broadband infrastructure development facilitated by the civil societies and social enterprises are on the increase. However, the problem plaguing the development of these bottom-up approaches in developing countries is the financial capacity to expand their small networks into larger...

  5. Computational versus psychophysical bottom-up image saliency: A comparative evaluation study

    NARCIS (Netherlands)

    Toet, A.

    2011-01-01

    The predictions of 13 computational bottom-up saliency models and a newly introduced Multiscale Contrast Conspicuity (MCC) metric are compared with human visual conspicuity measurements. The agreement between human visual conspicuity estimates and model saliency predictions is quantified through the

  6. Bottom-up model of self-organized criticality on networks.

    Science.gov (United States)

    Noël, Pierre-André; Brummitt, Charles D; D'Souza, Raissa M

    2014-01-01

    The Bak-Tang-Wiesenfeld (BTW) sandpile process is an archetypal, stylized model of complex systems with a critical point as an attractor of their dynamics. This phenomenon, called self-organized criticality, appears to occur ubiquitously in both nature and technology. Initially introduced on the two-dimensional lattice, the BTW process has been studied on network structures with great analytical successes in the estimation of macroscopic quantities, such as the exponents of asymptotically power-law distributions. In this article, we take a microscopic perspective and study the inner workings of the process through both numerical and rigorous analysis. Our simulations reveal fundamental flaws in the assumptions of past phenomenological models, the same models that allowed accurate macroscopic predictions; we mathematically justify why universality may explain these past successes. Next, starting from scratch, we obtain microscopic understanding that enables mechanistic models; such models can, for example, distinguish a cascade's area from its size. In the special case of a 3-regular network, we use self-consistency arguments to obtain a zero-parameter mechanistic (bottom-up) approximation that reproduces nontrivial correlations observed in simulations and that allows the study of the BTW process on networks in regimes otherwise prohibitively costly to investigate. We then generalize some of these results to configuration model networks and explain how one could continue the generalization. The numerous tools and methods presented herein are known to enable studying the effects of controlling the BTW process and other self-organizing systems. More broadly, our use of multitype branching processes to capture information bouncing back and forth in a network could inspire analogous models of systems in which consequences spread in a bidirectional fashion.

  7. Smart city planning from a bottom-up approach: local communities' intervention for a smarter urban environment

    Science.gov (United States)

    Alverti, Maroula; Hadjimitsis, Diofantos; Kyriakidis, Phaedon; Serraos, Konstantinos

    2016-08-01

    The aim of this paper is to explore the concept of "smart" cities from the perspective of inclusive community participation and Geographical Information Systems (GIS).The concept of a smart city is critically analyzed, focusing on the power/knowledge implications of a "bottom-up" approach in planning and how GIS could encourage community participation in smart urban planning. The paper commences with a literature review of what it means for cities to be "smart". It draws supporting definitions and critical insights into smart cities with respect to the built environment and the human factor. The second part of the paper, analyzes the "bottom-up" approach in urban planning, focusing on community participation reviewing forms and expressions through good practices from European cities. The third part of the paper includes a debate on how smart urban cities policies and community participation interact and influence each other. Finally, the paper closes with a discussion of the insights that were found and offers recommendations on how this debate could be addressed by Information and Communication Technologies and GIS in particular.

  8. Climate-induced changes in bottom-up and top-down processes independently alter a marine ecosystem.

    Science.gov (United States)

    Jochum, Malte; Schneider, Florian D; Crowe, Tasman P; Brose, Ulrich; O'Gorman, Eoin J

    2012-11-05

    Climate change has complex structural impacts on coastal ecosystems. Global warming is linked to a widespread decline in body size, whereas increased flood frequency can amplify nutrient enrichment through enhanced run-off. Altered population body-size structure represents a disruption in top-down control, whereas eutrophication embodies a change in bottom-up forcing. These processes are typically studied in isolation and little is known about their potential interactive effects. Here, we present the results of an in situ experiment examining the combined effects of top-down and bottom-up forces on the structure of a coastal marine community. Reduced average body mass of the top predator (the shore crab, Carcinus maenas) and nutrient enrichment combined additively to alter mean community body mass. Nutrient enrichment increased species richness and overall density of organisms. Reduced top-predator body mass increased community biomass. Additionally, we found evidence for an allometrically induced trophic cascade. Here, the reduction in top-predator body mass enabled greater biomass of intermediate fish predators within the mesocosms. This, in turn, suppressed key micrograzers, which led to an overall increase in microalgal biomass. This response highlights the possibility for climate-induced trophic cascades, driven by altered size structure of populations, rather than species extinction.

  9. A Bottom up Initiative: Meditation & Mindfulness 'Eastern' Practices in the "Western" Academia

    DEFF Research Database (Denmark)

    Singla, Rashmi

    The process of globalisation has also influenced the curriculum of Psychology discipline in the Nordic countries to some extent. There are new sub disciplines and themes in the contemporary courses which have been brought about by both top down as well as bottom up initiative. This paper covers...... a case of bottom up initiative, where the students themselves have demanded inclusion of non- conventional psychosocial interventions illustrated by meditation and mindfulness as Eastern psychological practices, thus filling the gap related to the existential, spiritual approaches. The western...... psychological hegemony has made such transformations difficult and contentious in some universities in Denmark, whereas others are more open towards an integrated form of knowledge originating from different geographical contexts. The initiative taken by the psychology students in Århus University, the specific...

  10. Bottom-up mining of XML query patterns to improve XML querying

    Institute of Scientific and Technical Information of China (English)

    Yi-jun BEI; Gang CHEN; Jin-xiang DONG; Ke CHEN

    2008-01-01

    Querying XML data is a computationally expensive process due to the complex nature of both the XML data and the XML queries. In this paper we propose an approach to expedite XML query processing by caching the results of frequent queries. We discover frequent query patterns from user-issued queries using an efficient bottom-up mining approach called VBUXMiner. VBUXMiner consists of two main steps. First, all queries are merged into a summary structure named "compressed global tree guide" (CGTG). Second, a bottom-up traversal scheme based on the CGTG is employed to generate frequent query patterns. We use the frequent query patterns in a cache mechanism to improve the XML query performance. Experimental results show that our proposed mining approach outperforms the previous mining algorithms for XML queries, such as XQPMinerTID and FastXMiner, and that by caching the results of frequent query patterns, XML query performance can be dramatically improved.

  11. Scaled CMOS Reliability and Considerations for Spacecraft Systems: Bottom-Up and Top-Down Perspective

    Science.gov (United States)

    White, Mark

    2012-01-01

    New space missions will increasingly rely on more advanced technologies because of system requirements for higher performance, particularly in instruments and high-speed processing. Component-level reliability challenges with scaled CMOS in spacecraft systems from a bottom-up perspective have been presented. Fundamental Front-end and Back-end processing reliability issues with more aggressively scaled parts have been discussed. Effective thermal management from system-level to the componentlevel (top-down) is a key element in overall design of reliable systems. Thermal management in space systems must consider a wide range of issues, including thermal loading of many different components, and frequent temperature cycling of some systems. Both perspectives (top-down and bottom-up) play a large role in robust, reliable spacecraft system design.

  12. A balance of bottom-up and top-down in linking climate policies

    Science.gov (United States)

    Green, Jessica F.; Sterner, Thomas; Wagner, Gernot

    2014-12-01

    Top-down climate negotiations embodied by the Kyoto Protocol have all but stalled, chiefly because of disagreements over targets and objections to financial transfers. To avoid those problems, many have shifted their focus to linkage of bottom-up climate policies such as regional carbon markets. This approach is appealing, but we identify four obstacles to successful linkage: different levels of ambition; competing domestic policy objectives; objections to financial transfers; and the difficulty of close regulatory coordination. Even with a more decentralized approach, overcoming the 'global warming gridlock' of the intergovernmental negotiations will require close international coordination. We demonstrate how a balance of bottom-up and top-down elements can create a path toward an effective global climate architecture.

  13. A constraint-based bottom-up counterpart to definite clause grammars

    DEFF Research Database (Denmark)

    Christiansen, Henning

    2004-01-01

    A new grammar formalism, CHR Grammars (CHRG), is proposed that provides a constraint-solving approach to language analysis, built on top of the programming language of Constraint Handling Rules in the same way as Definite Clause Grammars (DCG) on Prolog. CHRG works bottom-up and adds the following......, integrity constraints, operators a la assumption grammars, and to incorporate other constraint solvers. (iv)~Context-sensitive rules that apply for disambiguation, coordination in natural language and tagger-like rules....

  14. Bottom-Up Cost Analysis of a High Concentration PV Module

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, Kelsey A. W.; Woodhouse, Michael; Lee, Hohyun; Smestad, Greg P.

    2016-03-31

    We present a bottom-up model of III-V multi-junction cells, as well as a high concentration PV (HCPV) module. We calculate $0.59/W(DC) manufacturing costs for our model HCPV module design with today's capabilities, and find that reducing cell costs and increasing module efficiency offer the most promising paths for future cost reductions. Cell costs could be significantly reduced via substrate reuse and improved manufacturing yields.

  15. Identifying prognostic features by bottom-up approach and correlating to drug repositioning.

    Directory of Open Access Journals (Sweden)

    Wei Li

    Full Text Available Traditionally top-down method was used to identify prognostic features in cancer research. That is to say, differentially expressed genes usually in cancer versus normal were identified to see if they possess survival prediction power. The problem is that prognostic features identified from one set of patient samples can rarely be transferred to other datasets. We apply bottom-up approach in this study: survival correlated or clinical stage correlated genes were selected first and prioritized by their network topology additionally, then a small set of features can be used as a prognostic signature.Gene expression profiles of a cohort of 221 hepatocellular carcinoma (HCC patients were used as a training set, 'bottom-up' approach was applied to discover gene-expression signatures associated with survival in both tumor and adjacent non-tumor tissues, and compared with 'top-down' approach. The results were validated in a second cohort of 82 patients which was used as a testing set.Two sets of gene signatures separately identified in tumor and adjacent non-tumor tissues by bottom-up approach were developed in the training cohort. These two signatures were associated with overall survival times of HCC patients and the robustness of each was validated in the testing set, and each predictive performance was better than gene expression signatures reported previously. Moreover, genes in these two prognosis signature gave some indications for drug-repositioning on HCC. Some approved drugs targeting these markers have the alternative indications on hepatocellular carcinoma.Using the bottom-up approach, we have developed two prognostic gene signatures with a limited number of genes that associated with overall survival times of patients with HCC. Furthermore, prognostic markers in these two signatures have the potential to be therapeutic targets.

  16. Facilitating a research culture in an academic library: top down and bottom up approaches

    OpenAIRE

    Pickton, Miggie

    2016-01-01

    Purpose:\\ud The purpose of this paper is to consider why and how a research culture might be established in an academic library and to describe and evaluate efforts to achieve this at the University of Northampton. \\ud Design/methodology/approach:\\ud Contextualised within current literature on this topic, the paper examines the top down and bottom up approaches taken to facilitate practitioner research in one academic library. \\ud Findings:\\ud The approaches taken have led to a significant in...

  17. On the interactions between top-down anticipation and bottom-up regression

    Directory of Open Access Journals (Sweden)

    Jun Tani

    2007-11-01

    Full Text Available This paper discusses the importance of anticipation and regression in modeling cognitive behavior. The meanings of these cognitive functions are explained by describing our proposed neural network model which has been implemented on a set of cognitive robotics experiments. The reviews of these experiments suggest that the essences of embodied cognition may reside in the phenomena of the break-down between the top-down anticipation and the bottom-up regression and in its recovery process.

  18. The Application of Bottom-up and Top-down Processing in L2 Listening Comprehension

    Institute of Scientific and Technical Information of China (English)

    温颖茜

    2008-01-01

    Listening comprehension is one of the four basic skills for language learning and is also one of the most difficult tasks L2 learners ever experienced.L2 listening comprehemion is a cognitvive process,in which listeners use both bottom-up andtop-downprocessing to comprehend the auraltext.Thepaper focmes on the applicationof the two approaches in L2 lis-tening comprehemiom

  19. External Costs of Road, Rail and Air Transport - a Bottom-Up Approach

    OpenAIRE

    Weinreich, Sigurd; Rennings, Klaus; Schlomann, Barbara; Geßner, Christian; Engel, Thomas

    1998-01-01

    This paper aims to describe the calculation of environmental and health externalities caused by air pollutants, accidents and noise from different transport modes (road, rail, air) on the route Frankfurt-Milan. The investigation is part of the QUITS project (QUITS = Quality Indicators for Transport Systems), commissioned by the European Commission DG VII. The evaluation of the external costs is based on a bottom-up approach. The calculation involves four stages: emissions, dispersion, impacts...

  20. Catalyst-Free Bottom-Up Synthesis of Few-Layer Hexagonal Boron Nitride Nanosheets

    Directory of Open Access Journals (Sweden)

    Shena M. Stanley

    2015-01-01

    Full Text Available A novel catalyst-free methodology has been developed to prepare few-layer hexagonal boron nitride nanosheets using a bottom-up process. Scanning electron microscopy and transmission electron microscopy (both high and low resolution exhibit evidence of less than ten layers of nanosheets with uniform dimension. X-ray diffraction pattern and other additional characterization techniques prove crystallinity and purity of the product.

  1. Bottom-up graphene-nanoribbon fabrication reveals chiral edges and enantioselectivity.

    Science.gov (United States)

    Han, Patrick; Akagi, Kazuto; Federici Canova, Filippo; Mutoh, Hirotaka; Shiraki, Susumu; Iwaya, Katsuya; Weiss, Paul S; Asao, Naoki; Hitosugi, Taro

    2014-09-23

    We produce precise chiral-edge graphene nanoribbons on Cu{111} using self-assembly and surface-directed chemical reactions. We show that, using specific properties of the substrate, we can change the edge conformation of the nanoribbons, segregate their adsorption chiralities, and restrict their growth directions at low surface coverage. By elucidating the molecular-assembly mechanism, we demonstrate that our method constitutes an alternative bottom-up strategy toward synthesizing defect-free zigzag-edge graphene nanoribbons.

  2. Piezoresistive characterization of bottom-up, n-type silicon microwires undergoing bend deformation

    Energy Technology Data Exchange (ETDEWEB)

    McClarty, Megan M.; Oliver, Derek R., E-mail: Michael.Freund@umanitoba.ca, E-mail: Derek.Oliver@umanitoba.ca [Department of Electrical and Computer Engineering, University of Manitoba, Winnipeg R3T 5V6 (Canada); Bruce, Jared P.; Freund, Michael S., E-mail: Michael.Freund@umanitoba.ca, E-mail: Derek.Oliver@umanitoba.ca [Department of Chemistry, University of Manitoba, Winnipeg R3T 2N2 (Canada)

    2015-01-12

    The piezoresistance of silicon has been studied over the past few decades in order to characterize the material's unique electromechanical properties and investigate their wider applicability. While bulk and top-down (etched) micro- and nano-wires have been studied extensively, less work exists regarding bottom-up grown microwires. A facile method is presented for characterizing the piezoresistance of released, phosphorus-doped silicon microwires that have been grown, bottom-up, via a chemical vapour deposition, vapour-liquid-solid process. The method uses conductive tungsten probes to simultaneously make electrical measurements via direct ohmic contact and apply mechanical strain via bend deformation. These microwires display piezoresistive coefficients within an order of magnitude of those expected for bulk n-type silicon; however, they show an anomalous response at degenerate doping concentrations (∼10{sup 20 }cm{sup −3}) when compared to lower doping concentrations (∼10{sup 17 }cm{sup −3}), with a stronger piezoresistive coefficient exhibited for the more highly doped wires. This response is postulated to be due to the different growth mechanism of bottom-up microwires as compared to top-down.

  3. Atomic layer deposition-Sequential self-limiting surface reactions for advanced catalyst "bottom-up" synthesis

    Science.gov (United States)

    Lu, Junling; Elam, Jeffrey W.; Stair, Peter C.

    2016-06-01

    Catalyst synthesis with precise control over the structure of catalytic active sites at the atomic level is of essential importance for the scientific understanding of reaction mechanisms and for rational design of advanced catalysts with high performance. Such precise control is achievable using atomic layer deposition (ALD). ALD is similar to chemical vapor deposition (CVD), except that the deposition is split into a sequence of two self-limiting surface reactions between gaseous precursor molecules and a substrate. The unique self-limiting feature of ALD allows conformal deposition of catalytic materials on a high surface area catalyst support at the atomic level. The deposited catalytic materials can be precisely constructed on the support by varying the number and type of ALD cycles. As an alternative to the wet-chemistry based conventional methods, ALD provides a cycle-by-cycle "bottom-up" approach for nanostructuring supported catalysts with near atomic precision. In this review, we summarize recent attempts to synthesize supported catalysts with ALD. Nucleation and growth of metals by ALD on oxides and carbon materials for precise synthesis of supported monometallic catalyst are reviewed. The capability of achieving precise control over the particle size of monometallic nanoparticles by ALD is emphasized. The resulting metal catalysts with high dispersions and uniformity often show comparable or remarkably higher activity than those prepared by conventional methods. For supported bimetallic catalyst synthesis, we summarize the strategies for controlling the deposition of the secondary metal selectively on the primary metal nanoparticle but not on the support to exclude monometallic formation. As a review of the surface chemistry and growth behavior of metal ALD on metal surfaces, we demonstrate the ways to precisely tune size, composition and structure of bimetallic metal nanoparticles. The cycle-by-cycle "bottom up" construction of bimetallic (or multiple

  4. BoB: Best of Both in Compiler Construction Bottom-up Parsing with Top-down Semantic Evaluation

    Directory of Open Access Journals (Sweden)

    Wolfgang Dichler

    Full Text Available Compilers typically use either a top-down or a bottom-up strategy for parsing as well as semantic evaluation. Both strategies have advantages and disadvantages: bottom-up parsing supports LR(k grammars but is limited to S- or LR-attribution while top-dow ...

  5. Bottom-Up or Top-Down: English as a Foreign Language Vocabulary Instruction for Chinese University Students

    Science.gov (United States)

    Moskovsky, Christo; Jiang, Guowu; Libert, Alan; Fagan, Seamus

    2015-01-01

    Whereas there has been some research on the role of bottom-up and top-down processing in the learning of a second or foreign language, very little attention has been given to bottom-up and top-down instructional approaches to language teaching. The research reported here used a quasi-experimental design to assess the relative effectiveness of two…

  6. Identifying Bottom-Up and Top-Down Components of Attentional Weight by Experimental Analysis and Computational Modeling

    DEFF Research Database (Denmark)

    Nordfang, Maria; Dyrholm, Mads; Bundesen, Claus

    2013-01-01

    The attentional weight of a visual object depends on the contrast of the features of the object to its local surroundings (feature contrast) and the relevance of the features to one’s goals (feature relevance). We investigated the dependency in partial report experiments with briefly presented....... Measured by use of Bundesen’s (1990) computational theory of visual attention, the attentional weight of a singleton object was nearly proportional to the weight of an otherwise similar nonsingleton object, with a factor of proportionality that increased with the strength of the feature contrast...... of the singleton. This result is explained by generalizing the weight equation of Bundesen’s (1990) theory of visual attention such that the attentional weight of an object becomes a product of a bottom-up (feature contrast) and a top-down (feature relevance) component....

  7. Construction of membrane-bound artificial cells using microfluidics: a new frontier in bottom-up synthetic biology.

    Science.gov (United States)

    Elani, Yuval

    2016-06-15

    The quest to construct artificial cells from the bottom-up using simple building blocks has received much attention over recent decades and is one of the grand challenges in synthetic biology. Cell mimics that are encapsulated by lipid membranes are a particularly powerful class of artificial cells due to their biocompatibility and the ability to reconstitute biological machinery within them. One of the key obstacles in the field centres on the following: how can membrane-based artificial cells be generated in a controlled way and in high-throughput? In particular, how can they be constructed to have precisely defined parameters including size, biomolecular composition and spatial organization? Microfluidic generation strategies have proved instrumental in addressing these questions. This article will outline some of the major principles underpinning membrane-based artificial cells and their construction using microfluidics, and will detail some recent landmarks that have been achieved.

  8. Visual anticipation biases conscious perception but not bottom-up visual processing

    Directory of Open Access Journals (Sweden)

    Paul F.M.J. Verschure

    2015-01-01

    Full Text Available Theories of consciousness can be grouped with respect to their stance on embodiment, sensori-motor contingencies, prediction and integration. In this list prediction plays a key role and it is not clear which aspects of prediction are most prominent in the conscious scene. An evolving view on the brain is that it can be seen as a prediction machine that optimizes its ability to predict states of the world and the self through the top-down propagation of predictions and the bottom-up presentation of prediction errors. There are competing views though on whether prediction or prediction errors dominate the conscious scene. Yet, due to the lack of efficient indirect measures, the dynamic effects of prediction on perception, decision making and consciousness have been difficult to assess and to model. We propose a novel mathematical framework and psychophysical paradigm that allows us to assess both the hierarchical structuring of perceptual consciousness, its content and the impact of predictions and / or errors on the conscious scene. Using a displacement detection task combined with reverse correlation we reveal signatures of the usage of prediction at three different levels of perception: bottom-up early saccades, top-down driven late saccades and conscious decisions. Our results suggest that the brain employs multiple parallel mechanisms at different levels of information processing to restrict the sensory field using predictions. We observe that cognitive load has a quantifiable effect on this dissociation of the bottom-up sensory and top-down predictive processes. We propose a probabilistic data association model from dynamical systems theory to model this predictive bias in different information processing levels.

  9. Increased performance in a bottom-up designed robot by experimentally guided redesign

    DEFF Research Database (Denmark)

    Larsen, Jørgen Christian

    2013-01-01

    the bottom-up, mode-free approach, the authors used the robotic construction kit, LocoKit. This construction kit allows researchers to construct legged robots, without having a mathematical model beforehand. The authors used no specific mathematical model to design the robot, but instead used intuition...... and took inspiration from biology. The results were afterwards compared with results gained from biology, to see if the robot has some of the key elements the authors were looking for. Findings – With the use of LocoKit as the experimental platform, combined with known experimental measurement methods from...

  10. Bottom-up metamaterials with an isotropic magnetic response in the visible

    Science.gov (United States)

    Mühlig, Stefan; Dintinger, José; Cunningham, Alastair; Scharf, Toralf; Bürgi, Thomas; Rockstuhl, Carsten; Lederer, Falk

    A theoretical framework to analyze the optical properties of amorphous metamaterials made from meta-atoms which are amenable for a fabrication with bottom-up technologies is introduced. The achievement of an isotropic magnetic resonance in the visible is investigated by suggesting suitable designs for the meta-atoms. Furthermore, two meta-atoms are discussed in detail that were fabricated by self-assembling plasmonic nanoparticles using techniques from the field of colloidal nanochemistry. The metamaterials are experimentally characterized by spectroscopic means and the excitation of the magnetic dipole moment is clearly revealed. Advantages and disadvantages of metamaterials made from such meta-atoms are discussed.

  11. First-principles study on bottom-up fabrication process of atomically precise graphene nanoribbons

    Science.gov (United States)

    Kaneko, Tomoaki; Tajima, Nobuo; Ohno, Takahisa

    2016-06-01

    We investigate the energetics of a polyanthracene formation in the bottom-up fabrication of atomically precise graphene nanoribbons on Au(111) using first-principles calculations based on the density functional theory. We show that the structure of precursor molecules plays a decisive role in the C-C coupling reaction. The reaction energy of the dimerization of anthracene dimers is a larger negative value than that of the dimerization of anthracene monomers, suggesting that the precursor molecule used in experiments has a favorable structure for graphene nanoribbon fabrication.

  12. Toward the atomic structure of the nuclear pore complex: when top down meets bottom up.

    Science.gov (United States)

    Hoelz, André; Glavy, Joseph S; Beck, Martin

    2016-07-01

    Elucidating the structure of the nuclear pore complex (NPC) is a prerequisite for understanding the molecular mechanism of nucleocytoplasmic transport. However, owing to its sheer size and flexibility, the NPC is unapproachable by classical structure determination techniques and requires a joint effort of complementary methods. Whereas bottom-up approaches rely on biochemical interaction studies and crystal-structure determination of NPC components, top-down approaches attempt to determine the structure of the intact NPC in situ. Recently, both approaches have converged, thereby bridging the resolution gap from the higher-order scaffold structure to near-atomic resolution and opening the door for structure-guided experimental interrogations of NPC function.

  13. Scaled CMOS Reliability and Considerations for Spacecraft Systems : Bottom-Up and Top-Down Perspectives

    Science.gov (United States)

    White, Mark

    2012-01-01

    The recently launched Mars Science Laboratory (MSL) flagship mission, named Curiosity, is the most complex rover ever built by NASA and is scheduled to touch down on the red planet in August, 2012 in Gale Crater. The rover and its instruments will have to endure the harsh environments of the surface of Mars to fulfill its main science objectives. Such complex systems require reliable microelectronic components coupled with adequate component and system-level design margins. Reliability aspects of these elements of the spacecraft system are presented from bottom- up and top-down perspectives.

  14. Unsupervised tattoo segmentation combining bottom-up and top-down cues

    Science.gov (United States)

    Allen, Josef D.; Zhao, Nan; Yuan, Jiangbo; Liu, Xiuwen

    2011-06-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for finding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a figureground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is efficient and suitable for further tattoo classification and retrieval purpose.

  15. Bottom-up/top-down high resolution, high throughput lithography using vertically assembled block bottle brush polymers

    Science.gov (United States)

    Trefonas, Peter; Thackeray, James W.; Sun, Guorong; Cho, Sangho; Clark, Corrie; Verkhoturov, Stanislav V.; Eller, Michael J.; Li, Ang; Pavía-Jiménez, Adriana; Schweikert, Emile A.; Wooley, Karen L.

    2013-03-01

    We describe a novel deterministic bottom-up / top-down approach to sub-30 nm photolithography using a film composed of assembled block brush polymers of highly uniform composition and chain length. The polymer architecture consists of a rigid backbone of polymerized norbornene, each linked to flexible short side brush chains. The resultant `bottle brush' topology has a cylindrical shape with short brush chains arranged concentrically around the backbone, in which the cylinder radius is determined by the number of monomers within the brush fragment, while the cylinder length is determined by the degree of backbone polymerization. The modularity of the synthetic system allows a wide diversity of lithographically useful monomers, sequencing, dimension and property variation. Sequential grafting of pre-synthesized blocks allows for facile formation of either concentric or lengthwise block copolymers. Placement of brush chains of different compositions along different regions of the cylinder, along with variation of the relative concentric and lengthwise dimensions, provides mechanisms to align and control placement of the cylinders. These polymers are compatible with photoacid generators (PAGs) and crosslinker functionality. Our results are consistent with a model that the bottle brush polymers assemble (bottom-up) in the film to yield a `forest' of vertically arranged cylindrical block brush polymers, with the film thickness determined by the coherence lengths of the cylinders. Subsequent imaging via electron beam (EB or ebeam) or optical radiation yields a (top-down) mechanism for acid catalyzed crosslinking of adjacent cylinders. Uncrosslinked cylinders are removed in developer to yield negative photoresist patterns. Exposure doses are very low and throughputs are amenable to the requirements of Extreme Ultraviolet (EUV) lithography. The limiting resolution with ebeam exposure is potentially about two cylinder diameters width (< 8 nm), with the smallest observed

  16. Bottom-up/top-down, high-resolution, high-throughput lithography using vertically assembled block bottle brush polymers

    Science.gov (United States)

    Trefonas, Peter; Thackeray, James W.; Sun, Guorong; Cho, Sangho; Clark, Corrie; Verkhoturov, Stanislav V.; Eller, Michael J.; Li, Ang; Pavia-Sanders, Adriana; Schweikert, Emile A.; Wooley, Karen L.

    2013-10-01

    We describe a novel deterministic bottom-up/top-down approach to sub-30-nm photolithography using a film composed of assembled block brush polymers of highly uniform composition and chain length. The polymer architecture consists of a rigid backbone of polymerized norbornene, each linked to flexible short side brush chains. The resultant bottle brush topology has a cylindrical shape with short brush chains arranged concentrically around the backbone, in which the cylinder radius is determined by the number of monomers within the brush fragment, while the cylinder length is determined by the degree of backbone polymerization. The modularity of the synthetic system allows a wide diversity of lithographically useful monomers, sequencing, dimension, and property variation. Sequential grafting of presynthesized blocks allows for facile formation of either concentric or lengthwise block copolymers. Placement of brush chains of different compositions along different regions of the cylinder, along with variation of the relative concentric and lengthwise dimensions, provides mechanisms to align and control placement of the cylinders. These polymers are compatible with photoacid generators and crosslinker functionality. Our results are consistent with a model that the bottle brush polymers assemble (bottom-up) in the film to yield a forest of vertically arranged cylindrical block brush polymers, with the film thickness determined by the coherence lengths of the cylinders. Subsequent imaging via electron beam (e-beam) or optical radiation yields a (top-down) mechanism for acid catalyzed crosslinking of adjacent cylinders. Uncrosslinked cylinders are removed in developer to yield negative photoresist patterns. Exposure doses are very low and throughputs are amenable to the requirements of extreme ultraviolet lithography. The limiting resolution with e-beam exposure is potentially about two cylinder diameters width (<8 nm), with the smallest observed patterns approaching 10 nm.

  17. Linking top-down and bottom-up approaches for assessing the vulnerability of a 100 % renewable energy system in Northern-Italy

    Science.gov (United States)

    Borga, Marco; Francois, Baptiste; Hingray, Benoit; Zoccatelli, Davide; Creutin, Jean-Dominique; brown, Casey

    2016-04-01

    Due to their variable and un-controllable features, integration of Variable Renewable Energies (e.g. solar-power, wind-power and hydropower, denoted as VRE) into the electricity network implies higher production variability and increased risk of not meeting demand. Two approaches are commonly used for assessing this risk and especially its evolution in a global change context (i.e. climate and societal changes); top-down and bottom-up approaches. The general idea of a top-down approach is to drive analysis of global change or of some key aspects of global change on their systems (e.g., the effects of the COP 21, of the deployment of Smart Grids, or of climate change) with chains of loosely linked simulation models within a predictive framework. The bottom-up approach aims to improve understanding of the dependencies between the vulnerability of regional systems and large-scale phenomenon from knowledge gained through detailed exploration of the response to change of the system of interest, which may reveal vulnerability thresholds, tipping points as well as potential opportunities. Brown et al. (2012) defined an analytical framework to merge these two approaches. The objective is to build, a set of Climate Response Functions (CRFs) putting in perspective i) indicators of desired states ("success") and undesired states ("failure") of a system as defined in collaboration with stakeholders 2) exhaustive exploration of the effects of uncertain forcings and imperfect system understanding on the response of the system itself to a plausible set of possible changes, implemented a with multi-dimensionally consistent "stress test" algorithm, and 3) a set "ex post" hydroclimatic and socioeconomic scenarios that provide insight into the differential effectiveness of alternative policies and serve as entry points for the provision of climate information to inform policy evaluation and choice. We adapted this approach for analyzing a 100 % renewable energy system within a region

  18. Automated Urban Travel Interpretation: A Bottom-up Approach for Trajectory Segmentation.

    Science.gov (United States)

    Das, Rahul Deb; Winter, Stephan

    2016-11-23

    Understanding travel behavior is critical for an effective urban planning as well as for enabling various context-aware service provisions to support mobility as a service (MaaS). Both applications rely on the sensor traces generated by travellers' smartphones. These traces can be used to interpret travel modes, both for generating automated travel diaries as well as for real-time travel mode detection. Current approaches segment a trajectory by certain criteria, e.g., drop in speed. However, these criteria are heuristic, and, thus, existing approaches are subjective and involve significant vagueness and uncertainty in activity transitions in space and time. Also, segmentation approaches are not suited for real time interpretation of open-ended segments, and cannot cope with the frequent gaps in the location traces. In order to address all these challenges a novel, state based bottom-up approach is proposed. This approach assumes a fixed atomic segment of a homogeneous state, instead of an event-based segment, and a progressive iteration until a new state is found. The research investigates how an atomic state-based approach can be developed in such a way that can work in real time, near-real time and offline mode and in different environmental conditions with their varying quality of sensor traces. The results show the proposed bottom-up model outperforms the existing event-based segmentation models in terms of adaptivity, flexibility, accuracy and richness in information delivery pertinent to automated travel behavior interpretation.

  19. Sex differences in mental rotation: top-down versus bottom-up processing.

    Science.gov (United States)

    Butler, Tracy; Imperato-McGinley, Julianne; Pan, Hong; Voyer, Daniel; Cordero, Juan; Zhu, Yuan-Shan; Stern, Emily; Silbersweig, David

    2006-08-01

    Functional MRI during performance of a validated mental rotation task was used to assess a neurobiological basis for sex differences in visuospatial processing. Between-sex group analysis demonstrated greater activity in women than in men in dorsalmedial prefrontal and other high-order heteromodal association cortices, suggesting women performed mental rotation in an effortful, "top-down" fashion. In contrast, men activated primary sensory cortices as well as regions involved in implicit learning (basal ganglia) and mental imagery (precuneus), consistent with a more automatic, "bottom-up" strategy. Functional connectivity analysis in association with a measure of behavioral performance showed that, in men (but not women), accurate performance was associated with deactivation of parieto-insular vestibular cortex (PIVC) as part of a visual-vestibular network. Automatic evocation by men to a greater extent than women of this network during mental rotation may represent an effective, unconscious, bottom-up neural strategy which could reasonably account for men's traditional visuospatial performance advantage.

  20. A bottom-up model to describe consumers’ preferences towards late season peaches

    Energy Technology Data Exchange (ETDEWEB)

    Groot, E.; Albisu, L.M.

    2015-07-01

    Peaches are consumed in Mediterranean countries since ancient times. Nowadays there are few areas in Europe that produce peaches with Protected Designation of Origin (PDO), and the Calanda area is one of them. The aim of this work is to describe consumers’ preferences towards late season PDO Calanda peaches in the city of Zaragoza, Spain, by a bottom-up model. The bottom-up model proves greater amount of information than top-down models. In this approach it is estimated one utility function per consumer. Thus, it is not necessary to make assumptions about preference distributions and correlations across respondents. It was observed that preference distributions were neither normal nor independently distributed. If those preferences were estimated by top-down models, conclusions would be biased. This paper also explores a new way to describe preferences through individual utility functions. Results show that the largest behavioural group gathered origin sensitive consumers. Their utility increased if the peaches were produced in the Calanda area and, especially, when peaches had the PDO Calanda brand. In sequence, the second most valuable attribute for consumers was the price. Peach size and packaging were not so important on purchase choice decision. Nevertheless, it is advisable to avoid trading smallest size peaches (weighting around 160 g/fruit). Traders also have to be careful by using active packaging. It was found that a group of consumers disliked this kind of product, probably, because they perceived it as less natural. (Author)

  1. Top-down and bottom-up forces interact at thermal range extremes on American lobster.

    Science.gov (United States)

    Boudreau, Stephanie A; Anderson, Sean C; Worm, Boris

    2015-05-01

    Exploited marine populations are thought to be regulated by the effects of fishing, species interactions and climate. Yet, it is unclear how these forces interact and vary across a species' range. We conducted a meta-analysis of American lobster (Homarus americanus) abundance data throughout the entirety of the species' range, testing competing hypotheses about bottom-up (climate, temperature) vs. top-down (predation, fishing) regulation along a strong thermal gradient. Our results suggest an interaction between predation and thermal range - predation effects dominated at the cold and warm extremes, but not at the centre of the species' range. Similarly, there was consistent support for a positive climate effect on lobster recruitment at warm range extremes. In contrast, fishing effort followed, rather than led changes in lobster abundance over time. Our analysis suggests that the relative effects of top-down and bottom-up forcing in regulating marine populations may intensify at thermal range boundaries and weaken at the core of a species' range.

  2. Painful faces-induced attentional blink modulated by top-down and bottom-up mechanisms

    Directory of Open Access Journals (Sweden)

    Chun eZheng

    2015-06-01

    Full Text Available Pain-related stimuli can capture attention in an automatic (bottom-up or intentional (top-down fashion. Previous studies have examined attentional capture by pain-related information using spatial attention paradigms that involve mainly a bottom-up mechanism. In the current study, we investigated the pain information–induced attentional blink (AB using a rapid serial visual presentation (RSVP task, and compared the effects of task-irrelevant and task-relevant pain distractors. Relationships between accuracy of target identification and individual traits (i.e., empathy and catastrophizing thinking about pain were also examined. The results demonstrated that task-relevant painful faces had a significant pain information–induced AB effect, whereas task-irrelevant faces a near-significant trend of this effect, supporting the notion that pain-related stimuli can influence the temporal dynamics of attention. Furthermore, we found a significant negative correlation between response accuracy and pain catastrophizing score in task-relevant trials. These findings suggest that active scanning of environmental information related to pain produces greater deficits in cognition than does unintentional attention toward pain, which may represent the different ways in which healthy individuals and patients with chronic pain process pain-relevant information. These results may provide insight into the understanding of maladaptive attentional processing in patients with chronic pain.

  3. Painful faces-induced attentional blink modulated by top-down and bottom-up mechanisms.

    Science.gov (United States)

    Zheng, Chun; Wang, Jin-Yan; Luo, Fei

    2015-01-01

    Pain-related stimuli can capture attention in an automatic (bottom-up) or intentional (top-down) fashion. Previous studies have examined attentional capture by pain-related information using spatial attention paradigms that involve mainly a bottom-up mechanism. In the current study, we investigated the pain information-induced attentional blink (AB) using a rapid serial visual presentation (RSVP) task, and compared the effects of task-irrelevant and task-relevant pain distractors. Relationships between accuracy of target identification and individual traits (i.e., empathy and catastrophizing thinking about pain) were also examined. The results demonstrated that task-relevant painful faces had a significant pain information-induced AB effect, whereas task-irrelevant faces showed a near-significant trend of this effect, supporting the notion that pain-related stimuli can influence the temporal dynamics of attention. Furthermore, we found a significant negative correlation between response accuracy and pain catastrophizing score in task-relevant trials. These findings suggest that active scanning of environmental information related to pain produces greater deficits in cognition than does unintentional attention toward pain, which may represent the different ways in which healthy individuals and patients with chronic pain process pain-relevant information. These results may provide insight into the understanding of maladaptive attentional processing in patients with chronic pain.

  4. Bottom-up processing and low temperature transport properties of polycrystalline SnSe

    Energy Technology Data Exchange (ETDEWEB)

    Ge, Zhen-Hua; Wei, Kaya; Lewis, Hutton [Department of Physics, University of South Florida, Tampa, FL 33620 (United States); Martin, Joshua [Materials Measurement Science Division, National Institute of Standards and Technology, Gaithersburg, MD 20899 (United States); Nolas, George S., E-mail: gnolas@usf.edu [Department of Physics, University of South Florida, Tampa, FL 33620 (United States)

    2015-05-15

    A hydrothermal approach was employed to efficiently synthesize SnSe nanorods. The nanorods were consolidated into polycrystalline SnSe by spark plasma sintering for low temperature electrical and thermal properties characterization. The low temperature transport properties indicate semiconducting behavior with a typical dielectric temperature dependence of the thermal conductivity. The transport properties are discussed in light of the recent interest in this material for thermoelectric applications. The nanorod growth mechanism is also discussed in detail. - Graphical abstract: SnSe nanorods were synthesized by a simple hydrothermal method through a bottom-up approach. Micron sized flower-like crystals changed to nanorods with increasing hydrothermal temperature. Low temperature transport properties of polycrystalline SnSe, after SPS densification, were reported for the first time. This bottom-up synthetic approach can be used to produce phase-pure dense polycrystalline materials for thermoelectrics applications. - Highlights: • SnSe nanorods were synthesized by a simple and efficient hydrothermal approach. • The role of temperature, time and NaOH content was investigated. • SPS densification allowed for low temperature transport properties measurements. • Transport measurements indicate semiconducting behavior.

  5. A bottom-up approach for the synthesis of highly ordered fullerene-intercalated graphene hybrids

    Directory of Open Access Journals (Sweden)

    Dimitrios eGournis

    2015-02-01

    Full Text Available Much of the research effort on graphene focuses on its use as a building block for the development of new hybrid nanostructures with well-defined dimensions and properties suitable for applications such as gas storage, heterogeneous catalysis, gas/liquid separations, nanosensing and biomedicine. Towards this aim, here we describe a new bottom-up approach, which combines self-assembly with the Langmuir Schaefer deposition technique to synthesize graphene-based layered hybrid materials hosting fullerene molecules within the interlayer space. Our film preparation consists in a bottom-up layer-by-layer process that proceeds via the formation of a hybrid organo-graphene oxide Langmuir film. The structure and composition of these hybrid fullerene-containing thin multilayers deposited on hydrophobic substrates were characterized by a combination of X-ray diffraction, Raman and X-ray photoelectron spectroscopies, atomic force microscopy and conductivity measurements. The latter revealed that the presence of C60 within the interlayer spacing leads to an increase in electrical conductivity of the hybrid material as compared to the organo-graphene matrix alone.

  6. Top-down and bottom-up lipidomic analysis of rabbit lipoproteins under different metabolic conditions using flow field-flow fractionation, nanoflow liquid chromatography and mass spectrometry.

    Science.gov (United States)

    Byeon, Seul Kee; Kim, Jin Yong; Lee, Ju Yong; Chung, Bong Chul; Seo, Hong Seog; Moon, Myeong Hee

    2015-07-31

    This study demonstrated the performances of top-down and bottom-up approaches in lipidomic analysis of lipoproteins from rabbits raised under different metabolic conditions: healthy controls, carrageenan-induced inflammation, dehydration, high cholesterol (HC) diet, and highest cholesterol diet with inflammation (HCI). In the bottom-up approach, the high density lipoproteins (HDL) and the low density lipoproteins (LDL) were size-sorted and collected on a semi-preparative scale using a multiplexed hollow fiber flow field-flow fractionation (MxHF5), followed by nanoflow liquid chromatography-ESI-MS/MS (nLC-ESI-MS/MS) analysis of the lipids extracted from each lipoprotein fraction. In the top-down method, size-fractionated lipoproteins were directly infused to MS for quantitative analysis of targeted lipids using chip-type asymmetrical flow field-flow fractionation-electrospray ionization-tandem mass spectrometry (cAF4-ESI-MS/MS) in selected reaction monitoring (SRM) mode. The comprehensive bottom-up analysis yielded 122 and 104 lipids from HDL and LDL, respectively. Rabbits within the HC and HCI groups had lipid patterns that contrasted most substantially from those of controls, suggesting that HC diet significantly alters the lipid composition of lipoproteins. Among the identified lipids, 20 lipid species that exhibited large differences (>10-fold) were selected as targets for the top-down quantitative analysis in order to compare the results with those from the bottom-up method. Statistical comparison of the results from the two methods revealed that the results were not significantly different for most of the selected species, except for those species with only small differences in concentration between groups. The current study demonstrated that top-down lipid analysis using cAF4-ESI-MS/MS is a powerful high-speed analytical platform for targeted lipidomic analysis that does not require the extraction of lipids from blood samples.

  7. Humans strengthen bottom-up effects and weaken trophic cascades in a terrestrial food web.

    Directory of Open Access Journals (Sweden)

    Tyler B Muhly

    Full Text Available Ongoing debate about whether food webs are primarily regulated by predators or by primary plant productivity, cast as top-down and bottom-up effects, respectively, may becoming superfluous. Given that most of the world's ecosystems are human dominated we broadened this dichotomy by considering human effects in a terrestrial food-web. We studied a multiple human-use landscape in southwest Alberta, Canada, as opposed to protected areas where previous terrestrial food-web studies have been conducted. We used structural equation models (SEMs to assess the strength and direction of relationships between the density and distribution of: (1 humans, measured using a density index; (2 wolves (Canis lupus, elk (Cervus elapahus and domestic cattle (Bos taurus, measured using resource selection functions, and; (3 forage quality, quantity and utilization (measured at vegetation sampling plots. Relationships were evaluated by taking advantage of temporal and spatial variation in human density, including day versus night, and two landscapes with the highest and lowest human density in the study area. Here we show that forage-mediated effects of humans had primacy over predator-mediated effects in the food web. In our parsimonious SEM, occurrence of humans was most correlated with occurrence of forage (β = 0.637, p<0.0001. Elk and cattle distribution were correlated with forage (elk day: β = 0.400, p<0.0001; elk night: β = 0.369, p<0.0001; cattle day: β = 0.403, p<0.0001; cattle, night: β = 0.436, p<0.0001, and the distribution of elk or cattle and wolves were positively correlated during daytime (elk: β = 0.293, p <0.0001, cattle: β = 0.303, p<0.0001 and nighttime (elk: β = 0.460, p<0.0001, cattle: β = 0.482, p<0.0001. Our results contrast with research conducted in protected areas that suggested human effects in the food web are primarily predator-mediated. Instead, human influence on vegetation may strengthen

  8. Saccade generation by the frontal eye fields in rhesus monkeys is separable from visual detection and bottom-up attention shift.

    Directory of Open Access Journals (Sweden)

    Kyoung-Min Lee

    Full Text Available The frontal eye fields (FEF, originally identified as an oculomotor cortex, have also been implicated in perceptual functions, such as constructing a visual saliency map and shifting visual attention. Further dissecting the area's role in the transformation from visual input to oculomotor command has been difficult because of spatial confounding between stimuli and responses and consequently between intermediate cognitive processes, such as attention shift and saccade preparation. Here we developed two tasks in which the visual stimulus and the saccade response were dissociated in space (the extended memory-guided saccade task, and bottom-up attention shift and saccade target selection were independent (the four-alternative delayed saccade task. Reversible inactivation of the FEF in rhesus monkeys disrupted, as expected, contralateral memory-guided saccades, but visual detection was demonstrated to be intact at the same field. Moreover, saccade behavior was impaired when a bottom-up shift of attention was not a prerequisite for saccade target selection, indicating that the inactivation effect was independent of the previously reported dysfunctions in bottom-up attention control. These findings underscore the motor aspect of the area's functions, especially in situations where saccades are generated by internal cognitive processes, including visual short-term memory and long-term associative memory.

  9. Saccade generation by the frontal eye fields in rhesus monkeys is separable from visual detection and bottom-up attention shift.

    Science.gov (United States)

    Lee, Kyoung-Min; Ahn, Kyung-Ha; Keller, Edward L

    2012-01-01

    The frontal eye fields (FEF), originally identified as an oculomotor cortex, have also been implicated in perceptual functions, such as constructing a visual saliency map and shifting visual attention. Further dissecting the area's role in the transformation from visual input to oculomotor command has been difficult because of spatial confounding between stimuli and responses and consequently between intermediate cognitive processes, such as attention shift and saccade preparation. Here we developed two tasks in which the visual stimulus and the saccade response were dissociated in space (the extended memory-guided saccade task), and bottom-up attention shift and saccade target selection were independent (the four-alternative delayed saccade task). Reversible inactivation of the FEF in rhesus monkeys disrupted, as expected, contralateral memory-guided saccades, but visual detection was demonstrated to be intact at the same field. Moreover, saccade behavior was impaired when a bottom-up shift of attention was not a prerequisite for saccade target selection, indicating that the inactivation effect was independent of the previously reported dysfunctions in bottom-up attention control. These findings underscore the motor aspect of the area's functions, especially in situations where saccades are generated by internal cognitive processes, including visual short-term memory and long-term associative memory.

  10. Enhancing bottom-up and top-down proteomic measurements with ion mobility separations.

    Science.gov (United States)

    Baker, Erin Shammel; Burnum-Johnson, Kristin E; Ibrahim, Yehia M; Orton, Daniel J; Monroe, Matthew E; Kelly, Ryan T; Moore, Ronald J; Zhang, Xing; Théberge, Roger; Costello, Catherine E; Smith, Richard D

    2015-08-01

    Proteomic measurements with greater throughput, sensitivity, and structural information are essential for improving both in-depth characterization of complex mixtures and targeted studies. While LC separation coupled with MS (LC-MS) measurements have provided information on thousands of proteins in different sample types, the introduction of a separation stage that provides further component resolution and rapid structural information has many benefits in proteomic analyses. Technical advances in ion transmission and data acquisition have made ion mobility separations an opportune technology to be easily and effectively incorporated into LC-MS proteomic measurements for enhancing their information content. Herein, we report on applications illustrating increased sensitivity, throughput, and structural information by utilizing IMS-MS and LC-IMS-MS measurements for both bottom-up and top-down proteomics measurements.

  11. A bottom-up perspective on leadership of collaborative innovation in the public sector

    DEFF Research Database (Denmark)

    Hansen, Jesper Rohr

    . A crucial condition for success is iterative leadership adaptation. In conclusion, the thesis finds that specialized professionals are indeed able to develop politically viable, innovative and collaborative solutions to wicked problems; and that such professionals are able to transform themselves......The thesis investigates how new forms of public leadership can contribute to solving complex problems in today’s welfare societies through innovation. A bottom-up type of leadership for collaborative innovation addressing wicked problems is theorised, displaying a social constructive process...... approach to leadership; a theoretical model emphasises that leadership emerges through social processes of recognition. Leadership is recognised by utilising the uncertainty of a wicked problem and innovation to influence collaborators’ sensemaking processes. The empirical setting is the City of Copenhagen...

  12. Bottom-up synthesis of chiral covalent organic frameworks and their bound capillaries for chiral separation.

    Science.gov (United States)

    Qian, Hai-Long; Yang, Cheng-Xiong; Yan, Xiu-Ping

    2016-07-12

    Covalent organic frameworks (COFs) are a novel class of porous materials, and offer great potential for various applications. However, the applications of COFs in chiral separation and chiral catalysis are largely underexplored due to the very limited chiral COFs available and their challenging synthesis. Here we show a bottom-up strategy to construct chiral COFs and an in situ growth approach to fabricate chiral COF-bound capillary columns for chiral gas chromatography. We incorporate the chiral centres into one of the organic ligands for the synthesis of the chiral COFs. We subsequently in situ prepare the COF-bound capillary columns. The prepared chiral COFs and their bound capillary columns give high resolution for the separation of enantiomers with excellent repeatability and reproducibility. The proposed strategy provides a promising platform for the synthesis of chiral COFs and their chiral separation application.

  13. Bottom-Up Meets Top-Down: Patchy Hybrid Nonwovens as an Efficient Catalysis Platform.

    Science.gov (United States)

    Schöbel, Judith; Burgard, Matthias; Hils, Christian; Dersch, Roland; Dulle, Martin; Volk, Kirsten; Karg, Matthias; Greiner, Andreas; Schmalz, Holger

    2017-01-02

    Heterogeneous catalysis with supported nanoparticles (NPs) is a highly active field of research. However, the efficient stabilization of NPs without deteriorating their catalytic activity is challenging. By combining top-down (coaxial electrospinning) and bottom-up (crystallization-driven self-assembly) approaches, we prepared patchy nonwovens with functional, nanometer-sized patches on the surface. These patches can selectively bind and efficiently stabilize gold nanoparticles (AuNPs). The use of these AuNP-loaded patchy nonwovens in the alcoholysis of dimethylphenylsilane led to full conversion under comparably mild conditions and in short reaction times. The absence of gold leaching or a slowing down of the reaction even after ten subsequent cycles manifests the excellent reusability of this catalyst system. The flexibility of the presented approach allows for easy transfer to other nonwoven supports and catalytically active NPs, which promises broad applicability.

  14. Bottom-up fabrication of zwitterionic polymer brushes on intraocular lens for improved biocompatibility

    Science.gov (United States)

    Han, Yuemei; Xu, Xu; Tang, Junmei; Shen, Chenghui; Lin, Quankui; Chen, Hao

    2017-01-01

    Intraocular lens (IOL) is an efficient implantable device commonly used for treating cataracts. However, bioadhesion of bacteria or residual lens epithelial cells on the IOL surface after surgery causes postoperative complications, such as endophthalmitis or posterior capsular opacification, and leads to loss of sight again. In the present study, zwitterionic polymer brushes were fabricated on the IOL surface via bottom-up grafting procedure. The attenuated total reflection-Fourier transform infrared and contact angle measurements indicated successful surface modification, as well as excellent hydrophilicity. The coating of hydrophilic zwitterionic polymer effectively decreased the bioadhesion of lens epithelial cells or bacteria. In vivo intraocular implantation results showed good in vivo biocompatibility of zwitterionic IOL and its effectiveness against postoperative complications. PMID:28053528

  15. Collective Inclusioning: A Grounded Theory of a Bottom-Up Approach to Innovation and Leading

    Directory of Open Access Journals (Sweden)

    Michal Lysek

    2016-06-01

    Full Text Available This paper is a grounded theory study of how leaders (e.g., entrepreneurs, managers, etc. engage people in challenging undertakings (e.g., innovation that require everyone’s commitment to such a degree that they would have to go beyond what could be reasonably expected in order to succeed. Company leaders sometimes wonder why their employees no longer show the same responsibility towards their work, and why they are more concerned with internal politics than solving customer problems. It is because company leaders no longer apply collective inclusioning to the same extent as they did in the past. Collective inclusioning can be applied in four ways by convincing, afinitizing, goal congruencing, and engaging. It can lead to fostering strong units of people for taking on challenging undertakings. Collective inclusioning is a complementing theory to other strategic management and leading theories. It offers a new perspective on how to implement a bottom-up approach to innovation.

  16. Bottom-up synthesis of chiral covalent organic frameworks and their bound capillaries for chiral separation

    Science.gov (United States)

    Qian, Hai-Long; Yang, Cheng-Xiong; Yan, Xiu-Ping

    2016-07-01

    Covalent organic frameworks (COFs) are a novel class of porous materials, and offer great potential for various applications. However, the applications of COFs in chiral separation and chiral catalysis are largely underexplored due to the very limited chiral COFs available and their challenging synthesis. Here we show a bottom-up strategy to construct chiral COFs and an in situ growth approach to fabricate chiral COF-bound capillary columns for chiral gas chromatography. We incorporate the chiral centres into one of the organic ligands for the synthesis of the chiral COFs. We subsequently in situ prepare the COF-bound capillary columns. The prepared chiral COFs and their bound capillary columns give high resolution for the separation of enantiomers with excellent repeatability and reproducibility. The proposed strategy provides a promising platform for the synthesis of chiral COFs and their chiral separation application.

  17. Differential recolonization of Atlantic intertidal habitats after disturbance reveals potential bottom-up community regulation

    Science.gov (United States)

    Petzold, Willy; Scrosati, Ricardo A.

    2014-01-01

    In the spring of 2014, abundant sea ice that drifted out of the Gulf of St. Lawrence caused extensive disturbance in rocky intertidal habitats on the northern Atlantic coast of mainland Nova Scotia, Canada. To monitor recovery of intertidal communities, we surveyed two wave-exposed locations in the early summer of 2014. Barnacle recruitment and the abundance of predatory dogwhelks were low at one location (Tor Bay Provincial Park) but more than 20 times higher at the other location (Whitehead). Satellite data indicated that the abundance of coastal phytoplankton (the main food source for barnacle larvae) was consistently higher at Whitehead just before the barnacle recruitment season, when barnacle larvae were in the water column. These observations suggest bottom-up forcing of intertidal communities. The underlying mechanisms and their intensity along the NW Atlantic coast could be investigated through studies done at local and regional scales. PMID:26213609

  18. Single-molecule spectroscopy for plastic electronics: materials analysis from the bottom-up.

    Science.gov (United States)

    Lupton, John M

    2010-04-18

    pi-conjugated polymers find a range of applications in electronic devices. These materials are generally highly disordered in terms of chain length and chain conformation, besides being influenced by a variety of chemical and physical defects. Although this characteristic can be of benefit in certain device applications, disorder severely complicates materials analysis. Accurate analytical techniques are, however, crucial to optimising synthetic procedures and assessing overall material purity. Fortunately, single-molecule spectroscopic techniques have emerged as an unlikely but uniquely powerful approach to unraveling intrinsic material properties from the bottom up. Building on the success of such techniques in the life sciences, single-molecule spectroscopy is finding increasing applicability in materials science, effectively enabling the dissection of the bulk down to the level of the individual molecular constituent. This article reviews recent progress in single molecule spectroscopy of conjugated polymers as used in organic electronics.

  19. Strain Response of Hot-Mix Asphalt Overlays for Bottom-Up Reflective Cracking

    CERN Document Server

    Ghauch, Ziad G

    2011-01-01

    This paper examines the strain response of typical HMA overlays above jointed PCC slabs prone to bottom-up reflective cracking. The occurrence of reflective cracking under the combined effect of traffic and environmental loading significantly reduces the design life of the HMA overlays and can lead to its premature failure. In this context, viscoelastic material properties combined with cyclic vehicle loadings and pavement temperature distribution were implemented in a series of FE models in order to study the evolution of horizontal tensile and shear strains at the bottom of the HMA overlay. The effect of several design parameters, such as subbase and subgrade moduli, vehicle speed, overlay thickness, and temperature condition, on the horizontal and shear strain response was investigated. Results obtained show that the rate of horizontal and shear strain increase at the bottom of the HMA overlay drop with higher vehicle speed, higher subgrade modulus, and higher subbase modulus. Moreover, the rate of horizon...

  20. Unsupervised Tattoo Segmentation Combining Bottom-Up and Top-Down Cues

    Energy Technology Data Exchange (ETDEWEB)

    Allen, Josef D [ORNL

    2011-01-01

    Tattoo segmentation is challenging due to the complexity and large variance in tattoo structures. We have developed a segmentation algorithm for nding tattoos in an image. Our basic idea is split-merge: split each tattoo image into clusters through a bottom-up process, learn to merge the clusters containing skin and then distinguish tattoo from the other skin via top-down prior in the image itself. Tattoo segmentation with unknown number of clusters is transferred to a gure-ground segmentation. We have applied our segmentation algorithm on a tattoo dataset and the results have shown that our tattoo segmentation system is e cient and suitable for further tattoo classi cation and retrieval purpose.

  1. Differential recolonization of Atlantic intertidal habitats after disturbance reveals potential bottom-up community regulation.

    Science.gov (United States)

    Petzold, Willy; Scrosati, Ricardo A

    2014-01-01

    In the spring of 2014, abundant sea ice that drifted out of the Gulf of St. Lawrence caused extensive disturbance in rocky intertidal habitats on the northern Atlantic coast of mainland Nova Scotia, Canada. To monitor recovery of intertidal communities, we surveyed two wave-exposed locations in the early summer of 2014. Barnacle recruitment and the abundance of predatory dogwhelks were low at one location (Tor Bay Provincial Park) but more than 20 times higher at the other location (Whitehead). Satellite data indicated that the abundance of coastal phytoplankton (the main food source for barnacle larvae) was consistently higher at Whitehead just before the barnacle recruitment season, when barnacle larvae were in the water column. These observations suggest bottom-up forcing of intertidal communities. The underlying mechanisms and their intensity along the NW Atlantic coast could be investigated through studies done at local and regional scales.

  2. Bottom-Up Cost Analysis of a High Concentration PV Module; NREL (National Renewable Energy Laboratory)

    Energy Technology Data Exchange (ETDEWEB)

    Horowitz, K.; Woodhouse, M.; Lee, H.; Smestad, G.

    2015-04-13

    We present a bottom-up model of III-V multi-junction cells, as well as a high concentration PV (HCPV) module. We calculate $0.65/Wp(DC) manufacturing costs for our model HCPV module design with today’s capabilities, and find that reducing cell costs and increasing module efficiency offer the promising pathways for future cost reductions. Cell costs could be significantly reduced via an increase in manufacturing scale, substrate reuse, and improved manufacturing yields. We also identify several other significant drivers of HCPV module costs, including the Fresnel lens primary optic, module housing, thermal management, and the receiver board. These costs could potentially be lowered by employing innovative module designs.

  3. Integration of top-down and bottom-up information for audio organization and retrieval

    DEFF Research Database (Denmark)

    Jensen, Bjørn Sand

    The increasing availability of digital audio and music calls for methods and systems to analyse and organize these digital objects. This thesis investigates three elements related to such systems focusing on the ability to represent and elicit the user's view on the multimedia object and the system...... sources based on latent Dirichlet allocation (LDA). The model is used to integrate bottom-up features (reflecting timbre, loudness, tempo and chroma), meta-data aspects (lyrics) and top-down aspects, namely user generated open vocabulary tags. The model and representation is evaluated on the auxiliary...... task of genre and style classification. Eliciting the subjective representation and opinion of users is an important aspect in building personalized systems. The thesis contributes with a setup for modelling and elicitation of preference and other cognitive aspects with focus on audio applications...

  4. Bottom-Up Engineering of Well-Defined 3D Microtissues Using Microplatforms and Biomedical Applications.

    Science.gov (United States)

    Lee, Geon Hui; Lee, Jae Seo; Wang, Xiaohong; Lee, Sang Hoon

    2016-01-07

    During the last decades, the engineering of well-defined 3D tissues has attracted great attention because it provides in vivo mimicking environment and can be a building block for the engineering of bioartificial organs. In this Review, diverse engineering methods of 3D tissues using microscale devices are introduced. Recent progress of microtechnologies has enabled the development of microplatforms for bottom-up assembly of diverse shaped 3D tissues consisting of various cells. Micro hanging-drop plates, microfluidic chips, and arrayed microwells are the typical examples. The encapsulation of cells in hydrogel microspheres and microfibers allows the engineering of 3D microtissues with diverse shapes. Applications of 3D microtissues in biomedical fields are described, and the future direction of microplatform-based engineering of 3D micro-tissues is discussed.

  5. Implementing collaborative improvement - top-down, bottom-up or both?

    DEFF Research Database (Denmark)

    Kaltoft, Rasmus; Boer, Harry; Caniato, Federico

    2007-01-01

    , the study identifies three different implementation approaches. The bottom-up learning-by-doing approach starts at a practical level, with simple improvement activities, and aims at gradually developing a wide range of CoI knowledge, skills and initiatives. The top-down directive approach starts......The research presented in this article was aimed at increasing the current understanding of the process of developing Collaborative Improvement (CoI) in Extended Manufacturing Enterprises (EME). Based on action research in three EMEs involving a total of 13 companies from five European countries...... with aligning the partners' CoI objectives and an assessment of their collaboration and CoI maturity in order to provide a common platform before actually starting improvement activities. The laissez-faire approach builds on shared goals/vision, meetings on equal terms and joint work, in a non-directive and non...

  6. Achieving social-ecological fit through bottom-up collaborative governance: an empirical investigation

    Directory of Open Access Journals (Sweden)

    Angela M. Guerrero

    2015-12-01

    Full Text Available Significant benefits can arise from collaborative forms of governance that foster self-organization and flexibility. Likewise, governance systems that fit with the extent and complexity of the system under management are considered essential to our ability to solve environmental problems. However, from an empirical perspective the fundamental question of whether self-organized (bottom-up collaborative forms of governance are able to accomplish adequate fit is unresolved. We used new theory and methodological approaches underpinned by interdisciplinary network analysis to address this gap by investigating three governance challenges that relate to the problem of fit: shared management of ecological resources, management of interconnected ecological resources, and cross-scale management. We first identified a set of social-ecological network configurations that represent the hypothesized ways in which collaborative arrangements can contribute to addressing these challenges. Using social and ecological data from a large-scale biodiversity conservation initiative in Australia, we empirically determined how well the observed patterns of stakeholder interactions reflect these network configurations. We found that stakeholders collaborate to manage individual parcels of native vegetation, but not for the management of interconnected parcels. In addition, our data show that the collaborative arrangements enable management across different scales (local, regional, supraregional. Our study provides empirical support for the ability of collaborative forms of governance to address the problem of fit, but also suggests that in some cases the establishment of bottom-up collaborative arrangements would likely benefit from specific guidance to facilitate the establishment of collaborations that better align with the ways ecological resources are interconnected across the landscape. In our case study region, this would improve the capacity of stakeholders to

  7. Preferential effect of isoflurane on top-down versus bottom-up pathways in sensory cortex

    Directory of Open Access Journals (Sweden)

    Aeyal eRaz

    2014-10-01

    Full Text Available The mechanism of loss of consciousness (LOC under anesthesia is unknown. Because consciousness depends on activity in the cortico-thalamic network, anesthetic actions on this network are likely critical for LOC. Competing theories stress the importance of anesthetic actions on bottom-up ‘core’ thalamo-cortical (TC versus top-down cortico-cortical (CC and matrix TC connections. We tested these models using laminar recordings in rat auditory cortex in-vivo and murine brain slices. We selectively activated bottom-up vs. top-down afferent pathways using sensory stimuli in vivo and electrical stimulation in brain slices, and compared effects of isoflurane on responses evoked via the two pathways. Auditory stimuli in vivo and core TC afferent stimulation in brain slices evoked short latency current sinks in middle layers, consistent with activation of core TC afferents. By contrast, visual stimuli in vivo and stimulation of CC and matrix TC afferents in brain slices evoked responses mainly in superficial and deep layers, consistent with projection patterns of top-down afferents that carry visual information to auditory cortex. Responses to auditory stimuli in vivo and core TC afferents in brain slices were significantly less affected by isoflurane compared to responses triggered by visual stimuli in vivo and CC/matrix TC afferents in slices. At a just-hypnotic dose in vivo, auditory responses were enhanced by isoflurane, whereas visual responses were dramatically reduced. At a comparable concentration in slices, isoflurane suppressed both core TC and CC/matrix TC responses, but the effect on the latter responses was far greater than on core TC responses, indicating that at least part of the differential effects observed in vivo were due to local actions of isoflurane in auditory cortex. These data support a model in which disruption of top-down connectivity contributes to anesthesia-induced LOC, and have implications for understanding the neural

  8. Estimation of Emissions from Sugarcane Field Burning in Thailand Using Bottom-Up Country-Specific Activity Data

    Directory of Open Access Journals (Sweden)

    Wilaiwan Sornpoon

    2014-09-01

    Full Text Available Open burning in sugarcane fields is recognized as a major source of air pollution. However, the assessment of its emission intensity in many regions of the world still lacks information, especially regarding country-specific activity data including biomass fuel load and combustion factor. A site survey was conducted covering 13 sugarcane plantations subject to different farm management practices and climatic conditions. The results showed that pre-harvest and post-harvest burnings are the two main practices followed in Thailand. In 2012, the total production of sugarcane biomass fuel, i.e., dead, dry and fresh leaves, amounted to 10.15 million tonnes, which is equivalent to a fuel density of 0.79 kg∙m−2. The average combustion factor for the pre-harvest and post-harvest burning systems was determined to be 0.64 and 0.83, respectively. Emissions from sugarcane field burning were estimated using the bottom-up country-specific values from the site survey of this study and the results compared with those obtained using default values from the 2006 IPCC Guidelines. The comparison showed that the use of default values lead to underestimating the overall emissions by up to 30% as emissions from post-harvest burning are not accounted for, but it is the second most common practice followed in Thailand.

  9. A Family of Highly Efficient CuI-Based Lighting Phosphors Prepared by a Systematic, Bottom-up Synthetic Approach.

    Science.gov (United States)

    Liu, Wei; Fang, Yang; Wei, George Z; Teat, Simon J; Xiong, Kecai; Hu, Zhichao; Lustig, William P; Li, Jing

    2015-07-29

    Copper(I) iodide (CuI)-based inorganic-organic hybrid materials in the general chemical formula of CuI(L) are well-known for their structural diversity and strong photoluminescence and are therefore considered promising candidates for a number of optical applications. In this work, we demonstrate a systematic, bottom-up precursor approach to developing a series of CuI(L) network structures built on CuI rhomboid dimers. These compounds combine strong luminescence due to the CuI inorganic modules and significantly enhanced thermal stability as a result of connecting individual building units into robust, extended networks. Examination of their optical properties reveals that these materials not only exhibit exceptionally high photoluminescence performance (with internal quantum yield up to 95%) but also that their emission energy and color are systematically tunable through modification of the organic component. Results from density functional theory calculations provide convincing correlations between these materials' crystal structures and chemical compositions and their optophysical properties. The advantages of cost-effective, solution-processable, easily scalable and fully controllable synthesis as well as high quantum efficiency with improved thermal stability, make this phosphor family a promising candidate for alternative, RE-free phosphors in general lighting and illumination. This solution-based precursor approach creates a new blueprint for the rational design and controlled synthesis of inorganic-organic hybrid materials.

  10. Exploring the Life Expectancy Increase in Poland in the Context of CVD Mortality Fall: The Risk Assessment Bottom-Up Approach, From Health Outcome to Policies.

    Science.gov (United States)

    Kobza, Joanna; Geremek, Mariusz

    2015-01-01

    Life expectancy at birth is considered the best mortality-based summary indicator of the health status of the population and is useful for measuring long-term health changes. The objective of this article was to present the concept of the bottom-up policy risk assessment approach, developed to identify challenges involved in analyzing risk factor reduction policies and in assessing how the related health indicators have changed over time. This article focuses on the reasons of the significant life expectancy prolongation in Poland over the past 2 decades, thus includes policy context. The methodology details a bottom-up risk assessment approach, a chain of relations between the health outcome, risk factors, and health policy, based on Risk Assessment From Policy to Impact Dimension project guidance. A decline in cardiovascular disease mortality was a key factor that followed life expectancy prolongation. Among basic factors, tobacco and alcohol consumption, diet, physical activity, and new treatment technologies were identified. Poor health outcomes of the Polish population at the beginning of 1990s highlighted the need of the implementation of various health promotion programs, legal acts, and more effective public health policies. Evidence-based public health policy needs translating scientific research into policy and practice. The bottom-up case study template can be one of the focal tools in this process. Accountability for the health impact of policies and programs and legitimization of the decisions of policy makers has become one of the key questions nowadays in European countries' decision-making process and in EU public health strategy.

  11. A bottom-up approach to estimating cost elements of REDD+ pilot projects in Tanzania

    Directory of Open Access Journals (Sweden)

    Merger Eduard

    2012-08-01

    Full Text Available Abstract Background Several previous global REDD+ cost studies have been conducted, demonstrating that payments for maintaining forest carbon stocks have significant potential to be a cost-effective mechanism for climate change mitigation. These studies have mostly followed highly aggregated top-down approaches without estimating the full range of REDD+ costs elements, thus underestimating the actual costs of REDD+. Based on three REDD+ pilot projects in Tanzania, representing an area of 327,825 ha, this study explicitly adopts a bottom-up approach to data assessment. By estimating opportunity, implementation, transaction and institutional costs of REDD+ we develop a practical and replicable methodological framework to consistently assess REDD+ cost elements. Results Based on historical land use change patterns, current region-specific economic conditions and carbon stocks, project-specific opportunity costs ranged between US$ -7.8 and 28.8 tCOxxxx for deforestation and forest degradation drivers such as agriculture, fuel wood production, unsustainable timber extraction and pasture expansion. The mean opportunity costs for the three projects ranged between US$ 10.1 – 12.5 tCO2. Implementation costs comprised between 89% and 95% of total project costs (excluding opportunity costs ranging between US$ 4.5 - 12.2 tCO2 for a period of 30 years. Transaction costs for measurement, reporting, verification (MRV, and other carbon market related compliance costs comprised a minor share, between US$ 0.21 - 1.46 tCO2. Similarly, the institutional costs comprised around 1% of total REDD+ costs in a range of US$ 0.06 – 0.11 tCO2. Conclusions The use of bottom-up approaches to estimate REDD+ economics by considering regional variations in economic conditions and carbon stocks has been shown to be an appropriate approach to provide policy and decision-makers robust economic information on REDD+. The assessment of opportunity costs is a crucial first step to

  12. Conservative and dissipative force field for simulation of coarse-grained alkane molecules: A bottom-up approach

    Energy Technology Data Exchange (ETDEWEB)

    Trément, Sébastien; Rousseau, Bernard, E-mail: bernard.rousseau@u-psud.fr [Laboratoire de Chimie-Physique, UMR 8000 CNRS, Université Paris-Sud, Orsay (France); Schnell, Benoît; Petitjean, Laurent; Couty, Marc [Manufacture Française des Pneumatiques MICHELIN, Centre de Ladoux, 23 place des Carmes, 63000 Clermont-Ferrand (France)

    2014-04-07

    We apply operational procedures available in the literature to the construction of coarse-grained conservative and friction forces for use in dissipative particle dynamics (DPD) simulations. The full procedure rely on a bottom-up approach: large molecular dynamics trajectories of n-pentane and n-decane modeled with an anisotropic united atom model serve as input for the force field generation. As a consequence, the coarse-grained model is expected to reproduce at least semi-quantitatively structural and dynamical properties of the underlying atomistic model. Two different coarse-graining levels are studied, corresponding to five and ten carbon atoms per DPD bead. The influence of the coarse-graining level on the generated force fields contributions, namely, the conservative and the friction part, is discussed. It is shown that the coarse-grained model of n-pentane correctly reproduces self-diffusion and viscosity coefficients of real n-pentane, while the fully coarse-grained model for n-decane at ambient temperature over-predicts diffusion by a factor of 2. However, when the n-pentane coarse-grained model is used as a building block for larger molecule (e.g., n-decane as a two blobs model), a much better agreement with experimental data is obtained, suggesting that the force field constructed is transferable to large macro-molecular systems.

  13. A Nonminimal SO(10) x U(1)-F SUSY GUT model obtained from a bottom up approach

    Energy Technology Data Exchange (ETDEWEB)

    Albright, Carl H.

    1996-08-01

    Many of the ingredients are explored which are needed to develop a super- symmetric SO(10) x U(1)_F grand unified model based on the Yukawa structure of a model previously constructed in collaboration with S. Nandi to explain the quark and lepton masses and mixings in a particular neutrino scenario. The U(1)_F family symmetry can be made anomaly-free with the introduction of one conjugate pair of SO(10)-singlet neutrinos with the same U(1)_F charge. Due to a plethora of conjugate pairs of supermultiplets, the model develops a Landau singularity within a factor of 1.5 above the GUT scale. With the imposition of a Z_2 discrete symmetry and under certain conditions, all higgsino triplets can be made superheavy while just one pair of higgsino doublets remains light and results in mass matrix textures previously obtained from the bottom-up approach. Diametrically opposite splitting of the first and third family scalar quark and lepton masses away from the second family ones results from the nonuniversal D-term contributions.

  14. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion.

    Science.gov (United States)

    Xiu, Daiming; Geiger, Maximilian J; Klaver, Peter

    2015-01-01

    This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive ("happy"), neutral and negative ("angry" or "fearful") faces. Dynamic Causal Modeling (DCM) was applied on the functional magnetic resonance imaging (fMRI) data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus) and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala, and orbitofrontal cortex). The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  15. Pressurized Pepsin Digestion in Proteomics: An Automatable Alternative to Trypsin for Integrated Top-down Bottom-up Proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Lopez-Ferrer, Daniel; Petritis, Konstantinos; Robinson, Errol W.; Hixson, Kim K.; Tian, Zhixin; Lee, Jung Hwa; Lee, Sang-Won; Tolic, Nikola; Weitz, Karl K.; Belov, Mikhail E.; Smith, Richard D.; Pasa-Tolic, Ljiljana

    2011-02-01

    Integrated top-down bottom-up proteomics combined with online digestion has great potential to improve the characterization of protein isoforms in biological systems and is amendable to highthroughput proteomics experiments. Bottom-up proteomics ultimately provides the peptide sequences derived from the tandem MS analyses of peptides after the proteome has been digested. Top-down proteomics conversely entails the MS analyses of intact proteins for more effective characterization of genetic variations and/or post-translational modifications (PTMs). Herein, we describe recent efforts towards efficient integration of bottom-up and top-down LCMS based proteomic strategies. Since most proteomic platforms (i.e. LC systems) operate in acidic environments, we exploited the compatibility of the pepsin (i.e. the enzyme’s natural acidic activity) for the integration of bottom-up and top-down proteomics. Pressure enhanced pepsin digestions were successfully performed and characterized with several standard proteins in either an offline mode using a Barocycler or an online mode using a modified high pressure LC system referred to as a fast online digestion system (FOLDS). FOLDS was tested using pepsin and a whole microbial proteome, and the results compared against traditional trypsin digestions on the same platform. Additionally, FOLDS was integrated with a RePlay configuration to demonstrate an ultra-rapid integrated bottom-up top-down proteomic strategy employing a standard mixture of proteins and a monkey pox virus proteome.

  16. Emotional face expression modulates occipital-frontal effective connectivity during memory formation in a bottom-up fashion

    Directory of Open Access Journals (Sweden)

    Daiming eXiu

    2015-04-01

    Full Text Available This study investigated the role of bottom-up and top-down neural mechanisms in the processing of emotional face expression during memory formation. Functional brain imaging data was acquired during incidental learning of positive (‘happy’, neutral and negative (‘angry’ or ‘fearful’ faces. Dynamic Causal Modeling (DCM was applied on the fMRI data to characterize effective connectivity within a brain network involving face perception (inferior occipital gyrus and fusiform gyrus and successful memory formation related areas (hippocampus, superior parietal lobule, amygdala and orbitofrontal cortex. The bottom-up models assumed processing of emotional face expression along feed forward pathways to the orbitofrontal cortex. The top-down models assumed that the orbitofrontal cortex processed emotional valence and mediated connections to the hippocampus. A subsequent recognition memory test showed an effect of negative emotion on the response bias, but not on memory performance. Our DCM findings showed that the bottom-up model family of effective connectivity best explained the data across all subjects and specified that emotion affected most bottom-up connections to the orbitofrontal cortex, especially from the occipital visual cortex and superior parietal lobule. Of those pathways to the orbitofrontal cortex the connection from the inferior occipital gyrus correlated with memory performance independently of valence. We suggest that bottom-up neural mechanisms support effects of emotional face expression and memory formation in a parallel and partially overlapping fashion.

  17. A bottom-up approach to identifying the maximum operational adaptive capacity of water resource systems to a changing climate

    Science.gov (United States)

    Culley, S.; Noble, S.; Yates, A.; Timbs, M.; Westra, S.; Maier, H. R.; Giuliani, M.; Castelletti, A.

    2016-09-01

    Many water resource systems have been designed assuming that the statistical characteristics of future inflows are similar to those of the historical record. This assumption is no longer valid due to large-scale changes in the global climate, potentially causing declines in water resource system performance, or even complete system failure. Upgrading system infrastructure to cope with climate change can require substantial financial outlay, so it might be preferable to optimize existing system performance when possible. This paper builds on decision scaling theory by proposing a bottom-up approach to designing optimal feedback control policies for a water system exposed to a changing climate. This approach not only describes optimal operational policies for a range of potential climatic changes but also enables an assessment of a system's upper limit of its operational adaptive capacity, beyond which upgrades to infrastructure become unavoidable. The approach is illustrated using the Lake Como system in Northern Italy—a regulated system with a complex relationship between climate and system performance. By optimizing system operation under different hydrometeorological states, it is shown that the system can continue to meet its minimum performance requirements for more than three times as many states as it can under current operations. Importantly, a single management policy, no matter how robust, cannot fully utilize existing infrastructure as effectively as an ensemble of flexible management policies that are updated as the climate changes.

  18. Bottom-up low molecular weight heparin analysis using liquid chromatography-Fourier transform mass spectrometry for extensive characterization.

    Science.gov (United States)

    Li, Guoyun; Steppich, Julia; Wang, Zhenyu; Sun, Yi; Xue, Changhu; Linhardt, Robert J; Li, Lingyun

    2014-07-01

    Low molecular weight heparins (LMWHs) are heterogeneous, polydisperse, and highly negatively charged mixtures of glycosaminoglycan chains prescribed as anticoagulants. The detailed characterization of LMWH is important for the drug quality assurance and for new drug research and development. In this study, online hydrophilic interaction chromatography (HILIC) Fourier transform mass spectrometry (FTMS) was applied to analyze the oligosaccharide fragments of LMWHs generated by heparin lyase II digestion. More than 40 oligosaccharide fragments of LMWH were quantified and used to compare LMWHs prepared by three different manufacturers. The quantified fragment structures included unsaturated disaccharides/oligosaccharides arising from the prominent repeating units of these LMWHs, 3-O-sulfo containing tetrasaccharides arising from their antithrombin III binding sites, 1,6-anhydro ring-containing oligosaccharides formed during their manufacture, saturated uronic acid oligosaccharides coming from some chain nonreducing ends, and oxidized linkage region oligosaccharides coming from some chain reducing ends. This bottom-up approach provides rich detailed structural analysis and quantitative information with high accuracy and reproducibility. When combined with the top-down approach, HILIC LC-FTMS based analysis should be suitable for the advanced quality control and quality assurance in LMWH production.

  19. Bottom-Up Abstract Modelling of Optical Networks-on-Chip: From Physical to Architectural Layer

    Directory of Open Access Journals (Sweden)

    Alberto Parini

    2012-01-01

    Full Text Available This work presents a bottom-up abstraction procedure based on the design-flow FDTD + SystemC suitable for the modelling of optical Networks-on-Chip. In this procedure, a complex network is decomposed into elementary switching elements whose input-output behavior is described by means of scattering parameters models. The parameters of each elementary block are then determined through 2D-FDTD simulation, and the resulting analytical models are exported within functional blocks in SystemC environment. The inherent modularity and scalability of the S-matrix formalism are preserved inside SystemC, thus allowing the incremental composition and successive characterization of complex topologies typically out of reach for full-vectorial electromagnetic simulators. The consistency of the outlined approach is verified, in the first instance, by performing a SystemC analysis of a four-input, four-output ports switch and making a comparison with the results of 2D-FDTD simulations of the same device. Finally, a further complex network encompassing 160 microrings is investigated, the losses over each routing path are calculated, and the minimum amount of power needed to guarantee an assigned BER is determined. This work is a basic step in the direction of an automatic technology-aware network-level simulation framework capable of assembling complex optical switching fabrics, while at the same time assessing the practical feasibility and effectiveness at the physical/technological level.

  20. Bottom-Up, Wet Chemical Technique for the Continuous Synthesis of Inorganic Nanoparticles

    Directory of Open Access Journals (Sweden)

    Annika Betke

    2014-01-01

    Full Text Available Continuous wet chemical approaches for the production of inorganic nanoparticles are important for large scale production of nanoparticles. Here we describe a bottom-up, wet chemical method applying a microjet reactor. This technique allows the separation between nucleation and growth in a continuous reactor environment. Zinc oxide (ZnO, magnetite (Fe3O4, as well as brushite (CaHPO4·2H2O, particles with a small particle size distribution can be obtained continuously by using the rapid mixing of two precursor solutions and the fast removal of the nuclei from the reaction environment. The final particles were characterized by FT-IR, TGA, DLS, XRD and SEM techniques. Systematic studies on the influence of the different process parameters, such as flow rate and process temperature, show that the particle size can be influenced. Zinc oxide was obtained with particle sizes between 44 nm and 102 nm. The obtained magnetite particles have particle sizes in the range of 46 nm to 132 nm. Brushite behaves differently; the obtained particles were shaped like small plates with edge lengths between 100 nm and 500 nm.

  1. Two Paths to Transforming Markets through Public Sector EnergyEfficiency: Bottom Up versus Top Down

    Energy Technology Data Exchange (ETDEWEB)

    Van Wie McGrory, Laura; Coleman, Philip; Fridley, David; Harris,Jeffrey; Villasenor Franco, Edgar

    2006-05-10

    The evolution of government purchasing initiatives in Mexicoand China, part of the PEPS (Promoting an Energy-efficient Public Sector)program, demonstrates the need for flexibility in designingenergy-efficiency strategies in the public sector. Several years ofpursuing a top-down (federally led) strategy in Mexico produced fewresults, and it was not until the program was restructured in 2004 tofocus on municipal-level purchasing that the program gained momentum.Today, a new partnership with the Mexican federal government is leadingto an intergovernmental initiative with strong support at the federallevel. By contrast, the PEPS purchasing initiative in China wassuccessfully initiated and led at the central government level withstrategic support from international experts. The very different successtrajectories in these two countries provide valuable lessons fordesigning country-specific public sector energy-efficiency initiatives.Enabling conditions for any successful public sector purchasinginitiative include the existence of mandatory energy-efficiencyperformance standards, an effective energy-efficiency endorsementlabeling program, an immediate need for energy conservation, a simplepilot phase (focusing on a limited number of strategically chosenproducts), and specialized technical assistance. Top-down purchasingprograms are likely to be more successful where there is high-levelpolitical endorsement and a national procurement law in place, supportedby a network of trained purchasers. Bottom-up (municipally led)purchasing programs require that municipalities have the authority to settheir own purchasing policies, and also benefit from existing networks ofcities, supported by motivated municipal leaders and trained purchasingofficials.

  2. Spinodal nanotechnology as a new class of bottom-up one and applications

    Science.gov (United States)

    Katayama-Yoshida, Hiroshi; Fukushima, Tetsuya; Kizaki, Hidetoshi; Oshitani, Masamune; Sato, Kazunori

    2010-03-01

    We discuss the nano-materials design of spinodal nano-decomposition as a new class of bottom-up nanotechnology by combining ab initio calculations and kinetic Monte Carlo simulations. We include all the complexity in the fabrication process of spinodal nano-decomposition (Konbu- and Dairiseki-phase) into advanced materials design with inhomogeneous materials. We compare the theoretical predictions with available experiments, such as (i)semiconductor nano-spintronics in dilute magnetic semiconductors, (ii)colossal thermoelectric-power responses of spincaloritronics, (iii)self-repaired nano-catalysis in La(Fe,Pd)O3, (iv)high-efficiency solar-cells, (v)high-efficiency light-emitting diode and Lasers. (1) K. Sato, et al., Reviews of Modern Physics, in printing (2009). (2) H. Katayama-Yoshida, et al.,Handbook of Spintronic Semiconductors, (Pan Stanford Pub.), p.1-79, (2009). (3) H. Katayama-Yoshida, et al.,Semiconductors and Semimetals, 82,433 (2008). (4) H. Katayama-Yoshida, et al.,Jpn. J. Appl. Phys. 46, L777 (2007). (5) H. Kizaki, et al.,Applied Physics Express 1, 104001, (2008).

  3. The Early Anthropogenic Hypothesis: Top-Down and Bottom-up Evidence

    Science.gov (United States)

    Ruddiman, W. F.

    2014-12-01

    Two complementary lines of evidence support the early anthropogenic hypothesis. Top-down evidence comes from comparing Holocene greenhouse-gas trends with those during equivalent intervals of previous interglaciations. The increases in CO2 and CH4 during the late Holocene are anomalous compared to the decreasing trends in a stacked average of previous interglaciations, thereby supporting an anthropogenic origin. During interglacial stage 19, the closest Holocene insolation analog, CO2 fell to 245 ppm by the time equivalent to the present, in contrast to the observed pre-industrial rise to 280-285 ppm. The 245-ppm level measured in stage 19 falls at the top of the natural range predicted by the original anthropogenic hypothesis of Ruddiman (2003). Bottom-up evidence comes from a growing list of archeological and other compilations showing major early anthropogenic transformations of Earth's surface. Key examples include: efforts by Dorian Fuller and colleagues mapping the spread of irrigated rice agriculture across southern Asia and its effects on CH4 emissions prior to the industrial era; an additional effort by Fuller showing the spread of methane-emitting domesticated livestock across Asia and Africa (coincident with the spread of fertile crescent livestock across Europe); historical compilations by Jed Kaplan and colleagues documenting very high early per-capita forest clearance in Europe, thus underpinning simulations of extensive pre-industrial clearance and large CO2 emissions; and wide-ranging studies by Erle Ellis and colleagues of early anthropogenic land transformations in China and elsewhere.

  4. A "bottom up" governance framework for developing Australia's marine Spatial Data Infrastructure (SDI

    Directory of Open Access Journals (Sweden)

    K T Finney

    2007-07-01

    Full Text Available Spatial Data Infrastructures (SDIs have been developing in some countries for over 10 years but still suffer from having a relatively small installed base. Most SDIs will soon converge around a service-oriented-architecture (SOA using IT standards promulgated primarily by the Open Geospatial Consortium (OGC and ISO Technical Committee 211. There are very few examples of these types of architected SDIs in action, and as a result little detailed information exists on suitable governance models. This paper discusses the governance issues that are posed by SOA-based SDIs, particularly those issues surrounding standards and services management, with reference to an Australian marine case study and the general literature. A generalised governance framework is then postulated using an idealised use case model which is applicable for "bottom-up," community-based initiatives. This model incorporates guiding principles and motivational and self-regulation instruments that are characteristically found in successful open source development activities. It is argued that harnessing an open development model, using a voluntary workforce, could rapidly increase the size of the SDI installed base and importantly defray infrastructure build costs.

  5. A Bottom-Up Approach for Automatically Grouping Sensor Data Layers by their Observed Property

    Directory of Open Access Journals (Sweden)

    Steve H.L. Liang

    2013-01-01

    Full Text Available The Sensor Web is a growing phenomenon where an increasing number of sensors are collecting data in the physical world, to be made available over the Internet. To help realize the Sensor Web, the Open Geospatial Consortium (OGC has developed open standards to standardize the communication protocols for sharing sensor data. Spatial Data Infrastructures (SDIs are systems that have been developed to access, process, and visualize geospatial data from heterogeneous sources, and SDIs can be designed specifically for the Sensor Web. However, there are problems with interoperability associated with a lack of standardized naming, even with data collected using the same open standard. The objective of this research is to automatically group similar sensor data layers. We propose a methodology to automatically group similar sensor data layers based on the phenomenon they measure. Our methodology is based on a unique bottom-up approach that uses text processing, approximate string matching, and semantic string matching of data layers. We use WordNet as a lexical database to compute word pair similarities and derive a set-based dissimilarity function using those scores. Two approaches are taken to group data layers: mapping is defined between all the data layers, and clustering is performed to group similar data layers. We evaluate the results of our methodology.

  6. Preparation of hydrocortisone nanosuspension through a bottom-up nanoprecipitation technique using microfluidic reactors.

    Science.gov (United States)

    Ali, Hany S M; York, Peter; Blagden, Nicholas

    2009-06-22

    In this work, the possibility of bottom-up creation of a relatively stable aqueous hydrocortisone nanosuspension using microfluidic reactors was examined. The first part of the work involved a study of the parameters of the microfluidic precipitation process that affect the size of generated drug particles. These parameters included flow rates of drug solution and antisolvent, microfluidic channel diameters, microreactors inlet angles and drug concentrations. The experimental results revealed that hydrocortisone nano-sized dispersions in the range of 80-450 nm were obtained and the mean particle size could be changed by modifying the experimental parameters and design of microreactors. The second part of the work studied the possibility of preparing a hydrocortisone nanosuspension using microfluidic reactors. The nano-sized particles generated from a microreactor were rapidly introduced into an aqueous solution of stabilizers stirred at high speed with a propeller mixer. A tangential flow filtration system was then used to concentrate the prepared nanosuspension. The nanosuspension produced was then characterized using photon correlation spectroscopy (PCS), Zeta potential measurement, transmission electron microscopy (TEM), differential scanning calorimetry (DSC) and X-ray analysis. Results showed that a narrow sized nanosuspension composed of amorphous spherical particles with a mean particle size of 500+/-64 nm, a polydispersity index of 0.21+/-0.026 and a zeta potential of -18+/-2.84 mV was obtained. Physical stability studies showed that the hydrocortisone nanosuspension remained homogeneous with slight increase in mean particle size and polydispersity index over a 3-month period.

  7. Programmable chemical reaction networks: emulating regulatory functions in living cells using a bottom-up approach.

    Science.gov (United States)

    van Roekel, Hendrik W H; Rosier, Bas J H M; Meijer, Lenny H H; Hilbers, Peter A J; Markvoort, Albert J; Huck, Wilhelm T S; de Greef, Tom F A

    2015-11-07

    Living cells are able to produce a wide variety of biological responses when subjected to biochemical stimuli. It has become apparent that these biological responses are regulated by complex chemical reaction networks (CRNs). Unravelling the function of these circuits is a key topic of both systems biology and synthetic biology. Recent progress at the interface of chemistry and biology together with the realisation that current experimental tools are insufficient to quantitatively understand the molecular logic of pathways inside living cells has triggered renewed interest in the bottom-up development of CRNs. This builds upon earlier work of physical chemists who extensively studied inorganic CRNs and showed how a system of chemical reactions can give rise to complex spatiotemporal responses such as oscillations and pattern formation. Using purified biochemical components, in vitro synthetic biologists have started to engineer simplified model systems with the goal of mimicking biological responses of intracellular circuits. Emulation and reconstruction of system-level properties of intracellular networks using simplified circuits are able to reveal key design principles and molecular programs that underlie the biological function of interest. In this Tutorial Review, we present an accessible overview of this emerging field starting with key studies on inorganic CRNs followed by a discussion of recent work involving purified biochemical components. Finally, we review recent work showing the versatility of programmable biochemical reaction networks (BRNs) in analytical and diagnostic applications.

  8. Ursgal, Universal Python Module Combining Common Bottom-Up Proteomics Tools for Large-Scale Analysis.

    Science.gov (United States)

    Kremer, Lukas P M; Leufken, Johannes; Oyunchimeg, Purevdulam; Schulze, Stefan; Fufezan, Christian

    2016-03-04

    Proteomics data integration has become a broad field with a variety of programs offering innovative algorithms to analyze increasing amounts of data. Unfortunately, this software diversity leads to many problems as soon as the data is analyzed using more than one algorithm for the same task. Although it was shown that the combination of multiple peptide identification algorithms yields more robust results, it is only recently that unified approaches are emerging; however, workflows that, for example, aim to optimize search parameters or that employ cascaded style searches can only be made accessible if data analysis becomes not only unified but also and most importantly scriptable. Here we introduce Ursgal, a Python interface to many commonly used bottom-up proteomics tools and to additional auxiliary programs. Complex workflows can thus be composed using the Python scripting language using a few lines of code. Ursgal is easily extensible, and we have made several database search engines (X!Tandem, OMSSA, MS-GF+, Myrimatch, MS Amanda), statistical postprocessing algorithms (qvality, Percolator), and one algorithm that combines statistically postprocessed outputs from multiple search engines ("combined FDR") accessible as an interface in Python. Furthermore, we have implemented a new algorithm ("combined PEP") that combines multiple search engines employing elements of "combined FDR", PeptideShaker, and Bayes' theorem.

  9. Sustainability and Uncertainty: Bottom-Up and Top-Down Approaches

    Directory of Open Access Journals (Sweden)

    K. Klint Jensen

    2010-04-01

    Full Text Available The widely used concept of sustainability is seldom precisely defined, and its clarification involves making up one’s mind about a range of difficult questions. One line of research (bottom-up takes sustaining a system over time as its starting point and then infers prescriptions from this requirement. Another line (top-down takes an economical interpretation of the Brundtland Commission’s suggestion that the present generation’s needsatisfaction should not compromise the need-satisfaction of future generations as its starting point. It then measures sustainability at the level of society and infers prescriptions from this requirement. These two approaches may conflict, and in this conflict the top-down approach has the upper hand, ethically speaking. However, the implicit goal in the top-down approach of justice between generations needs to be refined in several dimensions. But even given a clarified ethical goal, disagreements can arise. At present we do not know what substitutions will be possible in the future. This uncertainty clearly affects the prescriptions that follow from the measure of sustainability. Consequently, decisions about how to make future agriculture sustainable are decisions under uncertainty. There might be different judgments on likelihoods; but even given some set of probabilities, there might be disagreement on the right level of precaution in face of the uncertainty.

  10. Top-down and bottom-up analysis of commercial enoxaparins.

    Science.gov (United States)

    Liu, Xinyue; St Ange, Kalib; Lin, Lei; Zhang, Fuming; Chi, Lianli; Linhardt, Robert J

    2017-01-13

    A strategy for the comprehensive analysis of low molecular weight (LMW) heparins is described that relies on using an integrated top-down and bottom-up approach. Liquid chromatography-mass spectrometry, an essential component of this approach, is rapid, robust, and amenable to automated processing and interpretation. Nuclear magnetic resonance spectroscopy provides complementary top-down information on the chirality of the uronic acid residues comprising a low molecular weight heparin. Using our integrated approach four different low molecular weight heparins prepared from porcine heparin through chemical β-eliminative cleavage were comprehensively analyzed. Lovenox™ and Clexane™, the innovator versions of enoxaparin marketed in the US and Europe, respectively, and two generic enoxaparins, from Sandoz and Teva, were analyzed. The results which were supported by analysis of variation (ANOVA), while showing remarkable similarities between different versions of the product and good lot-to-lot consistency of each product, also detects subtle differences that may result from differences in their manufacturing processes or differences in the source (or parent) porcine heparin from which each product is prepared.

  11. Um Modelo de inovação bottom up: Museu de Favela (MUF

    Directory of Open Access Journals (Sweden)

    Natália Nakano

    2013-12-01

    Full Text Available Este artigo tem como objetivo apresentar, descrever e discutir o modelo de inovação do primeiro museu territorial ao ar livre, concebido em uma favela no Rio de Janeiro, o Museu de Favela (MUF. Nele são introduzidos os conceitos de favela, e diferenciados museu tradicional e os ecomuseus, a fim de contextualizar o universo do MUF. Discute-se o conceito de coleção de um museu territorial ao ar livre e como se dá o trabalho de curadoria nesse contexto, bem como os tipos de interação possíveis com a diversidade de indivíduos atendidos por um museu como o MUF. Discute-se ainda o papel dessa nova tipologia museológica na sociedade, a partir de entidades criadas pela inovação do tipo bottom up realizada pela iniciativa do MUF dentro da nova museologia de ação. Conclui-se com considerações a respeito da mudança de foco do papel desempenhado pelo MUF como agente de desenvolvimento social e cultural.

  12. Visionmaker NYC: A bottom-up approach to finding shared socioeconomic pathways in New York City

    Science.gov (United States)

    Sanderson, E. W.; Fisher, K.; Giampieri, M.; Barr, J.; Meixler, M.; Allred, S. B.; Bunting-Howarth, K. E.; DuBois, B.; Parris, A. S.

    2015-12-01

    Visionmaker NYC is a free, public participatory, bottom-up web application to develop and share climate mitigation and adaptation strategies for New York City neighborhoods. The goal is to develop shared socioeconomic pathways by allowing a broad swath of community members - from schoolchildren to architects and developers to the general public - to input their concepts for a desired future. Visions are comprised of climate scenarios, lifestyle choices, and ecosystem arrangements, where ecosystems are broadly defined to include built ecosystems (e.g. apartment buildings, single family homes, etc.), transportation infrastructure (e.g. highways, connector roads, sidewalks), and natural land cover types (e.g. wetlands, forests, estuary.) Metrics of water flows, carbon cycling, biodiversity patterns, and population are estimated for the user's vision, for the same neighborhood today, and for that neighborhood as it existed in the pre-development state, based on the Welikia Project (welikia.org.) Users can keep visions private, share them with self-defined groups of other users, or distribute them publicly. Users can also propose "challenges" - specific desired states of metrics for specific parts of the city - and others can post visions in response. Visionmaker contributes by combining scenario planning, scientific modelling, and social media to create new, wide-open possibilities for discussion, collaboration, and imagination regarding future, shared socioeconomic pathways.

  13. A Computational Strategy to Analyze Label-Free Temporal Bottom-up Proteomics Data

    Energy Technology Data Exchange (ETDEWEB)

    Du, Xiuxia; Callister, Stephen J.; Manes, Nathan P.; Adkins, Joshua N.; Alexandridis, Roxana A.; Zeng, Xiaohua; Roh, Jung Hyeob; Smith, William E.; Donohue, Timothy J.; Kaplan, Samuel; Smith, Richard D.; Lipton, Mary S.

    2008-07-01

    Motivation: Biological systems are in a continual state of flux, which necessitates an understanding of the dynamic nature of protein abundances. The study of protein abundance dynamics has become feasible with recent improvements in mass spectrometry-based quantitative proteomics. However, a number of challenges still re-main related to how best to extract biological information from dy-namic proteomics data; for example, challenges related to extrane-ous variability, missing abundance values, and the identification of significant temporal patterns. Results: This article describes a strategy that addresses the afore-mentioned issues for the analysis of temporal bottom-up proteomics data. The core strategy for the data analysis algorithms and subse-quent data interpretation was formulated to take advantage of the temporal properties of the data. The analysis procedure presented herein was applied to data from a Rhodobacter sphaeroides 2.4.1 time-course study. The results were in close agreement with existing knowledge about R. sphaeroides, therefore demonstrating the utility of this analytical strategy.

  14. Bottom-up control of geomagnetic secular variation by the Earth's inner core

    DEFF Research Database (Denmark)

    Aubert, Julien; Finlay, Chris; Fournier, Alexandre

    2013-01-01

    Temporal changes in the Earth’s magnetic field, known as geomagnetic secular variation, occur most prominently at low latitudes in the Atlantic hemisphere1, 2 (that is, from −90 degrees east to 90 degrees east), whereas in the Pacific hemisphere there is comparatively little activity. This is a c......Temporal changes in the Earth’s magnetic field, known as geomagnetic secular variation, occur most prominently at low latitudes in the Atlantic hemisphere1, 2 (that is, from −90 degrees east to 90 degrees east), whereas in the Pacific hemisphere there is comparatively little activity....... This is a consequence of the geographical localization of intense, westward drifting, equatorial magnetic flux patches at the core surface3. Despite successes in explaining the morphology of the geomagnetic field4, numerical models of the geodynamo have so far failed to account systematically for this striking pattern...... of geomagnetic secular variation. Here we show that it can be reproduced provided that two mechanisms relying on the inner core are jointly considered. First, gravitational coupling5 aligns the inner core with the mantle, forcing the flow of liquid metal in the outer core into a giant, westward drifting, sheet...

  15. Bottom-up simulations of methane and ethane emissions from global oil and gas systems 1980 to 2012

    Science.gov (United States)

    Höglund-Isaksson, Lena

    2017-02-01

    Existing bottom-up emission inventories of methane from global oil and gas systems do not satisfactorily explain year-on-year variation in atmospheric methane estimated by top-down models. Using a novel bottom-up approach this study quantifies and attributes methane and ethane emissions from global oil and gas production from 1980 to 2012. Country-specific information on associated gas flows from published sources are combined with inter-annual variations in observed flaring of associated gas from satellite images from 1994 to 2010, to arrive at country-specific annual estimates of methane and ethane emissions from flows of associated gas. Results confirm trends from top-down models and indicate considerably higher methane and ethane emissions from oil production than previously shown in bottom-up inventories for this time period.

  16. Bottom-up synthesis of ordered metal/oxide/metal nanodots on substrates for nanoscale resistive switching memory

    Science.gov (United States)

    Han, Un-Bin; Lee, Jang-Sik

    2016-01-01

    The bottom-up approach using self-assembled materials/processes is thought to be a promising solution for next-generation device fabrication, but it is often found to be not feasible for use in real device fabrication. Here, we report a feasible and versatile way to fabricate high-density, nanoscale memory devices by direct bottom-up filling of memory elements. An ordered array of metal/oxide/metal (copper/copper oxide/copper) nanodots was synthesized with a uniform size and thickness defined by self-organized nanotemplate mask by sequential electrochemical deposition (ECD) of each layer. The fabricated memory devices showed bipolar resistive switching behaviors confirmed by conductive atomic force microscopy. This study demonstrates that ECD with bottom-up growth has great potential to fabricate high-density nanoelectronic devices beyond the scaling limit of top-down device fabrication processes. PMID:27157385

  17. The Comparative Effect of Top-down Processing and Bottom-up Processing through TBLT on Extrovert and Introvert EFL

    Directory of Open Access Journals (Sweden)

    Pezhman Nourzad Haradasht

    2013-09-01

    Full Text Available This research seeks to examine the effect of two models of reading comprehension, namely top-down and bottom-up processing, on the reading comprehension of extrovert and introvert EFL learners’ reading comprehension. To do this, 120 learners out of a total number of 170 intermediate learners being educated at Iran Mehr English Language School were selected all taking a PET (Preliminary English Test first for homogenization prior to the study. They also answered the Eysenck Personality Inventory (EPI which in turn categorized them into two subgroups within each reading models consisting of introverts and extroverts. All in all, there were four subgroups: 30 introverts and 30 extroverts undergoing the top-down processing treatment, and 30 introverts and 30 extroverts experiencing the bottom-up processing treatment. The aforementioned PET was administered as the post test of the study after each group was exposed to the treatment for 18 sessions in six weeks. After the instructions finished, the mean scores of all four groups on this post test were computed and a two-way ANOVA was run to test all the four hypotheses raise in this study. the results showed that while learners generally benefitted more from the bottom-up processing setting compared  to the top-down processing one, the extrovert group was better off receiving top-down instruction. Furthermore, introverts outperformed extroverts in bottom-up group; yet between the two personalities subgroups in the top-down setting no difference was seen. A predictable pattern of benefitting from teaching procedures could not be drawn for introverts as in both top-down and bottom-up settings, they benefitted more than extroverts. Keywords: Reading comprehension, top-down processing, bottom-up processing, extrovert, introvert

  18. Mechanisms underlying the basal forebrain enhancement of top-down and bottom-up attention.

    Science.gov (United States)

    Avery, Michael C; Dutt, Nikil; Krichmar, Jeffrey L

    2014-03-01

    Both attentional signals from frontal cortex and neuromodulatory signals from basal forebrain (BF) have been shown to influence information processing in the primary visual cortex (V1). These two systems exert complementary effects on their targets, including increasing firing rates and decreasing interneuronal correlations. Interestingly, experimental research suggests that the cholinergic system is important for increasing V1's sensitivity to both sensory and attentional information. To see how the BF and top-down attention act together to modulate sensory input, we developed a spiking neural network model of V1 and thalamus that incorporated cholinergic neuromodulation and top-down attention. In our model, activation of the BF had a broad effect that decreases the efficacy of top-down projections and increased the reliance of bottom-up sensory input. In contrast, we demonstrated how local release of acetylcholine in the visual cortex, which was triggered through top-down gluatmatergic projections, could enhance top-down attention with high spatial specificity. Our model matched experimental data showing that the BF and top-down attention decrease interneuronal correlations and increase between-trial reliability. We found that decreases in correlations were primarily between excitatory-inhibitory pairs rather than excitatory-excitatory pairs and suggest that excitatory-inhibitory decorrelation is necessary for maintaining low levels of excitatory-excitatory correlations. Increased inhibitory drive via release of acetylcholine in V1 may then act as a buffer, absorbing increases in excitatory-excitatory correlations that occur with attention and BF stimulation. These findings will lead to a better understanding of the mechanisms underyling the BF's interactions with attention signals and influences on correlations.

  19. Do top-down or bottom-up forces determine Stephanitis pyrioides abundance in urban landscapes?

    Science.gov (United States)

    Shrewsbury, Paula M; Raupp, Michael J

    2006-02-01

    This study examined the influence of habitat structural complexity on the collective effects of top-down and bottom-up forces on herbivore abundance in urban landscapes. The persistence and varying complexity of urban landscapes set them apart from ephemeral agroecosystems and natural habitats where the majority of studies have been conducted. Using surveys and manipulative experiments. We explicitly tested the effect of natural enemies (enemies hypothesis), host plant quality, and herbivore movement on the abundance of the specialist insect herbivore, Stephanitis pyrioides, in landscapes of varying structural complexity. This herbivore was extremely abundant in simple landscapes and rare in complex ones. Natural enemies were the major force influencing abundance of S. pyrioides across habitat types. Generalist predators, particularly the spider Anyphaena celer, were more abundant in complex landscapes. Predator abundance was related to greater abundance of alternative prey in those landscapes. Stephanitis pyrioides survival was lower in complex habitats when exposed to endemic natural enemy populations. Laboratory feeding trials confirmed the more abundant predators consumed S. pyrioides. Host plant quality was not a strong force influencing patterns of S. pyrioides abundance. When predators were excluded, adult S. pyrioides survival was greater on azaleas grown in complex habitats, in opposition to the observed pattern of abundance. Similarly, complexity did not affect S. pyrioides immigration and emigration rates. The complexity of urban landscapes affects the strength of top-down forces on herbivorous insect populations by influencing alternative prey and generalist predator abundance. It is possible that habitats can be manipulated to promote the suppressive effects of generalist predators.

  20. Engineered Micro-Objects as Scaffolding Elements in Cellular Building Blocks for Bottom-Up Tissue Engineering Approaches

    NARCIS (Netherlands)

    Leferink, A.M.; Schipper, D.; Arts, E.; Vrij, E.J.; Rivron, N.C.; Karperien, H.B.J.; Mittmann, K.; Blitterswijk, van C.A.; Moroni, L.; Truckenmuller, R.K.

    2014-01-01

    A material-based bottom-up approach is proposed towards an assembly of cells and engineered micro-objects at the macroscale. We show how shape, size and wettability of engineered micro-objects play an important role in the behavior of cells on these objects. This approach can, among other applicatio

  1. Evaluating the Resilience of the Bottom-up Method used to Detect and Benchmark the Smartness of University Campuses

    DEFF Research Database (Denmark)

    Giovannella, Carlo; Andone, Diana; Dascalu, Mihai

    2016-01-01

    A new method to perform a bottom-up extraction and benchmark of the perceived multilevel smartness of complex ecosystems has been recently described and applied to territories and learning ecosystems like university campuses and schools. In this paper we study the resilience of our method...

  2. Assessing the Gap Between Top-down and Bottom-up Measured Methane Emissions in Indianapolis, IN.

    Science.gov (United States)

    Prasad, K.; Lamb, B. K.; Cambaliza, M. O. L.; Shepson, P. B.; Stirm, B. H.; Salmon, O. E.; Lavoie, T. N.; Lauvaux, T.; Ferrara, T.; Howard, T.; Edburg, S. L.; Whetstone, J. R.

    2014-12-01

    Releases of methane (CH4) from the natural gas supply chain in the United States account for approximately 30% of the total US CH4 emissions. However, there continues to be large questions regarding the accuracy of current emission inventories for methane emissions from natural gas usage. In this paper, we describe results from top-down and bottom-up measurements of methane emissions from the large isolated city of Indianapolis. The top-down results are based on aircraft mass balance and tower based inverse modeling methods, while the bottom-up results are based on direct component sampling at metering and regulating stations, surface enclosure measurements of surveyed pipeline leaks, and tracer/modeling methods for other urban sources. Mobile mapping of methane urban concentrations was also used to identify significant sources and to show an urban-wide low level enhancement of methane levels. The residual difference between top-down and bottom-up measured emissions is large and cannot be fully explained in terms of the uncertainties in top-down and bottom-up emission measurements and estimates. Thus, the residual appears to be, at least partly, attributed to a significant wide-spread diffusive source. Analyses are included to estimate the size and nature of this diffusive source.

  3. Using classic methods in a networked manner: seeing volunteered spatial information in a bottom-up fashion

    NARCIS (Netherlands)

    Carton, L.J.; Ache, P.M.

    2014-01-01

    Using new social media and ICT infrastructures for self-organization, more and more citizen networks and business sectors organize themselves voluntarily around sustainability themes. The paper traces and evaluates one emerging innovation in such bottom-up, networked form of sustainable governance

  4. Evaluating the Resilience of the Bottom-up Method used to Detect and Benchmark the Smartness of University Campuses

    NARCIS (Netherlands)

    Giovannella, Carlo; Andone, Diana; Dascalu, Mihai; Popescu, Elvira; Rehm, Matthias; Mealha, Oscar

    2017-01-01

    A new method to perform a bottom-up extraction and benchmark of the perceived multilevel smartness of complex ecosystems has been recently described and applied to territories and learning ecosystems like university campuses and schools. In this paper we study the resilience of our method by co

  5. Citizenship Policy from the Bottom-Up: The Linguistic and Semiotic Landscape of a Naturalization Field Office

    Science.gov (United States)

    Loring, Ariel

    2015-01-01

    This article follows a bottom-up approach to language policy (Ramanathan, 2005; Wodak, 2006) in an analysis of citizenship in policy and practice. It compares representations of citizenship in and around a regional branch of the United States Citizenship and Immigration Services (USCIS), with a focus on citizenship swearing-in ceremonies for…

  6. A Facile Bottom-Up Approach to Construct Hybrid Flexible Cathode Scaffold for High-Performance Lithium-Sulfur Batteries.

    Science.gov (United States)

    Ghosh, Arnab; Manjunatha, Revanasiddappa; Kumar, Rajat; Mitra, Sagar

    2016-12-14

    Lithium-sulfur batteries mostly suffer from the low utilization of sulfur, poor cycle life, and low rate performances. The prime factors that affect the performance are enormous volume change of the electrode, soluble intermediate product formation, poor electronic and ionic conductivity of S, and end discharge products (i.e., Li2S2 and Li2S). The attractive way to mitigate these challenges underlying in the fabrication of a sulfur nanocomposite electrode consisting of different nanoparticles with distinct properties of lithium storage capability, mechanical reinforcement, and ionic as well as electronic conductivity leading to a mechanically robust and mixed conductive (ionic and electronic conductive) sulfur electrode. Herein, we report a novel bottom-up approach to synthesize a unique freestanding, flexible cathode scaffold made of porous reduced graphene oxide, nanosized sulfur, and Mn3O4 nanoparticles, and all are three-dimensionally interconnected to each other by hybrid polyaniline/sodium alginate (PANI-SA) matrix to serve individual purposes. A capacity of 1098 mAh g(-1) is achieved against lithium after 200 cycles at a current rate of 2 A g(-1) with 97.6% of initial capacity at a same current rate, suggesting the extreme stability and cycling performance of such electrode. Interestingly, with the higher current density of 5 A g(-1), the composite electrode exhibited an initial capacity of 1015 mA h g(-1) and retained 71% of the original capacity after 500 cycles. The in situ Raman study confirms the polysulfide absorption capability of Mn3O4. This work provides a new strategy to design a mechanically robust, mixed conductive nanocomposite electrode for high-performance lithium-sulfur batteries and a strategy that can be used to develop flexible large power storage devices.

  7. Motivation and drives in bottom-up developments in natural hazards management: multiple-use of adaptation strategies in Austria

    Science.gov (United States)

    Thaler, Thomas; Fuchs, Sven

    2015-04-01

    Losses from extreme hydrological events, such as recently experienced in Europe have focused the attention of policymakers as well as researchers on vulnerability to natural hazards. In parallel, the context of changing flood risks under climate and societal change is driving transformation in the role of the state in responsibility sharing and individual responsibilities for risk management and precaution. The new policy agenda enhances the responsibilities of local authorities and private citizens in hazard management and reduces the role of central governments. Within the objective is to place added responsibility on local organisations and citizens to determine locally-based strategies for risk reduction. A major challenge of modelling adaptation is to represent the complexity of coupled human-environmental systems and particularly the feedback loops between environmental dynamics and human decision-making processes on different scales. This paper focuses on bottom-up initiatives to flood risk management which are, by definition, different from the mainstream. These initiatives are clearly influenced (positively or negatively) by a number of factors, where the combination of these interdependences can create specific conditions that alter the opportunity for effective governance arrangements in a local scheme approach. In total, this study identified six general drivers which encourage the implementation of flood storages, such as direct relation to recent major flood frequency and history, the initiative of individual stakeholders (promoters), political pressures from outside (e.g. business companies, private households) and a strong solidarity attitude of municipalities and the stakeholders involved. Although partnership approach may be seen as an 'optimal' solution for flood risk management, in practice there are many limitations and barriers in establishing these collaborations and making them effective (especially in the long term) with the consequences

  8. Reconciling Top-Down and Bottom-Up Estimates of Oil and Gas Methane Emissions in the Barnett Shale

    Science.gov (United States)

    Hamburg, S.

    2015-12-01

    Top-down approaches that use aircraft, tower, or satellite-based measurements of well-mixed air to quantify regional methane emissions have typically estimated higher emissions from the natural gas supply chain when compared to bottom-up inventories. A coordinated research campaign in October 2013 used simultaneous top-down and bottom-up approaches to quantify total and fossil methane emissions in the Barnett Shale region of Texas. Research teams have published individual results including aircraft mass-balance estimates of regional emissions and a bottom-up, 25-county region spatially-resolved inventory. This work synthesizes data from the campaign to directly compare top-down and bottom-up estimates. A new analytical approach uses statistical estimators to integrate facility emission rate distributions from unbiased and targeted high emission site datasets, which more rigorously incorporates the fat-tail of skewed distributions to estimate regional emissions of well pads, compressor stations, and processing plants. The updated spatially-resolved inventory was used to estimate total and fossil methane emissions from spatial domains that match seven individual aircraft mass balance flights. Source apportionment of top-down emissions between fossil and biogenic methane was corroborated with two independent analyses of methane and ethane ratios. Reconciling top-down and bottom-up estimates of fossil methane emissions leads to more accurate assessment of natural gas supply chain emission rates and the relative contribution of high emission sites. These results increase our confidence in our understanding of the climate impacts of natural gas relative to more carbon-intensive fossil fuels and the potential effectiveness of mitigation strategies.

  9. A bottom-up approach to urban metabolism: the perspective of BRIDGE

    Science.gov (United States)

    Chrysoulakis, N.; Borrego, C.; San Josè, R.; Grimmond, S. B.; Jones, M. B.; Magliulo, V.; Klostermann, J.; Santamouris, M.

    2011-12-01

    Urban metabolism considers a city as a system and usually distinguishes between energy and material flows as its components. "Metabolic" studies are usually top-down approaches that assess the inputs and outputs of food, water, energy, and pollutants from a city, or that compare the changing metabolic process of several cities. In contrast, bottom-up approaches are based on quantitative estimates of urban metabolism components at local to regional scales. Such approaches consider the urban metabolism as the 3D exchange and transformation of energy and matter between a city and its environment. The city is considered as a system and the physical flows between this system and its environment are quantitatively estimated. The transformation of landscapes from primarily agricultural and forest uses to urbanized landscapes can greatly modify energy and material exchanges and it is, therefore, an important aspect of an urban area. Here we focus on the exchanges and transformation of energy, water, carbon and pollutants. Recent advances in bio-physical sciences have led to new methods and models to estimate local scale energy, water, carbon and pollutant fluxes. However, there is often poor communication of new knowledge and its implications to end-users, such as planners, architects and engineers. The FP7 Project BRIDGE (SustainaBle uRban plannIng Decision support accountinG for urban mEtabolism) aims at bridging this gap and at illustrating the advantages of considering environmental issues in urban planning. BRIDGE does not perform a complete life cycle analysis or calculate whole system urban metabolism, but rather focuses on specific metabolism components (energy, water, carbon and pollutants). Its main goal is the development of a Decision Suport System (DSS) with the potential to select planning actions which better fit the goal of changing the metabolism of urban systems towards sustainability. BRIDGE evaluates how planning alternatives can modify the physical

  10. Using the Hestia bottom-up FFCO2 emissions estimation to identify drivers and hotspots in urban areas

    Science.gov (United States)

    Rao, P.; Patarasuk, R.; Gurney, K. R.; o'Keefe, D.; Song, Y.; Huang, J.; Buchert, M.; Lin, J. C.; Mendoza, D. L.; Ehleringer, J. R.; Eldering, A.; Miller, C. E.; Duren, R. M.

    2015-12-01

    Urban areas occupy 3% of the earth's land surface and generate 75% of the fossil fuel carbon dioxide (FFCO2) emissions. We report on the application of the Hestia Project to the Salt Lake County (SLC) and Los Angeles (LA) domains. Hestia quantifies FFCO2 in fine space-time detail across urban domains using a scientific "bottom-up" approach. We explore the utility of the Hestia to inform both urbanization science and greenhouse gas (GHG) mitigation policy. We focus on the residential sector in SLC and the onroad sector in LA as these sectors are large emissions contributors in each locale, and local governments have some authority and policy levers to mitigate these emissions. Multiple regression using sociodemographic data across SLC census block-groups shows that per capita income exhibits a positive relationship with FFCO2 emissions while household size exhibits a negative relationship, after controlling for total population. Housing units per area (i.e., compact development) has little effect on FFCO2 emissions. Rising income in the high-income group has twice as much impact on the emissions as the low-income group. Household size for the low-income group has four times the impact on the emissions as the high-income group. In LA, onroad FFCO2 emissions account for 49% of total emissions, of which 41% is from arterials (intermediate road class). Arterials also have the largest carbon emissions intensity - FFCO2/vehicle distance travelled (VKT) - possibly from high traffic congestion and fleet composition. Non-interstate hotspot emissions (> 419 tC ln-km-1) are equally dominated by particular arterials and collectors (lowest road class) though collectors have a higher VKT. These hotspots occur largely in LA (67%) and Orange (18%) counties and provide targeted information for onroad emissions reduction. Using Hestia to identify FFCO2 emissions drivers and hotpots can aid state and local policy makers in planning the most effective GHG reductions.

  11. A bottom-up approach to derive the closure relation for modelling hydrological fluxes at the watershed scale

    Science.gov (United States)

    Vannametee, Ekkamol; Karssenberg, Derek; Hendriks, Martin; Bierkens, Marc

    2014-05-01

    , potentially avoiding calibration. The Hortonain runoff closure relation is evaluated using field discharge observations from 16 km2 catchments in the French Alps. The catchments are disaggregated to 60 REWs. Scaling parameters for each REW are derived from the parameter library. Discharge is simulated from individual REWs, routed over the stream network, and summed at the catchment outlets to obtain the catchment-scale responses. The results show that our closure relation is capable of reproducing the observed hydrograph and discharge volume without calibration, i.e. Nash-Sutcliffe index up to 0.8, 10% errors in discharge volume. Our closure relation outperforms a simple lumped rainfall-runoff model that does not have scaling components. A brute-force calibration for an optimal local-scale REW observable (i.e. saturated hydraulic conductivity; Ks), using a constant pre-factor for all REWs, however significantly improves the prediction. The calibrated Ks values are comparable to the local-scale observations in the study catchment, implying that calibration may be unnecessary if the local-scale observable REW properties can be correctly estimated. The bottom-up approach for derivation of closure relation, including the parameter estimation scheme, in this study is robust and shows promising applicability for the REW-based models.

  12. Chitosan microspheres with an extracellular matrix-mimicking nanofibrous structure as cell-carrier building blocks for bottom-up cartilage tissue engineering.

    Science.gov (United States)

    Zhou, Yong; Gao, Huai-Ling; Shen, Li-Li; Pan, Zhao; Mao, Li-Bo; Wu, Tao; He, Jia-Cai; Zou, Duo-Hong; Zhang, Zhi-Yuan; Yu, Shu-Hong

    2016-01-07

    Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation. Recently, as a valuable alternative, a bottom-up TE approach utilizing cell-loaded micrometer-scale modular components as building blocks to reconstruct a new tissue in vitro or in vivo has been proved to demonstrate a number of desirable advantages compared with the traditional bulk scaffold based top-down TE approach. Nevertheless, micro-components with an ECM-mimicking nanofibrous structure are still very scarce and highly desirable. Chitosan (CS), an accessible natural polymer, has demonstrated appealing intrinsic properties and promising application potential for TE, especially the cartilage tissue regeneration. According to this background, we report here the fabrication of chitosan microspheres with an ECM-mimicking nanofibrous structure for the first time based on a physical gelation process. By combining this physical fabrication procedure with microfluidic technology, uniform CS microspheres (CMS) with controlled nanofibrous microstructure and tunable sizes can be facilely obtained. Especially, no potentially toxic or denaturizing chemical crosslinking agent was introduced into the products. Notably, in vitro chondrocyte culture tests revealed that enhanced cell attachment and proliferation were realized, and a macroscopic 3D geometrically shaped cartilage-like composite can be easily constructed with the nanofibrous CMS (NCMS) and chondrocytes, which demonstrate significant application potential of NCMS as the bottom-up cell-carrier components for cartilage tissue engineering.

  13. Benchmarking energy scenarios for China: perspectives from top-down, economic and bottom-up, technical modelling

    DEFF Research Database (Denmark)

    This study uses a soft-linking methodology to harmonise two complex global top-down and bottom-up models with a regional China focus. The baseline follows the GDP and demographic trends of the Shared Socio-economic Pathways (SSP2) scenario, down-scaled for China, while the carbon tax scenario......-specific modelling results further. These new sub-regional China features can now be used for a more detailed analysis of China's regional developments in a global context....

  14. A comprehensive estimate of recent carbon sinks in China using both top-down and bottom-up approaches

    Science.gov (United States)

    Jiang, Fei; Chen, Jing; Zhou, Linxi; Ju, Weimin; Zhang, Huifang; Machida, Toshinobu; Ciais, Philippe; Peters, Wouter; Wang, Hengmao; Chen, Baozhang; Liu, Linxin; Zhang, Chunhua; Matsueda, Hidekazu; Sawa, Yousuke

    2016-04-01

    Atmospheric inversions use measurements of atmospheric CO2 gradients to constrain regional surface fluxes. Current inversions indicate a net terrestrial CO2 sink in China between 0.16 and 0.35 PgC/yr. The uncertainty of these estimates is as large as the mean because the atmospheric network historically contained only one high altitude station in China. Here, we revisit the calculation of the terrestrial CO2 flux in China, excluding emissions from fossil fuel burning and cement production, by using two inversions with three new CO2 monitoring stations in China as well as aircraft observations over Asia. We estimate a net terrestrial CO2 uptake of 0.39-0.51 PgC/yr with a mean of 0.45 PgC/yr in 2006-2009. After considering the lateral transport of carbon in air and water and international trade, the annual mean carbon sink is adjusted to 0.35 PgC/yr. To evaluate this top-down estimate, we constructed an independent bottom-up estimate based on ecosystem data, and giving a net land sink of 0.33 PgC/yr. This demonstrates closure between the top-down and bottom-up estimates. Both top-down and bottom-up estimates give a higher carbon sink than previous estimates made for the 1980s and 1990s, suggesting a trend towards increased uptake by land ecosystems in China.

  15. Bottom-up effects of nutrient availability on flower production, pollinator visitation, and seed output in a high-Andean shrub.

    Science.gov (United States)

    Muñoz, Alejandro A; Celedon-Neghme, Constanza; Cavieres, Lohengrin A; Arroyo, Mary T K

    2005-03-01

    Soil nutrient availability directly enhances vegetative growth, flowering, and fruiting in alpine ecosystems. However, the impacts of nutrient addition on pollinator visitation, which could affect seed output indirectly, are unknown. In a nutrient addition experiment, we tested the hypothesis that seed output in the insect-pollinated, self-incompatible shrub, Chuquiraga oppositifolia (Asteraceae) of the Andes of central Chile, is enhanced by soil nitrogen (N) availability. We aimed to monitor total shrub floral display, size of flower heads (capitula), pollinator visitation patterns, and seed output during three growing seasons on control and N addition shrubs. N addition did not augment floral display, size of capitula, pollinator visitation, or seed output during the first growing season. Seed mass and viability were 25-40% lower in fertilised shrubs. During the second growing season only 33% of the N addition shrubs flowered compared to 71% of controls, and a significant (50%) enhancement in vegetative growth occurred in fertilised shrubs. During the third growing season, floral display in N addition shrubs was more than double that of controls, received more than twice the number of insect pollinator visits, and seed output was three- to four-fold higher compared to controls. A significant (50%) enhancement in vegetative growth again occurred in N addition shrubs. Results of this study strongly suggest that soil N availability produces strong positive bottom-up effects on the reproductive output of the alpine shrub C. oppositifolia. Despite taking considerably longer to be manifest in comparison to the previously reported top-down indirect negative effects of lizard predators in the same study system, our results suggest that both bottom-up and top-down forces are important in controlling the reproductive output of an alpine shrub.

  16. Bottom-up processing of thermoelectric nanocomposites from colloidal nanocrystal building blocks: the case of Ag{sub 2}Te-PbTe

    Energy Technology Data Exchange (ETDEWEB)

    Cadavid, Doris [Catalonia Institute for Energy Research, IREC (Spain); Ibanez, Maria [Universitat de Barcelona, Departament d' Electronica (Spain); Gorsse, Stephane [Universite de Bordeaux, ICMCB, CNRS (France); Lopez, Antonio M. [Universitat Politecnica de Catalunya, Departament d' Enginyeria Electronica (Spain); Cirera, Albert [Universitat de Barcelona, Departament d' Electronica (Spain); Morante, Joan Ramon; Cabot, Andreu, E-mail: acabot@irec.cat [Catalonia Institute for Energy Research, IREC (Spain)

    2012-12-15

    Nanocomposites are highly promising materials to enhance the efficiency of current thermoelectric devices. A straightforward and at the same time highly versatile and controllable approach to produce nanocomposites is the assembly of solution-processed nanocrystal building blocks. The convenience of this bottom-up approach to produce nanocomposites with homogeneous phase distributions and adjustable composition is demonstrated here by blending Ag{sub 2}Te and PbTe colloidal nanocrystals to form Ag{sub 2}Te-PbTe bulk nanocomposites. The thermoelectric properties of these nanocomposites are analyzed in the temperature range from 300 to 700 K. The evolution of their electrical conductivity and Seebeck coefficient is discussed in terms of the blend composition and the characteristics of the constituent materials.

  17. Parallel- and serial-contact electrochemical metallization of monolayer nanopatterns: A versatile synthetic tool en route to bottom-up assembly of electric nanocircuits

    Directory of Open Access Journals (Sweden)

    Jonathan Berson

    2012-02-01

    Full Text Available Contact electrochemical transfer of silver from a metal-film stamp (parallel process or a metal-coated scanning probe (serial process is demonstrated to allow site-selective metallization of monolayer template patterns of any desired shape and size created by constructive nanolithography. The precise nanoscale control of metal delivery to predefined surface sites, achieved as a result of the selective affinity of the monolayer template for electrochemically generated metal ions, provides a versatile synthetic tool en route to the bottom-up assembly of electric nanocircuits. These findings offer direct experimental support to the view that, in electrochemical metal deposition, charge is carried across the electrode–solution interface by ion migration to the electrode rather than by electron transfer to hydrated ions in solution.

  18. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    Energy Technology Data Exchange (ETDEWEB)

    Dunn, Nicholas J. H.; Noid, W. G., E-mail: wnoid@chem.psu.edu [Department of Chemistry, The Pennsylvania State University, University Park, Pennsylvania 16802 (United States)

    2015-12-28

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed “pressure-matching” variational principle to determine a volume-dependent contribution to the potential, U{sub V}(V), that approximates the volume-dependence of the PMF. We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing U{sub V}, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that U{sub V} accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the “simplicity” of the model.

  19. Bottom-up coarse-grained models that accurately describe the structure, pressure, and compressibility of molecular liquids

    Science.gov (United States)

    Dunn, Nicholas J. H.; Noid, W. G.

    2015-12-01

    The present work investigates the capability of bottom-up coarse-graining (CG) methods for accurately modeling both structural and thermodynamic properties of all-atom (AA) models for molecular liquids. In particular, we consider 1, 2, and 3-site CG models for heptane, as well as 1 and 3-site CG models for toluene. For each model, we employ the multiscale coarse-graining method to determine interaction potentials that optimally approximate the configuration dependence of the many-body potential of mean force (PMF). We employ a previously developed "pressure-matching" variational principle to determine a volume-dependent contribution to the potential, UV(V), that approximates the volume-dependence of the PMF. We demonstrate that the resulting CG models describe AA density fluctuations with qualitative, but not quantitative, accuracy. Accordingly, we develop a self-consistent approach for further optimizing UV, such that the CG models accurately reproduce the equilibrium density, compressibility, and average pressure of the AA models, although the CG models still significantly underestimate the atomic pressure fluctuations. Additionally, by comparing this array of models that accurately describe the structure and thermodynamic pressure of heptane and toluene at a range of different resolutions, we investigate the impact of bottom-up coarse-graining upon thermodynamic properties. In particular, we demonstrate that UV accounts for the reduced cohesion in the CG models. Finally, we observe that bottom-up coarse-graining introduces subtle correlations between the resolution, the cohesive energy density, and the "simplicity" of the model.

  20. A regression approach for estimation of anthropogenic heat flux based on a bottom-up air pollutant emission database

    Science.gov (United States)

    Lee, Sang-Hyun; McKeen, Stuart A.; Sailor, David J.

    2014-10-01

    A statistical regression method is presented for estimating hourly anthropogenic heat flux (AHF) using an anthropogenic pollutant emission inventory for use in mesoscale meteorological and air-quality modeling. Based on bottom-up AHF estimated from detailed energy consumption data and anthropogenic pollutant emissions of carbon monoxide (CO) and nitrogen oxides (NOx) in the US National Emission Inventory year 2005 (NEI-2005), a robust regression relation between the AHF and the pollutant emissions is obtained for Houston. This relation is a combination of two power functions (Y = aXb) relating CO and NOx emissions to AHF, giving a determinant coefficient (R2) of 0.72. The AHF for Houston derived from the regression relation has high temporal (R = 0.91) and spatial (R = 0.83) correlations with the bottom-up AHF. Hourly AHF for the whole US in summer is estimated by applying the regression relation to the NEI-2005 summer pollutant emissions with a high spatial resolution of 4-km. The summer daily mean AHF range 10-40 W m-2 on a 4 × 4 km2 grid scale with maximum heat fluxes of 50-140 W m-2 for major US cities. The AHFs derived from the regression relations between the bottom-up AHF and either CO or NOx emissions show a small difference of less than 5% (4.7 W m-2) in city-scale daily mean AHF, and similar R2 statistics, compared to results from their combination. Thus, emissions of either species can be used to estimate AHF in the US cities. An hourly AHF inventory at 4 × 4 km2 resolution over the entire US based on the combined regression is derived and made publicly available for use in mesoscale numerical modeling.

  1. Bottom-Up Nano-heteroepitaxy of Wafer-Scale Semipolar GaN on (001) Si

    KAUST Repository

    Hus, Jui Wei

    2015-07-15

    Semipolar {101¯1} InGaN quantum wells are grown on (001) Si substrates with an Al-free buffer and wafer-scale uniformity. The novel structure is achieved by a bottom-up nano-heteroepitaxy employing self-organized ZnO nanorods as the strain-relieving layer. This ZnO nanostructure unlocks the problems encountered by the conventional AlN-based buffer, which grows slowly and contaminates the growth chamber. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. FROM A COMPARISON OF "TOP-DOWN" AND "BOTTOM-UP" APPROACHES TO THE APPLICATION OF THE "INTERACTIVE" APPROACH

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    This paper introduces three models of reading. Then it ana-lyzes the data gathered from an experiment on the comparison ofthe "top.down" and the "bottom-up" approaches and accord-ingly draws the conclusion that the former approach is helpful inimproving students’ reading comprehension while the latter isuseful in developing their writing skills as well as their knowledgeof vocabulary and sentence structure. Finally this paper presentsa procedure of the application of the "interactive approach",which proves to be productive in teaching college English inten-sive reading.

  3. [Diversity in thalamic relay neurons: evidence for "bottom-up" and "top-down" information flow in thalamocortical pathways].

    Science.gov (United States)

    Clascá, Francisco; Rubio-Garrido, Pablo; Galazo, María J; Porrero, César

    2009-01-01

    Thalamocortical (TC) pathways are still mainly understood as the gateway for ascending sensory-motor information into the cortex. However, it is now clear that a great many TC cells are involved in interactions between cortical areas via the thalamus. We review recent data, including our own, which demonstrate the generalized presence in rodent thalamus of two major TC cell types characterized, among other features, by their axon development, arborization and laminar targeting in the cortex. Such duality may allow inputs from thalamus to access cortical circuits via "bottom-up"-wired axon arbors or via "top-down"-wired axon arbors.

  4. Una implementación computacional de un modelo de atención visual Bottom-up aplicado a escenas naturales/A Computational Implementation of a Bottom-up Visual Attention Model Applied to Natural Scenes

    Directory of Open Access Journals (Sweden)

    Juan F. Ramírez Villegas

    2011-12-01

    Full Text Available El modelo de atención visual bottom-up propuesto por Itti et al., 2000 [1], ha sido un modelo popular en tanto exhibe cierta evidencia neurobiológica de la visión en primates. Este trabajo complementa el modelo computacional de este fenómeno desde la dinámica realista de una red neuronal. Asimismo, esta aproximación se basa en la existencia de mapas topográficos que representan la prominencia de los objetos del campo visual para la formación de una representación general (mapa de prominencia, esta representación es la entrada de una red neuronal dinámica con interacciones locales y globales de colaboración y competencia que convergen sobre las principales particularidades (objetos de la escena.The bottom-up visual attention model proposed by Itti et al. 2000 [1], has been a popular model since it exhibits certain neurobiological evidence of primates’ vision. This work complements the computational model of this phenomenon using a neural network with realistic dynamics. This approximation is based on several topographical maps representing the objects saliency that construct a general representation (saliency map, which is the input for a dynamic neural network, whose local and global collaborative and competitive interactions converge to the main particularities (objects presented by the visual scene as well.

  5. Chitosan microspheres with an extracellular matrix-mimicking nanofibrous structure as cell-carrier building blocks for bottom-up cartilage tissue engineering

    Science.gov (United States)

    Zhou, Yong; Gao, Huai-Ling; Shen, Li-Li; Pan, Zhao; Mao, Li-Bo; Wu, Tao; He, Jia-Cai; Zou, Duo-Hong; Zhang, Zhi-Yuan; Yu, Shu-Hong

    2015-12-01

    Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation. Recently, as a valuable alternative, a bottom-up TE approach utilizing cell-loaded micrometer-scale modular components as building blocks to reconstruct a new tissue in vitro or in vivo has been proved to demonstrate a number of desirable advantages compared with the traditional bulk scaffold based top-down TE approach. Nevertheless, micro-components with an ECM-mimicking nanofibrous structure are still very scarce and highly desirable. Chitosan (CS), an accessible natural polymer, has demonstrated appealing intrinsic properties and promising application potential for TE, especially the cartilage tissue regeneration. According to this background, we report here the fabrication of chitosan microspheres with an ECM-mimicking nanofibrous structure for the first time based on a physical gelation process. By combining this physical fabrication procedure with microfluidic technology, uniform CS microspheres (CMS) with controlled nanofibrous microstructure and tunable sizes can be facilely obtained. Especially, no potentially toxic or denaturizing chemical crosslinking agent was introduced into the products. Notably, in vitro chondrocyte culture tests revealed that enhanced cell attachment and proliferation were realized, and a macroscopic 3D geometrically shaped cartilage-like composite can be easily constructed with the nanofibrous CMS (NCMS) and chondrocytes, which demonstrate significant application potential of NCMS as the bottom-up cell-carrier components for cartilage tissue engineering.Scaffolds for tissue engineering (TE) which closely mimic the physicochemical properties of the natural extracellular matrix (ECM) have been proven to advantageously favor cell attachment, proliferation, migration and new tissue formation

  6. Source attribution of methane emissions from global oil and gas production: results of bottom-up simulations over three decades

    Science.gov (United States)

    Höglund-Isaksson, Lena

    2016-04-01

    Existing bottom-up emission inventories of historical methane and ethane emissions from global oil and gas systems do not well explain year-on-year variations estimated by top-down models from atmospheric measurements. This paper develops a bottom-up methodology which allows for country- and year specific source attribution of methane and ethane emissions from global oil and natural gas production for the period 1980 to 2012. The analysis rests on country-specific simulations of associated gas flows which are converted into methane and ethane emissions. The associated gas flows are constructed from country-specific information on oil and gas production and associated gas generation and recovery, and coupled with generic assumptions to bridge regional information gaps on the fractions of unrecovered associated gas that is vented instead of flared. Summing up emissions from associated gas flows with global estimates of emissions from unintended leakage and natural gas transmission and distribution, the resulting global emissions of methane and ethane from oil and gas systems are reasonably consistent with corresponding estimates from top-down models. Also revealed is that the fall of the Soviet Union in 1990 had a significant impact on methane and ethane emissions from global oil and gas systems.

  7. The value of using top-down and bottom-up approaches for building trust and transparency in biobanking.

    Science.gov (United States)

    Meslin, Eric M

    2010-01-01

    With the domestic and international proliferation of biobanks and their associated connections to health information databases, scholarly attention has been turning from the ethical issues arising from the construction of biobanks to the ethical issues that emerge in their operation and management. Calls for greater transparency in governance structures, coupled with stern reminders of the value of maintaining public trust, are seen as critical components in the success of these resources. Two different approaches have been adopted for addressing these types of ethical issues: the first is a 'top-down' approach which focuses on developing policy, procedures, regulations and guidelines to aid decision-makers. The second is a 'bottom-up' approach, which begins with those who are most affected by the issues and attempts to inductively develop consensus recommendations and policy. While both approaches have merit, I argue that more work needs to be done on 'bottom-up' strategies if trust and transparency are to be more than mere slogans. Using 2 case examples from Indiana, the paper summarizes data from a set of surveys we recently conducted that address issues arising from biobanks that provide some insight into issues associated with trust and transparency.

  8. Community context mediates the top-down vs. bottom-up effects of grazers on rocky shores.

    Science.gov (United States)

    Bracken, Matthew E S; Dolecal, Renee E; Long, Jeremy D

    2014-06-01

    Interactions between grazers and autotrophs are complex, including both top-down consumptive and bottom-up facilitative effects of grazers. Thus, in addition to consuming autotrophs, herbivores can also enhance autotroph biomass by recycling limiting nutrients, thereby increasing nutrient availability. Here, we evaluated these consumptive and facilitative interactions between snails (Littorina littorea) and seaweeds (Fucus vesiculosus and Ulva lactuca) on a rocky shore. We partitioned herbivores' total effects on seaweeds into their consumptive and facilitative effects and evaluated how community context (the presence of another seaweed species) modified the effects of Littorina on a focal seaweed species. Ulva, the more palatable species, enhanced the facilitative effects of Littorina on Fucus. Ulva did not modify the consumptive effect of Littorina on Fucus. Taken together, the consumptive and facilitative effects of snails on Fucus in the presence of Ulva balanced each other, resulting in no net effect of Littorina on Fucus. In contrast, the only effect of Fucus on Ulva was to enhance consumptive effects of Littorina on Ulva. Our results highlight the necessity of considering both consumptive and facilitative effects of herbivores on multiple autotroph species in order to gain a mechanistic understanding of grazers' top-down and bottom-up roles in structuring communities.

  9. The drastic outcomes from voting alliances in three-party bottom-up democratic voting (1990 $\\rightarrow$ 2013)

    CERN Document Server

    Galam, Serge

    2013-01-01

    The drastic effect of local alliances in three-party competition is investigated in democratic hierarchical bottom-up voting. The results are obtained analytically using a model which extends a sociophysics frame introduced in 1986 \\cite{psy} and 1990 \\cite{lebo} to study two-party systems and the spontaneous formation of democratic dictatorship. It is worth stressing that the 1990 paper was published in the Journal of Statistical Physics, the first paper of its kind in the journal. It was shown how a minority in power can preserve its leadership using bottom-up democratic elections. However such a bias holds only down to some critical value of minimum support. The results were used latter to explain the sudden collapse of European communist parties in the nineties. The extension to three-party competition reveals the mechanisms by which a very small minority party can get a substantial representation at higher levels of the hierarchy when the other two competing parties are big. Additional surprising results...

  10. Organizing and financing interstellar space projects - A bottom-up approach

    CERN Document Server

    Ceyssens, Frederik; Wouters, Kristof; Ceyssens, Pieter-Jan; Wen, Lianggong

    2011-01-01

    The development and deployment of interstellar missions will without doubt require orders of magnitude more resources than needed for current or past megaprojects (Apollo, Iter, LHC,...). Question is how enough resources for such gigaprojects can be found. In this contribution different scenarios will be explored assuming limited, moderate economic growth throughout the next centuries, i.e. without human population and productivity continuing to grow exponentially, and without extreme events such as economic collapse or singularity. In such a world, which is not unlike the current situation, gigascale space projects face a combination of inhibiting factors: the enormous cost threshold, the need for risky and costly development of often quite application specific technology, the relatively little benefit with respect to the costs for the sponsors, the time span of at least a few generations and the absence of a sense of urgency. It will be argued that the best chance of getting an interstellar project started ...

  11. Nanoparticle bioconjugates as "bottom-up" assemblies of artifical multienzyme complexes

    Science.gov (United States)

    Keighron, Jacqueline D.

    2010-11-01

    The sequential enzymes of several metabolic pathways have been shown to exist in close proximity with each other in the living cell. Although not proven in all cases, colocalization may have several implications for the rate of metabolite formation. Proximity between the sequential enzymes of a metabolic pathway has been proposed to have several benefits for the overall rate of metabolite formation. These include reduced diffusion distance for intermediates, sequestering of intermediates from competing pathways and the cytoplasm. Restricted diffusion in the vicinity of an enzyme can also cause the pooling of metabolites, which can alter reaction equilibria to control the rate of reaction through inhibition. Associations of metabolic enzymes are difficult to isolate ex vivo due to the weak interactions believed to colocalize sequential enzymes within the cell. Therefore model systems in which the proximity and diffusion of intermediates within the experiment system are controlled are attractive alternatives to explore the effects of colocalization of sequential enzymes. To this end three model systems for multienzyme complexes have been constructed. Direct adsorption enzyme:gold nanoparticle bioconjugates functionalized with malate dehydrogenase (MDH) and citrate synthase (CS) allow for proximity between to the enzymes to be controlled from the nanometer to micron range. Results show that while the enzymes present in the colocalized and non-colocalized systems compared here behaved differently overall the sequential activity of the pathway was improved by (1) decreasing the diffusion distance between active sites, (2) decreasing the diffusion coefficient of the reaction intermediate to prevent escape into the bulk solution, and (3) decreasing the overall amount of bioconjugate in the solution to prevent the pathway from being inhibited by the buildup of metabolite over time. Layer-by-layer (LBL) assemblies of MDH and CS were used to examine the layering effect of

  12. Mineralization of Synthetic Polymer Scaffolds: A Bottom-upApproach for the Development of Artificial Bone

    Energy Technology Data Exchange (ETDEWEB)

    Song, Jie; Viengkham, Malathong; Bertozzi, Carolyn R.

    2004-09-27

    The controlled integration of organic and inorganic components confers natural bone with superior mechanical properties. Bone biogenesis is thought to occur by templated mineralization of hard apatite crystals by an elastic protein scaffold, a process we sought to emulate with synthetic biomimetic hydrogel polymers. Crosslinked polymethacrylamide and polymethacrylate hydrogels were functionalized with mineral-binding ligands and used to template the formation of hydroxyapatite. Strong adhesion between the organic and inorganic materials was achieved for hydrogels functionalized with either carboxylate or hydroxy ligands. The mineral-nucleating potential of hydroxyl groups identified here broadens the design parameters for synthetic bone-like composites and suggests a potential role for hydroxylated collagen proteins in bone mineralization.

  13. Design of the Bottom-up Innovation project - a participatory, primary preventive, organizational level intervention on work-related stress and well-being for workers in Dutch vocational education

    Science.gov (United States)

    2013-01-01

    Background In the educational sector job demands have intensified, while job resources remained the same. A prolonged disbalance between demands and resources contributes to lowered vitality and heightened need for recovery, eventually resulting in burnout, sickness absence and retention problems. Until now stress management interventions in education focused mostly on strengthening the individual capacity to cope with stress, instead of altering the sources of stress at work at the organizational level. These interventions have been only partly effective in influencing burnout and well-being. Therefore, the “Bottom-up Innovation” project tests a two-phased participatory, primary preventive organizational level intervention (i.e. a participatory action approach) that targets and engages all workers in the primary process of schools. It is hypothesized that participating in the project results in increased occupational self-efficacy and organizational efficacy. The central research question: is an organization focused stress management intervention based on participatory action effective in reducing the need for recovery and enhancing vitality in school employees in comparison to business as usual? Methods/Design The study is designed as a controlled trial with mixed methods and three measurement moments: baseline (quantitative measures), six months and 18 months (quantitative and qualitative measures). At first follow-up short term effects of taking part in the needs assessment (phase 1) will be determined. At second follow-up the long term effects of taking part in the needs assessment will be determined as well as the effects of implemented tailored workplace solutions (phase 2). A process evaluation based on quantitative and qualitative data will shed light on whether, how and why the intervention (does not) work(s). Discussion “Bottom-up Innovation” is a combined effort of the educational sector, intervention providers and researchers. Results will

  14. Growth in NOx emissions from power plants in China: bottom-up estimates and satellite observations

    Directory of Open Access Journals (Sweden)

    Y. Lei

    2012-05-01

    Full Text Available Using OMI (Ozone Monitoring Instrument tropospheric NO2 columns and a nested-grid 3-D global chemical transport model (GEOS-Chem, we investigated the growth in NOx emissions from coal-fired power plants and their contributions to the growth in NO2 columns in 2005–2007 in China. We first developed a unit-based power plant NOx emission inventory for 2005–2007 to support this investigation. The total capacities of coal-fired power generation have increased by 48.8% in 2005–2007, with 92.2% of the total capacity additions coming from generator units with size ≥300 MW. The annual NOx emissions from coal-fired power plants were estimated to be 8.11 Tg NO2 for 2005 and 9.58 Tg NO2 for 2007, respectively. The modeled summer average tropospheric NO2 columns were highly correlated (R2 = 0.79–0.82 with OMI measurements over grids dominated by power plant emissions, with only 7–14% low bias, lending support to the high accuracy of the unit-based power plant NOx emission inventory. The ratios of OMI-derived annual and summer average tropospheric NO2 columns between 2007 and 2005 indicated that most of the grids with significant NO2 increases were related to power plant construction activities. OMI had the capability to trace the changes of NOx emissions from individual large power plants in cases where there is less interference from other NOx sources. Scenario runs from GEOS-Chem model suggested that the new power plants contributed 18.5% and 10% to the annual average NO2 columns in 2007 in Inner Mongolia and North China, respectively. The massive new power plant NOx emissions significantly changed the local NO2 profiles, especially in less polluted areas. A sensitivity study found that changes of NO2 shape factors due to including new power plant emissions increased the summer average OMI tropospheric NO2 columns by 3.8–17.2% for six selected locations, indicating that the updated emission information could help to improve the satellite

  15. Growth in NOx emissions from power plants in China: bottom-up estimates and satellite observations

    Directory of Open Access Journals (Sweden)

    Y. Lei

    2012-01-01

    Full Text Available Using OMI (Ozone Monitoring Instrument tropospheric NO2 columns and a nested-grid 3-D global chemical transport model (GEOS-Chem, we investigated the growth in NOx emissions from coal-fired power plants and their contributions to the growth in NO2 columns in 2005–2007 in China. We first developed a unit-based power plant NOx emission inventory for 2005–2007 to support this investigation. The total capacities of coal-fired power generation have increased by 48.8% in 2005–2007, with 92.2% of the total capacity additions coming from generator units with size ≥300 MW. The annual NOx emissions from coal-fired power plants were estimated to be 8.11 Tg NO2 for 2005 and 9.58 Tg NO2 for 2007, respectively. The modeled summer average tropospheric NO2 columns were highly correlated (R2 = 0.79–0.82 with OMI measurements over grids dominated by power plant emissions, with only 7–14% low bias, lending support to the high accuracy of the unit-based power plant NOx emission inventory. The ratios of OMI-derived annual and summer average tropospheric NO2 columns between 2007 and 2005 indicated that most of the grids with significant NO2 increases were related to power plant construction activities. OMI had the capability to trace the changes of NOx emissions from individual large power plants in cases where there is less interference from other NOx sources. Scenario runs from GEOS-Chem model suggested that the new power plants contributed 18.5% and 10% to the annual average NO2 columns in 2007 in Inner Mongolia and North China, respectively. The massive new power plant NOx emissions significantly changed the local NO2 profiles, especially in less polluted areas. A sensitivity study found that changes of NO2 shape factors due to including new power plant emissions increased the summer average OMI tropospheric NO2 columns by 3.8–17.2% for six selected locations, indicating that the updated emission information could help to improve the satellite

  16. Novel bottom-up SERS substrates for quantitative and parallelized analytics.

    Science.gov (United States)

    Strelau, Katharina K; Schüler, Thomas; Möller, Robert; Fritzsche, Wolfgang; Popp, Jürgen

    2010-02-01

    Surface-enhanced Raman spectroscopy (SERS) is an emerging technology in the field of analytics. Due to the high sensitivity in connection with specific Raman molecular fingerprint information SERS can be used in a variety of analytical, bioanalytical, and biosensing applications. However, for the SERS effect substrates with metal nanostructures are needed. The broad application of this technology is greatly hampered by the lack of reliable and reproducible substrates. Usually the activity of a given substrate has to be determined by time-consuming experiments such as calibration or ultramicroscopic studies. To use SERS as a standard analytical tool, cheap and reproducible substrates are required, preferably with a characterization technique that does not interfere with the subsequent measurements. Herein we introduce an innovative approach to produce low-cost and large-scale reproducible substrates for SERS applications, which allows easy and economical production of micropatterned SERS active surfaces on a large scale. This approach is based on an enzyme-induced growth of silver nanostructures. The special structural feature of the enzymatically deposited silver nanoparticles prevents the breakdown of SERS activity even at high particle densities (particle density >60%) that lead to a conductive layer. In contrast to other approaches, this substrate exhibits a relationship between electrical conductivity and the resulting SERS activity of a given spot. This enables the prediction of the SERS activity of the nanostructure ensemble and therewith the controllable and reproducible production of SERS substrates of enzymatic silver nanoparticles on a large scale, utilizing a simple measurement of the electrical conductivity. Furthermore, through a correlation between the conductivity and the SERS activity of the substrates it is possible to quantify SERS measurements with these substrates.

  17. A two-step combination of top-down and bottom-up fire emission estimates at regional and global scales: strengths and main uncertainties

    Science.gov (United States)

    Sofiev, Mikhail; Soares, Joana; Kouznetsov, Rostislav; Vira, Julius; Prank, Marje

    2016-04-01

    Top-down emission estimation via inverse dispersion modelling is used for various problems, where bottom-up approaches are difficult or highly uncertain. One of such areas is the estimation of emission from wild-land fires. In combination with dispersion modelling, satellite and/or in-situ observations can, in principle, be used to efficiently constrain the emission values. This is the main strength of the approach: the a-priori values of the emission factors (based on laboratory studies) are refined for real-life situations using the inverse-modelling technique. However, the approach also has major uncertainties, which are illustrated here with a few examples of the Integrated System for wild-land Fires (IS4FIRES). IS4FIRES generates the smoke emission and injection profile from MODIS and SEVIRI active-fire radiative energy observations. The emission calculation includes two steps: (i) initial top-down calibration of emission factors via inverse dispersion problem solution that is made once using training dataset from the past, (ii) application of the obtained emission coefficients to individual-fire radiative energy observations, thus leading to bottom-up emission compilation. For such a procedure, the major classes of uncertainties include: (i) imperfect information on fires, (ii) simplifications in the fire description, (iii) inaccuracies in the smoke observations and modelling, (iv) inaccuracies of the inverse problem solution. Using examples of the fire seasons 2010 in Russia, 2012 in Eurasia, 2007 in Australia, etc, it is pointed out that the top-down system calibration performed for a limited number of comparatively moderate cases (often the best-observed ones) may lead to errors in application to extreme events. For instance, the total emission of 2010 Russian fires is likely to be over-estimated by up to 50% if the calibration is based on the season 2006 and fire description is simplified. Longer calibration period and more sophisticated parameterization

  18. Methodology to characterize a residential building stock using a bottom-up approach: a case study applied to Belgium

    Directory of Open Access Journals (Sweden)

    Samuel Gendebien

    2014-06-01

    Full Text Available In the last ten years, the development and implementation of measures to mitigate climate change have become of major importance. In Europe, the residential sector accounts for 27% of the final energy consumption [1], and therefore contributes significantly to CO2 emissions. Roadmaps towards energy-efficient buildings have been proposed [2]. In such a context, the detailed characterization of residential building stocks in terms of age, type of construction, insulation level, energy vector, and of evolution prospects appears to be a useful contribution to the assessment of the impact of implementation of energy policies. In this work, a methodology to develop a tree-structure characterizing a residential building stock is presented in the frame of a bottom-up approach that aims to model and simulate domestic energy use. The methodology is applied to the Belgian case for the current situation and up to 2030 horizon. The potential applications of the developed tool are outlined.

  19. Bottom-Up Fabrication of Nanopatterned Polymers on DNA Origami by In Situ Atom-Transfer Radical Polymerization.

    Science.gov (United States)

    Tokura, Yu; Jiang, Yanyan; Welle, Alexander; Stenzel, Martina H; Krzemien, Katarzyna M; Michaelis, Jens; Berger, Rüdiger; Barner-Kowollik, Christopher; Wu, Yuzhou; Weil, Tanja

    2016-05-04

    Bottom-up strategies to fabricate patterned polymers at the nanoscale represent an emerging field in the development of advanced nanodevices, such as biosensors, nanofluidics, and nanophotonics. DNA origami techniques provide access to distinct architectures of various sizes and shapes and present manifold opportunities for functionalization at the nanoscale with the highest precision. Herein, we conduct in situ atom-transfer radical polymerization (ATRP) on DNA origami, yielding differently nanopatterned polymers of various heights. After cross-linking, the grafted polymeric nanostructures can even stably exist in solution without the DNA origami template. This straightforward approach allows for the fabrication of patterned polymers with low nanometer resolution, which provides access to unique DNA-based functional hybrid materials.

  20. Cyclization of the N-Terminal X-Asn-Gly Motif during Sample Preparation for Bottom-Up Proteomics

    DEFF Research Database (Denmark)

    Zhang, Xumin; Højrup, Peter

    2010-01-01

    We, herein, report a novel -17 Da peptide modification corresponding to an N-terminal cyclization of peptides possessing the N-terminal motif of X-Asn-Gly. The cyclization occurs spontaneously during sample preparation for bottom-up proteomics studies. Distinct from the two well-known N......-terminal cyclizations, cyclization of N-terminal glutamine and S-carbamoylmethylcysteine, it is dependent on pH instead of [NH(4)(+)]. The data set from our recent study on large-scale N(α)-modified peptides revealed a sequence requirement for the cyclization event similar to the well-known deamidation of Asn to iso......Asp and Asp. Detailed analysis using synthetic peptides confirmed that the cyclization forms between the N-terminus and its neighboring Asn residue, and the reaction shares the same succinimide intermediate with the Asn deamidation event. As a result, we, here, propose a molecular mechanism for this specific...

  1. Bottom-up fabrication of paper-based microchips by blade coating of cellulose microfibers on a patterned surface.

    Science.gov (United States)

    Gao, Bingbing; Liu, Hong; Gu, Zhongze

    2014-12-23

    We report a method for the bottom-up fabrication of paper-based capillary microchips by the blade coating of cellulose microfibers on a patterned surface. The fabrication process is similar to the paper-making process in which an aqueous suspension of cellulose microfibers is used as the starting material and is blade-coated onto a polypropylene substrate patterned using an inkjet printer. After water evaporation, the cellulose microfibers form a porous, hydrophilic, paperlike pattern that wicks aqueous solution by capillary action. This method enables simple, fast, inexpensive fabrication of paper-based capillary channels with both width and height down to about 10 μm. When this method is used, the capillary microfluidic chip for the colorimetric detection of glucose and total protein is fabricated, and the assay requires only 0.30 μL of sample, which is 240 times smaller than for paper devices fabricated using photolithography.

  2. Mass Spectrometry Applied to Bottom-Up Proteomics: Entering the High-Throughput Era for Hypothesis Testing

    Science.gov (United States)

    Gillet, Ludovic C.; Leitner, Alexander; Aebersold, Ruedi

    2016-06-01

    Proteins constitute a key class of molecular components that perform essential biochemical reactions in living cells. Whether the aim is to extensively characterize a given protein or to perform high-throughput qualitative and quantitative analysis of the proteome content of a sample, liquid chromatography coupled to tandem mass spectrometry has become the technology of choice. In this review, we summarize the current state of mass spectrometry applied to bottom-up proteomics, the approach that focuses on analyzing peptides obtained from proteolytic digestion of proteins. With the recent advances in instrumentation and methodology, we show that the field is moving away from providing qualitative identification of long lists of proteins to delivering highly consistent and accurate quantification values for large numbers of proteins across large numbers of samples. We believe that this shift will have a profound impact for the field of proteomics and life science research in general.

  3. The faith of a physicist reflections of a bottom-up thinker : the Gifford lectures for 1993-4

    CERN Document Server

    Polkinghorne, John C

    1994-01-01

    Is it possible to think like a scientist and yet have the faith of a Christian? Although many Westerners might say no, there are also many critically minded individuals who entertain what John Polkinghorne calls a "wistful wariness" toward religion--they feel unable to accept religion on rational grounds yet cannot dismiss it completely. Polkinghorne, both a particle physicist and Anglican priest, here explores just what rational grounds there could be for Christian beliefs, maintaining that the quest for motivated understanding is a concern shared by scientists and religious thinkers alike. Anyone who assumes that religion is based on unquestioning certainties, or that it need not take into account empirical knowledge, will be challenged by Polkinghorne's bottom-up examination of Christian beliefs about events ranging from creation to the resurrection. The author organizes his inquiry around the Nicene Creed, an early statement that continues to summarize Christian beliefs. He applies to each of its tenets ...

  4. Radiographic Evaluation of Children with Febrile Urinary Tract Infection: Bottom-Up, Top-Down, or None of the Above?

    Directory of Open Access Journals (Sweden)

    Michaella M. Prasad

    2012-01-01

    Full Text Available The proper algorithm for the radiographic evaluation of children with febrile urinary tract infection (FUTI is hotly debated. Three studies are commonly administered: renal-bladder ultrasound (RUS, voiding cystourethrogram (VCUG, and dimercapto-succinic acid (DMSA scan. However, the order in which these tests are obtained depends on the methodology followed: bottom-up or top-down. Each strategy carries advantages and disadvantages, and some groups now advocate even less of a workup (none of the above due to the current controversies about treatment when abnormalities are diagnosed. New technology is available and still under investigation, but it may help to clarify the interplay between vesicoureteral reflux, renal scarring, and dysfunctional elimination in the future.

  5. Identifying robust clusters and multi-community nodes by combining top-down and bottom-up approaches to clustering

    CERN Document Server

    Gaiteri, Chris; Szymanski, Boleslaw; Kuzmin, Konstantin; Xie, Jierui; Lee, Changkyu; Blanche, Timothy; Neto, Elias Chaibub; Huang, Su-Chun; Grabowski, Thomas; Madhyastha, Tara; Komashko, Vitalina

    2015-01-01

    Biological functions are often realized by groups of interacting molecules or cells. Membership in these groups may overlap when molecules or cells are reused in multiple functions. Traditional clustering methods assign each component to one group. Noisy measurements are common in high-throughput biological datasets. These two limitations reduce our ability to accurately define clusters in biological datasets and to interpret their biological functions. To address these limitations, we designed an algorithm called SpeakEasy, which detects overlapping or non-overlapping communities in biological networks. Input to SpeakEasy can be physical networks, such as molecular interactions, or inferred networks, such as gene coexpression networks. The networks can be directed or undirected, and may contain negative links. SpeakEasy combines traditional bottom-up and top-down approaches to clustering, by creating competition between clusters. Nodes that oscillate between multiple clusters in this competition are classifi...

  6. Biochemistry-directed hollow porous microspheres: bottom-up self-assembled polyanion-based cathodes for sodium ion batteries.

    Science.gov (United States)

    Lin, Bo; Li, Qiufeng; Liu, Baodong; Zhang, Sen; Deng, Chao

    2016-04-21

    Biochemistry-directed synthesis of functional nanomaterials has attracted great interest in energy storage, catalysis and other applications. The unique ability of biological systems to guide molecule self-assembling facilitates the construction of distinctive architectures with desirable physicochemical characteristics. Herein, we report a biochemistry-directed "bottom-up" approach to construct hollow porous microspheres of polyanion materials for sodium ion batteries. Two kinds of polyanions, i.e. Na3V2(PO4)3 and Na3.12Fe2.44(P2O7)2, are employed as cases in this study. The microalgae cell realizes the formation of a spherical "bottom" bio-precursor. Its tiny core is subjected to destruction and its tough shell tends to carbonize upon calcination, resulting in the hollow porous microspheres for the "top" product. The nanoscale crystals of the polyanion materials are tightly enwrapped by the highly-conductive framework in the hollow microsphere, resulting in the hierarchical nano-microstructure. The whole formation process is disclosed as a "bottom-up" mechanism. Moreover, the biochemistry-directed self-assembly process is confirmed to play a crucial role in the construction of the final architecture. Taking advantage of the well-defined hollow-microsphere architecture, the abundant interior voids and the highly-conductive framework, polyanion materials show favourable sodium-intercalation kinetics. Both materials are capable of high-rate long-term cycling. After five hundred cycles at 20 C and 10 C, Na3V2(PO4)3 and Na(3.12)Fe2.44(P2O7)2 retain 96.2% and 93.1% of the initial capacity, respectively. Therefore, the biochemistry-directed technique provides a low-cost, highly-efficient and widely applicable strategy to produce high-performance polyanion-based cathodes for sodium ion batteries.

  7. A comparison of top-down and bottom-up approaches to benthic habitat mapping to inform offshore wind energy development

    Science.gov (United States)

    LaFrance, Monique; King, John W.; Oakley, Bryan A.; Pratt, Sheldon

    2014-07-01

    heterogeneity over various spatial scales. The approaches were also able to integrate various data at differing spatial resolutions. The classification outputs exhibited similar results, including the number of habitat classes generated, the number of species defining the classes, the level of distinction of the biological communities, and dominance by tube-building amphipods. These results indicate that both approaches are able to discern a comparable degree of habitat variability and produce cohesive macrofaunal assemblages. The mapping approaches identify broadly similar benthic habitats at the two study sites and methods were able to distinguish the differing levels of heterogeneity between them. The top-down approach to habitat classification was faster and simpler to accomplish with the data available in this study when compared to the bottom-up approach. Additionally, the top-down approach generated full-coverage habitat classes that are clearly delineated and can easily be interpreted by the map user, which is desirable from a management perspective for providing a more complete assessment of the areas of interest. However, a higher level of biological variability was noted in some of the habitat classes created, indicating that the biological communities present in this area are influenced by factors not captured in the broad-scale geological habitat units used in this approach. The bottom-up approach was valuable in its ability to more clearly define macrofaunal assemblages among habitats, discern finer-scale habitat characteristics, and directly assess the degree of macrofaunal assemblage variability captured by the environmental parameters. From a user perspective, the map is more complex, which may be perceived as a limitation, though likely reflects natural gradations in habitat structure and likely presents a more ecologically realistic portrayal of the study areas. Though more comprehensive, the bottom-up approach in this study was limited by the reliance on

  8. Bottom-up preparation of MgH2 nanoparticles with enhanced cycle life stability during electrochemical conversion in Li-ion batteries

    Science.gov (United States)

    Oumellal, Yassine; Zlotea, Claudia; Bastide, Stéphane; Cachet-Vivier, Christine; Léonel, Eric; Sengmany, Stéphane; Leroy, Eric; Aymard, Luc; Bonnet, Jean-Pierre; Latroche, Michel

    2014-11-01

    A promising anode material for Li-ion batteries based on MgH2 with around 5 nm average particles size was synthesized by a bottom-up method. A series of several composites containing MgH2 nanoparticles well dispersed into a porous carbon host has been prepared with different metal content up to 70 wt%. A narrow particle size distribution (1-10 nm) of the MgH2 nanospecies with around 5.5 nm average size can be controlled up to 50 wt% Mg. After a ball milling treatment under Ar, the composite containing 50 wt% Mg shows an impressive cycle life stability with a good electrochemical capacity of around 500 mA h g-1. Moreover, the nanoparticles' size distribution is stable during cycling.A promising anode material for Li-ion batteries based on MgH2 with around 5 nm average particles size was synthesized by a bottom-up method. A series of several composites containing MgH2 nanoparticles well dispersed into a porous carbon host has been prepared with different metal content up to 70 wt%. A narrow particle size distribution (1-10 nm) of the MgH2 nanospecies with around 5.5 nm average size can be controlled up to 50 wt% Mg. After a ball milling treatment under Ar, the composite containing 50 wt% Mg shows an impressive cycle life stability with a good electrochemical capacity of around 500 mA h g-1. Moreover, the nanoparticles' size distribution is stable during cycling. Electronic supplementary information (ESI) available: (a) Dark field TEM image and the corresponding SAED electron diffraction pattern of the as-synthesized 15MgH2@HSAG-500, (b) N2 sorption isotherms at 77 K of all as-synthesized xMgH2@HSAG-500 composites, (c) N2 sorption isotherms at 77 K of the 50MgH2@HSAG-500 composite before and after ball milling, (d) electrochemical characterization of all as-synthesized xMgH2@HSAG-500 composites for the first cycle, where x is 15, 25, 50 and 70 wt% Mg. (e) Comparison between the capacities of two ball milled xMgH2@HSAG-500 composites with x = 50 and 70 wt% Mg. (f

  9. Evaluating vehicle re-entrained road dust and its potential to deposit to Lake Tahoe: a bottom-up inventory approach.

    Science.gov (United States)

    Zhu, Dongzi; Kuhns, Hampden D; Gillies, John A; Gertler, Alan W

    2014-01-01

    Identifying hotspot areas impacted by emissions of dust from roadways is an essential step for mitigation. This paper develops a detailed road dust PM₁₀ emission inventory using a bottom-up approach and evaluates the potential for the dust to deposit to Lake Tahoe where it can affect water clarity. Previous studies of estimates of quantities of atmospheric deposition of fine sediment particles ("FSP", dust emission factors, five years of meteorological data, a traffic demand model and GIS analysis was used to estimate the near field deposition of airborne particulate matter atmospheric deposition to the lake. Approximately ~20 Mg year(-1) of PM₁₀ and ~36 Mg year(-1) Total Suspended Particulate (TSP) from roadway emissions of dust are estimated to reach the lake. We estimate that the atmospheric dry deposition of particles to the lake attributable to vehicle travel on paved roads is approximately 0.6% of the Total Maximum Daily Loadings (TMDL) of FSP that the lake can receive and still meet water quality standards.

  10. When top-down becomes bottom up: behaviour of hyperdense howler monkeys (Alouatta seniculus) trapped on a 0.6 ha island.

    Science.gov (United States)

    Orihuela, Gabriela; Terborgh, John; Ceballos, Natalia; Glander, Kenneth

    2014-01-01

    Predators are a ubiquitous presence in most natural environments. Opportunities to contrast the behaviour of a species in the presence and absence of predators are thus rare. Here we report on the behaviour of howler monkey groups living under radically different conditions on two land-bridge islands in Lago Guri, Venezuela. One group of 6 adults inhabited a 190-ha island (Danto) where they were exposed to multiple potential predators. This group, the control, occupied a home range of 23 ha and contested access to food resources with neighbouring groups in typical fashion. The second group, containing 6 adults, was isolated on a remote, predator-free 0.6 ha islet (Iguana) offering limited food resources. Howlers living on the large island moved, fed and rested in a coherent group, frequently engaged in affiliative activities, rarely displayed agonistic behaviour and maintained intergroup spacing through howling. In contrast, the howlers on Iguana showed repulsion, as individuals spent most of their time spaced widely around the perimeter of the island. Iguana howlers rarely engaged in affiliative behaviour, often chased or fought with one another and were not observed to howl. These behaviors are interpreted as adjustments to the unrelenting deprivation associated with bottom-up limitation in a predator-free environment.

  11. Tailoring the morphology and luminescence of GaN/InGaN core-shell nanowires using bottom-up selective-area epitaxy

    Science.gov (United States)

    Nami, Mohsen; Eller, Rhett F.; Okur, Serdal; Rishinaramangalam, Ashwin K.; Liu, Sheng; Brener, Igal; Feezell, Daniel F.

    2017-01-01

    Controlled bottom-up selective-area epitaxy (SAE) is used to tailor the morphology and photoluminescence properties of GaN/InGaN core-shell nanowire arrays. The nanowires are grown on c-plane sapphire substrates using pulsed-mode metal organic chemical vapor deposition. By varying the dielectric mask configuration and growth conditions, we achieve GaN nanowire cores with diameters ranging from 80 to 700 nm that exhibit various degrees of polar, semipolar, and nonpolar faceting. A single InGaN quantum well (QW) and GaN barrier shell is also grown on the GaN nanowire cores and micro-photoluminescence is obtained and analyzed for a variety of nanowire dimensions, array pitch spacings, and aperture diameters. By increasing the nanowire pitch spacing on the same growth wafer, the emission wavelength redshifts from 440 to 520 nm, while increasing the aperture diameter results in a ˜35 nm blueshift. The thickness of one QW/barrier period as a function of pitch and aperture diameter is inferred using scanning electron microscopy, with larger pitches showing significantly thicker QWs. Significant increases in indium composition were predicted for larger pitches and smaller aperture diameters. The results are interpreted in terms of local growth conditions and adatom capture radius around the nanowires. This work provides significant insight into the effects of mask configuration and growth conditions on the nanowire properties and is applicable to the engineering of monolithic multi-color nanowire LEDs on a single chip.

  12. Climate change, pink salmon, and the nexus between bottom-up and top-down forcing in the subarctic Pacific Ocean and Bering Sea.

    Science.gov (United States)

    Springer, Alan M; van Vliet, Gus B

    2014-05-06

    Climate change in the last century was associated with spectacular growth of many wild Pacific salmon stocks in the North Pacific Ocean and Bering Sea, apparently through bottom-up forcing linking meteorology to ocean physics, water temperature, and plankton production. One species in particular, pink salmon, became so numerous by the 1990s that they began to dominate other species of salmon for prey resources and to exert top-down control in the open ocean ecosystem. Information from long-term monitoring of seabirds in the Aleutian Islands and Bering Sea reveals that the sphere of influence of pink salmon is much larger than previously known. Seabirds, pink salmon, other species of salmon, and by extension other higher-order predators, are tightly linked ecologically and must be included in international management and conservation policies for sustaining all species that compete for common, finite resource pools. These data further emphasize that the unique 2-y cycle in abundance of pink salmon drives interannual shifts between two alternate states of a complex marine ecosystem.

  13. When top-down becomes bottom up: behaviour of hyperdense howler monkeys (Alouatta seniculus trapped on a 0.6 ha island.

    Directory of Open Access Journals (Sweden)

    Gabriela Orihuela

    Full Text Available Predators are a ubiquitous presence in most natural environments. Opportunities to contrast the behaviour of a species in the presence and absence of predators are thus rare. Here we report on the behaviour of howler monkey groups living under radically different conditions on two land-bridge islands in Lago Guri, Venezuela. One group of 6 adults inhabited a 190-ha island (Danto where they were exposed to multiple potential predators. This group, the control, occupied a home range of 23 ha and contested access to food resources with neighbouring groups in typical fashion. The second group, containing 6 adults, was isolated on a remote, predator-free 0.6 ha islet (Iguana offering limited food resources. Howlers living on the large island moved, fed and rested in a coherent group, frequently engaged in affiliative activities, rarely displayed agonistic behaviour and maintained intergroup spacing through howling. In contrast, the howlers on Iguana showed repulsion, as individuals spent most of their time spaced widely around the perimeter of the island. Iguana howlers rarely engaged in affiliative behaviour, often chased or fought with one another and were not observed to howl. These behaviors are interpreted as adjustments to the unrelenting deprivation associated with bottom-up limitation in a predator-free environment.

  14. Identifying robust communities and multi-community nodes by combining top-down and bottom-up approaches to clustering.

    Science.gov (United States)

    Gaiteri, Chris; Chen, Mingming; Szymanski, Boleslaw; Kuzmin, Konstantin; Xie, Jierui; Lee, Changkyu; Blanche, Timothy; Chaibub Neto, Elias; Huang, Su-Chun; Grabowski, Thomas; Madhyastha, Tara; Komashko, Vitalina

    2015-11-09

    Biological functions are carried out by groups of interacting molecules, cells or tissues, known as communities. Membership in these communities may overlap when biological components are involved in multiple functions. However, traditional clustering methods detect non-overlapping communities. These detected communities may also be unstable and difficult to replicate, because traditional methods are sensitive to noise and parameter settings. These aspects of traditional clustering methods limit our ability to detect biological communities, and therefore our ability to understand biological functions. To address these limitations and detect robust overlapping biological communities, we propose an unorthodox clustering method called SpeakEasy which identifies communities using top-down and bottom-up approaches simultaneously. Specifically, nodes join communities based on their local connections, as well as global information about the network structure. This method can quantify the stability of each community, automatically identify the number of communities, and quickly cluster networks with hundreds of thousands of nodes. SpeakEasy shows top performance on synthetic clustering benchmarks and accurately identifies meaningful biological communities in a range of datasets, including: gene microarrays, protein interactions, sorted cell populations, electrophysiology and fMRI brain imaging.

  15. Bioenergy decision-making of farms in Northern Finland: Combining the bottom-up and top-down perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Snaekin, Juha-Pekka, E-mail: juhapekkasnakin@luukku.co [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland); Muilu, Toivo; Pesola, Tuomo [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland)

    2010-10-15

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers.

  16. Bioenergy decision-making of farms in Northern Finland. Combining the bottom-up and top-down perspectives

    Energy Technology Data Exchange (ETDEWEB)

    Snaekin, Juha-Pekka; Muilu, Toivo; Pesola, Tuomo [University of Oulu, Department of Geography, P.O. Box 3000, FIN-90014 Oulu (Finland)

    2010-10-15

    Finnish farmers' role as energy producers is small compared to their role as energy resource owners. Since climate and energy policy in Finland continues favoring large-scale energy visions, additional investment support for agriculture will stay modest. To utilize fully the energy potential in farms, we analyze the farmers' decision-making environment. First, we present an overview of the Finnish energy policy and economy and their effect on farms (the top-down perspective). Then we analyze the drivers behind the bioenergy decisions of farms in general and in the Oulu region, located in Northern Finland (the bottom-up perspective). There is weak policy coherence between national and regional energy efforts. Strong pressure is placed on farmers to improve their business and marketing knowledge, innovation and financial abilities, education level, and networking skills. In the Oulu region, bioenergy forerunners can be divided in three different groups - investors, entrepreneurs and hobbyists - that have different levels of commitment to their energy businesses. This further stresses the importance of getting quality business services from numerous service providers. (author)

  17. A bottom-up, scientist-based initiative for the communication of climate sciences with the general public

    Science.gov (United States)

    Bourqui, Michel; Bolduc, Cassandra; Paul, Charbonneau; Marie, Charrière; Daniel, Hill; Angelica, Lopez; Enrique, Loubet; Philippe, Roy; Barbara, Winter

    2015-04-01

    This talk introduces a scientists-initiated, new online platform whose aim is to contribute to making climate sciences become public knowledge. It takes a unique bottom-up approach, strictly founded on individual-based participation, high scientific standards and independence The main purpose is to build an open-access, multilingual and peer-reviewed journal publishing short climate articles in non-scientific language. The targeted public includes journalists, teachers, students, local politicians, economists, members of the agriculture sector, and any other citizens from around the world with an interest in climate sciences. This journal is meant to offer a simple and direct channel for scientists wishing to disseminate their research to the general public. A high standard of climate articles is ensured through: a) requiring that the main author is an active climate scientist, and b) an innovative peer-review process involving scientific and non-scientific referees with distinct roles. The platform fosters the direct participation of non-scientists through co-authoring, peer-reviewing, language translation. It furthermore engages the general public in the scientific inquiry by allowing non-scientists to invite manuscripts to be written on topics of their concern. The platform is currently being developed by a community of scientists and non-scientists. In this talk, I will present the basic ideas behind this new online platform, its current state and the plans for the next future. The beta version of the platform is available at: http://www.climateonline.bourquiconsulting.ch

  18. Encouraging the pursuit of advanced degrees in science and engineering: Top-down and bottom-up methodologies

    Science.gov (United States)

    Maddox, Anthony B.; Smith-Maddox, Renee P.; Penick, Benson E.

    1989-01-01

    The MassPEP/NASA Graduate Research Development Program (GRDP) whose objective is to encourage Black Americans, Mexican Americans, American Indians, Puerto Ricans, and Pacific Islanders to pursue graduate degrees in science and engineering is described. The GRDP employs a top-down or goal driven methodology through five modules which focus on research, graduate school climate, technical writing, standardized examinations, and electronic networking. These modules are designed to develop and reinforce some of the skills necessary to seriously consider the goal of completing a graduate education. The GRDP is a community-based program which seeks to recruit twenty participants from a pool of Boston-area undergraduates enrolled in engineering and science curriculums and recent graduates with engineering and science degrees. The program emphasizes that with sufficient information, its participants can overcome most of the barriers perceived as preventing them from obtaining graduate science and engineering degrees. Experience has shown that the top-down modules may be complemented by a more bottom-up or event-driven methodology. This approach considers events in the academic and professional experiences of participants in order to develop the personal and leadership skills necessary for graduate school and similar endeavors.

  19. A bottom-up method for module-based product platform development through mapping, clustering and matching analysis

    Institute of Scientific and Technical Information of China (English)

    ZHANG Meng; LI Guo-xi; CAO Jian-ping; GONG Jing-zhong; WU Bao-zhong

    2016-01-01

    Designing product platform could be an effective and efficient solution for manufacturing firms. Product platforms enable firms to provide increased product variety for the marketplace with as little variety between products as possible. Developed consumer products and modules within a firm can further be investigated to find out the possibility of product platform creation. A bottom-up method is proposed for module-based product platform through mapping, clustering and matching analysis. The framework and the parametric model of the method are presented, which consist of three steps: (1) mapping parameters from existing product families to functional modules, (2) clustering the modules within existing module families based on their parameters so as to generate module clusters, and selecting the satisfactory module clusters based on commonality, and (3) matching the parameters of the module clusters to the functional modules in order to capture platform elements. In addition, the parameter matching criterion and mismatching treatment are put forward to ensure the effectiveness of the platform process, while standardization and serialization of the platform element are presented. A design case of the belt conveyor is studied to demonstrate the feasibility of the proposed method.

  20. Estimation of the measurement uncertainty by the bottom-up approach for the determination of methamphetamine and amphetamine in urine.

    Science.gov (United States)

    Lee, Sooyeun; Choi, Hyeyoung; Kim, Eunmi; Choi, Hwakyung; Chung, Heesun; Chung, Kyu Hyuck

    2010-05-01

    The measurement uncertainty (MU) of methamphetamine (MA) and amphetamine (AP) was estimated in an authentic urine sample with a relatively low concentration of MA and AP using the bottom-up approach. A cause and effect diagram was deduced; the amount of MA or AP in the sample, the volume of the sample, method precision, and sample effect were considered uncertainty sources. The concentrations of MA and AP in the urine sample with their expanded uncertainties were 340.5 +/- 33.2 ng/mL and 113.4 +/- 15.4 ng/mL, respectively, which means 9.7% and 13.6% of the concentration gave an estimated expanded uncertainty, respectively. The largest uncertainty originated from sample effect and method precision in MA and AP, respectively, but the uncertainty of the volume of the sample was minimal in both. The MU needs to be determined during the method validation process to assess test reliability. Moreover, the identification of the largest and/or smallest uncertainty source can help improve experimental protocols.

  1. D-Branes at Singularities A Bottom-Up Approach to the String Embedding of the Standard Model

    CERN Document Server

    Aldazabal, G; Quevedo, Fernando; Uranga, Angel M

    2000-01-01

    We propose a bottom-up approach to the building of particle physics models from string theory. Our building blocks are Type II D-branes which we combine appropriately to reproduce desirable features of a particle theory model: 1) Chirality ; 2) Standard Model group ; 3) N=1 or N=0 supersymmetry ; 4) Three quark-lepton generations. We start such a program by studying configurations of D=10, Type IIB D3-branes located at singularities. We study in detail the case of Z_N, N=1,0 supersymmetric orbifold singularities leading to the SM group or some left-right symmetricextension. In general, tadpole cancellation conditions require the presence of additional branes, e.g. D7-branes. For the N=1 supersymmetric case the unique twist leading to three quark-lepton generations is Z_3, predicting $\\sin^2\\theta_W=3/14=0.21$. The models obtained are the simplest semirealistic string models ever built. In the non-supersymmetric case there is a three-generation model for each Z_N, N>4, but the Weinberg angle is in general too ...

  2. Biochemistry-directed hollow porous microspheres: bottom-up self-assembled polyanion-based cathodes for sodium ion batteries

    Science.gov (United States)

    Lin, Bo; Li, Qiufeng; Liu, Baodong; Zhang, Sen; Deng, Chao

    2016-04-01

    Biochemistry-directed synthesis of functional nanomaterials has attracted great interest in energy storage, catalysis and other applications. The unique ability of biological systems to guide molecule self-assembling facilitates the construction of distinctive architectures with desirable physicochemical characteristics. Herein, we report a biochemistry-directed ``bottom-up'' approach to construct hollow porous microspheres of polyanion materials for sodium ion batteries. Two kinds of polyanions, i.e. Na3V2(PO4)3 and Na3.12Fe2.44(P2O7)2, are employed as cases in this study. The microalgae cell realizes the formation of a spherical ``bottom'' bio-precursor. Its tiny core is subjected to destruction and its tough shell tends to carbonize upon calcination, resulting in the hollow porous microspheres for the ``top'' product. The nanoscale crystals of the polyanion materials are tightly enwrapped by the highly-conductive framework in the hollow microsphere, resulting in the hierarchical nano-microstructure. The whole formation process is disclosed as a ``bottom-up'' mechanism. Moreover, the biochemistry-directed self-assembly process is confirmed to play a crucial role in the construction of the final architecture. Taking advantage of the well-defined hollow-microsphere architecture, the abundant interior voids and the highly-conductive framework, polyanion materials show favourable sodium-intercalation kinetics. Both materials are capable of high-rate long-term cycling. After five hundred cycles at 20 C and 10 C, Na3V2(PO4)3 and Na3.12Fe2.44(P2O7)2 retain 96.2% and 93.1% of the initial capacity, respectively. Therefore, the biochemistry-directed technique provides a low-cost, highly-efficient and widely applicable strategy to produce high-performance polyanion-based cathodes for sodium ion batteries.Biochemistry-directed synthesis of functional nanomaterials has attracted great interest in energy storage, catalysis and other applications. The unique ability of

  3. A novel bottom-up process to produce nanoparticles containing protein and peptide for suspension in hydrofluoroalkane propellants.

    Science.gov (United States)

    Tan, Yinhe; Yang, Zhiwen; Peng, Xinsheng; Xin, Feng; Xu, Yuehong; Feng, Min; Zhao, Chunshun; Hu, Haiyan; Wu, Chuanbin

    2011-07-15

    To overcome the disadvantages of microemulsion and nanoprecipitation methods to produce protein-containing nanoparticles, a novel bottom-up process was developed to produce nanoparticles containing the model protein lysozyme. The nanoparticles were generated by freeze-drying a solution of lysozyme, lecithin and lactose in tert-butyl alcohol (TBA)/water co-solvent system and washing off excess lecithin in lyophilizate by centrifugation. Formulation parameters such as lecithin concentration in organic phase, water content in TBA/water co-solvent, and lactose concentration in water were optimized so as to obtain desired nanoparticles with retention of the bioactivity of lysozyme. Based on the results, 24.0% (w/v) of lecithin, 37.5% (v/v) of water content, and 0.56% (w/v) of lactose concentration were selected to generate spherical nanoparticles with approximately 200 nm in mean size, 0.1 in polydispersity index (PI), and 99% retained bioactivity of lysozyme. These nanoparticles rinsed with ethanol containing dipalmitoylphosphatidylcholine (DPPC), Span 85 or oleic acid (3%, w/v) could readily be dispersed in HFA 134a to form a stable suspension with good redispersibility and 98% retained bioactivity of lysozyme. The study indicates there is a potential to produce pressed metered dose inhaler (pMDI) formulations containing therapeutic protein and peptide nanoparticles.

  4. A comparative 'bottom up' proteomics strategy for the site-specific identification and quantification of protein modifications by electrophilic lipids.

    Science.gov (United States)

    Han, Bingnan; Hare, Michael; Wickramasekara, Samanthi; Fang, Yi; Maier, Claudia S

    2012-10-22

    We report a mass spectrometry-based comparative "bottom up" proteomics approach that combines d(0)/d(4)-succinic anhydride labeling with commercially available hydrazine (Hz)-functionalized beads (Affi-gel Hz beads) for detection, identification and relative quantification of site-specific oxylipid modifications in biological matrices. We evaluated and applied this robust and simple method for the quantitative analysis of oxylipid protein conjugates in cardiac mitochondrial proteome samples isolated from 3- and 24-month-old rat hearts. The use of d(0)/d(4)-succinic anhydride labeling, Hz-bead based affinity enrichment, nanoLC fractionation and MALDI-ToF/ToF tandem mass spectrometry yielded relative quantification of oxylipid conjugates with residue-specific modification information. Conjugation of acrolein (ACR), 4-hydroxy-2-hexenal (HHE), 4-hydroxy-2-nonenal (HNE) and 4-oxo-2-noneal (ONE) to cysteine, histidine and lysine residues were identified. HHE conjugates were the predominant subset of Michael-type adducts detected in this study. The HHE conjugates showed higher levels in mitochondrial preparations from young heart congruent with previous findings by others that the n-3/n-6 PUFA ratio is higher in young heart mitochondrial membranes. Although this study focuses on protein adducts of reactive oxylipids, the method might be equally applicable to protein carbonyl modifications caused by metal catalyzed oxidation reactions.

  5. Beyond Defining the Smart City. Meeting Top-Down and Bottom-Up Approaches in the Middle

    Directory of Open Access Journals (Sweden)

    Jonas Breuer

    2014-05-01

    Full Text Available This paper aims to better frame the discussion and the various, divergent operationalisations and interpretations of the Smart City concept. We start by explicating top-down approaches to the Smart City, followed by what purely bottom-up initiatives can look like. We provide a clear overview of stakeholders’ different viewpoints on the city of tomorrow. Particularly the consequences and potential impacts of these differing interpretations and approaches should be of specific interest to researchers, policy makers, city administrations, private actors and anyone involved and concerned with life in cities. Therefore the goal of this article is not so much answering the question of what the Smart City is, but rather what the concept can mean for different stakeholders as well as the consequences of their interpretation. We do this by assembling an eclectic overview, bringing together definitions, examples and operationalisations from academia, policy and industry as well as identifying major trends and approaches to realizing the Smart City. We add to the debate by proposing a different approach that starts from the collective, collaboration and context when researching Smart City initiatives.

  6. A bottom-up valence bond derivation of excitation energies in 1D-like delocalized systems.

    Science.gov (United States)

    Kepenekian, Mikaël; Robert, Vincent; Boilleau, Corentin; Malrieu, Jean-Paul

    2012-01-28

    Using the chemically relevant parameters hopping integral t(0) and on-site repulsion energy U, the charge gap (lowest dipolarly allowed transition energy) in 1D systems is examined through a bottom-up strategy. The method is based on the locally ionized states, the energies of which are corrected using short-range delocalization effects. In a valence bond framework, these states interact to produce an excitonic matrix which accounts for the delocalized character of excited states. The treatment, which gives access to the correlated spectrum of ionization potentials, is entirely analytical and valid whatever the U/|t(0)| ratio for such systems ruled by Peierls-Hubbard Hamiltonians. This second-order analytical derivation is finally confronted to numerical results of a renormalized excitonic treatment using larger blocks as functions of the U/|t(0)| ratio. The method is applied to dimerized chains and to fused polybenzenic 1D lattices. Such approaches complement the traditional Bloch-function based picture and deliver a conceptual understanding of the charge gap opening process based on a chemical intuitive picture.

  7. Bottom-up estimation of joint moments during manual lifting using orientation sensors instead of position sensors.

    Science.gov (United States)

    Faber, Gert S; Kingma, Idsart; van Dieën, Jaap H

    2010-05-01

    L5/S1, hip and knee moments during manual lifting tasks are, in a laboratory environment, frequently established by bottom-up inverse dynamics, using force plates to measure ground reaction forces (GRFs) and an optoelectronic system to measure segment positions and orientations. For field measurements, alternative measurement systems are being developed. One alternative is the use of small body-mounted inertial/magnetic sensors (IMSs) and instrumented force shoes to measure segment orientation and GRFs, respectively. However, because IMSs measure segment orientations only, the positions of segments relative to each other and relative to the GRFs have to be determined by linking them, assuming fixed segment lengths and zero joint translation. This will affect the estimated joint positions and joint moments. This study investigated the effect of using segment orientations only (orientation-based method) instead of using orientations and positions (reference method) on three-dimensional joint moments. To compare analysis methods (and not measurement methods), GRFs were measured with a force plate and segment positions and/or orientations were measured using optoelectronic marker clusters for both analysis methods. Eleven male subjects lifted a box from floor level using three lifting techniques: a stoop, a semi-squat and a squat technique. The difference between the two analysis methods remained small for the knee moments: knee joint and with reasonable accuracy at the hip and L5/S1 joints using segment orientation and GRF data only.

  8. A Bottom-up Energy Efficiency Improvement Roadmap for China’s Iron and Steel Industry up to 2050

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Qi [Northeastern Univ., Shenyang (China); Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Hasanbeigi, Ali [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Price, Lynn [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Lu, Hongyou [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Arens, Marlene [Fraunhofer Inst. for Systems and Innovation Research (ISI), Karlsruhe (Germany)

    2016-09-01

    Iron and steel manufacturing is energy intensive in China and in the world. China is the world largest steel producer accounting for around half of the world steel production. In this study, we use a bottom-up energy consumption model to analyze four steel-production and energy-efficiency scenarios and evaluate the potential for energy savings from energy-efficient technologies in China’s iron and steel industry between 2010 and 2050. The results show that China’s steel production will rise and peak in the year 2020 at 860 million tons (Mt) per year for the base-case scenario and 680 Mt for the advanced energy-efficiency scenario. From 2020 on, production will gradually decrease to about 510 Mt and 400 Mt in 2050, for the base-case and advanced scenarios, respectively. Energy intensity will decrease from 21.2 gigajoules per ton (G/t) in 2010 to 12.2 GJ/t and 9.9 GJ/t in 2050 for the base-case and advanced scenarios, respectively. In the near term, decreases in iron and steel industry energy intensity will come from adoption of energy-efficient technologies. In the long term, a shift in the production structure of China’s iron and steel industry, reducing the share of blast furnace/basic oxygen furnace production and increasing the share of electric-arc furnace production while reducing the use of pig iron as a feedstock to electric-arc furnaces will continue to reduce the sector’s energy consumption. We discuss barriers to achieving these energy-efficiency gains and make policy recommendations to support improved energy efficiency and a shift in the nature of iron and steel production in China.

  9. Comparing bottom-up and top-down parameterisations of a process-based runoff generation model tailored on floods

    Science.gov (United States)

    Antonetti, Manuel; Scherrer, Simon; Margreth, Michael; Zappa, Massimiliano

    2016-04-01

    Information about the spatial distribution of dominant runoff processes (DRPs) can improve flood predictions on ungauged basins, where conceptual rainfall-runoff models usually appear to be limited due to the need for calibration. For example, hydrological classifications based on DRPs can be used as regionalisation tools assuming that, once a model structure and its parameters have been identified for each DRP, they can be transferred to other areas where the same DRP occurs. Here we present a process-based runoff generation model as an event-based spin-off of the conceptual hydrological model PREVAH. The model is grid-based and consists of a specific storage system for each DRP. To unbind the parameter values from catchment-related characteristics, the runoff concentration and the flood routing are uncoupled from the runoff generation routine and simulated separately. For the model parameterisation, two contrasting approaches are applied. First, in a bottom-up approach, the parameters of the runoff generation routine are determined a priori based on the results of sprinkling experiments on 60-100 m2 hillslope plots at several grassland locations in Switzerland. The model is, then, applied on a small catchment (0.5 km2) on the Swiss Plateau, and the parameters linked to the runoff concentration are calibrated on a single heavy rainfall-runoff event. The whole system is finally verified on several nearby catchments of larger sizes (up to 430 km2) affected by different heavy rainfall events. In a second attempt, following a top-down approach, all the parameters are calibrated on the largest catchment under investigation and successively verified on three sub-catchments. Simulation results from both parameterisation techniques are finally compared with results obtained with the traditional PREVAH.

  10. Bottom-up engineering of biological systems through standard bricks: a modularity study on basic parts and devices.

    Directory of Open Access Journals (Sweden)

    Lorenzo Pasotti

    Full Text Available BACKGROUND: Modularity is a crucial issue in the engineering world, as it enables engineers to achieve predictable outcomes when different components are interconnected. Synthetic Biology aims to apply key concepts of engineering to design and construct new biological systems that exhibit a predictable behaviour. Even if physical and measurement standards have been recently proposed to facilitate the assembly and characterization of biological components, real modularity is still a major research issue. The success of the bottom-up approach strictly depends on the clear definition of the limits in which biological functions can be predictable. RESULTS: The modularity of transcription-based biological components has been investigated in several conditions. First, the activity of a set of promoters was quantified in Escherichia coli via different measurement systems (i.e., different plasmids, reporter genes, ribosome binding sites relative to an in vivo reference promoter. Second, promoter activity variation was measured when two independent gene expression cassettes were assembled in the same system. Third, the interchangeability of input modules (a set of constitutive promoters and two regulated promoters connected to a fixed output device (a logic inverter expressing GFP was evaluated. The three input modules provide tunable transcriptional signals that drive the output device. If modularity persists, identical transcriptional signals trigger identical GFP outputs. To verify this, all the input devices were individually characterized and then the input-output characteristic of the logic inverter was derived in the different configurations. CONCLUSIONS: Promoters activities (referred to a standard promoter can vary when they are measured via different reporter devices (up to 22%, when they are used within a two-expression-cassette system (up to 35% and when they drive another device in a functionally interconnected circuit (up to 44%. This paper

  11. Diffusion-driven multiscale analysis on manifolds and graphs: top-down and bottom-up constructions

    Science.gov (United States)

    Szlam, Arthur D.; Maggioni, Mauro; Coifman, Ronald R.; Bremer, James C., Jr.

    2005-08-01

    Classically, analysis on manifolds and graphs has been based on the study of the eigenfunctions of the Laplacian and its generalizations. These objects from differential geometry and analysis on manifolds have proven useful in applications to partial differential equations, and their discrete counterparts have been applied to optimization problems, learning, clustering, routing and many other algorithms.1-7 The eigenfunctions of the Laplacian are in general global: their support often coincides with the whole manifold, and they are affected by global properties of the manifold (for example certain global topological invariants). Recently a framework for building natural multiresolution structures on manifolds and graphs was introduced, that greatly generalizes, among other things, the construction of wavelets and wavelet packets in Euclidean spaces.8,9 This allows the study of the manifold and of functions on it at different scales, which are naturally induced by the geometry of the manifold. This construction proceeds bottom-up, from the finest scale to the coarsest scale, using powers of a diffusion operator as dilations and a numerical rank constraint to critically sample the multiresolution subspaces. In this paper we introduce a novel multiscale construction, based on a top-down recursive partitioning induced by the eigenfunctions of the Laplacian. This yields associated local cosine packets on manifolds, generalizing local cosines in Euclidean spaces.10 We discuss some of the connections with the construction of diffusion wavelets. These constructions have direct applications to the approximation, denoising, compression and learning of functions on a manifold and are promising in view of applications to problems in manifold approximation, learning, dimensionality reduction.

  12. Regime shift from phytoplankton to macrophyte dominance in a large river: Top-down versus bottom-up effects

    Energy Technology Data Exchange (ETDEWEB)

    Ibanez, Carles, E-mail: carles.ibanez@irta.cat [IRTA Aquatic Ecosystems, Carretera Poble Nou, Km 5.5, 43540 St. Carles de la Rapita, Catalonia (Spain); Alcaraz, Carles; Caiola, Nuno; Rovira, Albert; Trobajo, Rosa [IRTA Aquatic Ecosystems, Carretera Poble Nou, Km 5.5, 43540 St. Carles de la Rapita, Catalonia (Spain); Alonso, Miguel [United Research Services S.L., Urgell 143, 08036 Barcelona, Catalonia (Spain); Duran, Concha [Confederacion Hidrografica del Ebro, Sagasta 24-26, 50071 Zaragoza, Aragon (Spain); Jimenez, Pere J. [Grup Natura Freixe, Major 56, 43750 Flix, Catalonia (Spain); Munne, Antoni [Agencia Catalana de l' Aigua, Provenca 204-208, 08036 Barcelona, Catalonia (Spain); Prat, Narcis [Departament d' Ecologia, Universitat de Barcelona, Diagonal 645, 08028 Barcelona Catalonia (Spain)

    2012-02-01

    The lower Ebro River (Catalonia, Spain) has recently undergone a regime shift from a phytoplankton-dominated to a macrophyte-dominated system. This shift is well known in shallow lakes but apparently it has never been documented in rivers. Two initial hypotheses to explain the collapse of the phytoplankton were considered: a) the diminution of nutrients (bottom-up); b) the filtering effect due to the colonization of the zebra mussel (top-down). Data on water quality, hydrology and biological communities (phytoplankton, macrophytes and zebra mussel) was obtained both from existing data sets and new surveys. Results clearly indicate that the decrease in phosphorus is the main cause of a dramatic decrease in chlorophyll and large increase in water transparency, triggering the subsequent colonization of macrophytes in the river bed. A Generalized Linear Model analysis showed that the decrease in dissolved phosphorus had a relative importance 14 times higher than the increase in zebra mussel density to explain the variation of total chlorophyll. We suggest that the described changes in the lower Ebro River can be considered a novel ecosystem shift. This shift is triggering remarkable changes in the biological communities beyond the decrease of phytoplankton and the proliferation of macrophytes, such as massive colonization of Simulidae (black fly) and other changes in the benthic invertebrate communities that are currently investigated. - Highlights: Black-Right-Pointing-Pointer We show a regime shift in a large river from phytoplankton to macrophyte dominance. Black-Right-Pointing-Pointer Two main hypotheses are considered: nutrient decrease and zebra mussel grazing. Black-Right-Pointing-Pointer Phosphorus depletion is found to be the main cause of the phytoplankton decline. Black-Right-Pointing-Pointer We conclude that oligotrophication triggered the colonization of macrophytes. Black-Right-Pointing-Pointer This new regime shift in a river is similar to that described

  13. Assessment of Historic Trend in Mobility and Energy Use in India Transportation Sector Using Bottom-up Approach

    Energy Technology Data Exchange (ETDEWEB)

    Zhou, Nan; McNeil, Michael A.

    2009-05-01

    Transportation mobility in India has increased significantly in the past decades. From 1970 to 2000, motorized mobility (passenger-km) has risen by 888%, compared with an 88% population growth (Singh,2006). This contributed to many energy and environmental issues, and an energy strategy incorporates efficiency improvement and other measures needs to be designed. Unfortunately, existing energy data do not provide information on driving forces behind energy use and sometime show large inconsistencies. Many previous studies address only a single transportation mode such as passenger road travel; did not include comprehensive data collection or analysis has yet been done, or lack detail on energy demand by each mode and fuel mix. The current study will fill a considerable gap in current efforts, develop a data base on all transport modes including passenger air and water, and freight in order to facilitate the development of energy scenarios and assess significance of technology potential in a global climate change model. An extensive literature review and data collection has been done to establish the database with breakdown of mobility, intensity, distance, and fuel mix of all transportation modes. Energy consumption was estimated and compared with aggregated transport consumption reported in IEA India transportation energy data. Different scenarios were estimated based on different assumptions on freight road mobility. Based on the bottom-up analysis, we estimated that the energy consumption from 1990 to 2000 increased at an annual growth rate of 7% for the mid-range road freight growth case and 12% for the high road freight growth case corresponding to the scenarios in mobility, while the IEA data only shows a 1.7% growth rate in those years.

  14. Trophic cascades of bottom-up and top-down forcing on nutrients and plankton in the Kattegat, evaluated by modelling

    DEFF Research Database (Denmark)

    Petersen, Marcell Elo; Maar, Marie; Larsen, Janus;

    2017-01-01

    The aim of the study was to investigate the relative importance of bottom-up and top-down forcing on trophic cascades in the pelagic food-web and the implications for water quality indicators (summer phytoplankton biomass and winter nutrients) in relation to management. The 3D ecological model ER...

  15. Applying bottom-up material flow analysis to identify the system boundaries of non-energy use data in international energy statistics

    NARCIS (Netherlands)

    Weiss, M.; Neelis, M.L.; Zuidberg, M.C.; Patel, M.K.

    2008-01-01

    Data on the non-energy use of fossil fuels in energy statistics are subject to major uncertainties. We apply a simple bottom-up methodology to recalculate non-energy use for the entire world and for the 50 countries with the highest consumption of fossil fuels for non-energy purposes. We quantify wo

  16. Research on ethics in two large Human Biomonitoring projects ECNIS and NewGeneris: a bottom up approach

    Directory of Open Access Journals (Sweden)

    Casteleyn Ludwine

    2008-01-01

    Full Text Available Abstract Assessment of ethical aspects and authorization by ethics committees have become a major constraint for health research including human subjects. Ethical reference values often are extrapolated from clinical settings, where emphasis lies on decisional autonomy and protection of individual's privacy. The question rises if this set of values used in clinical research can be considered as relevant references for HBM research, which is at the basis of public health surveillance. Current and future research activities using human biomarkers are facing new challenges and expectancies on sensitive socio-ethical issues. Reflection is needed on the necessity to balance individual rights against public interest. In addition, many HBM research programs require international collaboration. Domestic legislation is not always easily applicable in international projects. Also, there seem to be considerable inconsistencies in ethical assessments of similar research activities between different countries and even within one country. All this is causing delay and putting the researcher in situations in which it is unclear how to act in accordance with necessary legal requirements. Therefore, analysis of ethical practices and their consequences for HBM research is needed. This analysis will be performed by a bottom-up approach, based on a methodology for comparative analysis of determinants in ethical reasoning, allowing taking into account different social, cultural, political and historical traditions, in view of safeguarding common EU values. Based on information collected in real life complexity, paradigm cases and virtual case scenarios will be developed and discussed with relevant stakeholders to openly discuss possible obstacles and to identify options for improvement in regulation. The material collected will allow developing an ethical framework which may constitute the basis for a more harmonized and consistent socio-ethical and legal approach

  17. Assisted editing od SensorML with EDI. A bottom-up scenario towards the definition of sensor profiles.

    Science.gov (United States)

    Oggioni, Alessandro; Tagliolato, Paolo; Fugazza, Cristiano; Bastianini, Mauro; Pavesi, Fabio; Pepe, Monica; Menegon, Stefano; Basoni, Anna; Carrara, Paola

    2015-04-01

    -product of this ongoing work is currently constituting an archive of predefined sensor descriptions. This information is being collected in order to further ease metadata creation in the next phase of the project. Users will be able to choose among a number of sensor and sensor platform prototypes: These will be specific instances on which it will be possible to define, in a bottom-up approach, "sensor profiles". We report on the outcome of this activity.

  18. Sources of Error in Remote Sensing-Based Bottom-Up Emission Estimates of Carbon and Air Quality Emissions from Crop Residue Burning in the Contiguous United States and the Russian Federation

    Science.gov (United States)

    McCarty, J. L.; Romanenkov, V.

    2010-12-01

    Since its publication in 1980, the Seiler and Crutzen bottom-up method of estimating biomass burning emissions has become an accepted and standard approach cited in nearly 500 peer-reviewed scientific publications. As the science of biomass burning emissions advances, the need to quantify error in variable inputs has also grown. This research focuses on bottom-up emission estimates of black carbon (BC), CO2, CO, CH4, PM10, PM2.5, NO2, and SO2 from crop residue burning in the contiguous U.S. (CONUS) and the Russian Federation. Crop residue burning emissions for the CONUS were estimated for a five-year period, 2003 through 2007, using multispectral remote sensing-derived products, specifically multi-year crop type maps, an 8-day difference Normalized Burn Ratio product, and calibrated area estimates of cropland burning from 1 km MODIS Active Fire Points. An emission factor database was assembled from eleven published sources while fuel loads and combustion completeness were derived from expert knowledge and governmental reports. With the aim of transferring technique and knowledge to in-country collaborators, crop residue burning emissions in Russia were calculated from burned area estimates derived from the 1 km MODIS Active Fire Points. A second analysis of burned area estimates from both a regionally tuned 8-day difference Normalized Burn Ratio product and the standard MODIS Burned Area product focused on the European region of Russia. For these analyses, BC emission factors were estimated by multiplying published BC to PM2.5 ratios to PM2.5 emission factors for similar crops in the CONUS. Errors and uncertainties were quantified for emission factors, fuel loads, combustion completeness, and accuracies of remote sensing products for both burned area and land cover type for the analyses in the CONUS and Russia. The uncertainty for the non-remote sensing variables was difficult to quantify given the lack of observations available. The results from this uncertainty

  19. Top-Down and Bottom-Up Approaches in Engineering 1 T Phase Molybdenum Disulfide (MoS2 ): Towards Highly Catalytically Active Materials.

    Science.gov (United States)

    Chua, Chun Kiang; Loo, Adeline Huiling; Pumera, Martin

    2016-09-26

    The metallic 1 T phase of MoS2 has been widely identified to be responsible for the improved performances of MoS2 in applications including hydrogen evolution reactions and electrochemical supercapacitors. To this aim, various synthetic methods have been reported to obtain 1 T phase-rich MoS2 . Here, the aim is to evaluate the efficiencies of the bottom-up (hydrothermal reaction) and top-down (chemical exfoliation) approaches in producing 1 T phase MoS2 . It is established in this study that the 1 T phase MoS2 produced through the bottom-up approach contains a high proportion of 1 T phase and demonstrates excellent electrochemical and electrical properties. Its performance in the hydrogen evolution reaction and electrochemical supercapacitors also surpassed that of 1 T phase MoS2 produced through a top-down approach.

  20. Modeling Technical Change in Energy System Analysis: Analyzing the Introduction of Learning-by-Doing in Bottom-up Energy Models

    Energy Technology Data Exchange (ETDEWEB)

    Berglund, Christer; Soederholm, Patrik [Luleaa Univ. of Technology (Sweden). Div. of Economics

    2005-02-01

    The main objective of this paper is to provide an overview and a critical analysis of the recent literature on incorporating induced technical change in energy systems models. Special emphasis is put on surveying recent studies aiming at integrating learning-by-doing into bottom-up energy systems models through so-called learning curves, and on analyzing the relevance of learning curve analysis for understanding the process of innovation and technology diffusion in the energy sector. The survey indicates that this model work represents a major advance in energy research, and embeds important policy implications, not the least concerning the cost and the timing of environmental policies (including carbon emission constraints). However, bottom-up energy models with endogenous learning are also limited in their characterization of technology diffusion and innovation. While they provide a detailed account of technical options - which is absent in many top-down models - they also lack important aspects of diffusion behavior that are captured in top-down representations. For instance, they fail in capturing strategic technology diffusion behavior in the energy sector, and they neglect important general equilibrium impacts (such as the opportunity cost of redirecting R and D support to the energy sector). For these reasons bottom-up and top-down models with induced technical change should not be viewed as substitutes but rather as complements.

  1. Factors associated with asthma control.

    NARCIS (Netherlands)

    Vries, M.P. de; Bemt, E.A.J.M. van den; Lince, S.; Muris, J.W.M.; Thoonen, B.P.A.; Schayck, C.P. van

    2005-01-01

    The aim of this study was to evaluate which factors are associated with asthma control experienced by asthma patients. In a cross-sectional study patients aged 16-60 years with mild to moderate asthma were selected. The influence of the following factors on asthma control was studied in a multivaria

  2. Closing the gap? Top-down versus bottom-up projections of China's regional energy use and CO2 emissions

    DEFF Research Database (Denmark)

    Dai, Hancheng; Mischke, Peggy; Xie, Xuxuan;

    2016-01-01

    As the world's largest CO2 emitter, China is a prominent case study for scenario analysis. This study uses two newly developed global top-down and bottom-up models with a regional China focus to compare China's future energy and CO2 emission pathways toward 2050. By harmonizing the economic...... and demographic trends as well as a carbon tax pathway, we explore how both models respond to these identical exogenous inputs. Then a soft-linking methodology is applied to "narrow the gap" between the results computed by these models. We find for example that without soft-linking, China's baseline CO2 emissions...

  3. Aplicación y comparación de la metodología de diseño Top Down y Bottom Up

    OpenAIRE

    Restrepo Muñóz, Verónica Pauline

    2009-01-01

    Este proyecto estudia y compara las metodologías Bottom Up y Top Down, utilizadas en el desarrollo de productos dentro de un departamento de manufactura en un ambiente colaborativo -- Se desarrolló un producto mediante ambas metodologías, posteriormente se analizó su incidencia en el comportamiento de indicadores de gestión, que miden el desempeño de una organización -- Se destacan también los beneficios del Top Down en la manufactu...

  4. Comparing translational population-PBPK modelling of brain microdialysis with bottom-up prediction of brain-to-plasma distribution in rat and human.

    Science.gov (United States)

    Ball, Kathryn; Bouzom, François; Scherrmann, Jean-Michel; Walther, Bernard; Declèves, Xavier

    2014-11-01

    The prediction of brain extracellular fluid (ECF) concentrations in human is a potentially valuable asset during drug development as it can provide the pharmacokinetic input for pharmacokinetic-pharmacodynamic models. This study aimed to compare two translational modelling approaches that can be applied at the preclinical stage of development in order to simulate human brain ECF concentrations. A population-PBPK model of the central nervous system was developed based on brain microdialysis data, and the model parameters were translated to their corresponding human values to simulate ECF and brain tissue concentration profiles. In parallel, the PBPK modelling software Simcyp was used to simulate human brain tissue concentrations, via the bottom-up prediction of brain tissue distribution using two different sets of mechanistic tissue composition-based equations. The population-PBPK and bottom-up approaches gave similar predictions of total brain concentrations in both rat and human, while only the population-PBPK model was capable of accurately simulating the rat ECF concentrations. The choice of PBPK model must therefore depend on the purpose of the modelling exercise, the in vitro and in vivo data available and knowledge of the mechanisms governing the membrane permeability and distribution of the drug.

  5. A Bottom-Up Building Stock Model for Tracking Regional Energy Targets—A Case Study of Kočevje

    Directory of Open Access Journals (Sweden)

    Marjana Šijanec Zavrl

    2016-10-01

    Full Text Available The paper addresses the development of a bottom-up building stock energy model (BuilS for identification of the building stock renovation potential by considering energy performance of individual buildings through cross-linked data from various public available databases. The model enables integration of various EE and RES measures on the building stock to demonstrate long-term economic and environmental effects of different building stock refurbishment strategies. In the presented case study, the BuilS model was applied in the Kočevje city area and validated using the measured energy consumption of the buildings connected to the city district heating system. Three strategies for improving the building stock in Kočevje towards a more sustainable one are presented with their impact on energy use and CO2 emission projections up to 2030. It is demonstrated that the BuilS bottom-up model enables the setting of a correct baseline regarding energy use of the existing building stock and that such a model is a powerful tool for design and validation of the building stock renovation strategies. It is also shown that the accuracy of the model depends on available information on local resources and local needs, therefore acceleration of the building stock monitoring on the level of each building and continually upgrading of databases with building renovation information is of the utmost importance.

  6. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    Directory of Open Access Journals (Sweden)

    Matthieu de Stampa

    2010-02-01

    Full Text Available Background: Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs. Purpose: To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs.Results: In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher followed by a double one (clinician and managers of services in the implementation phase.Conclusion: The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements

  7. Diagnostic study, design and implementation of an integrated model of care in France: a bottom-up process with continuous leadership

    Directory of Open Access Journals (Sweden)

    Matthieu de Stampa

    2010-02-01

    Full Text Available Background: Sustaining integrated care is difficult, in large part because of problems encountered securing the participation of health care and social service professionals and, in particular, general practitioners (GPs. Purpose: To present an innovative bottom-up and pragmatic strategy used to implement a new integrated care model in France for community-dwelling elderly people with complex needs. Results: In the first step, a diagnostic study was conducted with face-to-face interviews to gather data on current practices from a sample of health and social stakeholders working with elderly people. In the second step, an integrated care model called Coordination Personnes Agées (COPA was designed by the same major stakeholders in order to define its detailed characteristics based on the local context. In the third step, the model was implemented in two phases: adoption and maintenance. This strategy was carried out by a continuous and flexible leadership throughout the process, initially with a mixed leadership (clinician and researcher followed by a double one (clinician and managers of services in the implementation phase. Conclusion: The implementation of this bottom-up and pragmatic strategy relied on establishing a collaborative dynamic among health and social stakeholders. This enhanced their involvement throughout the implementation phase, particularly among the GPs, and allowed them to support the change practices and services arrangements

  8. A layered abduction model of perception: Integrating bottom-up and top-down processing in a multi-sense agent

    Science.gov (United States)

    Josephson, John R.

    1989-01-01

    A layered-abduction model of perception is presented which unifies bottom-up and top-down processing in a single logical and information-processing framework. The process of interpreting the input from each sense is broken down into discrete layers of interpretation, where at each layer a best explanation hypothesis is formed of the data presented by the layer or layers below, with the help of information available laterally and from above. The formation of this hypothesis is treated as a problem of abductive inference, similar to diagnosis and theory formation. Thus this model brings a knowledge-based problem-solving approach to the analysis of perception, treating perception as a kind of compiled cognition. The bottom-up passing of information from layer to layer defines channels of information flow, which separate and converge in a specific way for any specific sense modality. Multi-modal perception occurs where channels converge from more than one sense. This model has not yet been implemented, though it is based on systems which have been successful in medical and mechanical diagnosis and medical test interpretation.

  9. Onshore wind energy in Baden-Württemberg: a bottom-up economic assessment of the socio-technical potential

    OpenAIRE

    2015-01-01

    Detailed information about the potentials and costs of renewable energies is an important input factor for energy system models, as well as commercial and political decision-making processes. With its increasing locally installed capacity and hub height, wind energy plays an important role when it comes to meeting climate targets and optimizing electricity networks. Recently however, wind energy has faced more and more social barriers and land use constraints which can negatively impact both ...

  10. BOOK REVIEW: Tribology on the Small Scale: A Bottom Up Approach to Friction, Lubrication, and Wear Tribology on the Small Scale: A Bottom Up Approach to Friction, Lubrication, and Wear

    Science.gov (United States)

    Hainsworth, S.

    2008-11-01

    Friction, lubrication and wear interactions between materials make considerable differences to how efficient our engines are, whether or not we ski downhill faster than others, or whether the shoes that we are wearing give us sufficient grip to successfully navigate the marble floors of buildings. Traditionally, tribologists have focussed on the macroscopic issues of tribological problems, looking at the design of components, the viscosity of oils and the mechanical properties of surfaces to understand how components interact to give the desired friction and wear properties. However, in the last twenty years there has been an increasing realization that the processes that are controlling these macroscopic interactions are determined by what happens on the atomic and microscopic scale. Further, with the advent of nano- and micro-electro mechanical systems (NEMs and MEMs), macroscopic scale tribological interactions do not influence the tribology of these devices in the same way, and capillary forces and van der Waal's forces play an increased role in determining whether these devices function successfully. This book aims to fill a gap in the area of tribology textbooks by addressing the important advances that have been made in our understanding of the science of nano- and micro-scale tribological interactions. The book is aimed at advanced undergraduate and graduate level students on engineering programmes, academics and scientists interested in atomic and microscopic scale tribological interactions, and engineers and scientists who are not tribologists per se but work in technologies (such as NEMs/MEMs) where tribology is of importance. Whilst the target audience appears to be largely engineers, the book should have wider appeal to physicists, chemists and modellers with interests in tribological interactions. The book consists of twelve chapters with an introduction to the general significance of tribology and a brief history of modern tribology, followed by more

  11. Bottoms up: How subnational elections predict parties’ decisions to run in presidential elections in Europe and Latin America

    Directory of Open Access Journals (Sweden)

    Jae-Jae Spoon

    2015-08-01

    Full Text Available Do parties’ experiences in subnational elections predict when parties enter national competition and compete for the presidency? Building upon the party nationalization literature, we argue that a party’s presence in elections across subnational units and its subsequent performance in these elections are determining factors for whether it enters the presidential race. To conduct our analysis, we have assembled an original dataset on parties’ presence and performance in subnational elections and presidential entry in 17 countries in Europe and Latin America from 1990 to 2013. We find that a party’s presence and performance in subnational elections are significant predictors of its decision to run for president, even when the party ran in the previous election and when the elections are concurrent. These findings have important implications for understanding how subnational elections relate to national party systems and democratic representation, more generally.

  12. Preventing violence and reinforcing human security: a rights-based framework for top-down and bottom-up action.

    Science.gov (United States)

    Kjaerulf, Finn; Barahona, Rodrigo

    2010-05-01

    This article explores the violence reduction potential in the intersection between health, criminal justice, and development. It emphasizes public health, rule of law, and equality-driven socioeconomic development as principal concerns in preventing violence. In parts of Latin America, violence has become a serious public health and security problem. Prior studies have explored the risk factors associated with violence as well as experiences in its prevention. These studies and existing approaches to violence prevention provide evidence on where to direct attention and build prevention efforts. This article argues for integrated community-driven and national interventions to create cooperative national- local linkages and embed international human rights law at the national and local levels. Nations struggling with violence should be encouraged to apply an integrated framework to prevent violence and reinforce human security.

  13. Computational Nano-materials Design for Spinodal Nanotechnology as a New Class of Bottom-up Nanotechnology

    Science.gov (United States)

    Katayama-Yoshida, Hiroshi; Fukushima, Tetsuya; Sato, Kazunori

    Based on the spinodal nano-decomposition (SND) of dilute magnetic semiconductors (DMS), we generalized the SND to the application of catalysis and photovoltaic solar-cells, where nano-scale particle formation in catalysis and and nano-scale separation of electrons and holes are essential in order to enhance the efficiency. First, we summarize the shape control (Konbu- & Dairiseki-Phases) and dimensionality dependence of crystal growth condition on SND in DMS. Second, we discuss the application of SND for the formation of nano-particles and the self-regeneration in three-way catalysis for automotive emission control by Perovskite La(Fe,Pd or Rh)O3. Third, we propose (i) self-regeneration mechanism and (ii) self-organized nano-structures by SND in chalcopyrite Cu(In,Ga)Se2, Kesterite Cu2ZnSnSe4, and Perovskite CsSnI3 for the low-cost, environment-friendly and high-efficiency photovoltaic solar cells using first-principles calculations.

  14. Modeling eye movements in visual agnosia with a saliency map approach: bottom-up guidance or top-down strategy?

    Science.gov (United States)

    Foulsham, Tom; Barton, Jason J S; Kingstone, Alan; Dewhurst, Richard; Underwood, Geoffrey

    2011-08-01

    Two recent papers (Foulsham, Barton, Kingstone, Dewhurst, & Underwood, 2009; Mannan, Kennard, & Husain, 2009) report that neuropsychological patients with a profound object recognition problem (visual agnosic subjects) show differences from healthy observers in the way their eye movements are controlled when looking at images. The interpretation of these papers is that eye movements can be modeled as the selection of points on a saliency map, and that agnosic subjects show an increased reliance on visual saliency, i.e., brightness and contrast in low-level stimulus features. Here we review this approach and present new data from our own experiments with an agnosic patient that quantifies the relationship between saliency and fixation location. In addition, we consider whether the perceptual difficulties of individual patients might be modeled by selectively weighting the different features involved in a saliency map. Our data indicate that saliency is not always a good predictor of fixation in agnosia: even for our agnosic subject, as for normal observers, the saliency-fixation relationship varied as a function of the task. This means that top-down processes still have a significant effect on the earliest stages of scanning in the setting of visual agnosia, indicating severe limitations for the saliency map model. Top-down, active strategies-which are the hallmark of our human visual system-play a vital role in eye movement control, whether we know what we are looking at or not.

  15. A bottom-up art event gave birth to a process of community empowerment in an Italian village.

    Science.gov (United States)

    Sardu, Claudia; Mereu, Alessandra; Sotgiu, Alessandra; Contu, Paolo

    2012-03-01

    Although community participation is a component of community empowerment, it often remains a theoretical exhortation. Reporting experiences which enable people to take control of their lives, can be useful to suggest practical elements for promoting empowerment. This article describes the experience of a Sardinian village (Ulassai), that developed into a community empowerment. The Laverack's operational domains were used to measure the community empowerment process. The process started in 1979 'almost by chance' with an art performance that was the entry point for community participation. This experience has been the foundation for the community empowerment. Citizens acquired the 'ability of thinking and planning as a community and not mere individuals'. In the following 30 years citizens gave birth to several outcomes rooted in that event. The intermediate outcomes highlight the 'ability of action by a group to mobilize existing resources, and act collectively against opposing forces'. The long-term outcomes demonstrate the 'ability to integrate the cultural experiences that strengthened the community's identification into a sustainable community asset', and the 'ability to cope with global environmental challenges and to collaborate on an equal basis with other stakeholders. The pathways to community empowerment, showed by the community of Ulassai, overlap with the 'operational domains'. The Ulassai experience shows that the empowerment process can start from an event apparently unrelated to health promotion. This community experience illustrates the positive role arts can play in community development. Hence, the call for health promoters to look carefully into those situations that occur naturally in communities.

  16. An extended model for the evolution of prebiotic homochirality: a bottom-up approach to the origin of life.

    Science.gov (United States)

    Gleiser, Marcelo; Walker, Sara Imari

    2008-08-01

    A generalized autocatalytic model for chiral polymerization is investigated in detail. Apart from enantiomeric cross-inhibition, the model allows for the autogenic (non-catalytic) formation of left and right-handed monomers from a substrate with reaction rates epsilon L and epsilon R, respectively. The spatiotemporal evolution of the net chiral asymmetry is studied for models with several values of the maximum polymer length, N. For N = 2, we study the validity of the adiabatic approximation often cited in the literature. We show that the approximation obtains the correct equilibrium values of the net chirality, but fails to reproduce the short time behavior. We show also that the autogenic term in the full N = 2 model behaves as a control parameter in a chiral symmetry-breaking phase transition leading to full homochirality from racemic initial conditions. We study the dynamics of the N--> infinity model with symmetric (epsilon L = epsilon R) autogenic formation, showing that it only achieves homochirality for epsilon > epsilon c, where epsilon c is an N-dependent critical value. For epsilon

  17. An Extended Model for the Evolution of Prebiotic Homochirality: A Bottom-Up Approach to the Origin of Life

    CERN Document Server

    Gleiser, Marcelo

    2008-01-01

    A generalized autocatalytic model for chiral polymerization is investigated in detail. Apart from enantiomeric cross-inhibition, the model allows for the autogenic (non-catalytic) formation of left and right-handed monomers from a substrate with reaction rates $\\epsilon_L$ and $\\epsilon_R$, respectively. The spatiotemporal evolution of the net chiral asymmetry is studied for models with several values of the maximum polymer length, N. For N=2, we study the validity of the adiabatic approximation often cited in the literature. We show that the approximation obtains the correct equilibrium values of the net chirality, but fails to reproduce the short time behavior. We show also that the autogenic term in the full N=2 model behaves as a control parameter in a chiral symmetry- breaking phase transition leading to full homochirality from racemic initial conditions. We study the dynamics of the N -> infinity model with symmetric ($\\epsilon_L = \\epsilon_R$) autogenic formation, showing that it only achieves homochir...

  18. Tunneling in low-power device-design: A bottom-up view of issues, challenges, and opportunities

    Science.gov (United States)

    Ganapathi, Kartik

    Simulation of electronic transport in nanoscale devices plays a pivotal role in shedding light on underlying physics, and in guiding device-design and optimization. The length scale of the problem and the physical mechanism of device operation guide the choice of formalism. In the sub-20 nanometer regime, semi-classical approaches start breaking down, thus necessitating a quantum-mechanical treatment of the electronic transport problem. Non-equilibrium Green's function (NEGF) is a theoretical framework for investigating quantum-mechanical systems---interacting with surroundings through exchange of quasiparticles---far from equilibrium. Although hugely computation-intensive with a realistic device-representation, it provides a rigorous way to include particle-particle interactions and to model phenomena that are inherently quantum-mechanical. We build the Berkeley Quantum Transport Simulator (BQTS)---a massively parallel, generic, NEGF-based numerical simulator---to explore low-power device-design opportunities. Demonstrating scalability and benchmarking results with experimental tunnel diode data, we set out to understand tunneling in devices and to leverage it for both digital and analog applications. Investigating InAs short-channel band-to-band tunneling transistors (TFETs), we show that direct source-to-drain tunneling sets the leakage-floor in such devices, thereby limiting the minimum subthreshold swing (SS) in spite of excellent electrostatics. A heterojunction TFET with a halo doping in the source-channel overlap region is proposed and is shown to achieve steep SS as well as large ON current. We discover that by band-offset engineering, the steepness therein could be controlled primarily by the modulation of heterojunction-barrier. Subsequently, exploring layered materials for analog applications, we demonstrate that doping the drain underlap region in graphene FETs prolongs the onset of tunneling in their output characteristics, and hence significantly

  19. Deciphering the components of regional net ecosystem fluxes following a bottom-up approach for the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    N. Carvalhais

    2010-11-01

    Full Text Available Quantification of ecosystem carbon pools is a fundamental requirement for estimating carbon fluxes and for addressing the dynamics and responses of the terrestrial carbon cycle to environmental drivers. The initial estimates of carbon pools in terrestrial carbon cycle models often rely on the ecosystem steady state assumption, leading to initial equilibrium conditions. In this study, we investigate how trends and inter-annual variability of net ecosystem fluxes are affected by initial non-steady state conditions. Further, we examine how modeled ecosystem responses induced exclusively by the model drivers can be separated from the initial conditions. For this, the Carnegie-Ames-Stanford Approach (CASA model is optimized at set of European eddy covariance sites, which support the parameterization of regional simulations of ecosystem fluxes for the Iberian Peninsula, between 1982 and 2006.

    The presented analysis stands on a credible model performance for a set of sites, that represent generally well the plant functional types and selected descriptors of climate and phenology present in the Iberian region – except for a limited Northwestern area. The effects of initial conditions on inter-annual variability and on trends, results mostly from the recovery of pools to equilibrium conditions; which control most of the inter-annual variability (IAV and both the magnitude and sign of most of the trends. However, by removing the time series of pure model recovery from the time series of the overall fluxes, we are able to retrieve estimates of inter-annual variability and trends in net ecosystem fluxes that are quasi-independent from the initial conditions. This approach reduced the sensitivity of the net fluxes to initial conditions from 47% and 174% to −3% and 7%, for strong initial sink and source conditions, respectively.

    With the aim to identify and improve understanding of the component fluxes that drive the observed trends, the

  20. Deciphering the components of regional net ecosystem fluxes following a bottom-up approach for the Iberian Peninsula

    Directory of Open Access Journals (Sweden)

    N. Carvalhais

    2010-06-01

    Full Text Available Quantification of ecosystem carbon pools is a fundamental requirement for estimating carbon fluxes and for addressing the dynamics and responses of the terrestrial carbon cycle to environmental drivers. The initial estimates of carbon pools in terrestrial carbon cycle models often rely on the ecosystem steady state assumption, leading to initial equilibrium conditions. In this study, we investigate how trends and inter-annual variability of net ecosystem fluxes are affected by initial non-steady state conditions. Further, we examine how modeled ecosystem responses induced exclusively by the model drivers can be separated from the initial conditions. For this, the Carnegie-Ames-Stanford Approach (CASA model is optimized at set of European eddy covariance sites, which support the parameterization of regional simulations of ecosystem fluxes for the Iberian Peninsula, between 1982 and 2006.

    The presented analysis stands on a credible model performance for a set of sites, that well represent the plant functional types and selected descriptors of climate and phenology present in the Iberian region – except for a limited northwestern area. The effects of initial conditions on inter-annual variability and on trends, results mostly from the recovery of pools to equilibrium conditions; which control most of the inter-annual variability (IAV and both the magnitude and sign of most of the trends. However, by removing the time series of pure model recovery from the time series of the overall fluxes, we are able to retrieve estimates of inter-annual variability and trends in net ecosystem fluxes that are quasi-independent from the initial conditions. This approach reduced the sensitivity of the net fluxes to initial conditions from 47% and 174% to −3% and 7%, for strong initial sink and source conditions, respectively.

    With the aim to identify and improve understanding of the component fluxes that drive the observed trends, the net

  1. 3-Substituted-7-(diethylamino)coumarins as molecular scaffolds for the bottom-up self-assembly of solids with extensive π-stacking

    Science.gov (United States)

    Arcos-Ramos, Rafael; Maldonado-Domínguez, Mauricio; Ordóñez-Hernández, Javier; Romero-Ávila, Margarita; Farfán, Norberto; Carreón-Castro, María del Pilar

    2017-02-01

    In this study, a set of molecular crystals derived from 3-substituted-7-(diethylamino)-2H-chromen-2-ones 1-8 were studied to sample the aggregation of coumarins into ordered solids. Crystals of parent compound 1a and its brominated derivative 2 were obtained and solved in the P-1 and C2/c space groups, respectively. All the crystalline coumarins studied display extensive π-stacking in the solid-state. Theoretical valence-conduction band gaps for derivatives 3b and 5 are close to crystalline rubrene, highlighting the importance of cooperativity and periodicity of π-stacking, in organic semiconductors; given their synthetic accessibility, electronic tunability and self-assembly via stacking, dipolar and H-bonding interactions, these systems arise as candidates for the bottom-up construction of organic crystals with extensive π-stacking and high polarizability.

  2. Integration scheme of nanoscale resistive switching memory using bottom-up processes at room temperature for high-density memory applications

    Science.gov (United States)

    Han, Un-Bin; Lee, Jang-Sik

    2016-07-01

    A facile and versatile scheme is demonstrated to fabricate nanoscale resistive switching memory devices that exhibit reliable bipolar switching behavior. A solution process is used to synthesize the copper oxide layer into 250-nm via-holes that had been patterned in Si wafers. Direct bottom-up filling of copper oxide can facilitate fabrication of nanoscale memory devices without using vacuum deposition and etching processes. In addition, all materials and processes are CMOS compatible, and especially, the devices can be fabricated at room temperature. Nanoscale memory devices synthesized on wafers having 250-nm via-holes showed reproducible resistive switching programmable memory characteristics with reasonable endurance and data retention properties. This integration strategy provides a solution to overcome the scaling limit of current memory device fabrication methods.

  3. Integration scheme of nanoscale resistive switching memory using bottom-up processes at room temperature for high-density memory applications

    Science.gov (United States)

    Han, Un-Bin; Lee, Jang-Sik

    2016-01-01

    A facile and versatile scheme is demonstrated to fabricate nanoscale resistive switching memory devices that exhibit reliable bipolar switching behavior. A solution process is used to synthesize the copper oxide layer into 250-nm via-holes that had been patterned in Si wafers. Direct bottom-up filling of copper oxide can facilitate fabrication of nanoscale memory devices without using vacuum deposition and etching processes. In addition, all materials and processes are CMOS compatible, and especially, the devices can be fabricated at room temperature. Nanoscale memory devices synthesized on wafers having 250-nm via-holes showed reproducible resistive switching programmable memory characteristics with reasonable endurance and data retention properties. This integration strategy provides a solution to overcome the scaling limit of current memory device fabrication methods. PMID:27364856

  4. Visual scanning and recognition of Chinese, Caucasian, and racially ambiguous faces: contributions from bottom-up facial physiognomic information and top-down knowledge of racial categories.

    Science.gov (United States)

    Wang, Qiandong; Xiao, Naiqi G; Quinn, Paul C; Hu, Chao S; Qian, Miao; Fu, Genyue; Lee, Kang

    2015-02-01

    Recent studies have shown that participants use different eye movement strategies when scanning own- and other-race faces. However, it is unclear (1) whether this effect is related to face recognition performance, and (2) to what extent this effect is influenced by top-down or bottom-up facial information. In the present study, Chinese participants performed a face recognition task with Chinese, Caucasian, and racially ambiguous faces. For the racially ambiguous faces, we led participants to believe that they were viewing either own-race Chinese faces or other-race Caucasian faces. Results showed that (1) Chinese participants scanned the nose of the true Chinese faces more than that of the true Caucasian faces, whereas they scanned the eyes of the Caucasian faces more than those of the Chinese faces; (2) they scanned the eyes, nose, and mouth equally for the ambiguous faces in the Chinese condition compared with those in the Caucasian condition; (3) when recognizing the true Chinese target faces, but not the true target Caucasian faces, the greater the fixation proportion on the nose, the faster the participants correctly recognized these faces. The same was true when racially ambiguous face stimuli were thought to be Chinese faces. These results provide the first evidence to show that (1) visual scanning patterns of faces are related to own-race face recognition response time, and (2) it is bottom-up facial physiognomic information that mainly contributes to face scanning. However, top-down knowledge of racial categories can influence the relationship between face scanning patterns and recognition response time.

  5. Top-Down and Bottom-Up Identification of Proteins by Liquid Extraction Surface Analysis Mass Spectrometry of Healthy and Diseased Human Liver Tissue

    Science.gov (United States)

    Sarsby, Joscelyn; Martin, Nicholas J.; Lalor, Patricia F.; Bunch, Josephine; Cooper, Helen J.

    2014-09-01

    Liquid extraction surface analysis mass spectrometry (LESA MS) has the potential to become a useful tool in the spatially-resolved profiling of proteins in substrates. Here, the approach has been applied to the analysis of thin tissue sections from human liver. The aim was to determine whether LESA MS was a suitable approach for the detection of protein biomarkers of nonalcoholic liver disease (nonalcoholic steatohepatitis, NASH), with a view to the eventual development of LESA MS for imaging NASH pathology. Two approaches were considered. In the first, endogenous proteins were extracted from liver tissue sections by LESA, subjected to automated trypsin digestion, and the resulting peptide mixture was analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS) (bottom-up approach). In the second (top-down approach), endogenous proteins were extracted by LESA, and analyzed intact. Selected protein ions were subjected to collision-induced dissociation (CID) and/or electron transfer dissociation (ETD) mass spectrometry. The bottom-up approach resulted in the identification of over 500 proteins; however identification of key protein biomarkers, liver fatty acid binding protein (FABP1), and its variant (Thr→Ala, position 94), was unreliable and irreproducible. Top-down LESA MS analysis of healthy and diseased liver tissue revealed peaks corresponding to multiple (~15-25) proteins. MS/MS of four of these proteins identified them as FABP1, its variant, α-hemoglobin, and 10 kDa heat shock protein. The reliable identification of FABP1 and its variant by top-down LESA MS suggests that the approach may be suitable for imaging NASH pathology in sections from liver biopsies.

  6. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Directory of Open Access Journals (Sweden)

    Sebastian McBride

    Full Text Available Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1 conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2 implementation and validation of the model into robotic hardware (as a representative of an active vision system. Seven computational requirements were identified: 1 transformation of retinotopic to egocentric mappings, 2 spatial memory for the purposes of medium-term inhibition of return, 3 synchronization of 'where' and 'what' information from the two visual streams, 4 convergence of top-down and bottom-up information to a centralized point of information processing, 5 a threshold function to elicit saccade action, 6 a function to represent task relevance as a ratio of excitation and inhibition, and 7 derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  7. Top-down and bottom-up identification of proteins by liquid extraction surface analysis mass spectrometry of healthy and diseased human liver tissue.

    Science.gov (United States)

    Sarsby, Joscelyn; Martin, Nicholas J; Lalor, Patricia F; Bunch, Josephine; Cooper, Helen J

    2014-11-01

    Liquid extraction surface analysis mass spectrometry (LESA MS) has the potential to become a useful tool in the spatially-resolved profiling of proteins in substrates. Here, the approach has been applied to the analysis of thin tissue sections from human liver. The aim was to determine whether LESA MS was a suitable approach for the detection of protein biomarkers of nonalcoholic liver disease (nonalcoholic steatohepatitis, NASH), with a view to the eventual development of LESA MS for imaging NASH pathology. Two approaches were considered. In the first, endogenous proteins were extracted from liver tissue sections by LESA, subjected to automated trypsin digestion, and the resulting peptide mixture was analyzed by liquid chromatography tandem mass spectrometry (LC-MS/MS) (bottom-up approach). In the second (top-down approach), endogenous proteins were extracted by LESA, and analyzed intact. Selected protein ions were subjected to collision-induced dissociation (CID) and/or electron transfer dissociation (ETD) mass spectrometry. The bottom-up approach resulted in the identification of over 500 proteins; however identification of key protein biomarkers, liver fatty acid binding protein (FABP1), and its variant (Thr→Ala, position 94), was unreliable and irreproducible. Top-down LESA MS analysis of healthy and diseased liver tissue revealed peaks corresponding to multiple (~15-25) proteins. MS/MS of four of these proteins identified them as FABP1, its variant, α-hemoglobin, and 10 kDa heat shock protein. The reliable identification of FABP1 and its variant by top-down LESA MS suggests that the approach may be suitable for imaging NASH pathology in sections from liver biopsies.

  8. Identifying the computational requirements of an integrated top-down-bottom-up model for overt visual attention within an active vision system.

    Science.gov (United States)

    McBride, Sebastian; Huelse, Martin; Lee, Mark

    2013-01-01

    Computational visual attention systems have been constructed in order for robots and other devices to detect and locate regions of interest in their visual world. Such systems often attempt to take account of what is known of the human visual system and employ concepts, such as 'active vision', to gain various perceived advantages. However, despite the potential for gaining insights from such experiments, the computational requirements for visual attention processing are often not clearly presented from a biological perspective. This was the primary objective of this study, attained through two specific phases of investigation: 1) conceptual modeling of a top-down-bottom-up framework through critical analysis of the psychophysical and neurophysiological literature, 2) implementation and validation of the model into robotic hardware (as a representative of an active vision system). Seven computational requirements were identified: 1) transformation of retinotopic to egocentric mappings, 2) spatial memory for the purposes of medium-term inhibition of return, 3) synchronization of 'where' and 'what' information from the two visual streams, 4) convergence of top-down and bottom-up information to a centralized point of information processing, 5) a threshold function to elicit saccade action, 6) a function to represent task relevance as a ratio of excitation and inhibition, and 7) derivation of excitation and inhibition values from object-associated feature classes. The model provides further insight into the nature of data representation and transfer between brain regions associated with the vertebrate 'active' visual attention system. In particular, the model lends strong support to the functional role of the lateral intraparietal region of the brain as a primary area of information consolidation that directs putative action through the use of a 'priority map'.

  9. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Directory of Open Access Journals (Sweden)

    W. J. F. Acton

    2015-10-01

    Full Text Available This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton transfer reaction-mass spectrometer (PTR-MS and a proton transfer reaction-time of flight-mass spectrometer (PTR-ToF-MS together with the methods of virtual disjunct eddy covariance (PTR-MS and eddy covariance (PTR-ToF-MS. Isoprene was the dominant emitted compound with a mean day-time flux of 1.9 mg m-2 h-1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28 day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m-2 h-1 was calculated using the MEGAN isoprene emissions algorithms (Guenther et al., 2006. A detailed tree species distribution map for the site enabled the leaf-level emissions of isoprene and monoterpenes recorded using GC-MS to be scaled up to produce a "bottom-up" canopy-scale flux. This was compared with the "top-down" canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  10. Top-down model estimates, bottom-up inventories, and future projections of global natural and anthropogenic emissions of nitrous oxide

    Science.gov (United States)

    Davidson, E. A.; Kanter, D.

    2013-12-01

    Nitrous oxide (N2O) is the third most abundantly emitted greenhouse gas and the largest remaining emitted ozone depleting substance. It is a product of nitrifying and denitrifying bacteria in soils, sediments and water bodies. Humans began to disrupt the N cycle in the preindustrial era as they expanded agricultural land, used fire for land clearing and management, and cultivated leguminous crops that carry out biological N fixation. This disruption accelerated after the industrial revolution, especially as the use of synthetic N fertilizers became common after 1950. Here we present findings from a new United Nations Environment Programme report, in which we constrain estimates of the anthropogenic and natural emissions of N2O and consider scenarios for future emissions. Inventory-based estimates of natural emissions from terrestrial, marine and atmospheric sources range from 10 to 12 Tg N2O-N/yr. Similar values can be derived for global N2O emissions that were predominantly natural before the industrial revolution. While there was inter-decadal variability, there was little or no consistent trend in atmospheric N2O concentrations between 1730 and 1850, allowing us to assume near steady state. Assuming an atmospheric lifetime of 120 years, the 'top-down' estimate of pre-industrial emissions of 11 Tg N2O-N/yr is consistent with the bottom-up inventories for natural emissions, although the former includes some modest pre-industrial anthropogenic effects (probably large inherent uncertainties in both approaches, it is encouraging that the bottom-up (6.0) and top-down (5.3) estimates are within 12% of each other and their uncertainty ranges overlap. N2O is inescapably linked to food production and food security. Future agricultural emissions will be determined by population, dietary habits, and agricultural N use efficiency. Without deliberate and effective mitigation policies, anthropogenic N2O emissions will likely double by 2050 and continue to increase thereafter

  11. Canopy-scale flux measurements and bottom-up emission estimates of volatile organic compounds from a mixed oak and hornbeam forest in northern Italy

    Science.gov (United States)

    Acton, W. Joe F.; Schallhart, Simon; Langford, Ben; Valach, Amy; Rantala, Pekka; Fares, Silvano; Carriero, Giulia; Tillmann, Ralf; Tomlinson, Sam J.; Dragosits, Ulrike; Gianelle, Damiano; Hewitt, C. Nicholas; Nemitz, Eiko

    2016-06-01

    This paper reports the fluxes and mixing ratios of biogenically emitted volatile organic compounds (BVOCs) 4 m above a mixed oak and hornbeam forest in northern Italy. Fluxes of methanol, acetaldehyde, isoprene, methyl vinyl ketone + methacrolein, methyl ethyl ketone and monoterpenes were obtained using both a proton-transfer-reaction mass spectrometer (PTR-MS) and a proton-transfer-reaction time-of-flight mass spectrometer (PTR-ToF-MS) together with the methods of virtual disjunct eddy covariance (using PTR-MS) and eddy covariance (using PTR-ToF-MS). Isoprene was the dominant emitted compound with a mean daytime flux of 1.9 mg m-2 h-1. Mixing ratios, recorded 4 m above the canopy, were dominated by methanol with a mean value of 6.2 ppbv over the 28-day measurement period. Comparison of isoprene fluxes calculated using the PTR-MS and PTR-ToF-MS showed very good agreement while comparison of the monoterpene fluxes suggested a slight over estimation of the flux by the PTR-MS. A basal isoprene emission rate for the forest of 1.7 mg m-2 h-1 was calculated using the Model of Emissions of Gases and Aerosols from Nature (MEGAN) isoprene emission algorithms (Guenther et al., 2006). A detailed tree-species distribution map for the site enabled the leaf-level emission of isoprene and monoterpenes recorded using gas-chromatography mass spectrometry (GC-MS) to be scaled up to produce a bottom-up canopy-scale flux. This was compared with the top-down canopy-scale flux obtained by measurements. For monoterpenes, the two estimates were closely correlated and this correlation improved when the plant-species composition in the individual flux footprint was taken into account. However, the bottom-up approach significantly underestimated the isoprene flux, compared with the top-down measurements, suggesting that the leaf-level measurements were not representative of actual emission rates.

  12. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands

    Directory of Open Access Journals (Sweden)

    Rita S. W. Yam

    2016-02-01

    Full Text Available Pomacea canaliculata (Ampullariidae has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents.

  13. Importance of Macrophyte Quality in Determining Life-History Traits of the Apple Snails Pomacea canaliculata: Implications for Bottom-Up Management of an Invasive Herbivorous Pest in Constructed Wetlands.

    Science.gov (United States)

    Yam, Rita S W; Fan, Yen-Tzu; Wang, Tzu-Ting

    2016-02-24

    Pomacea canaliculata (Ampullariidae) has extensively invaded most Asian constructed wetlands and its massive herbivory of macrophytes has become a major cause of ecosystem dysfunctioning of these restored habitats. We conducted non-choice laboratory feeding experiments of P. canaliculata using five common macrophyte species in constructed wetlands including Ipomoea aquatica, Commelina communis, Nymphoides coreana, Acorus calamus and Phragmites australis. Effects of macrophytes on snail feeding, growth and fecundity responses were evaluated. Results indicated that P. canaliculata reared on Ipomoea had the highest feeding and growth rates with highest reproductive output, but all individuals fed with Phragmites showed lowest feeding rates and little growth with poorest reproductive output. Plant N and P contents were important for enhancing palatability, supporting growth and offspring quantity of P. canaliculata, whilst toughness, cellulose and phenolics had critically deterrent effects on various life-history traits. Although snail offspring quality was generally consistent regardless of maternal feeding conditions, the reduced growth and offspring quantity of the poorly-fed snails in constructed wetlands dominated by the less-palatable macrophytes could limit the invasive success of P. canaliculata. Effective bottom-up control of P. canaliculata in constructed wetlands should involve selective planting strategy using macrophytes with low nutrient and high toughness, cellulose and phenolic contents.

  14. On the advantages of spring magnets compared to pure FePt: Strategy for rare-earth free permanent magnets following a bottom-up approach

    Science.gov (United States)

    Pousthomis, M.; Garnero, C.; Marcelot, C. G.; Blon, T.; Cayez, S.; Cassignol, C.; Du, V. A.; Krispin, M.; Arenal, R.; Soulantica, K.; Viau, G.; Lacroix, L.-M.

    2017-02-01

    Nanostructured magnets benefiting from efficient exchange-coupling between hard and soft grains represent an appealing approach for integrated miniaturized magnetic power sources. Using a bottom-up approach, nanostructured materials were prepared from binary assemblies of bcc FeCo and fcc FePt nanoparticles and compared with pure L10-FePt materials. The use of a bifunctional mercapto benzoic acid yields homogeneous assemblies of the two types of particles while reducing the organic matter amount. The 650 °C thermal annealing, mandatory to allow the L10-FePt phase transition, led to an important interdiffusion and thus decreased drastically the amount of soft phase present in the final composites. The analysis of recoil curves however evidenced the presence of an efficient interphase exchange coupling, which allows obtaining better magnetic performances than pure L10 FePt materials, energy product above 100 kJ m-3 being estimated for a Pt content of only 33%. These results clearly evidenced the interest of chemically grown nanoparticles for the preparation of performant spring-magnets, opening promising perspective for integrated subcentimetric magnets with optimized properties.

  15. Middle-Out Approaches to Reform of University Teaching and Learning: Champions striding between the top-down and bottom-up approaches

    Directory of Open Access Journals (Sweden)

    Rick Cummings

    2005-03-01

    Full Text Available In recent years, Australian universities have been driven by a diversity of external forces, including funding cuts, massification of higher education, and changing student demographics, to reform their relationship with students and improve teaching and learning, particularly for those studying off-campus or part-time. Many universities have responded to these forces either through formal strategic plans developed top-down by executive staff or through organic developments arising from staff in a bottom-up approach. By contrast, much of Murdoch University’s response has been led by a small number of staff who have middle management responsibilities and who have championed the reform of key university functions, largely in spite of current policy or accepted practice. This paper argues that the ‘middle-out’ strategy has both a basis in change management theory and practice, and a number of strengths, including low risk, low cost, and high sustainability. Three linked examples of middle-out change management in teaching and learning at Murdoch University are described and the outcomes analyzed to demonstrate the benefits and pitfalls of this approach.

  16. Duemmler, Kerstin; Nagel, Alexander-Kenneth: governing religious diversity: top-down and bottom-up initiatives in Germany and Switzerland.

    Science.gov (United States)

    Duemmler, Kerstin; Nagel, Alexander-Kenneth

    2013-06-01

    In recent years religious pluralization has become a significant policy issue in Western societies as a result of a new awareness of religion and of religious minorities articulating themselves and becoming more visible. The article explores the variety of social and political reactions to religious diversity in urban areas and in doing so it brings together theoretical concepts of political and cultural sociology. The notion of diversity governance as joint endeavour of state and societal actors managing societies is linked to the notion of boundary work as interplay of state and/or societal actors maintaining or modifying boundaries between religious traditions. Based on two case studies the article illustrates two idealtypical settings of diversity governance: The first case from the German Ruhr Area stands for a bottom-up approach which is based on civic self-organization of interreligious activities whereas the second case from the Swiss canton of Lucerne exhibits a model of top-down governance based on state interventions in religious instruction at schools. Drawing on semi-structured interviews and participant observation the authors show how different governance settings shape the construction and blurring of boundaries in the religious field. Both approaches operate differently when incorporating religious diversity and rendering former homogenous notions of we-groups more heterogeneous. Despite of the approaches initial aim of inclusion, patterns of exclusion are equally reproduced since the idea of 'legitimate religion' rooted in Christian majority culture is present.

  17. High-Throughput Top-Down and Bottom-Up Processes for Forming Single-Nanotube Based Architectures for 3D Electronics

    Science.gov (United States)

    Kaul, Anupama B.; Megerian, Krikor G.; von Allmen, Paul; Kowalczyk, Robert; Baron, Richard

    2009-01-01

    We have developed manufacturable approaches to form single, vertically aligned carbon nanotubes, where the tubes are centered precisely, and placed within a few hundred nm of 1-1.5 micron deep trenches. These wafer-scale approaches were enabled by chemically amplified resists and inductively coupled Cryo-etchers for forming the 3D nanoscale architectures. The tube growth was performed using dc plasma-enhanced chemical vapor deposition (PECVD), and the materials used for the pre-fabricated 3D architectures were chemically and structurally compatible with the high temperature (700 C) PECVD synthesis of our tubes, in an ammonia and acetylene ambient. Tube characteristics were also engineered to some extent, by adjusting growth parameters, such as Ni catalyst thickness, pressure and plasma power during growth. Such scalable, high throughput top-down fabrication techniques, combined with bottom-up tube synthesis, should accelerate the development of PECVD tubes for applications such as interconnects, nano-electromechanical (NEMS), sensors or 3D electronics in general.

  18. The bottom-up approach to defining life : deciphering the functional organization of biological cells via multi-objective representation of biological complexity from molecules to cells

    Directory of Open Access Journals (Sweden)

    Sathish ePeriyasamy

    2013-12-01

    Full Text Available In silico representation of cellular systems needs to represent the adaptive dynamics of biological cells, recognizing a cell’s multi-objective topology formed by spatially and temporally cohesive intracellular structures. The design of these models needs to address the hierarchical and concurrent nature of cellular functions and incorporate the ability to self-organise in response to transitions between healthy and pathological phases, and adapt accordingly. The functions of biological systems are constantly evolving, due to the ever changing demands of their environment. Biological systems meet these demands by pursuing objectives, aided by their constituents, giving rise to biological functions. A biological cell is organised into an objective/task hierarchy. These objective hierarchy corresponds to the nested nature of temporally cohesive structures and representing them will facilitate in studying pleiotropy and polygeny by modeling causalities propagating across multiple interconnected intracellular processes. Although biological adaptations occur in physiological, developmental and reproductive timescales, the paper is focused on adaptations that occur within physiological timescales, where the biomolecular activities contributing to functional organisation, play a key role in cellular physiology. The paper proposes a multi-scale and multi-objective modelling approach from the bottom-up by representing temporally cohesive structures for multi-tasking of intracellular processes. Further the paper characterises the properties and constraints that are consequential to the organisational and adaptive dynamics in biological cells.

  19. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems, Using a Bottom-Up Approach and Installer Survey - Second Edition

    Energy Technology Data Exchange (ETDEWEB)

    Friedman, B.; Ardani, K.; Feldman, D.; Citron, R.; Margolis, R.; Zuboy, J.

    2013-10-01

    This report presents results from the second U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs -- often referred to as 'business process' or 'soft' costs -- for U.S. residential and commercial photovoltaic (PV) systems. In service to DOE's SunShot Initiative, annual expenditure and labor-hour-productivity data are analyzed to benchmark 2012 soft costs related to (1) customer acquisition and system design (2) permitting, inspection, and interconnection (PII). We also include an in-depth analysis of costs related to financing, overhead, and profit. Soft costs are both a major challenge and a major opportunity for reducing PV system prices and stimulating SunShot-level PV deployment in the United States. The data and analysis in this series of benchmarking reports are a step toward the more detailed understanding of PV soft costs required to track and accelerate these price reductions.

  20. Synthesis of a Cementitious Material Nanocement Using Bottom-Up Nanotechnology Concept: An Alternative Approach to Avoid CO2 Emission during Production of Cement

    Directory of Open Access Journals (Sweden)

    Byung Wan Jo

    2014-01-01

    Full Text Available The world’s increasing need is to develop smart and sustainable construction material, which will generate minimal climate changing gas during their production. The bottom-up nanotechnology has established itself as a promising alternative technique for the production of the cementitious material. The present investigation deals with the chemical synthesis of cementitious material using nanosilica, sodium aluminate, sodium hydroxide, and calcium nitrate as reacting phases. The characteristic properties of the chemically synthesized nanocement were verified by the chemical composition analysis, setting time measurement, particle size distribution, fineness analysis, and SEM and XRD analyses. Finally, the performance of the nanocement was ensured by the fabrication and characterization of the nanocement based mortar. Comparing the results with the commercially available cement product, it is demonstrated that the chemically synthesized nanocement not only shows better physical and mechanical performance, but also brings several encouraging impacts to the society, including the reduction of CO2 emission and the development of sustainable construction material. A plausible reaction scheme has been proposed to explain the synthesis and the overall performances of the nanocement.

  1. Bottom-up derivation of conservative and dissipative interactions for coarse-grained molecular liquids with the conditional reversible work method

    Energy Technology Data Exchange (ETDEWEB)

    Deichmann, Gregor; Marcon, Valentina; Vegt, Nico F. A. van der, E-mail: vandervegt@csi.tu-darmstadt.de [Center of Smart Interfaces, Technische Universität Darmstadt, Alarich-Weiss-Straße 10, 64287 Darmstadt (Germany)

    2014-12-14

    Molecular simulations of soft matter systems have been performed in recent years using a variety of systematically coarse-grained models. With these models, structural or thermodynamic properties can be quite accurately represented while the prediction of dynamic properties remains difficult, especially for multi-component systems. In this work, we use constraint molecular dynamics simulations for calculating dissipative pair forces which are used together with conditional reversible work (CRW) conservative forces in dissipative particle dynamics (DPD) simulations. The combined CRW-DPD approach aims to extend the representability of CRW models to dynamic properties and uses a bottom-up approach. Dissipative pair forces are derived from fluctuations of the direct atomistic forces between mapped groups. The conservative CRW potential is obtained from a similar series of constraint dynamics simulations and represents the reversible work performed to couple the direct atomistic interactions between the mapped atom groups. Neopentane, tetrachloromethane, cyclohexane, and n-hexane have been considered as model systems. These molecular liquids are simulated with atomistic molecular dynamics, coarse-grained molecular dynamics, and DPD. We find that the CRW-DPD models reproduce the liquid structure and diffusive dynamics of the liquid systems in reasonable agreement with the atomistic models when using single-site mapping schemes with beads containing five or six heavy atoms. For a two-site representation of n-hexane (3 carbons per bead), time scale separation can no longer be assumed and the DPD approach consequently fails to reproduce the atomistic dynamics.

  2. The top-down, middle-down, and bottom-up mass spectrometry approaches for characterization of histone variants and their post-translational modifications.

    Science.gov (United States)

    Moradian, Annie; Kalli, Anastasia; Sweredoski, Michael J; Hess, Sonja

    2014-03-01

    Epigenetic regulation of gene expression is, at least in part, mediated by histone modifications. PTMs of histones change chromatin structure and regulate gene transcription, DNA damage repair, and DNA replication. Thus, studying histone variants and their modifications not only elucidates their functional mechanisms in chromatin regulation, but also provides insights into phenotypes and diseases. A challenge in this field is to determine the best approach(es) to identify histone variants and their PTMs using a robust high-throughput analysis. The large number of histone variants and the enormous diversity that can be generated through combinatorial modifications, also known as histone code, makes identification of histone PTMs a laborious task. MS has been proven to be a powerful tool in this regard. Here, we focus on bottom-up, middle-down, and top-down MS approaches, including CID and electron-capture dissociation/electron-transfer dissociation based techniques for characterization of histones and their PTMs. In addition, we discuss advances in chromatographic separation that take advantage of the chemical properties of the specific histone modifications. This review is also unique in its discussion of current bioinformatic strategies for comprehensive histone code analysis.

  3. Emission trading and the role of learning-by-doing spillovers in the 'bottom-up' energy-system ERIS model

    Energy Technology Data Exchange (ETDEWEB)

    Barreto, L.; Klaassen, G. [International Institute for Applied Systems Analysis, Laxenburg (Austria). Environmentally Compatible Energy Strategies

    2004-07-01

    In this paper, using the 'bottom-up' energy-system optimisation Ecris model, we examine the effects of emission trading on technology deployment, emphasising the role of technology learning spill overs. That is, the possibility that the learning accumulated in a particular technology in a given region may spill to other regions as well, leading to cost reductions there also. The effects of different configurations of inter regional spillovers of learning in ERIS and the impact of the emission trading mechanism under those different circumstances are analysed. Including spatial spillovers of learning allows capturing the possibility that the imposition of greenhouse gas emission constraints in a given region may induce technological change in other regions, such as developing countries, even if the latter regions do not face emission constraints. Our stylised results point out the potential benefits of sound international cooperation between industrialised and developing regions on research, development, demonstration and deployment (RD3) of clean energy technologies and on the implementation of emission trading schemes. (author)

  4. Evolutionary Steps in the Emergence of Life Deduced from the Bottom-Up Approach and GADV Hypothesis (Top-Down Approach).

    Science.gov (United States)

    Ikehara, Kenji

    2016-01-26

    It is no doubt quite difficult to solve the riddle of the origin of life. So, firstly, I would like to point out the kinds of obstacles there are in solving this riddle and how we should tackle these difficult problems, reviewing the studies that have been conducted so far. After that, I will propose that the consecutive evolutionary steps in a timeline can be rationally deduced by using a common event as a juncture, which is obtained by two counter-directional approaches: one is the bottom-up approach through which many researchers have studied the origin of life, and the other is the top-down approach, through which I established the [GADV]-protein world hypothesis or GADV hypothesis on the origin of life starting from a study on the formation of entirely new genes in extant microorganisms. Last, I will describe the probable evolutionary process from the formation of Earth to the emergence of life, which was deduced by using a common event-the establishment of the first genetic code encoding [GADV]-amino acids-as a juncture for the results obtained from the two approaches.

  5. Structural and optical nanoscale analysis of GaN core-shell microrod arrays fabricated by combined top-down and bottom-up process on Si(111)

    Science.gov (United States)

    Müller, Marcus; Schmidt, Gordon; Metzner, Sebastian; Veit, Peter; Bertram, Frank; Krylyuk, Sergiy; Debnath, Ratan; Ha, Jong-Yoon; Wen, Baomei; Blanchard, Paul; Motayed, Abhishek; King, Matthew R.; Davydov, Albert V.; Christen, Jürgen

    2016-05-01

    Large arrays of GaN core-shell microrods were fabricated on Si(111) substrates applying a combined bottom-up and top-down approach which includes inductively coupled plasma (ICP) etching of patterned GaN films grown by metal-organic vapor phase epitaxy (MOVPE) and selective overgrowth of obtained GaN/Si pillars using hydride vapor phase epitaxy (HVPE). The structural and optical properties of individual core-shell microrods have been studied with a nanometer scale spatial resolution using low-temperature cathodoluminescence spectroscopy (CL) directly performed in a scanning electron microscope (SEM) and in a scanning transmission electron microscope (STEM). SEM, TEM, and CL measurements reveal the formation of distinct growth domains during the HVPE overgrowth. A high free-carrier concentration observed in the non-polar \\{ 1\\bar{1}00\\} HVPE shells is assigned to in-diffusion of silicon atoms from the substrate. In contrast, the HVPE shells directly grown on top of the c-plane of the GaN pillars reveal a lower free-carrier concentration.

  6. Evolutionary Steps in the Emergence of Life Deduced from the Bottom-Up Approach and GADV Hypothesis (Top-Down Approach

    Directory of Open Access Journals (Sweden)

    Kenji Ikehara

    2016-01-01

    Full Text Available It is no doubt quite difficult to solve the riddle of the origin of life. So, firstly, I would like to point out the kinds of obstacles there are in solving this riddle and how we should tackle these difficult problems, reviewing the studies that have been conducted so far. After that, I will propose that the consecutive evolutionary steps in a timeline can be rationally deduced by using a common event as a juncture, which is obtained by two counter-directional approaches: one is the bottom-up approach through which many researchers have studied the origin of life, and the other is the top-down approach, through which I established the [GADV]-protein world hypothesis or GADV hypothesis on the origin of life starting from a study on the formation of entirely new genes in extant microorganisms. Last, I will describe the probable evolutionary process from the formation of Earth to the emergence of life, which was deduced by using a common event—the establishment of the first genetic code encoding [GADV]-amino acids—as a juncture for the results obtained from the two approaches.

  7. Perceived Effects of Pornography on the Couple Relationship: Initial Findings of Open-Ended, Participant-Informed, "Bottom-Up" Research.

    Science.gov (United States)

    Kohut, Taylor; Fisher, William A; Campbell, Lorne

    2017-02-01

    The current study adopted a participant-informed, "bottom-up," qualitative approach to identifying perceived effects of pornography on the couple relationship. A large sample (N = 430) of men and women in heterosexual relationships in which pornography was used by at least one partner was recruited through online (e.g., Facebook, Twitter, etc.) and offline (e.g., newspapers, radio, etc.) sources. Participants responded to open-ended questions regarding perceived consequences of pornography use for each couple member and for their relationship in the context of an online survey. In the current sample of respondents, "no negative effects" was the most commonly reported impact of pornography use. Among remaining responses, positive perceived effects of pornography use on couple members and their relationship (e.g., improved sexual communication, more sexual experimentation, enhanced sexual comfort) were reported frequently; negative perceived effects of pornography (e.g., unrealistic expectations, decreased sexual interest in partner, increased insecurity) were also reported, albeit with considerably less frequency. The results of this work suggest new research directions that require more systematic attention.

  8. Heart Disease Risk Factors You Can Control

    Science.gov (United States)

    ... can control the following risk factors by making lifestyle changes. Your doctor might also suggest medicine to help control some risk factors, such as high blood pressure or high cholesterol. Poor blood cholesterol (koh-LESS-tur-ol) and triglyceride ( ...

  9. Contribution of Oil and Gas Production to Atmospheric CH4 in the South-Central United States: Reconciling Bottom-up and Top-down Estimates

    Science.gov (United States)

    Liu, Z.; Pinto, J. P.; Turner, A. J.; Bruhwiler, L.; Henze, D. K.; Brioude, J. F.; Bousserez, N.; Sargsyan, K.; Safta, C.; Najm, H. N.; LaFranchi, B. W.; Bambha, R.; Michelsen, H. A.

    2014-12-01

    Estimates of anthropogenic CH4 emissions in the United States have been largely inconsistent, particularly for oil and gas production (OGP) in the South-Central United States. We have quantified the contribution of OGP to the South-Central US (TX/OK/KS) CH4 budget through atmospheric regional transport modeling with the Community Multi-scale Air Quality (CMAQ). This model is driven by a new process-based, spatially resolved OGP CH4 emissions inventory. We employed Bayesian inference to calibrate CMAQ emissions inputs using continuous CH4 measurements at the DOE Southern Great Plains (SGP) central facility and evaluated model predictions against a subset of aircraft and surface flask measurements that are assimilated by NOAA's CarbonTracker-CH4. Our results suggest that OGP emissions are the largest source of CH4 observed at the DOE SGP site and the largest source of CH4 in TX/OK/KS, constituting ~45% of total CH4 emission in the region. The next largest source in the region is livestock, with other sources being relatively less important. We estimate OGP emissions in TX/OK/KS contribute about one half of national total OGP emissions. Using continuous CH4 measurements, we found evidence of rapid nocturnal transport by the Great Plains low-level jet (LLJ) and sporadic oil and gas emissions. Our study demonstrates the importance of improved knowledge of the spatial and temporal features of oil and gas emissions in reconciling CH4 budgets derived using bottom-up and top-down approaches at regional and national scales.

  10. Top-down/bottom-up description of electricity sector for Switzerland using the GEM-E3 computable general equilibrium model

    Energy Technology Data Exchange (ETDEWEB)

    Krakowski, R. A

    2006-06-15

    Participation of the Paul Scherrer Institute (PSI) in the advancement and extension of the multi-region, Computable General Equilibrium (CGE) model GEM-E3 (CES/KUL, 2002) focused primarily on two top-level facets: a) extension of the model database and model calibration, particularly as related to the second component of this study, which is; b) advancement of the dynamics of innovation and investment, primarily through the incorporation of Exogenous Technical Learning (ETL) into he Bottom-Up (BU, technology-based) part of the dynamic upgrade; this latter activity also included the completion of the dynamic coupling of the BU description of the electricity sector with the 'Top-Down' (TD, econometric) description of the economy inherent to the GEM-E3 CGE model. The results of this two- component study are described in two parts that have been combined in this single summary report: Part I describes the methodology and gives illustrative results from the BUTD integration, as well as describing the approach to and giving preliminary results from incorporating an ETL description into the BU component of the overall model; Part II reports on the calibration component of task in terms of: a) formulating a BU technology database for Switzerland based on previous work; incorporation of that database into the GEM-E3 model; and calibrating the BU database with the TD database embodied in the (Swiss) Social Accounting Matrix (SAM). The BUTD coupling along with the ETL incorporation described in Part I represent the major effort embodied in this investigation, but this effort could not be completed without the calibration preamble reported herein as Part II. A brief summary of the scope of each of these key study components is given. (author)

  11. Energetic Bottomup in the Low Countries. Energy transition from the bottom-up. On Happy energetic civilians, Solar and wind cooperatives, New utility companies; Energieke BottomUp in Lage Landen. De Energietransitie van Onderaf. Over Vrolijke energieke burgers, Zon- en windcooperaties, Nieuwe nuts

    Energy Technology Data Exchange (ETDEWEB)

    Schwencke, A.M.

    2012-08-15

    This essay is an outline of the 'energy transition from the bottom-up'. Leading questions are: (1) what are the actual initiatives; (2) who is involved; (3) how does one work (organization, business models); (4) why are people active in this field; (5) what good is it; (6) what is the aim? The essay is based on public information sources (websites, blogs, publications) and interviews with people involved [Dutch] Dit essay is een verkenning van de 'energietransitie van onderaf'. Leidende vragen zijn: (1) om wat voor initiatieven gaat het nu eigenlijk?; (2) wie zijn daarbij betrokken?; (3) hoe gaat men te werk (organisatie, business modellen)?; (4) waarom is men er op die manier mee bezig?; (5) Zet het zoden aan de dijk?; (6) Waar beweegt het naar toe? Het essay baseert zich op openbare bronnen (websites, blogs, publicaties) en gesprekken met mensen uit het veld.

  12. Enhancing the Wettability of High Aspect-Ratio Through-Silicon Vias Lined with LPCVD Silicon Nitride or PE-ALD Titanium Nitride for Void-Free Bottom-Up Copper Electroplating

    NARCIS (Netherlands)

    Saadaoui, M.; van Zeijl, H.; Wien, W. H. A.; Pham, H. T. M.; Kwakernaak, C.; Knoops, H. C. M.; Kessels, W. M. M.; R. van de Sanden,; Voogt, F. C.; Roozeboom, F.; Sarro, P. M.

    2011-01-01

    One of the critical steps toward producing void-free and uniform bottom-up copper electroplating in high aspect-ratio (AR) through-silicon vias (TSVs) is the ability of the copper electrolyte to spontaneously flow through the entire depth of the via. This can be accomplished by reducing the concentr

  13. A Statistical Method for Estimating Missing GHG Emissions in Bottom-Up Inventories: The Case of Fossil Fuel Combustion in Industry in the Bogota Region, Colombia

    Science.gov (United States)

    Jimenez-Pizarro, R.; Rojas, A. M.; Pulido-Guio, A. D.

    2012-12-01

    The development of environmentally, socially and financially suitable greenhouse gas (GHG) mitigation portfolios requires detailed disaggregation of emissions by activity sector, preferably at the regional level. Bottom-up (BU) emission inventories are intrinsically disaggregated, but although detailed, they are frequently incomplete. Missing and erroneous activity data are rather common in emission inventories of GHG, criteria and toxic pollutants, even in developed countries. The fraction of missing and erroneous data can be rather large in developing country inventories. In addition, the cost and time for obtaining or correcting this information can be prohibitive or can delay the inventory development. This is particularly true for regional BU inventories in the developing world. Moreover, a rather common practice is to disregard or to arbitrarily impute low default activity or emission values to missing data, which typically leads to significant underestimation of the total emissions. Our investigation focuses on GHG emissions by fossil fuel combustion in industry in the Bogota Region, composed by Bogota and its adjacent, semi-rural area of influence, the Province of Cundinamarca. We found that the BU inventories for this sub-category substantially underestimate emissions when compared to top-down (TD) estimations based on sub-sector specific national fuel consumption data and regional energy intensities. Although both BU inventories have a substantial number of missing and evidently erroneous entries, i.e. information on fuel consumption per combustion unit per company, the validated energy use and emission data display clear and smooth frequency distributions, which can be adequately fitted to bimodal log-normal distributions. This is not unexpected as industrial plant sizes are typically log-normally distributed. Moreover, our statistical tests suggest that industrial sub-sectors, as classified by the International Standard Industrial Classification (ISIC

  14. 解决变化问题的自底向上流程建模方法%Bottom-up workflow modeling approach for business changes

    Institute of Scientific and Technical Information of China (English)

    严志民; 徐玮

    2011-01-01

    为使工作流适应业务快速发展而复杂多变的特点,提出一种全新的以数据为中心的业务流程定义和业务流程建模的说明性业务流程建模方法。以自底向上机制分析解剖业务流程,提取出原子工单、活动和业务策略等,将业务要素和业务变化的描述分离成不同的层次。执行语义上,以数据中心的业务流程建模的说明性业务流程建模方法借助有限状态自动机来描述单个工单的生命周期,利用标号迁移系统来描述工作流及多个工单间的交互。此外,还进行了从以数据中心的业务流程建模的说明性业务流程建模方法到实现可部署工作流的探讨,并结合杭州市房产管理局的实际工作流程,阐述了该方法的实际应用。%To meet with the adaptability requirements of workflow in a complicated and rapid changing business environment,a new modeling method named Declarative ARTifact-centric workflow(DART) was proposed.The business process was analyzed in the bottom-up manner so that its building blocks such as artifacts,activities and business policies were extracted.Representation of business component and change were differentiated.DART also took Finite State Automata(FSA) to illustrate single artifact's lifecycle,and Labeled Transition Systems(LTS) to describe workflow and interactions among artifacts.In addition,from DART modeling method to realize deployable workflow was also discussed.This method was tested in Hangzhou real estate administration bureau and application was finally studied.

  15. Benchmarking Non-Hardware Balance-of-System (Soft) Costs for U.S. Photovoltaic Systems Using a Bottom-Up Approach and Installer Survey

    Energy Technology Data Exchange (ETDEWEB)

    Ardani, Kristen [National Renewable Energy Lab. (NREL), Golden, CO (United States); Margolis, Robert [National Renewable Energy Lab. (NREL), Golden, CO (United States); Feldman, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Ong, Sean [National Renewable Energy Lab. (NREL), Golden, CO (United States); Barbose, Galen [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Wiser, Ryan [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-11-01

    This report presents results from the first U.S. Department of Energy (DOE) sponsored, bottom-up data-collection and analysis of non-hardware balance-of-system costs—often referred to as “business process” or “soft” costs—for residential and commercial photovoltaic (PV) systems. Annual expenditure and labor-hour-productivity data are analyzed to benchmark 2010 soft costs related to the DOE priority areas of (1) customer acquisition; (2) permitting, inspection, and interconnection; (3) installation labor; and (4) installer labor for arranging third-party financing. Annual expenditure and labor-hour data were collected from 87 PV installers. After eliminating outliers, the survey sample consists of 75 installers, representing approximately 13% of all residential PV installations and 4% of all commercial installations added in 2010. Including assumed permitting fees, in 2010 the average soft costs benchmarked in this analysis total $1.50/W for residential systems (ranging from $0.66/W to $1.66/W between the 20th and 80th percentiles). For commercial systems, the median 2010 benchmarked soft costs (including assumed permitting fees) are $0.99/W for systems smaller than 250 kW (ranging from $0.51/W to $1.45/W between the 20th and 80th percentiles) and $0.25/W for systems larger than 250 kW (ranging from $0.17/W to $0.78/W between the 20th and 80th percentiles). Additional soft costs not benchmarked in the present analysis (e.g., installer profit, overhead, financing, and contracting) are significant and would add to these figures. The survey results provide a benchmark for measuring—and helping to accelerate—progress over the next decade toward achieving the DOE SunShot Initiative’s soft-cost-reduction targets. We conclude that the selected non-hardware business processes add considerable cost to U.S. PV systems, constituting 23% of residential PV system price, 17% of small commercial system price, and 5% of large commercial system price (in 2010

  16. Controlling Structure from the Bottom-Up: Structural and Optical Properties of Layer-by-Layer Assembled Palladium Coordination-Based Multilayers

    Energy Technology Data Exchange (ETDEWEB)

    Altman,M.; Shukla, A.; Zubkov, T.; Evmenenko, G.; Dutta, P.; van der Boom, M.

    2006-01-01

    Layer-by-layer assembly of two palladium coordination-based multilayers on silicon and glass substrates is presented. The new assemblies consist of rigid-rod chromophores connected by terminal pyridine moieties to palladium centers. Both colloidal palladium and PdCl{sub 2}(PhCN){sub 2} were used in order to determine the effect of the metal complex precursor on multilayer structure and optical properties. The multilayers were formed by an iterative wet-chemical deposition process at room temperature in air on a siloxane-based template layer. Twelve consecutive deposition steps have been demonstrated resulting in structurally regular assemblies with an equal amount of chromophore and palladium added in each molecular bilayer. The optical intensity characteristics of the metal-organic films are clearly a function of the palladium precursor employed. The colloid-based system has a UV-vis absorption maximum an order of magnitude stronger than that of the PdCl{sub 2}-based multilayer. The absorption maximum of the PdCl{sub 2}-based film exhibits a significant red shift of 23 nm with the addition of 12 layers. Remarkably, the structure and physiochemical properties of the submicron scale PdCl{sub 2}-based structures are determined by the configuration of the {approx}15 Angstrom thick template layer. The refractive index of the PdCl2-based film was determined by spectroscopic ellipsometry. Well-defined three-dimensional structures, with a dimension of 5 m, were obtained using photopatterned template monolayers. The properties and microstructure of the films were studied by UV-vis spectroscopy, spectroscopic ellipsometry, atomic force microscopy (AFM), X-ray reflectivity (XRR), scanning electron microscopy (SEM), and aqueous contact angle measurements (CA).

  17. Causal Factors in Genome Control

    NARCIS (Netherlands)

    O'Duibhir, E.

    2015-01-01

    The aim of this thesis is to study how genes are switched on and off in a coordinated way across an entire genome. In order to do this yeast is used as a model organism. The mechanisms that control gene expression in yeast are very similar to those of human cells. Chapter 1 provides a general introd

  18. Toward systematic integration between self-determination theory and motivational interviewing as examples of top-down and bottom-up intervention development: autonomy or volition as a fundamental theoretical principle.

    Science.gov (United States)

    Vansteenkiste, Maarten; Williams, Geoffrey C; Resnicow, Ken

    2012-03-02

    Clinical interventions can be developed through two distinct pathways. In the first, which we call top-down, a well-articulated theory drives the development of the intervention, whereas in the case of a bottom-up approach, clinical experience, more so than a dedicated theoretical perspective, drives the intervention. Using this dialectic, this paper discusses Self-Determination Theory (SDT) 12 and Motivational Interviewing (MI) 3 as prototypical examples of a top-down and bottom-up approaches, respectively. We sketch the different starting points, foci and developmental processes of SDT and MI, but equally note the complementary character and the potential for systematic integration between both approaches. Nevertheless, for a deeper integration to take place, we contend that MI researchers might want to embrace autonomy as a fundamental basic process underlying therapeutic change and we discuss the advantages of doing so.

  19. The effects of recent control policies on trends in emissions of anthropogenic atmospheric pollutants and CO2 in China

    OpenAIRE

    Zhao, Y.; Zhang, Junying; Nielsen, Chris

    2013-01-01

    To examine the effects of China's national policies of energy conservation and emission control during 2005–2010, inter-annual emission trends of gaseous pollutants, primary aerosols, and CO2 are estimated with a bottom-up framework. The control measures led to improved energy efficiency and/or increased penetration of emission control devices at power plants and other important industrial sources, yielding reduced emission factors for all evaluated species except NOx. The national emissions ...

  20. Top-Down Versus Bottom-Up Estimative of CO2 and CO Vehicular Emission Contribution from the Megacity of SãO Paulo, Brazil

    Science.gov (United States)

    Andrade, M.; Nogueira, T.; Martínez, P. J.; Fornaro, A.; Miranda, R. M.; Ynoue, R.

    2013-12-01

    data presented here compared tunnel measurements performed in 2004 and 2011. The official data estimate an emission of 15327 million tons per year of CO2eq (60% by LDV, 38% HDV and 2% motorcycles) and 128 million tons per year of CO. The top-down estimative based on tunnel measurements resulted in values approximately 5 times higher, being the difference more attributable to the estimative of the diesel emission factor. The uncertainties are related to the deterioration of the emission factor with time and the driving pattern. The diurnal variation of CO2 atmospheric concentration is characterized by the mobile source emission pattern. CETESB. Relatório Anual de Qualidade do Ar no Estado de São Paulo 2012. Companhia de Tecnologia de Saneamento Ambiental, São Paulo, Brazil, 2013a. CETESB. Plano de Controle de Poluição Veicular do Estado de São Paulo 2011 /2013. Companhia de Tecnologia de Saneamento Ambiental, São Paulo, Brazil, 2013b.

  1. Thyristor Controlled Reactor for Power Factor Improvement

    Directory of Open Access Journals (Sweden)

    Sheila Mahapatra

    2014-04-01

    Full Text Available Power factor improvement is the essence of any power sector for reliable operation. This paper provides Thyristor Controlled Reactor regulated by programmed microcontroller which aids in improving power factor and retaining it close to unity under various loading conditions. The implementation is done on 8051 microcontrollerwhich isprogrammed using Keil software. To determine time lag between current and voltage PSpice softwareis used and to display power factor according tothe variation in loadProteus software is used. Whenever a capacitive load is connected to the transmission linea shunt reactor is connected which injects lagging reactive VARs to the power system. As a result the power factor is improved. The results given in this paper provides suitable microcontroller based reactive power compensation and power factor improvement technique using a Thyristor Controlled Reactor module.

  2. Development Of A Web Service And Android 'APP' For The Distribution Of Rainfall Data. A Bottom-Up Remote Sensing Data Mining And Redistribution Project In The Age Of The 'Web 2.0'

    Science.gov (United States)

    Mantas, Vasco M.; Pereira, A. J. S. C.; Liu, Zhong

    2013-12-01

    A project was devised to develop a set of freely available applications and web services that can (1) simplify access from Mobile Devices to TOVAS data and (2) support the development of new datasets through data repackaging and mash-up. The bottom-up approach enables the multiplication of new services, often of limited direct interest to the organizations that produces the original, global datasets, but significant to small, local users. Through this multiplication of services, the development cost is transferred to the intermediate or end users and the entire process is made more efficient, even allowing new players to use the data in innovative ways.

  3. What drives farmers to make top-down or bottom-up adaptation to climate change and fluctuations? A comparative study on 3 cases of apple farming in Japan and South Africa.

    Science.gov (United States)

    Fujisawa, Mariko; Kobayashi, Kazuhiko; Johnston, Peter; New, Mark

    2015-01-01

    Agriculture is one of the most vulnerable sectors to climate change. Farmers have been exposed to multiple stressors including climate change, and they have managed to adapt to those risks. The adaptation actions undertaken by farmers and their decision making are, however, only poorly understood. By studying adaptation practices undertaken by apple farmers in three regions: Nagano and Kazuno in Japan and Elgin in South Africa, we categorize the adaptation actions into two types: farmer initiated bottom-up adaptation and institution led top-down adaptation. We found that the driver which differentiates the type of adaptation likely adopted was strongly related to the farmers' characteristics, particularly their dependence on the institutions, e.g. the farmers' cooperative, in selling their products. The farmers who rely on the farmers' cooperative for their sales are likely to adopt the institution-led adaptation, whereas the farmers who have established their own sales channels tend to start innovative actions by bottom-up. We further argue that even though the two types have contrasting features, the combinations of the both types of adaptations could lead to more successful adaptation particularly in agriculture. This study also emphasizes that more farm-level studies for various crops and regions are warranted to provide substantial feedbacks to adaptation policy.

  4. What drives farmers to make top-down or bottom-up adaptation to climate change and fluctuations? A comparative study on 3 cases of apple farming in Japan and South Africa.

    Directory of Open Access Journals (Sweden)

    Mariko Fujisawa

    Full Text Available Agriculture is one of the most vulnerable sectors to climate change. Farmers have been exposed to multiple stressors including climate change, and they have managed to adapt to those risks. The adaptation actions undertaken by farmers and their decision making are, however, only poorly understood. By studying adaptation practices undertaken by apple farmers in three regions: Nagano and Kazuno in Japan and Elgin in South Africa, we categorize the adaptation actions into two types: farmer initiated bottom-up adaptation and institution led top-down adaptation. We found that the driver which differentiates the type of adaptation likely adopted was strongly related to the farmers' characteristics, particularly their dependence on the institutions, e.g. the farmers' cooperative, in selling their products. The farmers who rely on the farmers' cooperative for their sales are likely to adopt the institution-led adaptation, whereas the farmers who have established their own sales channels tend to start innovative actions by bottom-up. We further argue that even though the two types have contrasting features, the combinations of the both types of adaptations could lead to more successful adaptation particularly in agriculture. This study also emphasizes that more farm-level studies for various crops and regions are warranted to provide substantial feedbacks to adaptation policy.

  5. A framework for assessing inter-individual variability in pharmacokinetics using virtual human populations and integrating general knowledge of physical chemistry, biology, anatomy, physiology and genetics: A tale of 'bottom-up' vs 'top-down' recognition of covariates.

    Science.gov (United States)

    Jamei, Masoud; Dickinson, Gemma L; Rostami-Hodjegan, Amin

    2009-01-01

    An increasing number of failures in clinical stages of drug development have been related to the effects of candidate drugs in a sub-group of patients rather than the 'average' person. Expectation of extreme effects or lack of therapeutic effects in some subgroups following administration of similar doses requires a full understanding of the issue of variability and the importance of identifying covariates that determine the exposure to the drug candidates in each individual. In any drug development program the earlier these covariates are known the better. An important component of the drive to decrease this failure rate in drug development involves attempts to use physiologically-based pharmacokinetics 'bottom-up' modeling and simulation to optimize molecular features with respect to the absorption, distribution, metabolism and elimination (ADME) processes. The key element of this approach is the separation of information on the system (i.e. human body) from that of the drug (e.g. physicochemical characteristics determining permeability through membranes, partitioning to tissues, binding to plasma proteins or affinities toward certain enzymes and transporter proteins) and the study design (e.g. dose, route and frequency of administration, concomitant drugs and food). In this review, the classical 'top-down' approach in covariate recognition is compared with the 'bottom-up' paradigm. The determinants and sources of inter-individual variability in different stages of drug absorption, distribution, metabolism and excretion are discussed in detail. Further, the commonly known tools for simulating ADME properties are introduced.

  6. Preparation of Au-Pt nanostructures by combining top-down with bottom-up strategies and application in label-free electrochemical immunosensor for detection of NMP22.

    Science.gov (United States)

    Jia, Hongying; Gao, Picheng; Ma, Hongmin; Wu, Dan; Du, Bin; Wei, Qin

    2015-02-01

    A novel label-free amperometric immunosensor for sensitive detection of nuclear matrix protein 22 (NMP22) was developed based on Au-Pt bimetallic nanostructures, which were prepared by combining top-down with bottom-up strategies. Nanoporous gold (NPG) was prepared by "top-down" dealloying of commercial Au/Ag alloy film. After deposition of NPG on an electrode, Pt nanoparticles (PtNPs) were further decorated on NPG by "bottom-up" electrodeposition. The prepared bimetallic nanostructures combine the merits of both NPG and PtNPs, and show a high electrocatalytic activity towards the reduction of H2O2. The label-free immunosensor was constructed by directly immobilizing antibody of NMP22 (anti-NMP22) on the surface of bimetallic nanostructures. The immunoreaction induced amperometric response could be detected and negatively correlated to the concentration of NMP22. Bimetallic nanostructure morphologies and detection conditions were investigated to obtain the best sensing performance. Under the optimal conditions, a linear range from 0.01ng/mL to 10ng/mL and a detection limit of 3.33pg/mL were obtained. The proposed immunosensor showed high sensitivity, good selectivity, stability, reproducibility, and regeneration for the detection of NMP22, and it was evaluated in urine samples, receiving satisfactory results.

  7. Automated Linear Function Submission-Based Double Auction as Bottom-up Real-Time Pricing in a Regional Prosumers’ Electricity Network

    Directory of Open Access Journals (Sweden)

    Tadahiro Taniguchi

    2015-07-01

    Full Text Available A linear function submission-based double auction (LFS-DA mechanism for a regional electricity network is proposed in this paper. Each agent in the network is equipped with a battery and a generator. Each agent simultaneously becomes a producer and consumer of electricity, i.e., a prosumer, and trades electricity in the regional market at a variable price. In the LFS-DA, each agent uses linear demand and supply functions when they submit bids and asks to an auctioneer in the regional market. The LFS-DA can achieve an exact balance between electricity demand and supply for each time slot throughout the learning phase and was shown capable of solving the primal problem of maximizing the social welfare of the network without any central price setter, e.g., a utility or a large electricity company, in contrast with conventional real-time pricing (RTP. This paper presents a clarification of the relationship between the RTP algorithm derived on the basis of a dual decomposition framework and LFS-DA. Specifically, we proved that the changes in the price profile of the LFS-DA mechanism are equal to those achieved by the RTP mechanism derived from the dual decomposition framework, except for a constant factor.

  8. Efficient Research Design: Using Value-of-Information Analysis to Estimate the Optimal Mix of Top-down and Bottom-up Costing Approaches in an Economic Evaluation alongside a Clinical Trial.

    Science.gov (United States)

    Wilson, Edward C F; Mugford, Miranda; Barton, Garry; Shepstone, Lee

    2016-04-01

    In designing economic evaluations alongside clinical trials, analysts are frequently faced with alternative methods of collecting the same data, the extremes being top-down ("gross costing") and bottom-up ("micro-costing") approaches. A priori, bottom-up approaches may be considered superior to top-down approaches but are also more expensive to collect and analyze. In this article, we use value-of-information analysis to estimate the efficient mix of observations on each method in a proposed clinical trial. By assigning a prior bivariate distribution to the 2 data collection processes, the predicted posterior (i.e., preposterior) mean and variance of the superior process can be calculated from proposed samples using either process. This is then used to calculate the preposterior mean and variance of incremental net benefit and hence the expected net gain of sampling. We apply this method to a previously collected data set to estimate the value of conducting a further trial and identifying the optimal mix of observations on drug costs at 2 levels: by individual item (process A) and by drug class (process B). We find that substituting a number of observations on process A for process B leads to a modest £ 35,000 increase in expected net gain of sampling. Drivers of the results are the correlation between the 2 processes and their relative cost. This method has potential use following a pilot study to inform efficient data collection approaches for a subsequent full-scale trial. It provides a formal quantitative approach to inform trialists whether it is efficient to collect resource use data on all patients in a trial or on a subset of patients only or to collect limited data on most and detailed data on a subset.

  9. Safety factor profile control in a tokamak

    CERN Document Server

    Bribiesca Argomedo, Federico; Prieur, Christophe

    2014-01-01

    Control of the Safety Factor Profile in a Tokamak uses Lyapunov techniques to address a challenging problem for which even the simplest physically relevant models are represented by nonlinear, time-dependent, partial differential equations (PDEs). This is because of the  spatiotemporal dynamics of transport phenomena (magnetic flux, heat, densities, etc.) in the anisotropic plasma medium. Robustness considerations are ubiquitous in the analysis and control design since direct measurements on the magnetic flux are impossible (its estimation relies on virtual sensors) and large uncertainties remain in the coupling between the plasma particles and the radio-frequency waves (distributed inputs). The Brief begins with a presentation of the reference dynamical model and continues by developing a Lyapunov function for the discretized system (in a polytopic linear-parameter-varying formulation). The limitations of this finite-dimensional approach motivate new developments in the infinite-dimensional framework. The t...

  10. Research and Development from the bottom up

    DEFF Research Database (Denmark)

    Brem, Alexander; Wolfram, P.

    2014-01-01

    is introduced consisting of three core dimensions: sophistication, sustainability, and emerging market orientation. On the basis of these dimensions, analogies and distinctions between the terms are identified and general tendencies are explored such as the increasing importance of sustainability in social...... and ecological context or the growing interest of developed market firms in approaches from emerging markets. Hence, the presented framework supports further research in new paradigms for research and development (R&D) in developed market firms (DMFs), particularly in relation to emerging markets. This framework...... enables scholars to compare concepts from developed and emerging markets, to address studies specifically by using consistent terms, and to advance research into the concepts according their characterization....

  11. Horizontal Symmetry: Bottom Up and Top Down

    CERN Document Server

    Lam, C S

    2011-01-01

    A group-theoretical connection between horizontal symmetry $\\G$ and fermion mixing is established, and applied to neutrino mixing. The group-theoretical approach is consistent with a dynamical theory based on $U(1)\\times \\G$, but the dynamical theory can be used to pick out the most stable mixing that purely group-theoretical considerations cannot. A symmetry common to leptons and quarks is also discussed. This higher symmetry picks $A_4$ over $S_4$ to be the preferred symmetry for leptons.

  12. Bottom Up Project Cost and Risk Modeling

    Data.gov (United States)

    National Aeronautics and Space Administration — Microcosm along with its partners HRP Systems, End-to-End Analytics, and ARES Corporation (unfunded in Phase I), propose to develop a new solution for detailed data...

  13. Milk bottom-up proteomics: method optimisation.

    Directory of Open Access Journals (Sweden)

    Delphine eVincent

    2016-01-01

    Full Text Available Milk is a complex fluid whose proteome displays a diverse set of proteins of high abundance such as caseins and medium to low abundance whey proteins such as ß-lactoglobulin, lactoferrin, immunoglobulins, glycoproteins, peptide hormones and enzymes. A sample preparation method that enables high reproducibility and throughput is key in reliably identifying proteins present or proteins responding to conditions such as a diet, health or genetics. Using skim milk samples from Jersey and Holstein-Friesian cows, we compared three extraction procedures which have not previously been applied to samples of cows’ milk. Method A (urea involved a simple dilution of the milk in a urea-based buffer, method B (TCA/acetone involved a trichloroacetic acid (TCA/acetone precipitation and method C (methanol/chloroform involved a tri-phasic partition method in chloroform/methanol solution. Protein assays, SDS-PAGE profiling, and trypsin digestion followed by nanoHPLC-electrospray ionisation-tandem mass spectrometry (nLC-ESI-MS/MS analyses were performed to assess their efficiency. Replicates were used at each analytical step (extraction, digestion, injection to assess reproducibility. Mass spectrometry (MS data are available via ProteomeXchange with identifier PXD002529. Overall 186 unique accessions, major and minor proteins, were identified with a combination of methods. Method C (methanol/chloroform yielded the best resolved SDS-patterns and highest protein recovery rates, method A (urea yielded the greatest number of accessions, and, of the three procedures, method B (TCA/acetone was the least compatible of all with a wide range of downstream analytical procedures. Our results also highlighted breed differences between the proteins in milk of Jersey and Holstein-Friesian cows.

  14. Mobile Handsets from the Bottom Up

    DEFF Research Database (Denmark)

    Wallis, Cara; Linchuan Qiu, Jack; Ling, Richard

    2013-01-01

    The setting could be a hole-in-the-wall that serves as a shop in a narrow alley in Guangzhou, a cart on a dusty street on the outskirts of Accra, a bustling marketplace in Mexico City, or a tiny storefront near downtown Los Angeles’ garment district. At such locales, men and women hawk an array o...

  15. Bottom Up Succession Planning Works Better.

    Science.gov (United States)

    Stevens, Paul

    Most succession planning practices are based on the premise that ambitious people have and want only one career direction--upwardly mobile. However, employees have 10 career direction options at any stage of their working lives. A minority want the career action requiring promotion. Employers with a comprehensive career planning support program…

  16. Teaching Listening Comprehension: Bottom-Up Approach

    Science.gov (United States)

    Khuziakhmetov, Anvar N.; Porchesku, Galina V.

    2016-01-01

    Improving listening comprehension skills is one of the urgent contemporary educational problems in the field of second language acquisition. Understanding how L2 listening comprehension works can have a serious influence on language pedagogy. The aim of the paper is to discuss the practical and methodological value of the notion of the perception…

  17. Bottom-up Experiments and Concrete Utopias

    DEFF Research Database (Denmark)

    Andersson, Lasse

    2010-01-01

    Artiklen undersøger hvorledes brugerdrevne experimenter kan udfordre den standardiserede erhvervsorienterede udgave af Oplevelsesbyen og via eksperimentet stimulerer lokalt forankrede og demokratiske udgaver af en oplevelses- og vidensbaseret by....

  18. Convergência brasileira aos padrões internacionais de contabilidade pública vis-à-vis as estratégias top-down e bottom-up

    Directory of Open Access Journals (Sweden)

    Janyluce Rezende Gama

    2014-02-01

    Full Text Available O Brasil está em processo de convergência de sua contabilidade pública em relação aos padrões internacionais desenvolvidos pela Federação Internacional dos Contadores (Ifac. A implementação de sistemas de informação contábil é geralmente realizada por meio das abordagens top-down ou bottom-up. Assim, este estudo tem por objetivos: 1 identificar a abordagem adotada pelo governo federal brasileiro; 2 descrever o modelo de implementação do sistema de informação contábil público no Brasil; e 3 mapear o fluxo de informações e atores envolvidos no processo de convergência. A abordagem qualitativa foi adotada utilizando a pesquisa documental e análise de conteúdo de documentos disponíveis para operacionalizar a pesquisa. Foi identificado que o Brasil utiliza a abordagem middle-up-down, que favorece a interação entre múltiplos atores no processo, diferentemente da abordagem top-down, que segue o modelo internacional divulgado.

  19. Bottom-up electrochemical preparation of solid-state carbon nanodots directly from nitriles/ionic liquids using carbon-free electrodes and the applications in specific ferric ion detection and cell imaging

    Science.gov (United States)

    Niu, Fushuang; Xu, Yuanhong; Liu, Mengli; Sun, Jing; Guo, Pengran; Liu, Jingquan

    2016-03-01

    Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often required. Herein, solid-state C-dots were simply prepared by bottom-up EC carbonization of nitriles (e.g. acetonitrile) in the presence of an ionic liquid [e.g. 1-butyl-3-methylimidazolium hexafluorophosphate (BMIMPF6)], using carbon-free electrodes. Due to the positive charges of BMIM+ on the C-dots, the final products presented in a precipitate form on the cathode, and the unreacted nitriles and BMIMPF6 can be easily removed by simple vacuum filtration. The as-prepared solid-state C-dots can be well dispersed in an aqueous medium with excellent photoluminescence properties. The average size of the C-dots was found to be 3.02 +/- 0.12 nm as evidenced by transmission electron microscopy. Other techniques such as UV-vis spectroscopy, fluorescence spectroscopy, X-ray photoelectron spectroscopy and atomic force microscopy were applied for the characterization of the C-dots and to analyze the possible generation mechanism. These C-dots have been successfully applied in efficient cell imaging and specific ferric ion detection.Carbon nanodots (C-dots), a new type of potential alternative to conventional semiconductor quantum dots, have attracted numerous attentions in various applications including bio-chemical sensing, cell imaging, etc., due to their chemical inertness, low toxicity and flexible functionalization. Various methods including electrochemical (EC) methods have been reported for the synthesis of C-dots. However, complex procedures and/or carbon source-containing electrodes are often

  20. A novel bottom-up process to produce thymopentin nanoparticles and their formulation optimization%胸腺五肽纳米粒的制备及其处方优化

    Institute of Scientific and Technical Information of China (English)

    单紫筠; 谭银合; 杨志文; 余思琴; 陈宝; 吴传斌

    2012-01-01

    目的 建立制备胸腺五肽纳米粒的方法并对其处方进行优化,为制备符合要求的压力定量吸入气雾剂奠定基础.方法 将胸腺五肽、卵磷脂、乳糖溶于叔丁醇-水的混合溶剂中,冷冻干燥,将冻干产物用异丙醇混悬,离心除去多余的卵磷脂以得到纯药物纳米粒.采用星点设计-效应面法对其中水、卵磷脂、胸腺五肽的用量进行优化,因胸腺五肽的含量受此处方影响不显著,所以只选取纳米粒的粒径、粒径分布为结果考察指标.结果 最优处方:水∶叔丁醇、卵磷脂∶叔丁醇、胸腺五肽∶水的比例分别为0.5(mL∶mL) 、213.5(mg∶mL) 、17.0(mg∶mL),即水的用量1.5 mL、卵磷脂的用量640.57 mg、胸腺五肽的用量25.57 mg、叔丁醇的用量3.0 mL.按此处方制备的纳米粒粒径在150 nm左右,多分散系数为0.1以下,含量均能保持在98%以上.结论 采用该方法制备胸腺五肽纳米粒,质量佳,重现性好,方法简便,具有良好的应用前景.%Objective A noveJ bottom-up process was developed to produce nanoparticles containing thymopentin and the formulation was optimized to produce desirable nanoparticles for development of pressed metered dose inhaler ( pMDI) of thymopentin( TP-5 ). Methods A solution of TP-5 , lecithin and lactose in tert-butyl alcohol( TBA )/ water co-solvent system was freeze-dried to generate nanoparticles and residual lecithin was washed off in lyophilizate through eentrifugation. Formulation parameters such as lecithin content in organic phase,water content in TB A/water co-solvent, and TP-5 content in water were optimized with the central composite design-response surface methodology. As the retained content of TP-5 in nanoparticles did not significantly vary with the above formulation parameters, only particle size and size distribution of TP-5 nanoparticles was taken as response parameters. Results The ratios of water to TBA, lecithin to TBA and TP-5 to water in the

  1. Integrating top-down and bottom-up approaches to design a cost-effective and equitable programme of measures for adaptation of a river basin to global change

    Science.gov (United States)

    Girard, Corentin; Rinaudo, Jean-Daniel; Pulido-Velazquez, Manuel

    2016-04-01

    Adaptation to the multiple facets of global change challenges the conventional means of sustainably planning and managing water resources at the river basin scale. Numerous demand or supply management options are available, from which adaptation measures need to be selected in a context of high uncertainty of future conditions. Given the interdependency of water users, agreements need to be found at the local level to implement the most effective adaptation measures. Therefore, this work develops an approach combining economics and water resources engineering to select a cost-effective programme of adaptation measures in the context of climate change uncertainty, and to define an equitable allocation of the cost of the adaptation plan between the stakeholders involved. A framework is developed to integrate inputs from the two main approaches commonly used to plan for adaptation. The first, referred to as "top-down", consists of a modelling chain going from global greenhouse gases emission scenarios to local hydrological models used to assess the impact of climate change on water resources. Conversely, the second approach, called "bottom-up", starts from assessing vulnerability at the local level to then identify adaptation measures used to face an uncertain future. The methodological framework presented in this contribution relies on a combination of these two approaches to support the selection of adaptation measures at the local level. Outcomes from these two approaches are integrated to select a cost-effective combination of adaptation measures through a least-cost optimization model developed at the river basin scale. The performances of a programme of measures are assessed under different climate projections to identify cost-effective and least-regret adaptation measures. The issue of allocating the cost of the adaptation plan is considered through two complementary perspectives. The outcome of a negotiation process between the stakeholders is modelled through

  2. The Formation of Sustainable Urban Communities: A Bottom-up Perspective%可持续城市社区的形成:一个自下而上的视角

    Institute of Scientific and Technical Information of China (English)

    张慧; 莫嘉诗; 王斯福

    2015-01-01

    本文以在昆明的四个社区所进行的田野调查为基础,从居民的视角出发讨论城市社区形成过程中居民的社会关系、公共信任、社区归属感、冲突解决与社会融合问题.通过最基层城市社区的运行和实施情况分析社区管理和城市规划的可持续发展问题.该研究是欧盟"城镇化中国:中国可持续城镇化发展"的一部分,旨在以整体论的视角反思快速城市化给居民生活所带来的影响.%Based on research in four communities in Kunming, through a bottom-up perspective, the paper discusses the relationships among residents, public trust, senses of belonging, dispute resolution and social integration in the formation of sustainable urban communities. By understanding policy implementa?tion at the grassroots level, this paper aims to consider the issue of sustainability in both community man?agement and urban planning. As part of the EU funded"UrbaChina: Sustainable Urbanization in China", the aim of the research team is to reflect on the impact fast urbanization has on residents'way of life.

  3. Power factor control system for ac induction motors

    Science.gov (United States)

    Nola, F. J. (Inventor)

    1981-01-01

    A power control circuit for an induction motor is disclosed in which a servo loop is used to control power input by controlling the power factor of motor operation. The power factor is measured by summing the voltage and current derived square wave signals.

  4. Underlying Factors for Practicality of the Production Control Systems

    DEFF Research Database (Denmark)

    Arica, Emrah; Strandhagen, Jan Ola; Hvolby, Hans-Henrik

    2012-01-01

    This paper gives indications to important factors that must be considered for effectiveness of the production control systems under uncertainty. Five key factors have been identified by the literature study. Production schedule generation and execution approach under uncertainty, information...

  5. A top-down / bottom-up approach for multi-actors and multi-criteria assessment of mining projects for sustainable development. Application on Arlit Uranium mines (Niger); Une demarche Top-Down / Bottom-Up pour l'evaluation en termes multicriteres et multi-acteurs des projets miniers dans l'optique du developpement durable. Application sur les mines d'Uranium d'Arlit (Niger)

    Energy Technology Data Exchange (ETDEWEB)

    Chamaret, A

    2007-06-15

    This thesis aims to appraise the relevance of using an hybrid top-down / bottom-up approach to evaluate mining projects in the perspective of sustainable development. With the advent of corporate social responsibility and sustainable development concepts, new social expectations have appeared towards companies that go beyond a sole requirement of profit earning capacity. If companies do not answer to these expectations, they risk to lose their social legitimacy. Traditionally associated with social, environmental, economical and political impacts and risks, mining activity is particularly concerned by these new issues. Whereas mineral resources needs have never been so high, mining companies are now expected to limit their negative effects and to take into account their different audiences' expectations in order to define, together, the terms of their social license to operate. Considering the diversity of issues, scales, actors and contexts, the challenge is real and necessitates tools to better understand issues and to structure dialogues. Based on the Uranium mines of Arlit (Niger) case study, this work shows that associating participatory approaches to structuration tools and literature propositions, appears as an efficient formula to better organize issues diversity and to build a structured dialogue between mining companies and their stakeholders. First Part aims to present the theoretical, institutional and sectorial contexts of the thesis. Second Part exposes work and results of the evaluation carried out in Niger. And, Third Part, shows the conclusions that can be derived from this work and presents a proposal for an evaluation framework, potentially applicable to other mining sites. (author)

  6. Human Factors in Air Traffic Control

    Science.gov (United States)

    1982-04-01

    leaves him in a central role but provides some automated facilitiee to help him to do his 3ob. 7he concepto therefore have different emotional...which engender favourable attitudes towards them. Considering the amount of effort which goes into marketing commercial products, astonishinglXy...in air traffic control environments. Safety standards are clear and gentrally enforced, and the equipment currently marketed does not have emissions

  7. Temperature Dependence of Factors Controlling Isoprene Emissions

    Science.gov (United States)

    Duncan, Bryan N.; Yoshida, Yasuko; Damon, Megan R.; Douglass, Anne R.; Witte, Jacquelyn C.

    2009-01-01

    We investigated the relationship of variability in the formaldehyde (HCHO) columns measured by the Aura Ozone Monitoring Instrument (OMI) to isoprene emissions in the southeastern United States for 2005-2007. The data show that the inferred, regional-average isoprene emissions varied by about 22% during summer and are well correlated with temperature, which is known to influence emissions. Part of the correlation with temperature is likely associated with other causal factors that are temperature-dependent. We show that the variations in HCHO are convolved with the temperature dependence of surface ozone, which influences isoprene emissions, and the dependence of the HCHO column to mixed layer height as OMI's sensitivity to HCHO increases with altitude. Furthermore, we show that while there is an association of drought with the variation in HCHO, drought in the southeastern U.S. is convolved with temperature.

  8. Risk factors for caries - control and prevention

    Directory of Open Access Journals (Sweden)

    Melida Hasanagić

    2008-08-01

    Full Text Available Objectives. To investigate a prevalence of caries, filled permanentand extracted permanent teeth, as well as caries risk factors inschool children aged 7, 9 and 11.Methods. The survey included 800 children (296 children aged7; 254 children aged 9 and 250 children aged 11 from the MostarMunicipality, 400 of them living in both rural and urban areas.A dental mirror and standard light of dental chair were used forexamination. The DMF index (Dental Caries, Missing Teeth andFilled Teeth was determined, as well as failure in keeping teethhygiene, sugar intake with food, and incidence of oral cavity infection.Results. The dental state of permanent teeth in children aged 7and 9 has shown significant difference between the children fromrural and urban areas (p < 0,001. Out of 2,698 and 2,790 permanentteeth in children aged 11 from rural and urban areas, 1,086(40,25 % and 884 (31.68 % had caries, respectively (p < 0.01.The difference between these groups of children has been foundin relation to the index of oral hygiene too (p < 0.05.Conclusion. An identification of risk groups for getting caries wasvery important and could help health and social structures to maintaintheir programs in order to improve oral health.

  9. Chalcopyrite leaching: The rate controlling factors

    Science.gov (United States)

    Li, J.; Kawashima, N.; Kaplun, K.; Absolon, V. J.; Gerson, A. R.

    2010-05-01

    The processes that determine the rate of chalcopyrite leaching are central to understanding how chalcopyrite (CuFeS 2) behaves under the environmentally adverse conditions of acid rock drainage. To this end the effect of the acid anion on chalcopyrite leach rates using a variety of acidic media (H 2SO 4, HClO 4, HCl and H 2SO 4 with 0.25 M NaCl) under carefully controlled solution conditions (pH 1 and 2, Eh 750 mV (SHE) and 75 °C) has been examined. These conditions have been chosen to enable sufficient leach rates for accurate experimental determination and to compare to the previous mechanistic analysis carried out by Harmer et al. (2006). Extensive surface analysis of leach residues demonstrated that variations in the surface speciation could not be responsible for the observed variations in leach rate. The rate of Cu release, however, was found to be first order with respect to Fe 3+ activity and inversely proportional with respect to H + activity to the power of 0.7: {1}/{S}{dC}/{dt}=(2.0±0.2){a}/{aH0.7} where S is the relative surface area, C is concentration of Cu in the solution (M), t is the time (h), 2.0 is the rate constant (M 0.7 h -1) and a and a are Fe 3+ and H + activities, respectively (M). The rate model was further validated against additional leaches carried out in H 2SO 4 media with the initial addition of Fe 3+ (8 mM as Fe 2(SO 4) 3) at 75 °C under various pH and Eh regimes. The only condition under which this rate model was found not to hold was at simultaneously low a and high a, that is at pH 1 and a<5×10-5M, where the concentration of dissolved O 2 may be leach rate determining.

  10. Revisiting factors controlling methane emissions from high-Arctic tundra

    DEFF Research Database (Denmark)

    Mastepanov, M.; Sigsgaard, Charlotte; Tagesson, Håkan Torbern;

    2013-01-01

    controlling methane emission, i.e. temperature and water table position. Late in the growing season CH4 emissions were found to be very similar between the study years (except the extremely dry 2010) despite large differences in climatic factors (temperature and water table). Late-season bursts of CH4...... short-term control factors (temperature and water table). Our findings suggest the importance of multiyear studies with a continued focus on shoulder seasons in Arctic ecosystems....

  11. COMPLIANCE AS FACTORING BUSINESS RISK MANAGEMENT: CONTROL ASPECTS

    Directory of Open Access Journals (Sweden)

    V.K. Makarovych

    2016-03-01

    Full Text Available Indetermination of modern economy conditions and the lack of theoretical knowledge gained by domestic scientists about risk in factoring business actualize the research concerning the methodology and technique of factoring companies’ risk management. The article examines compliance which is the technology innovative for Ukrainian market of factoring risk management technologies. It is determined that the compliance is the risk management process directed to free will correspondence to state, international legislation as well as to the ethics standards accepted in the field of regulated legal relations and to the traditions of business circulation to sustain the necessary regulations and standards of market behaviour, and to consolidate the image of a factoring company. Compliance risks should be understood as the risks of missed profit or losses caused by the conflicts of interests and the discrepancy of employees’ actions to internal and external standard documents. The attention is paid to the control over the compliance. The author singles out 3 kinds of the compliance control such as institutional, operational and the compliance control over the observance of conducting business professional ethics regulations which are necessary for providing of efficient management of factoring business risks. The paper shows the organizing process of factoring business compliance control (by the development of internal standard documents, a compliance program, the foundation of compliance control subdivision, monitoring of the risks cause the choice, made by management entities of a factoring company, of the management methods of risks for their business. The development of new and improvement of existed forms of compliance control organizing process help satisfy users’ information needs and requests of the risk management factoring company department. The suggestions proposed create the grounds for the transformation and improvement of factoring

  12. PMBLDC motor drive with power factor correction controller

    DEFF Research Database (Denmark)

    George, G.J.; Ramachandran, Rakesh; Arun, N.

    2012-01-01

    This paper presents a boost converter configuration, control scheme and design of single phase power factor controller for permanent magnet brushless DC motor (PMBLDCM) drive. PMBLDC motors are the latest choice of researchers, due to the high efficiency, silent operation, compact size, high reli....... Simulations are done using MATLAB/ SIMULINK software. © 2012 IEEE....

  13. Confirmatory Factor Analysis of the Work Locus of Control Scale

    Science.gov (United States)

    Oliver, Joseph E.; Jose, Paul E.; Brough, Paula

    2006-01-01

    Original formulations of the Work Locus of Control Scale (WLCS) proposed a unidimensional structure of this measure; however, more recently, evidence for a two-dimensional structure has been reported, with separate subscales for internal and external loci of control. The current study evaluates the one- and two-factor models with confirmatory…

  14. Human factors survey of advanced instrumentation and controls

    Energy Technology Data Exchange (ETDEWEB)

    Carter, R.J.

    1989-01-01

    A survey oriented towards identifying the human factors issues in regard to the use of advanced instrumentation and controls (I C) in the nuclear industry was conducted. A number of United States (US) and Canadian nuclear vendors and utilities were participants in the survey. Human factors items, subsumed under the categories of computer-generated displays (CGD), controls, organizational support, training, and related topics, were discussed. The survey found the industry to be concerned about the human factors issues related to the implementation of advanced I C. Fifteen potential human factors problems were identified. They include: the need for an advanced I C guideline equivalent to NUREG-0700; a role change in the control room from operator to supervisor; information overload; adequacy of existing training technology for advanced I C; and operator acceptance and trust. 11 refs., 1 tab.

  15. Soft Controls: Technical Basis and Human Factors Review Guidance

    Science.gov (United States)

    2000-03-01

    Controlling Office is (insert controlling DoD office). NUREG /CR-6635 BNL- NUREG -52565 Soft Controls: Technical Basis and Human Factors Review Guidance...DC 20555-0001 AVAILABILITY NOTICE Availability of Reference Materials Cited in NRC Publications NRC publications in the NUREG series, NRC regu...Technical Information Service Springfield, VA 22161 -0002 <http://www.ntis.gov> 1 -800-553-6847 or locally 703-605-6000 The NUREG series

  16. Aircraft Loss of Control Causal Factors and Mitigation Challenges

    Science.gov (United States)

    Jacobson, Steven R.

    2010-01-01

    Loss of control is the leading cause of jet fatalities worldwide. Aside from their frequency of occurrence, accidents resulting from loss of aircraft control seize the public s attention by yielding a large number of fatalities in a single event. In response to the rising threat to aviation safety, the NASA Aviation Safety Program has conducted a study of the loss of control problem. This study gathered four types of information pertaining to loss of control accidents: (1) statistical data; (2) individual accident reports that cite loss of control as a contributing factor; (3) previous meta-analyses of loss of control accidents; and (4) inputs solicited from aircraft manufacturers, air carriers, researchers, and other industry stakeholders. Using these information resources, the study team identified the causal factors that were cited in the greatest number of loss of control accidents, and which were emphasized most by industry stakeholders. This report describes the study approach, the key causal factors for aircraft loss of control, and recommended mitigation strategies to make near-term impacts, mid-term impacts, and Next Generation Air Transportation System impacts on the loss of control accident statistics

  17. Transcription Factor Zbtb20 Controls Regional Specification of Mammalian Archicortex

    DEFF Research Database (Denmark)

    Rosenthal, Eva Helga

    2010-01-01

    Combinatorial expression of sets of transcription factors (TFs) along the mammalian cortex controls its subdivision into functional areas. Unlike neocortex, only few recent data suggest genetic mechanisms controlling the regionalization of the archicortex. TF Emx2 plays a crucial role in patterning...... later on becoming restricted exclusively to postmitotic neurons of hippocampus (Hi) proper, dentate gyrus (DG), and two transitory zones, subiculum (S) and retrosplenial cortex (Rsp). Analysis of Zbtb20-/- mice revealed altered cortical patterning at the border between neocortex and archicortex...

  18. Teleoperator hand controllers: A contextual human factors assessment

    Energy Technology Data Exchange (ETDEWEB)

    Draper, J.V.

    1994-05-01

    This document provides a human factors assessment of controllers for use with remotely controlled manipulators deployed to remove hazardous waste from underground storage tanks. The analysis concentrates on controller technique (i.e., the broad class of hand controller) and not on details of controller ergonomics. Examples of controller techniques include, for example, direct rate control, resolved unilateral position control, and direct bilateral position control. Using an existing concept, the Tank Waste Retrieval Manipulator System, as a reference, two basic types of manipulators may be identified for this application. A long reach, gross-positioning manipulator (LRM) may be used to position a smaller manipulator or an end-effector within a work site. For a Long Reach Manipulator, which will have an enormous motion range and be capable of high end-effector velocity, it will be safest and most efficient to use a resolved rate control system. A smaller, dexterous manipulator may be used to perform handling work within a relatively small work site, (i.e., to complete tasks requiring near-human dexterity). For a Dexterous Manipulator, which will have a smaller motion range than the LRM and be required to perform more difficult tasks, a resolved bilateral position control system will be safest and most efficient. However, during some waste recovery tasks it may be important to support the users by restricting movements to a single plane or axis. This can be done with a resolved bilateral position control system by (1) using the master controller force output to restrict controller inputs or (2) switching the controller to a multiaxis rate control mode and using the force output to provide a spring return to center functionality.

  19. Potential risk factors for diabetic neuropathy: a case control study

    Directory of Open Access Journals (Sweden)

    Nooraei Mahdi

    2005-12-01

    Full Text Available Abstract Background Diabetes mellitus type II afflicts at least 2 million people in Iran. Neuropathy is one of the most common complications of diabetes and lowers the patient's quality of life. Since neuropathy often leads to ulceration and amputation, we have tried to elucidate the factors that can affect its progression. Methods In this case-control study, 110 diabetic patients were selected from the Shariati Hospital diabetes clinic. Michigan Neuropathic Diabetic Scoring (MNDS was used to differentiate cases from controls. The diagnosis of neuropathy was confirmed by nerve conduction studies (nerve conduction velocity and electromyography. The multiple factors compared between the two groups included consumption of angiotensin converting enzyme inhibitors (ACEI, blood pressure, serum lipid level, sex, smoking, method of diabetes control and its quality. Results Statistically significant relationships were found between neuropathy and age, gender, quality of diabetes control and duration of disease (P values in the order: 0.04, 0.04, Conclusion In this study, hyperglycemia was the only modifiable risk factor for diabetic neuropathy. Glycemic control reduces the incidence of neuropathy, slows its progression and improves the diabetic patient's quality of life. More attention must be paid to elderly male diabetic patients with poor diabetes control with regard to regular foot examinations and more practical education.

  20. Dominant factors in controlling marine gas pools in South China

    Institute of Scientific and Technical Information of China (English)

    XU Sihuang; W.Lynn Watney

    2007-01-01

    In marine strata from Sinian to Middle Triassic in South China,there develop four sets of regional and six sets of local source rocks,and ten sets of reservoir rocks.The occurrence of four main formation periods in association with five main reconstruction periods,results in a secondary origin for the most marine gas pools in South China.To improve the understanding of marine gas pools in South China with severely deformed geological background,the dominant control factors are discussed in this paper.The fluid sources,including the gas cracked from crude oil,the gas dissolved in water,the gas of inorganic origin,hydrocarbons generated during the second phase,and the mixed pool fluid source,were the most significant control factors of the types and the development stage of pools.The period of the pool formation and the reconstruction controlled the pool evolution and the distribution on a regional scale.Owing to the multiple periods of the pool formation and the reconstruction,the distribution of marine gas pools was complex both in space and in time,and the gas in the pools is heterogeneous.Pool elements,such as preservation conditions,traps and migration paths,and reservoir rocks and facies,also served as important control factors to marine gas pools in South China.Especially,the preservation conditions played a key role in maintaining marine oil and gas accumulations on a regional or local scale.According to several dominant control factors of a pool,the pool-controlling model can be constructed.As an example,the pool-controlling model of Sinian gas pool in Weiyuan gas field in Sichuan basin was summed up.

  1. Dominant factors in controlling marine gas pools in South China

    Science.gov (United States)

    Xu, S.; Watney, W.L.

    2007-01-01

    In marine strata from Sinian to Middle Triassic in South China, there develop four sets of regional and six sets of local source rocks, and ten sets of reservoir rocks. The occurrence of four main formation periods in association with five main reconstruction periods, results in a secondary origin for the most marine gas pools in South China. To improve the understanding of marine gas pools in South China with severely deformed geological background, the dominant control factors are discussed in this paper. The fluid sources, including the gas cracked from crude oil, the gas dissolved in water, the gas of inorganic origin, hydrocarbons generated during the second phase, and the mixed pool fluid source, were the most significant control factors of the types and the development stage of pools. The period of the pool formation and the reconstruction controlled the pool evolution and the distribution on a regional scale. Owing to the multiple periods of the pool formation and the reconstruction, the distribution of marine gas pools was complex both in space and in time, and the gas in the pools is heterogeneous. Pool elements, such as preservation conditions, traps and migration paths, and reservoir rocks and facies, also served as important control factors to marine gas pools in South China. Especially, the preservation conditions played a key role in maintaining marine oil and gas accumulations on a regional or local scale. According to several dominant control factors of a pool, the pool-controlling model can be constructed. As an example, the pool-controlling model of Sinian gas pool in Weiyuan gas field in Sichuan basin was summed up. ?? Higher Education Press and Springer-Verlag 2007.

  2. Behavioural factors related to metabolic control in patients with phenylketonuria

    NARCIS (Netherlands)

    Crone, MR; van Spronsen, FJ; Oudshoorn, K; Bekhof, J; van Rijn, G; Verkerk, PH

    2005-01-01

    Background. The objective of this study was to determine the importance of parental factors possibly related to dietary control in early and continuously treated patients with phenylketonuria (PKU). Methods. A questionnaire was disseminated among parents of 238 patients with PKU born after the natio

  3. Physiology and Endocrinology Symposium. Factors controlling puberty in beef heifers

    Science.gov (United States)

    The Physiology and Endocrinology Symposium on “Factors controlling puberty in beef heifers” was held at the joint annual meeting of the American Dairy Science Association and the American Society of Animal Science in New Orleans, Louisiana, USA, July 10 to 14, 2011. The objective of the symposium w...

  4. Responses of Lens esculenta Moench to controlled environmental factors

    NARCIS (Netherlands)

    Saint-Clair, P.M.

    1972-01-01

    Many experiments were undertaken to study the responses of the lentil cultivars 'Large blonde' and 'Anicia' to controlled environmental factors. They covered different aspects of the physiology and the ecology of the crop.The orientation experiments (2) involved germination and depth of sowing. The

  5. Simple Expressions for Safety Factors in Inventory Control

    NARCIS (Netherlands)

    Strijbosch, L.W.G.; Moors, J.J.A.

    1999-01-01

    The literature on inventory control discusses many methods to establish the level of decision parameters -like reorder levels or safety factors-, necessary to attain a prescribed service level. In general, however, these methods are not easy applicable: they often use time-consuming iterations, requ

  6. Designing Simulation Experiments with Controllable and Uncontrollable Factors

    DEFF Research Database (Denmark)

    Dehlendorff, Christian; Kulahci, Murat; Andersen, Klaus Kaae

    In this study we propose a new method for designing computer experiments inspired by the split plot designs used in physical experimentation. The basic layout is that each set of controllable factor settings corresponds to a whole plot for which a number of subplots, each corresponding to one...... combination of settings of the uncontrollable factors, is employed. The caveat is a desire that the subplots within each whole plot cover the design space uniformly. A further desire is that in the combined design, where all experimental runs are considered at once, the uniformity of the design space coverage...... should be guaranteed. Our proposed method allows for a large number of uncontrollable and controllable settings to be run in a limited number of runs while uniformly covering the design space for the uncontrollable factors....

  7. Simplifying the audit of risk factor recording and control

    DEFF Research Database (Denmark)

    Zhao, Min; Cooney, Marie Therese; Klipstein-Grobusch, Kerstin

    2016-01-01

    BACKGROUND: To simplify the assessment of the recording and control of coronary heart disease risk factors in different countries and regions. DESIGN: The SUrvey of Risk Factors (SURF) is an international clinical audit. METHODS: Data on consecutive patients with established coronary heart disease...... from countries in Europe, Asia and the Middle East were collected on a one-page collection sheet or electronically during routine clinic visits. Information on demographics, diagnostic category, risk factors, physical and laboratory measurements, and medications were included and key variables...... summarized in a Cardiovascular Health Index Score. RESULTS: Coronary heart disease patients (N = 10,186; 29% women) were enrolled from 79 centres in 11 countries. Recording of risk factors varied considerably: smoking was recorded in over 98% of subjects, while about 20% lacked data on laboratory...

  8. Risk factors of Cancer Prostate A case control study.

    Science.gov (United States)

    Kamel, Nahed M; Tayel, Eiman S; El Abbady, Ahmed A; Khashab, Sahar S

    2006-01-01

    The purpose of this study is to reveal the different risk factors related to this cancer particularly that there is no agreement about which factors affect the risk. A fishing expedition hospital based case control study was carried out. Cases and controls were identified from the Urology Department of Alexandria Main University Hospital, 2004. All cases diagnosed as having the tumor were included in the case series. For each case the second subject proved to have a negative pathological examination was included in the control group (50).Data collection was carried out blindly using a structured interview schedule. Analysis was applied using Chi-square test, Fisher exact and Student's t-test. Odds Ratios and 95% Confidence Intervals were calculated. Results indicated that regular consumption of sausages was greater among cases than controls (X(2)= 10.19, p= 0.001 and an odds ratio of 5.92 (CI: 1.69-25.99). Also more cases claimed consuming regularly butter and natural ghee than controls (X(2)= 5.47, p= 0.019). The estimate risk was as high as 2.79 (Cl: 1.07-7.33). However regular consumption of vegetables was more encountered among controls than cases (X(2) = 5.005, p= 0.025 where the odds ratio was 0.19 (Cl: 0.02-1.01). Moreover the multiple regression analysis confirmed the results obtained from univariate analysis. The consistency of results of current work as regards sausages and butter with several other research works can support the identification of these specific possible risk factors. Also other research workers pointed out to the protective effect of vegetables. However further research is needed to address other risk factors.

  9. Factores de necesidad asociados al uso adecuado del control prenatal

    Directory of Open Access Journals (Sweden)

    Clarybel Miranda Mellado

    2016-07-01

    Full Text Available Introducción: Por las implicaciones sociales de la mortalidad y morbilidad materna es importante determinar los factores de necesidad que influyen en el uso adecuado del control prenatal en gestantes de Sincelejo. Materiales y Métodos: Estudio analítico de corte transversal, que incluyó 730 gestantes, seleccionadas mediante muestreo aleatorio por conglomerados, de las comunas de la ciudad. La información fue recolectada por medio de una encuesta sociodemográfica, una ficha de uso de control prenatal y un Cuestionario para evaluar los factores de necesidad  propuestos por el Modelo de Promoción de la Salud de Nola Pender. Las gestantes fueron contactadas en sus domicilios y diligenciaron los instrumentos. Los datos fueron analizados aplicando estadística descriptiva e inferencial para determinar las asociaciones entre variables. Resultados: El 97,7% (713 de las gestantes asistía al control prenatal, con una mediana de 4 controles prenatales. Un 2,3% (17 no lo habían iniciado al momento de la encuesta y 24,4% (178 hizo uso inadecuado. El 80,7% (589 de las gestantes califican su estado de salud como bueno o muy bueno, 94,8% (692 percibieron beneficios del control prenatal. Se encontró asociación significativa entre la percepción de beneficios y el uso adecuado de control prenatal [OR=5,5 (IC 95%: 2,8 - 10,8]. Discusión y Conclusiones: La percepción que las mujeres tienen sobre los buenos resultados que reporta la asistencia al control prenatal, es el principal factor que puede explicar la adherencia al control y el cumplimiento regular de las consultas. Cómo citar este artículo: Miranda C, Castillo IY. Factores de necesidad asociados al uso adecuado del control prenatal. Rev Cuid. 2016; 7(2: 1345-51. http://dx.doi.org/10.15649/cuidarte.v7i2.340

  10. Controlling for gene expression changes in transcription factor protein networks.

    Science.gov (United States)

    Banks, Charles A S; Lee, Zachary T; Boanca, Gina; Lakshminarasimhan, Mahadevan; Groppe, Brad D; Wen, Zhihui; Hattem, Gaye L; Seidel, Chris W; Florens, Laurence; Washburn, Michael P

    2014-06-01

    The development of affinity purification technologies combined with mass spectrometric analysis of purified protein mixtures has been used both to identify new protein-protein interactions and to define the subunit composition of protein complexes. Transcription factor protein interactions, however, have not been systematically analyzed using these approaches. Here, we investigated whether ectopic expression of an affinity tagged transcription factor as bait in affinity purification mass spectrometry experiments perturbs gene expression in cells, resulting in the false positive identification of bait-associated proteins when typical experimental controls are used. Using quantitative proteomics and RNA sequencing, we determined that the increase in the abundance of a set of proteins caused by overexpression of the transcription factor RelA is not sufficient for these proteins to then co-purify non-specifically and be misidentified as bait-associated proteins. Therefore, typical controls should be sufficient, and a number of different baits can be compared with a common set of controls. This is of practical interest when identifying bait interactors from a large number of different baits. As expected, we found several known RelA interactors enriched in our RelA purifications (NFκB1, NFκB2, Rel, RelB, IκBα, IκBβ, and IκBε). We also found several proteins not previously described in association with RelA, including the small mitochondrial chaperone Tim13. Using a variety of biochemical approaches, we further investigated the nature of the association between Tim13 and NFκB family transcription factors. This work therefore provides a conceptual and experimental framework for analyzing transcription factor protein interactions.

  11. Risk factors for episiotomy: a case-control study

    Directory of Open Access Journals (Sweden)

    Giordana Campos Braga

    2014-10-01

    Full Text Available Objective: obtaining information on the factors associated with episiotomy will be useful in sensitizing professionals to the need to minimize its incidence. Therefore, the objective of this study was to evaluate risk factors for episiotomy in pregnant women who had undergone vaginal delivery at a university maternity hospital in northeastern Brazil. Methods: a case-control study was conducted with pregnant women submitted to episiotomy (cases and pregnant women not submitted to episiotomy (controls between March 2009 and July 2010 at the Professor Fernando Figueira Integral Medicine Institute (IMIP in Recife, Brazil, in a ratio of 1 case to 2 controls. The study variables consisted of: whether episiotomy was performed, demographic, obstetric and fetal characteristics (primiparity, analgesia, instrumental delivery, fetal distress, etc., external factors (day and time of delivery, professional attending delivery and factors directly related to delivery. Odds ratios (OR and 95% confidence intervals (95%CI were calculated. Multivariate analysis was performed to determine the adjusted risk of episiotomy. Results: a total of 522 women (173 cases and 349 controls were included. It was found that deliveries with episiotomy were more likely to have been attended by staff physicians (OR = 1.88; 95%CI: 1.01 - 3.48, to have required forceps (OR = 12.31; 95%CI: 4.9 - 30.1 and to have occurred in primiparas (OR = 4.24; 95%CI: 2.61 - 6.89. The likelihood of a nurse having attended the delivery with episiotomy was significantly lower (OR = 0.29; 95%CI: 0.16 - 0.55. Conclusion: episiotomy was found to be strongly associated with deliveries attended by staff physicians, with primiparity, and with instrumental delivery, and was less common in deliveries attended by nurses.

  12. Factors Controlling the Distribution of Trace Metals in Macroalgae

    Institute of Scientific and Technical Information of China (English)

    王宝利; 刘丛强

    2004-01-01

    This paper presents the concentrations of trace metals (Cr, Mn, Fe, Co, Ni, Cu, Zn, Cd, Pb) in macroalgae from five areas. Significant differences were noticed in trace metal concentration in macroalgae, and a large range of variations between the minimum and maximum concentrations of trace metals was found. Trace metals detected in macroalgae generally occur in adsorbed and absorbed forms. Environmental and biological factors jointly control the trace metal compositions and concentrations in macroalgae. The complexity and variation of these factors cause significant differences in trace metal concentrations in macroalgae. Environmental factors play a more important role in controlling trace metal compositions and concentrations when external available trace metals are beyond requirement for algal metabolism and growth, especially for non-essential trace metals; however, when the external available trace metals just satisfy the needs of algal metabolism and growth, biological factors would play a more important role, especially for essential trace metals. Interactions among the trace metals can also influence their compositions and concentrations in macroalgae. It is also discussed how to make macroalgae as an excellent biomonitor for trace metals.

  13. Understanding disease control: influence of epidemiological and economic factors

    CERN Document Server

    Oles, Katarzyna; Kleczkowski, Adam

    2011-01-01

    We present a local spread model of disease transmission on a regular network and compare different control options ranging from treating the whole population to local control in a well-defined neighborhood of an infectious individual. Comparison is based on a total cost of epidemic, including cost of palliative treatment of ill individuals and preventive cost aimed at vaccination or culling of susceptible individuals. Disease is characterized by pre- symptomatic phase which makes detection and control difficult. Three general strategies emerge, global preventive treatment, local treatment within a neighborhood of certain size and only palliative treatment with no prevention. The choice between the strategies depends on relative costs of palliative and preventive treatment. The details of the local strategy and in particular the size of the optimal treatment neighborhood weakly depends on disease infectivity but strongly depends on other epidemiological factors. The required extend of prevention is proportiona...

  14. Cooperative research for human factors review of advanced control rooms

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Jung Woon; Park, Jae Chang; Lee, Yong Hee; Oh, In Seok; Lee, Hyun Chul

    2000-12-01

    This project has been performed as cooperative research between KAERI and USNRC. Human factors issues related to soft controls, which is one of key features of advanced HSI, are identified in this project. The issues are analyzed for the evaluation approaches in either experimental or analytical ways. Also, issues requiring additional researches for the evaluation of advanced HSI are identified in the areas of advanced information systems design, computer-based procedure systems, soft controls, human systems interface and plant modernization process, and maintainability of digital systems. The issues are analyzed to discriminate the urgency of researches on it to high, medium, and low levels in consideration of advanced HSI development status in Korea, and some of the issues that can be handled by experimental researches are identified. Additionally, an experimental study is performed to compare operator's performance on human error detection in advanced control rooms vs. in conventional control rooms. It is found that advanced control rooms have several design characteristics hindering operator's error detection performance compared to conventional control rooms.

  15. Gelatin methacrylate microspheres for controlled growth factor release.

    Science.gov (United States)

    Nguyen, Anh H; McKinney, Jay; Miller, Tobias; Bongiorno, Tom; McDevitt, Todd C

    2015-02-01

    Gelatin has been commonly used as a delivery vehicle for various biomolecules for tissue engineering and regenerative medicine applications due to its simple fabrication methods, inherent electrostatic binding properties, and proteolytic degradability. Compared to traditional chemical cross-linking methods, such as the use of glutaraldehyde (GA), methacrylate modification of gelatin offers an alternative method to better control the extent of hydrogel cross-linking. Here we examined the physical properties and growth factor delivery of gelatin methacrylate (GMA) microparticles (MPs) formulated with a wide range of different cross-linking densities (15-90%). Less methacrylated MPs had decreased elastic moduli and larger mesh sizes compared to GA MPs, with increasing methacrylation correlating to greater moduli and smaller mesh sizes. As expected, an inverse correlation between microparticle cross-linking density and degradation was observed, with the lowest cross-linked GMA MPs degrading at the fastest rate, comparable to GA MPs. Interestingly, GMA MPs at lower cross-linking densities could be loaded with up to a 10-fold higher relative amount of growth factor than conventional GA cross-linked MPs, despite the GA MPs having an order of magnitude greater gelatin content. Moreover, a reduced GMA cross-linking density resulted in more complete release of bone morphogenic protein 4 and basic fibroblast growth factor and accelerated release rate with collagenase treatment. These studies demonstrate that GMA MPs provide a more flexible platform for growth factor delivery by enhancing the relative binding capacity and permitting proteolytic degradation tunability, thereby offering a more potent controlled release system for growth factor delivery.

  16. Biodiversity mediates top-down control in eelgrass ecosystems: a global comparative-experimental approach.

    Science.gov (United States)

    Duffy, J Emmett; Reynolds, Pamela L; Boström, Christoffer; Coyer, James A; Cusson, Mathieu; Donadi, Serena; Douglass, James G; Eklöf, Johan S; Engelen, Aschwin H; Eriksson, Britas Klemens; Fredriksen, Stein; Gamfeldt, Lars; Gustafsson, Camilla; Hoarau, Galice; Hori, Masakazu; Hovel, Kevin; Iken, Katrin; Lefcheck, Jonathan S; Moksnes, Per-Olav; Nakaoka, Masahiro; O'Connor, Mary I; Olsen, Jeanine L; Richardson, J Paul; Ruesink, Jennifer L; Sotka, Erik E; Thormar, Jonas; Whalen, Matthew A; Stachowicz, John J

    2015-07-01

    Nutrient pollution and reduced grazing each can stimulate algal blooms as shown by numerous experiments. But because experiments rarely incorporate natural variation in environmental factors and biodiversity, conditions determining the relative strength of bottom-up and top-down forcing remain unresolved. We factorially added nutrients and reduced grazing at 15 sites across the range of the marine foundation species eelgrass (Zostera marina) to quantify how top-down and bottom-up control interact with natural gradients in biodiversity and environmental forcing. Experiments confirmed modest top-down control of algae, whereas fertilisation had no general effect. Unexpectedly, grazer and algal biomass were better predicted by cross-site variation in grazer and eelgrass diversity than by global environmental gradients. Moreover, these large-scale patterns corresponded strikingly with prior small-scale experiments. Our results link global and local evidence that biodiversity and top-down control strongly influence functioning of threatened seagrass ecosystems, and suggest that biodiversity is comparably important to global change stressors.

  17. Based on Multi-Factors Grey Prediction Control for Elevator Velocity Modulation

    OpenAIRE

    2012-01-01

    This paper uses the double-factors grey prediction and the fuzzy controller for the elevator car speed control. We introduce double-factors grey control to predict car vibration for elevator speed during the operation. Simulation results show that based on multi-factors gray prediction fuzzy PI control for elevator velocity modulation system closer than simple gray fuzzy PI control elevator speed control system to the actual operation. The control effect of double factors grey fuzzy PI contro...

  18. Risk Factors For Ectopic Pregnancy : A Case Control Study

    Directory of Open Access Journals (Sweden)

    Deshmukh J.S

    1999-01-01

    Full Text Available Research question: Which are the risk factors for ectopic pregnancy . Objective: To study the strength of association between hypothesised risk factors and ectopic pregnancy. Study design: Unmatched case- control study. Setting: Government Medical College, Hospital, Nagpur. Participants: 133 cases of ectopic pregnancy and equal number of controls (non pregnant women admitted to study hospital. Study variables : Pelvic inflammatory diseases, sexually transmitted diseases, IUD use at conception , past use of IUD, prior ectopic pregnancy, OC pills use at the time of conception, past use of OC pills, induced abortion, spontaneous abortion, infertility and pelvic and abdominal surgery. Statistical analysis: Odds ratios & their 95% CI, Pearson’s chi square test, unconditional logistic regression analysis and population attributable risk proportion. Results : Use of IUD at conception, prior ectopic pregnancy , pelvic inflammatory disease, sexually transmitted diseases, infertility, OC pills use at the time of conception, past use of IUD and induced abortion were found to be significantly associated with ectopic pregnancy. Conclusion: Identification of these risk factors for etopic pregnancy shall help in early detection and appropriate management in an individual case and it may help in devising a comprehensive preventive strategy for ectopic pregnancy

  19. Social and cultural factors in the successful control of tuberculosis.

    Science.gov (United States)

    Rubel, A J; Garro, L C

    1992-01-01

    The burden of tuberculosis on the public health is staggering. Worldwide, annual incidence of new cases is estimated to be about 8 million. Almost 3 million deaths occur yearly. Early case identification and adherence to treatment regimens are the remaining barriers to successful control. In many nations, however, fewer than half those with active disease receive a diagnosis, and fewer than half those beginning treatment complete it. The twin problems of delay in seeking treatment and abandonment of a prescribed regimen derive from complex factors. People's confusion as to the implications of the tuberculosis symptoms, costs of transportation to clinic services, the social stigma that attaches to tuberculosis, the high cost of medication, organizational problems in providing adequate followup services, and patients' perception of clinic facilities as inhospitable all contribute to the complexity. Sociocultural factors are emphasized in this report because hitherto they have not been adequately explored. Salient among those sociocultural factors is the health culture of the patients. That is, the understanding and information people have from family, friends, and neighbors as to the nature of a health problem, its cause, and its implications. A knowledge of the health culture of their patients has become a critical tool if tuberculosis control programs are to be successful. Several anthropological procedures are recommended to help uncover the health culture of people served by tuberculosis clinics.

  20. Shoreline as a controlling factor in commercial shrimp production

    Science.gov (United States)

    Faller, K. H. (Principal Investigator)

    1979-01-01

    The author has identified the following significant results. An ecological model was developed that relates marsh detritus export and shrimp production. It was based on the hypothesis that the shoreline is a controlling factor in the production of shrimp through regulation of detritus export from the marsh. LANDSAT data were used to develop measurement of shoreline length and areas of marsh having more than 5.0 kilometers of shoreline per square kilometer of area for the Louisiana coast, demonstrating the capability of remote sensing to provide important geographic information. These factors were combined with published tidal ranges and salinities to develop a mathematical model that predicted shrimp production for nine geographic units of the Louisiana coast, as indicated by the long term average commercial shrimp yield.

  1. Revisiting factors controlling methane emissions from high-Arctic tundra

    DEFF Research Database (Denmark)

    Mastepanov, M.; Sigsgaard, C.; Tagesson, T.;

    2013-01-01

    with measurements made outside the growing season, are underrepresented in the literature. Here we present results of 5 yr (2006-2010) of automatic chamber measurements at a high-Arctic location in Zackenberg, NE Greenland, covering both the growing seasons and two months of the following freeze-in periods...... explained by high seasonality of both variables, and weakly correlated with the water table. The greatest variability in fluxes between the study years was observed during the first part of the growing season. Somewhat surprisingly, this variability could not be explained by commonly known factors...... controlling methane emission, i.e. temperature and water table position. Late in the growing season CH4 emissions were found to be very similar between the study years (except the extremely dry 2010) despite large differences in climatic factors (temperature and water table). Late-season bursts of CH4...

  2. Frequency control system based on power factor control of asynchronous motor

    Institute of Scientific and Technical Information of China (English)

    MENG Qing-chun; YANG Fei-xia; REN Zhi-ling

    2005-01-01

    Deduced the relationship between the power factor (PF) and the angular frequency according to the simplified equivalent circuit of asynchronous motor, forming a power factor auto-control system. An anti-interference circuit was also introduced in the middle voltage link of inverter to avoid the shift of the optimum PF point caused by the change of the load and the reliable run of the control system was assured. The experiment results show that it has a good self-adaptation in the whole scope of speed adjustment and an obvious economization on energy while it runs under load.

  3. Biological Stability of Drinking Water: Controlling Factors, Methods, and Challenges

    KAUST Repository

    Prest, Emmanuelle I.

    2016-02-01

    Biological stability of drinking water refers to the concept of providing consumers with drinking water of same microbial quality at the tap as produced at the water treatment facility. However, uncontrolled growth of bacteria can occur during distribution in water mains and premise plumbing, and can lead to hygienic (e.g., development of opportunistic pathogens), aesthetic (e.g., deterioration of taste, odor, color) or operational (e.g., fouling or biocorrosion of pipes) problems. Drinking water contains diverse microorganisms competing for limited available nutrients for growth. Bacterial growth and interactions are regulated by factors, such as (i) type and concentration of available organic and inorganic nutrients, (ii) type and concentration of residual disinfectant, (iii) presence of predators, such as protozoa and invertebrates, (iv) environmental conditions, such as water temperature, and (v) spatial location of microorganisms (bulk water, sediment, or biofilm). Water treatment and distribution conditions in water mains and premise plumbing affect each of these factors and shape bacterial community characteristics (abundance, composition, viability) in distribution systems. Improved understanding of bacterial interactions in distribution systems and of environmental conditions impact is needed for better control of bacterial communities during drinking water production and distribution. This article reviews (i) existing knowledge on biological stability controlling factors and (ii) how these factors are affected by drinking water production and distribution conditions. In addition, (iii) the concept of biological stability is discussed in light of experience with well-established and new analytical methods, enabling high throughput analysis and in-depth characterization of bacterial communities in drinking water. We discussed, how knowledge gained from novel techniques will improve design and monitoring of water treatment and distribution systems in order

  4. Biological stability of drinking water: controlling factors, methods and challenges

    Directory of Open Access Journals (Sweden)

    Emmanuelle ePrest

    2016-02-01

    Full Text Available Biological stability of drinking water refers to the concept of providing consumers with drinking water of same microbial quality at the tap as produced at the water treatment facility. However, uncontrolled growth of bacteria can occur during distribution in water mains and premise plumbing, and can lead to hygienic (e.g. development of opportunistic pathogens, aesthetic (e.g. deterioration of taste, odour, colour or operational (e.g. fouling or biocorrosion of pipes problems. Drinking water contains diverse microorganisms competing for limited available nutrients for growth. Bacterial growth and interactions are regulated by factors such as (i type and concentration of available organic and inorganic nutrients, (ii type and concentration of residual disinfectant, (iii presence of predators such as protozoa and invertebrates, (iv environmental conditions such as water temperature, and (v spatial location of microorganisms (bulk water, sediment or biofilm. Water treatment and distribution conditions in water mains and premise plumbing affect each of these factors and shape bacterial community characteristics (abundance, composition, viability in distribution systems. Improved understanding of bacterial interactions in distribution systems and of environmental conditions impact is needed for better control of bacterial communities during drinking water production and distribution. This article reviews (i existing knowledge on biological stability controlling factors and (ii how these factors are affected by drinking water production and distribution conditions. In addition, (iii the concept of biological stability is discussed in light of experience with well-established and new analytical methods, enabling high throughput analysis and in-depth characterization of bacterial communities in drinking water. We discuss how knowledge gained from novel techniques will improve design and monitoring of water treatment and distribution systems in order to

  5. Biological Stability of Drinking Water: Controlling Factors, Methods, and Challenges

    Science.gov (United States)

    Prest, Emmanuelle I.; Hammes, Frederik; van Loosdrecht, Mark C. M.; Vrouwenvelder, Johannes S.

    2016-01-01

    Biological stability of drinking water refers to the concept of providing consumers with drinking water of same microbial quality at the tap as produced at the water treatment facility. However, uncontrolled growth of bacteria can occur during distribution in water mains and premise plumbing, and can lead to hygienic (e.g., development of opportunistic pathogens), aesthetic (e.g., deterioration of taste, odor, color) or operational (e.g., fouling or biocorrosion of pipes) problems. Drinking water contains diverse microorganisms competing for limited available nutrients for growth. Bacterial growth and interactions are regulated by factors, such as (i) type and concentration of available organic and inorganic nutrients, (ii) type and concentration of residual disinfectant, (iii) presence of predators, such as protozoa and invertebrates, (iv) environmental conditions, such as water temperature, and (v) spatial location of microorganisms (bulk water, sediment, or biofilm). Water treatment and distribution conditions in water mains and premise plumbing affect each of these factors and shape bacterial community characteristics (abundance, composition, viability) in distribution systems. Improved understanding of bacterial interactions in distribution systems and of environmental conditions impact is needed for better control of bacterial communities during drinking water production and distribution. This article reviews (i) existing knowledge on biological stability controlling factors and (ii) how these factors are affected by drinking water production and distribution conditions. In addition, (iii) the concept of biological stability is discussed in light of experience with well-established and new analytical methods, enabling high throughput analysis and in-depth characterization of bacterial communities in drinking water. We discussed, how knowledge gained from novel techniques will improve design and monitoring of water treatment and distribution systems in order

  6. Biological Stability of Drinking Water: Controlling Factors, Methods, and Challenges.

    Science.gov (United States)

    Prest, Emmanuelle I; Hammes, Frederik; van Loosdrecht, Mark C M; Vrouwenvelder, Johannes S

    2016-01-01

    Biological stability of drinking water refers to the concept of providing consumers with drinking water of same microbial quality at the tap as produced at the water treatment facility. However, uncontrolled growth of bacteria can occur during distribution in water mains and premise plumbing, and can lead to hygienic (e.g., development of opportunistic pathogens), aesthetic (e.g., deterioration of taste, odor, color) or operational (e.g., fouling or biocorrosion of pipes) problems. Drinking water contains diverse microorganisms competing for limited available nutrients for growth. Bacterial growth and interactions are regulated by factors, such as (i) type and concentration of available organic and inorganic nutrients, (ii) type and concentration of residual disinfectant, (iii) presence of predators, such as protozoa and invertebrates, (iv) environmental conditions, such as water temperature, and (v) spatial location of microorganisms (bulk water, sediment, or biofilm). Water treatment and distribution conditions in water mains and premise plumbing affect each of these factors and shape bacterial community characteristics (abundance, composition, viability) in distribution systems. Improved understanding of bacterial interactions in distribution systems and of environmental conditions impact is needed for better control of bacterial communities during drinking water production and distribution. This article reviews (i) existing knowledge on biological stability controlling factors and (ii) how these factors are affected by drinking water production and distribution conditions. In addition, (iii) the concept of biological stability is discussed in light of experience with well-established and new analytical methods, enabling high throughput analysis and in-depth characterization of bacterial communities in drinking water. We discussed, how knowledge gained from novel techniques will improve design and monitoring of water treatment and distribution systems in order

  7. Multiple Sclerosis Associated Risk Factors: A Case-Control Study

    Directory of Open Access Journals (Sweden)

    Jalal POOROLAJAL

    2015-11-01

    Full Text Available Background: Hamadan Province is one of the high-risk regions in Iran for Multiple sclerosis (MS. A majority of the epidemiological studies conducted in Iran addressing MS are descriptive. This study was conducted to assess MS and its associated risk factors in Hamadan Province, the west of Iran.Methods: This case-control study compared 100 patients with MS (case group and 100 patients with acute infectious diseases (control group from September 2013 to March 2014. A checklist was used to assess the demographic, medical, and family history of the patients. The Friedman-Rosenman questionnaire was also used to assess personality type. Statistical analysis was performed using logistic regression model with Stata 11 software program.Results: The adjusted odds ratio (OR estimate of MS was 4.37 (95% CI: 2.33, 8.20 for females compared to males; 0.15 (95% CI: 0.06, 0.43 for people aged above 50 years compared to aged 14 to 29 years; 0.44 (95% CI: 0.21, 0.91 for overweight or obese people compared to normal weights. Crude OR indicated a significant association between the occurrence of MS and exclusive breast feeding, season of birth, and smoking. However, the association was not statistically significant after adjustment for other covariates.Conclusion: The risk of MS is significantly lower in male gender, obese/overweight, and old people. Furthermore, non-smoking, non-exclusive breast-feeding, and born in autumn may increase the risk of MS but need further investigation. However, long-term large prospective cohort studies are needed to investigate the true effect of the potential risk factors on MS. Keywords: Multiple sclerosis, Risk factors, Case-control study, Iran

  8. Three controllable factors of steady operation of EGSB reactor

    Institute of Scientific and Technical Information of China (English)

    LI Hui-li; LU Bing-nan; LI Fang

    2008-01-01

    The bench- scale EGSB (expanded granular sludge bed) reactor was operated to study the effect of sludge loading rate, pH value and nutrient element on the operation of the EGSB reactor and the control rule of these factors. Continuous flow was used to treat synthetic wastewater containing dextrose and beer, and the temperature of reactor was controlled at mesophiles temperature (33 ℃). The experimental results demonstrated trolled by adding sodium bicarbonate, the proper additive quantity was 1000-1200 mg/L; the additive quantity wastewater with 400-5000 mg/L COD concentration. The COD removal efficiency was over 85%. The operation of the EGSB reactor was steady and the EGSB reactor had strong anti-shock load ability.

  9. Factors Controlling Nanoparticle Pharmacokinetics: An Integrated Analysis and Perspective

    DEFF Research Database (Denmark)

    Moghimi, Seyed Moien; Hunter, A.C.; Andresen, T.L.

    2012-01-01

    Intravenously injected nanoparticulate drug carriers provide a wide range of unique opportunities for site-specific targeting of therapeutic agents to many areas within the vasculature and beyond. Pharmacokinetics and biodistribution of these carriers are controlled by a complex array...... of interrelated core and interfacial physicochemical and biological factors. Pertinent to realizing therapeutic goals, definitive maps that establish the interdependency of nanoparticle size, shape, and surface characteristics in relation to interfacial forces, biodistribution, controlled drug release, excretion......, and adverse effects must be outlined. These concepts are critically evaluated and an integrated perspective is provided on the basis of the recent application of nanoscience approaches to nanocarrier design and engineering. The future of this exciting field is bright; some regulatory-approved products...

  10. Patient factors and glycaemic control--associations and explanatory power

    DEFF Research Database (Denmark)

    Rogvi, S; Tapager, I; Almdal, T P

    2012-01-01

    AIMS: To investigate the association between glycaemic control and patient socio-demographics, activation level, diabetes-related distress, assessment of care, knowledge of target HbA(1c), and self-management behaviours, and to determine to what extent these factors explain the variance in HbA(1c......) in a large Danish population of patients with Type 2 diabetes. METHODS: Cross-sectional survey and record review of 2045 patients from a specialist diabetes clinic. Validated scales measured patient activation, self-management behaviours, diabetes-related emotional distress, and perceived care...... and behaviour, specific treatment modalities and glycaemic control. Knowledge of treatment goals, achieving patient activation in coping with diabetes, and lowering disease-related emotional stress are important patient education goals. However, the large unexplained component of HbA(1c) variance highlights...

  11. Understanding disease control: influence of epidemiological and economic factors.

    Directory of Open Access Journals (Sweden)

    Katarzyna Oleś

    Full Text Available We present a model of disease transmission on a regular and small world network and compare different control options. Comparison is based on a total cost of epidemic, including cost of palliative treatment of ill individuals and preventive cost aimed at vaccination or culling of susceptible individuals. Disease is characterized by pre-symptomatic phase, which makes detection and control difficult. Three general strategies emerge: global preventive treatment, local treatment within a neighborhood of certain size and only palliative treatment with no prevention. While the choice between the strategies depends on a relative cost of palliative and preventive treatment, the details of the local strategy and, in particular, the size of the optimal treatment neighborhood depend on the epidemiological factors. The required extent of prevention is proportional to the size of the infection neighborhood, but depends on time till detection and time till treatment in a non-nonlinear (power law. The optimal size of control neighborhood is also highly sensitive to the relative cost, particularly for inefficient detection and control application. These results have important consequences for design of prevention strategies aiming at emerging diseases for which parameters are not nessecerly known in advance.

  12. Factors Controlling Sediment Denitrification Rates in Grassland and Forest Streams

    Directory of Open Access Journals (Sweden)

    Haryun Kim

    2014-01-01

    Full Text Available Sediment denitrification is an important nitrate (NO3- removal process from agricultural streams. The direct and indirect factors that control denitrification rates in tributary sediments can vary depending on the types of agricultural activities and vegetation. Our research investigated (1 tributary sediment denitrification rates in a grassland stream affected by pasture ecosystems and a forest stream affected by N fertilization; and (2 the environmental factors that determine denitrification rates in tributary sediments. The denitrification enzyme activity (DEA in grassland stream sediments is positively correlated with precipitation likely due to the increased nutrient exchange rates between stream water and sediments, and was higher than in forest stream sediments, leading to a decrease in NO3- concentration ([NO3-] in stream sediments. The DEA in riparian sediments was regulated by carbon concentrations and did not contribute to NO3- removal from the riparian sediment in grassland and forest streams. Thus, environmental factors affected by different types of agricultural activities and vegetation might regulate denitrification rates and [NO3-] in agricultural stream ecosystems.

  13. Controllability analysis of transcriptional regulatory networks reveals circular control patterns among transcription factors

    DEFF Research Database (Denmark)

    Österlund, Tobias; Bordel, Sergio; Nielsen, Jens

    2015-01-01

    Transcriptional regulation is the most committed type of regulation in living cells where transcription factors (TFs) control the expression of their target genes and TF expression is controlled by other TFs forming complex transcriptional regulatory networks that can be highly interconnected. Here...... we analyze the topology and organization of nine transcriptional regulatory networks for E. coli, yeast, mouse and human, and we evaluate how the structure of these networks influences two of their key properties, namely controllability and stability. We calculate the controllability for each network...... as a measure of the organization and interconnectivity of the network. We find that the number of driver nodes n(D) needed to control the whole network is 64% of the TFs in the E. coli transcriptional regulatory network in contrast to only 17% for the yeast network, 4% for the mouse network and 8...

  14. Ex-post evaluation of local energy efficiency and demand-side management operations - State of the art, bottom-up methods, applied examples and approach for the development of an evaluation practical culture; L'evaluation ex-post des operations locales de maitrise de la demande en energie - Etat de l'art, methodes bottom-up, exemples appliques et approche du developpement d'une culture pratique de l'evaluation

    Energy Technology Data Exchange (ETDEWEB)

    Broc, J.S

    2006-12-15

    Energy end-use Efficiency (EE) is a priority for energy policies to face resources exhaustion and to reduce pollutant emissions. At the same time, in France, local level is increasingly involved into the implementation of EE activities, whose frame is changing (energy market liberalization, new policy instruments). Needs for ex-post evaluation of the local EE activities are thus increasing, for regulation requirements and to support a necessary change of scale. Our thesis focuses on the original issue of the ex-post evaluation of local EE operations in France. The state of the art, through the analysis of the American and European experiences and of the reference guidebooks, gives a substantial methodological material and emphasises the key evaluation issues. Concurrently, local EE operations in France are characterized by an analysis of their environment and a work on their segmentation criteria. The combination of these criteria with the key evaluation issues provides an analysis framework used as the basis for the composition of evaluation methods. This also highlights the specific evaluation needs for local operations. A methodology is then developed to complete and adapt the existing material to design evaluation methods for local operations, so that stakeholders can easily appropriate. Evaluation results thus feed a know-how building process with experience feedback. These methods are to meet two main goals: to determine the operation results, and to detect the success/failure factors. The methodology was validated on concrete cases, where these objectives were reached. (author)

  15. Risk factor control is key in diabetic nephropathy.

    Science.gov (United States)

    Lewis, Gareth; Maxwell, Alexander P

    2014-02-01

    Prolonged duration of diabetes, poor glycaemic control and hypertension are major risk factors for both diabetic nephropathy and cardiovascular disease. Optimising blood sugar control together with excellent control of blood pressure can reduce the risk of developing diabetic nephropathy. Diabetic nephropathy should be considered in any patient with diabetes when persistent albuminuria develops. Microalbuminuria is the earliest clinically detectable indicator of diabetic nephropathy risk. The majority of patients with diabetic nephropathy are appropriately diagnosed based on elevated urinary albumin excretion and/or reduced 0032-6518 renal function. Patients with type 2 diabetes should have annual urinary ACR measurements from the time of diabetes diagnosis while those with type 1 diabetes should commence five years after diagnosis. Blood pressure lowering to 130/80mmHg and reduction of proteinuria to diabetic nephropathy and reduces the number of cardiovascular events. Drugs that block the renin-angiotensin-aldosterone system (RAAS) are effective in reducing proteinuria, managing hypertension and reducing cardiovascular risk. Unless there are clear contraindications or intolerance all patients with diabetic nephropathy should be prescribed an ACEI or ARB. Stopping an ACEI or ARB during intercurrent illness or times of volume depletion is critically important. Patients with diabetic nephropathy should have at least yearly measurements of blood pressure, renal function and urinary ACR.

  16. Spatial and temporal variations and controlling factors of sediment accumulation in the Yangtze River estuary and its adjacent sea area in the Holocene, especially in the Early Holocene

    Science.gov (United States)

    Feng, Zhibing; Liu, Baohua; Zhao, Yuexia; Li, Xishuang; Jiang, Li; Si, Shaokun

    2016-08-01

    The sub-bottom and collected borehole data provide insight into the transport and accumulation processes of the Yangtze-derived sediment in the study area since ~11 kyr BP. Five seismic units were identified according to six major acoustic surfaces. The sedimentary strata consist of fluvial, estuarine and deltaic systems from the bottom up, characterized by two different trends in sediment accumulation rates, i.e., low-high-low, and high-low-high. On the inner shelf of the East China Sea, the terrain with trough and ridge was formed by the Early Holocene transgression strata (formed in ~10 to 12 kyr BP) scoured by the later rectilinear tidal current due to postglacial sea-level transgression, and the sharply protruding seismic units are interpreted to be bedrocks outcropping on the seafloor. An analysis of the sedimentary characteristics in the boreholes and such factors as difference in accumulation rates, and tectonic subsidence led us to conclude that the paleo-coastline was located not far away from and to the east of Core ZK09 at ~9 kyr BP, and the southern bank of the Yangtze River estuary was located to the south of Core ZK09. At ~9 kyr BP, the Yangtze-derived sediments were transported eastwards along the southern bank of the Yangtze River and the barrier due to the influence of the paleo-coastal current from the north, the direction of the Yangtze-derived sediment transport was split on the northeast of the Zhoushan archipelago, and the sediments covered the terrain with trough and ridge. During the high sea level period (7 kyr BP-present), the eastward migration of paleo-coastline had resulted in the increase in accumulation rate. We also conclude that the sharp increase in accumulation rate near the Yangtze River estuary after ~2 kyr BP was not primarily caused by human activities. The position shifts of the estuary caused by the paleo-coastline migration and sea level oscillations since the Holocene is the main cause controlling the Yangtze

  17. “眼光向下”:科举民俗研究的价值、方法与目标%The Value,Approach and Objectives of Research on Folk Customs of the Imperial Examination:A Bottom-up Perspective

    Institute of Scientific and Technical Information of China (English)

    杜春燕

    2015-01-01

    As an important area in the studies of the imperial examination,research on folk customs of the Imperial Examination,featuring a bottom-up perspective,deals with the system of the imperial examination, social customs and influences in order to have a better understanding of the cultural characteristics and the value of the Imperial Examination. Its interdisciplinary nature and bottom-up approach entail that is has to draw on historical anthropology,sociology,folklore,education science,and linguistics. The study on folk customs of the Imperial Examination may broaden the academic vision,better explore folk historical materials and enriching research results.%科举民俗作为科举学研究的重要方向,是从“自下而上”的视角,探究科举考试制度、活动、习俗及社会影响,以加深对科举考试文化特质与价值的认识。科举民俗研究具有学科交叉的特点,需要进行跨学科研究。通过借鉴人类学、社会学、民俗学、教育学、语言学等学科理论与研究方法,科举民俗研究可拓展学术视野,发掘科举民间史料,深化和丰富科举学研究的内涵。

  18. Factors Influencing Glycemic Control in Children with Type 1 Diabetes

    Directory of Open Access Journals (Sweden)

    Seher Çakır

    2010-05-01

    Full Text Available Introduction: There are a plenty of factors influencing glycemic control in children with type 1 diabetes mellitus (DM. The aim of this study was to determine the factors influencing metabolic control in children with type 1 DM. Materials and Method: The study was performed in 200 children with type 1 DM between the ages 6 months to 18 years. This study was conducted by interviewing individually with the children and their families and completing the questionnaires related to their demographic features and data associated with their illness. The laboratory findings and medical information of the patients from the charts were also retrospectively recorded. Results: There were a total of 200 patients including 104 (52% girls and 96 (48% boys. The mean age of the patients was 11.7 (±4.26 years. The mean duration of diabetes was 3.8 years (6 months to 14 years. Eighty-nine percent of all patients and all of the patients between 12 and 18 years of age were on intensive insulin therapy. Mean insulin dose was 0.84±0.19 units/kg/day. The mean HbA1c value was 8.8%. Body mass index (BMI mean z-score was -0.06±1.19. There were no correlations between HbA1c and the duration of diabetes or age although a positive correlation was found with insulin dose (r=0.27 p<0.01. It was found that intensive therapy did not lower HbA1c values or the risk of severe hypoglycemia. Nevertheless, there was a decrease in HbA1c values of 72 (36% patients whose therapy was converted from conventional therapy to intensive therapy (p<0.05. HbA1c values were found to be higher in patients who lived with more than 4 persons in the house, who were non-compliant to follow-up or diet, who had more than 3 symptomatic hypoglycemia in the last 6 months, who had episodes of diabetic ketoacidosis (DKA, who were adolescent at the time of diagnosis, and who were admitted with diabetic ketoacidosis at the time of diagnosis (p<0.05. Although there was a correlation between insulin doses and

  19. Risk factors for gastroenteritis: a nested case-control study.

    Science.gov (United States)

    Rodrigo, S; Sinclair, M; Wolfe, R; Leder, K

    2011-04-01

    This nested case-control study investigated the risk factors for gastroenteritis in a cohort using rainwater as their primary domestic water source. Consumption of beef [odds ratio (OR) 2·74, 95% confidence interval (CI) 1·56-4·80], handling of raw fresh chicken in the household (OR 1·52, 95% CI 1·02-2·29) and animal contact (OR 1·83, 95% CI 1·20-2·83) were found to be significant risk factors (P>0·05). Significant protective effects were observed with raw salad prepared at home (OR 0·33, 95% CI 0·18-0·58), consumption of salami (OR 0·60, 95% CI 0·36-0·98), and shellfish (OR 0·31, 95% CI 0·14-0·67). This study provides novel insight into community-based endemic gastroenteritis showing that consumption of beef was associated with increased odds of illness and with a population attributable fraction (PAF) of 57·6%. Detecting such a high PAF for beef in a non-outbreak setting was unexpected.

  20. Transcription factor PIF4 controls the thermosensory activation of flowering

    KAUST Repository

    Kumar, S. Vinod

    2012-03-21

    Plant growth and development are strongly affected by small differences in temperature. Current climate change has already altered global plant phenology and distribution, and projected increases in temperature pose a significant challenge to agriculture. Despite the important role of temperature on plant development, the underlying pathways are unknown. It has previously been shown that thermal acceleration of flowering is dependent on the florigen, FLOWERING LOCUS T (FT). How this occurs is, however, not understood, because the major pathway known to upregulate FT, the photoperiod pathway, is not required for thermal acceleration of flowering. Here we demonstrate a direct mechanism by which increasing temperature causes the bHLH transcription factor PHYTOCHROME INTERACTING FACTOR4 (PIF4) to activate FT. Our findings provide a new understanding of how plants control their timing of reproduction in response to temperature. Flowering time is an important trait in crops as well as affecting the life cycles of pollinator species. A molecular understanding of how temperature affects flowering will be important for mitigating the effects of climate change. © 2012 Macmillan Publishers Limited. All rights reserved.

  1. Hacking Health: Bottom-up Innovation for Healthcare

    Directory of Open Access Journals (Sweden)

    Jeeshan Chowdhury

    2012-07-01

    Full Text Available Healthcare is not sustainable and still functions with outdated technology (e.g., pagers, paper records. Top-down approaches by governments and corporations have failed to deliver digital technologies to modernize healthcare. Disruptive innovation must come from the ground up by bridging the gap between front-line health experts and innovators in the latest web and mobile technology. Hacking Health is a hackathon that is focused on social innovation more than technical innovation. Our approach to improve healthcare is to pair technological innovators with healthcare experts to build realistic, human-centric solutions to front-line healthcare problems.

  2. Bottom-up Assembly of Engineered Protein Fibers

    Science.gov (United States)

    2015-02-15

    magnetite  templating   peptide ,   CMms6,  was   attached.   Alkyne-­‐functionalized   CMms6  was   attached   to   the   AHA...bearing   proteins   through   a   copper   catalyzed   click   chemistry   reaction   and   monitored  molecular  weight

  3. The nano revolution: bottom-up manufacturing with biomolecules

    Science.gov (United States)

    Li, Yi-Fen; Li, Jing; Paavola, Chad; Kagawa, Hiromi; Chan, Suzanne L.; Trent, Jonathan D.

    2007-05-01

    As the nano-scale becomes a focus for engineering electronic, photonic, medical, and other important devices, an unprecedented role for biomolecules is emerging to address one of the most formidable problems in nano-manufacturing: precise manipulation and organization of matter on the nano-scale. Biomolecules are a solution to this problem because they themselves are nanoscale particles with intrinsic properties that allow them to precisely self-assemble and self-organize into the amazing diversity of structures observed in nature. Indeed, there is ample evidence that the combination of molecular recognition and self-assembly combined with mutation, selection, and replication have the potential to create structures that could truly revolutionize manufacturing processes in many sectors of industry. Genetically engineered biomolecules are already being used to make the next generation of nano-scale templates, nano-detailed masks, and molecular scaffolds for the future manufacturing of electronic devices, medical diagnostic tools, and chemical engineering interfaces. Here we present an example of this type of technology by showing how a protein can be genetically modified to form a new structure and coated with metal to lead the way to producing "nano-wires," which may ultimately become the basis for self-assembled circuitry.

  4. QUALITY FUNCTION DEPLOYMENT IN BOTTOM UP PROCESS FOR DESIGN REUSE

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    To deal with a bottomup process model for design reuses a specific extended house of quality(EHOQ)is proposedTwo kinds of suppo rted functions,basic supported functions and new supported functions,are defined Two processes to determine two kinds of functions are presentedA kind of EHO Q matrix for a company is given and its management steps are studied

  5. Bottom up design of nanoparticles for anti-cancer diapeutics

    DEFF Research Database (Denmark)

    Needham, David; Arslanagic, Amina; Glud, Kasper

    2016-01-01

     nm particle would dissolve in less than a second! And so the nanoparticle design requires a highly water-insoluble drug, and a tight, encapsulating, impermeable lipid:cholesterol monolayer. While the "Y" junction can be used to mix an ethanolic solution with anti-solvent, we find that a "no...

  6. The Heating of the Solar Atmosphere: from the Bottom Up?

    Science.gov (United States)

    Winebarger, Amy

    2014-01-01

    The heating of the solar atmosphere remains a mystery. Over the past several decades, scientists have examined the observational properties of structures in the solar atmosphere, notably their temperature, density, lifetime, and geometry, to determine the location, frequency, and duration of heating. In this talk, I will review these observational results, focusing on the wealth of information stored in the light curve of structures in different spectral lines or channels available in the Solar Dynamic Observatory's Atmospheric Imaging Assembly, Hinode's X-ray Telescope and Extreme-ultraviolet Imaging Spectrometer, and the Interface Region Imaging Spectrograph. I will discuss some recent results from combined data sets that support the heating of the solar atmosphere may be dominated by low, near-constant heating events.

  7. Towards Bottom-Up Analysis of Social Food

    OpenAIRE

    Rich, Jaclyn; Haddadi, Hamed; Hospedales, Timothy M.

    2016-01-01

    Social media provide a wealth of information for research into public health by providing a rich mix of personal data, location, hashtags, and social network information. Among these, Instagram has been recently the subject of many computational social science studies. However despite Instagram's focus on image sharing, most studies have exclusively focused on the hashtag and social network structure. In this paper we perform the first large scale content analysis of Instagram posts, addressi...

  8. Political will for better health, a bottom-up process.

    Science.gov (United States)

    De Ceukelaire, Wim; De Vos, Pol; Criel, Bart

    2011-09-01

    Lately, different voices in the global public health community have drawn attention to the interaction between the State and civil society in the context of reducing health inequities. A rights-based approach empowers people not only to claim their rights but also to demand accountability from the State. Lessons from history show that economic growth does not automatically have positive implications for population health. It may even be disruptive in the absence of strong stewardship and regulation by national and local public health authorities. The field research in which we have been involved over the past 20 years in the Philippines, Palestine, Cuba, and Europe confirms that organized communities and people's organizations can effectively pressure the state into action towards realizing the right to health. Class analysis, influencing power relations, and giving the State a central role have been identified as three key strategies of relevant social movements and NGOs. More interaction between academia and civil society organizations could contribute to enhance and safeguard the societal relevance of public health researches. Our own experience made us discover that social movements and public health researchers have a lot to learn from one another.

  9. Glycan Node Analysis: A Bottom-up Approach to Glycomics.

    Science.gov (United States)

    Zaare, Sahba; Aguilar, Jesús S; Hu, Yueming; Ferdosi, Shadi; Borges, Chad R

    2016-01-01

    Synthesized in a non-template-driven process by enzymes called glycosyltransferases, glycans are key players in various significant intra- and extracellular events. Many pathological conditions, notably cancer, affect gene expression, which can in turn deregulate the relative abundance and activity levels of glycoside hydrolase and glycosyltransferase enzymes. Unique aberrant whole glycans resulting from deregulated glycosyltransferase(s) are often present in trace quantities within complex biofluids, making their detection difficult and sometimes stochastic. However, with proper sample preparation, one of the oldest forms of mass spectrometry (gas chromatography-mass spectrometry, GC-MS) can routinely detect the collection of branch-point and linkage-specific monosaccharides ("glycan nodes") present in complex biofluids. Complementary to traditional top-down glycomics techniques, the approach discussed herein involves the collection and condensation of each constituent glycan node in a sample into a single independent analytical signal, which provides detailed structural and quantitative information about changes to the glycome as a whole and reveals potentially deregulated glycosyltransferases. Improvements to the permethylation and subsequent liquid/liquid extraction stages provided herein enhance reproducibility and overall yield by facilitating minimal exposure of permethylated glycans to alkaline aqueous conditions. Modifications to the acetylation stage further increase the extent of reaction and overall yield. Despite their reproducibility, the overall yields of N-acetylhexosamine (HexNAc) partially permethylated alditol acetates (PMAAs) are shown to be inherently lower than their expected theoretical value relative to hexose PMAAs. Calculating the ratio of the area under the extracted ion chromatogram (XIC) for each individual hexose PMAA (or HexNAc PMAA) to the sum of such XIC areas for all hexoses (or HexNAcs) provides a new normalization method that facilitates relative quantification of individual glycan nodes in a sample. Although presently constrained in terms of its absolute limits of detection, this method expedites the analysis of clinical biofluids and shows considerable promise as a complementary approach to traditional top-down glycomics.

  10. Quantitative bottom-up proteomics depends on digestion conditions.

    Science.gov (United States)

    Lowenthal, Mark S; Liang, Yuxue; Phinney, Karen W; Stein, Stephen E

    2014-01-07

    Accurate quantification is a fundamental requirement in the fields of proteomics and biomarker discovery, and for clinical diagnostic assays. To demonstrate the extent of quantitative variability in measurable peptide concentrations due to differences among "typical" protein digestion protocols, the model protein, human serum albumin (HSA), was subjected to enzymatic digestion using 12 different sample preparation methods, and separately, was examined through a comprehensive timecourse of trypsinolysis. A variety of digestion conditions were explored including differences in digestion time, denaturant, source of enzyme, sample cleanup, and denaturation temperature, among others. Timecourse experiments compared differences in relative peptide concentrations for tryptic digestions ranging from 15 min to 48 h. A predigested stable isotope-labeled ((15)N) form of the full-length (HSA) protein, expressed in yeast was spiked into all samples prior to LC-MS analysis to compare yields of numerous varieties of tryptic peptides. Relative quantification was achieved by normalization of integrated extracted ion chromatograms (XICs) using liquid chromatography-tandem mass spectrometry (LC-MS/MS) by multiple-reaction monitoring (MRM) on a triple quadrupole (QQQ) MS. Related peptide fragmentation transitions, and multiple peptide charge states, were monitored for validation of quantitative results. Results demonstrate that protein concentration was shown to be unequal to tryptic peptide concentrations for most peptides, including so-called "proteotypic" peptides. Peptide release during digestion displayed complex kinetics dependent on digestion conditions and, by inference, from denatured protein structure. Hydrolysis rates at tryptic cleavage sites were also shown to be affected by differences in nearest and next-nearest amino acid residues. The data suggesting nonstoichiometry of enzymatic protein digestions emphasizes the often overlooked difficulties for routine absolute protein quantification, and highlights the need for use of suitable internal standards and isotope dilution techniques.

  11. Spintronics in the «Bottom-up» Approach

    Directory of Open Access Journals (Sweden)

    Yu.A. Kruglyak

    2014-11-01

    Full Text Available Basic topics of spintronics such as spin valve, interface resistance due to the mismatch of conduction modes, spin potentials, non-local spin voltage, spin moment and its transport, Landau-Lifshitz-Gilbert equation, and explanation on its basis why a magnet has an “easy axis”, nanomagnet dynamics by spin current, polarizers and analyzers of spin current, diffusion equation for ballistic transport and current in terms of non-equilibrium potentials are discussed in the frame of the “bottom-up” approach of modern nanoelectronics.

  12. Pivots - A Bottom-Up Approach to Enhance Resilience

    Science.gov (United States)

    2015-12-01

    63 viii B. PROPOSED MODEL —WRAP-AROUND SERVICES BUSINESS INCUBATOR .......................................................................64 C... business entrepreneur xii SCORE Service Corps of Retired Executives SME small and medium sized enterprise SNA social network analysis SoVI...preparedness or mitigation. B. PROPOSED MODEL —WRAP-AROUND SERVICES BUSINESS INCUBATOR Numerous illustrations demonstrate how small business owners

  13. Bottom-Up Energy Analysis System - Methodology and Results

    Energy Technology Data Exchange (ETDEWEB)

    McNeil, Michael A. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Letschert, Virginie E. [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Stephane, de la Rue du Can [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Ke, Jing [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2012-06-15

    The main objective of the development of BUENAS is to provide a global model with sufficient detail and accuracy for technical assessment of policy measures such as energy efficiency standards and labeling (EES&L) programs. In most countries where energy efficiency policies exist, the initial emphasis is on household appliances and lighting. Often, equipment used in commercial buildings, particularly heating, air conditioning and ventilation (HVAC) is also covered by EES&L programs. In the industrial sector, standards and labeling generally covers electric motors and distribution transformers, although a few more types of industrial equipment are covered by some programs, and there is a trend toward including more of them. In order to make a comprehensive estimate of the total potential impacts, development of the model prioritized coverage of as many end uses commonly targeted by EES&L programs as possible, for as many countries as possible.

  14. Revisiting factors controlling methane emissions from high-Arctic tundra

    Directory of Open Access Journals (Sweden)

    M. Mastepanov

    2013-07-01

    Full Text Available The northern latitudes are experiencing disproportionate warming relative to the mid-latitudes, and there is growing concern about feedbacks between this warming and methane production and release from high-latitude soils. Studies of methane emissions carried out in the Arctic, particularly those with measurements made outside the growing season, are underrepresented in the literature. Here we present results of 5 yr (2006–2010 of automatic chamber measurements at a high-Arctic location in Zackenberg, NE Greenland, covering both the growing seasons and two months of the following freeze-in periods. The measurements show clear seasonal dynamics in methane emission. The start of the growing season and the increase in CH4 fluxes were strongly related to the date of snowmelt. Within each particular growing season, CH4 fluxes were highly correlated with the soil temperature (R2 > 0.75, which is probably explained by high seasonality of both variables, and weakly correlated with the water table. The greatest variability in fluxes between the study years was observed during the first part of the growing season. Somewhat surprisingly, this variability could not be explained by commonly known factors controlling methane emission, i.e. temperature and water table position. Late in the growing season CH4 emissions were found to be very similar between the study years (except the extremely dry 2010 despite large differences in climatic factors (temperature and water table. Late-season bursts of CH4 coinciding with soil freezing in the autumn were observed during at least three years. The cumulative emission during the freeze-in CH4 bursts was comparable in size with the growing season emission for the year 2007, and about one third of the growing season emissions for the years 2009 and 2010. In all three cases the CH4 burst was accompanied by a corresponding episodic increase in CO2 emission, which can compose a significant contribution to the annual CO2

  15. Risk factors in pediatric asthmatic patients. Cases and control studies.

    Directory of Open Access Journals (Sweden)

    Rafael Alejandro Gómez Baute

    2003-04-01

    Full Text Available Background: Asthma constitutes the first disease among chronic diseases in children. The morbid-mortality promoted to continue being elevated in spite of the new therapies. For this reason it is a disease with high priority for investigation in pediatric ages. Method: A control and case group study was carried out. The samples was composed by 72 asthmatic children from three General Comprehensive doctor offices from Palmira health area located in Cienfuegos Province, Cuba; and a control group of 72 children apparently healthy from the same population. A questionnaire with the different risk variables was elaborated. Odds ratio technique was used to estimate the risk. Results: low weight at birth, family history of asthma, brochiolitis antecedent and the excessive usage of antibiotics in children under 1 year old were the main risks found. Conclusions: It is conclusive that the exposure to home allergen plus a genetic favorable factor, the prematurity, and brochiolitis constituted the most outstanding elements to suffer from asthma in the population studied.

  16. 75 FR 56972 - Pipeline Safety: Control Room Management/Human Factors

    Science.gov (United States)

    2010-09-17

    ... Safety: Control Room Management/Human Factors AGENCY: Pipeline and Hazardous Materials Safety... address human factors and other aspects of control room management for certain pipelines where controllers... rooms and controllers covered by the control room management rule are critical to the safe operation...

  17. Hydrologic factors controlling groundwater salinity in northwestern coastal zone, Egypt

    Indian Academy of Sciences (India)

    Nahla A Morad; M H Masoud; S M Abdel Moghith

    2014-10-01

    The aim of this article is to assess the main factors influencing salinity of groundwater in the coastal area between El Dabaa and Sidi Barani, Egypt. The types and ages of the main aquifers in this area are the fractured limestone of Middle Miocene, the calcareous sandstone of Pliocene and the Oolitic Limestone of Pleistocene age. The aquifers in the area are recharged by seasonal rainfall of the order of 150 mm/year. The relationship of groundwater salinity against the absolute water level, the well drilling depth, and the ability of aquifer to recharge has been discussed in the present work. The ability of aquifer to locally recharge by direct rainfall is a measure of the vertical permeability due to lithological and structural factors that control groundwater salinity in the investigated aquifers. On the other hand, the fracturing system as well as the attitude of the surface water divide has a prime role in changing both the mode of occurrence and the salinity of groundwater in the area. Directly to the west of Matrouh, where the coastal plain is the narrowest, and east of Barrani, where the coastal plain is the widest, are good examples of this concept, where the water salinity attains its maximum and minimum limits respectively. Accordingly, well drilling in the Miocene aquifer, in the area between El Negila and Barrani to get groundwater of salinities less than 5000 mg/l is recommended in this area, at flow rate less than 10m3/hr/well. In other words, one can expect that the brackish water is probably found where the surface water divide is far from the shore line, where the Wadi fill deposits dominate (Quaternary aquifer), acting as a possible water salinity by direct rainfall and runoff.

  18. 75 FR 69912 - Pipeline Safety: Control Room Management/Human Factors

    Science.gov (United States)

    2010-11-16

    ... Safety: Control Room Management/Human Factors AGENCY: Pipeline and Hazardous Materials Safety..., 2010, PHMSA published a Control Room Management/Human Factors notice of proposed rulemaking (NPRM... to expedite the program implementation deadlines of the Control Room Management/Human Factors rule...

  19. 75 FR 5536 - Pipeline Safety: Control Room Management/Human Factors, Correction

    Science.gov (United States)

    2010-02-03

    ... Safety: Control Room Management/Human Factors, Correction AGENCY: Pipeline and Hazardous Materials Safety... Regulations to address human factors and other aspects of control room management for pipelines where... 63310) entitled ``Pipeline Safety: Control Room Management/Human Factors.'' This final rule...

  20. Controllability analysis of transcriptional regulatory networks reveals circular control patterns among transcription factors.

    Science.gov (United States)

    Österlund, Tobias; Bordel, Sergio; Nielsen, Jens

    2015-05-01

    Transcriptional regulation is the most committed type of regulation in living cells where transcription factors (TFs) control the expression of their target genes and TF expression is controlled by other TFs forming complex transcriptional regulatory networks that can be highly interconnected. Here we analyze the topology and organization of nine transcriptional regulatory networks for E. coli, yeast, mouse and human, and we evaluate how the structure of these networks influences two of their key properties, namely controllability and stability. We calculate the controllability for each network as a measure of the organization and interconnectivity of the network. We find that the number of driver nodes nD needed to control the whole network is 64% of the TFs in the E. coli transcriptional regulatory network in contrast to only 17% for the yeast network, 4% for the mouse network and 8% for the human network. The high controllability (low number of drivers needed to control the system) in yeast, mouse and human is due to the presence of internal loops in their regulatory networks where the TFs regulate each other in a circular fashion. We refer to these internal loops as circular control motifs (CCM). The E. coli transcriptional regulatory network, which does not have any CCMs, shows a hierarchical structure of the transcriptional regulatory network in contrast to the eukaryal networks. The presence of CCMs also has influence on the stability of these networks, as the presence of cycles can be associated with potential unstable steady-states where even small changes in binding affinities can cause dramatic rearrangements of the state of the network.

  1. Resource Considerations during Parallel Scheduling of Large Control Flow Dominated Applications

    DEFF Research Database (Denmark)

    Grode, Jesper Nicolai Riis; Madsen, Jan

    1995-01-01

    This paper presents a technique to determine the possible parallelism between different control-structures in large hierarchical Control- and Data-Flow Graphs (CDFGs). The technique is based on a hierarchical bottom-up heuristic, which after resolving data- and control-dependencies between control...

  2. Factors controlling large-wood transport in a mountain river

    Science.gov (United States)

    Ruiz-Villanueva, Virginia; Wyżga, Bartłomiej; Zawiejska, Joanna; Hajdukiewicz, Maciej; Stoffel, Markus

    2016-11-01

    As with bedload transport, wood transport in rivers is governed by several factors such as flow regime, geomorphic configuration of the channel and floodplain, or wood size and shape. Because large-wood tends to be transported during floods, safety and logistical constraints make field measurements difficult. As a result, direct observation and measurements of the conditions of wood transport are scarce. This lack of direct observations and the complexity of the processes involved in wood transport may result in an incomplete understanding of wood transport processes. Numerical modelling provides an alternative approach to addressing some of the unknowns in the dynamics of large-wood in rivers. The aim of this study is to improve the understanding of controls governing wood transport in mountain rivers, combining numerical modelling and direct field observations. By defining different scenarios, we illustrate relationships between the rate of wood transport and discharge, wood size, and river morphology. We test these relationships for a wide, multithread reach and a narrower, partially channelized single-thread reach of the Czarny Dunajec River in the Polish Carpathians. Results indicate that a wide range of quantitative information about wood transport can be obtained from a combination of numerical modelling and field observations and from document contrasting patterns of wood transport in single- and multithread river reaches. On the one hand, log diameter seems to have a greater importance for wood transport in the multithread channel because of shallower flow, lower flow velocity, and lower stream power. Hydrodynamic conditions in the single-thread channel allow transport of large-wood pieces, whereas in the multithread reach, logs with diameters similar to water depth are not being moved. On the other hand, log length also exerts strong control on wood transport, more so in the single-thread than in the multithread reach. In any case, wood transport strongly

  3. Control factors of partial nitritation for landfill leachate treatment

    Institute of Scientific and Technical Information of China (English)

    LIANG Zhu; LIU Jun-xin

    2007-01-01

    Anaerobic ammonium oxidation (ANAMMOX) technology has potential technical superiority and economical efficiency for the nitrogen removal from landfill leachate, which contains high-strength ammonium nitrogen (NH4+-N) and refractory organics. To complete the ANAMMOX process, a preceding partial nitritation step to produce the appropriate ratio of nitrite/ammonium is a key stage. The objective of this study was to determine the optimal conditions to acquire constant partial nitritation for landfill leachate treatment, and a bench scale fixed bed bio-film reactor was used in this study to investigate the effects of the running factors on the partial nitritation. The results showed that both the dissolved oxygen (DO) concentration and the ammonium volumetric loading rate (Nv) had effects on the partial nitritation. In the controlling conditions with a temperature of 30±1℃, Nv of 0.2-1.0 kg NH4+-N/(m3·d), and DO concentration of 0.8-2.3 mg/L, the steady partial nitritation was achieved as follows: more than 94% partial nitritation efficiency (nitrite as the main product), 60%-74% NH4+-N removal efficiency, and NO2--N/NH4+-N ratio (concentration ratio) of 1.0-1.4 in the effluent.The impact of temperature was related to Nv at certain DO concentration, and the temperature range of 25-30℃ was suitable for treating high strength ammonium leachate. Ammonium-oxidizing bacteria (AOB) could be acclimated to higher FA (free ammonium) in the range of 122-224 mg/L. According to the denaturing gradient gel electrophoresis analysis result of the bio-film in the reactor, there were 25 kinds of 16S rRNA gene fragments, which indicated that abundant microbial communities existed in the bio-film, although high concentrations of ammonium and FA may inhibit the growth of the nitrite-oxidizing bacteria (NOB) and other microorganisms in the reactor.

  4. Exploring Factors of Successful Tendering Practices using Qualitative Comparative Analysis (QCA): The Study of Organizational Repetitions

    DEFF Research Database (Denmark)

    Bekdik, Baris; Thuesen, Christian

    2015-01-01

    The purpose of this paper is to introduce and evaluate Qualitative Comparative Analysis (QCA) as a method for exploring the complexity of practices of project organizing and management combining the benefits of top-down and bottom-up research strategies. The QCA method is used in order to describe...... combinations of factors leading to particular results of tendering practices. Empirical material collected through data mining in previously completed project records (quantitative data) is supported by data obtained from project managers of a general contractor company (qualitative data) in order...... between topdown and bottom-up research strategies....

  5. Diabetes and age-related demographic differences in risk factor control.

    Science.gov (United States)

    Egan, Brent M; Li, Jiexiang; Wolfman, Tamara E; Sinopoli, Angelo

    2014-06-01

    Disparate vascular outcomes in diabetes by race and/or ethnicity may reflect differential risk factor control, especially pre-Medicare. Assess concurrent target attainment for glycohemoglobin 2, P factor awareness and treatment were lower in Hispanics than whites. When treated, diabetes and hypertension control were greater in whites than blacks or Hispanics. Concurrent risk factor control is low in all diabetics and could improve with greater statin use. Insuring younger adults, especially Hispanic, could raise risk factor awareness and treatment. Improving treatment effectiveness in younger black and Hispanic diabetics could promote equitable risk factor control.

  6. Viral infections as controlling factors for the deep biosphere? (Invited)

    Science.gov (United States)

    Engelen, B.; Engelhardt, T.; Sahlberg, M.; Cypionka, H.

    2009-12-01

    The marine deep biosphere represents the largest biotope on Earth. Throughout the last years, we have obtained interesting insights into its microbial community composition. However, one component that was completely overlooked so far is the viral inventory of deep-subsurface sediments. While viral infections were identified to have a major impact on the benthic microflora of deep-sea surface sediments (Danavaro et al. 2008), no studies were performed on deep-biosphere samples, so far. As grazers probably play only a minor role in anoxic and highly compressed deep sediments, viruses might be the main “predators” for indigenous microorganisms. Furthermore, the release of cell components, called “the viral shunt”, could have a major impact on the deep biosphere in providing labile organic compounds to non-infected microorganisms in these generally nutrient depleted sediments. However, direct counting of viruses in sediments is highly challenging due to the small size of viruses and the high background of small particles. Even molecular surveys using “universal” PCR primers that target phage-specific genes fail due to the vast phage diversity. One solution for this problem is the lysogenic viral life cycle as many bacteriophages integrate their DNA into the host genome. It is estimated that up to 70% of cultivated bacteria contain prophages within their genome. Therefore, culture collections (Batzke et al. 2007) represent an archive of the viral composition within the respective habitat. These prophages can be induced to become free phage particles in stimulation experiments in which the host cells are set under certain stress situations such as a treatment with UV exposure or DNA-damaging antibiotics. The study of the viral component within the deep biosphere offers to answer the following questions: To which extent are deep-biosphere populations controlled by viral infections? What is the inter- and intra-specific diversity and the host-specific viral

  7. Factors controlling the initiation of Snowball Earth events

    Science.gov (United States)

    Voigt, A.

    2012-12-01

    During the Neoproterozoic glaciations tropical continents were covered by active glaciers that extended down to sea level. To explain these glaciers, the Snowball Earth hypothesis assumes that oceans were completely sea-ice covered during these glaciation, but there is an ongoing debate whether or not some regions of the tropical oceans remained open. In this talk, I will describe past and ongoing climate modelling activities with the comprehensive coupled climate model ECHAM5/MPI-OM that identify and compare factors that control the initiation of Snowball Earth events. I first show that shifting the continents from their present-day location to their Marinoan (635 My BP) low-latitude location increases the planetary albedo, cools the climate, and thereby allows Snowball Earth initiation at higher levels of total solar irradiance and atmospheric CO2. I then present simulations with successively lowered bare sea-ice albedo, disabled sea-ice dynamics, and switched-off ocean heat transport. These simulations show that both lowering the bare sea-ice albedo and disabling sea-ice dynamics increase the critical sea-ice cover in ECHAM5/MPI-OM, but sea-ice dynamics due to strong equatorward sea-ice transport have a much larger influence on the critical CO2. Disabling sea-ice transport allows a state with sea-ice margin at 10 deg latitude by virtue of the Jormungand mechanism. The accumulation of snow on land, in combination with tropical land temperatures below or close to freezing, suggests that tropical land glaciers could easily form in such a state. However, in contrast to aquaplanet simulations without ocean heat transport, there is no sign of a Jormungand hysteresis in the coupled simulations. Ocean heat transport is not responsible for the lack of a Jormungand hysteresis in the coupled simulations. By relating the above findings to previous studies, I will outline promising future avenues of research on the initiation of Snowball Earth events. In particular, an

  8. Molecular factors controlling photosynthetic light harvesting by carotenoids.

    Science.gov (United States)

    Polívka, Tomás; Frank, Harry A

    2010-08-17

    Carotenoids are naturally occurring pigments that absorb light in the spectral region in which the sun irradiates maximally. These molecules transfer this energy to chlorophylls, initiating the primary photochemical events of photosynthesis. Carotenoids also regulate the flow of energy within the photosynthetic apparatus and protect it from photoinduced damage caused by excess light absorption. To carry out these functions in nature, carotenoids are bound in discrete pigment-protein complexes in the proximity of chlorophylls. A few three-dimensional structures of these carotenoid complexes have been determined by X-ray crystallography. Thus, the stage is set for attempting to correlate the structural information with the spectroscopic properties of carotenoids to understand the molecular mechanism(s) of their function in photosynthetic systems. In this Account, we summarize current spectroscopic data describing the excited state energies and ultrafast dynamics of purified carotenoids in solution and bound in light-harvesting complexes from purple bacteria, marine algae, and green plants. Many of these complexes can be modified using mutagenesis or pigment exchange which facilitates the elucidation of correlations between structure and function. We describe the structural and electronic factors controlling the function of carotenoids as energy donors. We also discuss unresolved issues related to the nature of spectroscopically dark excited states, which could play a role in light harvesting. To illustrate the interplay between structural determinations and spectroscopic investigations that exemplifies work in the field, we describe the spectroscopic properties of four light-harvesting complexes whose structures have been determined to atomic resolution. The first, the LH2 complex from the purple bacterium Rhodopseudomonas acidophila, contains the carotenoid rhodopin glucoside. The second is the LHCII trimeric complex from higher plants which uses the carotenoids

  9. Investigating the effective factors on management internal controls applying

    Directory of Open Access Journals (Sweden)

    Ahmad Ahmadkhani

    2012-08-01

    Full Text Available Information technology plays an important role on increasing internal control in many organizations. In this paper, we present an empirical study to measure the impact of information technology, hiring high quality skilled management team, using high quality standards and increasing employees' awareness on managing internal control. The survey uses a questionnaire based on Likert scale and distributes among the people who work in either administration or financial sectors of governmental agencies in province of Zanjan, Iran. The results of the study indicate that the implementation of information technology positively influences management team to control their system, more effectively, using more skilled and specialized managers positively influences management internal control, an organization with suitable standard positively influences management internal control and increasing employees' awareness positively influences management internal control.

  10. Other Factors That Affect Heart Disease: Birth Control Pills

    Science.gov (United States)

    ... that women who use high-dose birth control pills (oral contraceptives) are more likely to have a heart attack or stroke because blood clots are more likely to form in the blood vessels. These risks are lessened once the birth control pill is stopped. Using the pill also may worsen ...

  11. Satellite observations indicate substantial spatiotemporal variability in biomass burning NOx emission factors for South America

    NARCIS (Netherlands)

    Castellanos, P.; Boersma, K.F.; Werf, van de G.R.

    2014-01-01

    Biomass burning is an important contributor to global total emissions of NOx (NO+NO2). Generally bottom-up fire emissions models calculate NOx emissions by multiplying fuel consumption estimates with static biome-specific emission factors, defined in units of grams of NO per kilogram of dry matter c

  12. Impacts on Power Factor of AC Voltage Controllers Under Non-Sinusoidal Conditions

    Directory of Open Access Journals (Sweden)

    Mukhtiar Ahmed Mahar

    2012-04-01

    Full Text Available AC-AC conversion is obtained with the help of Cyclo-converters, DC Link converters and AC Voltage Controllers. AC voltage controllers are also referred to as voltage regulators. Main issue concerned to these converters is that they generate harmonics due to periodic variable structure system. The generated harmonics create disturbances and degrade the performance of converter. The power factor of supply side is affected due to these harmonics. This paper focuses on source side power factor of ac voltage controllers under nonsinusoidal conditions. In order to observe the power factor, measurement tool of power factor and simulation model of ac voltage controller is also developed in MATLAB software.

  13. CubeSat Form Factor Thermal Control Louvers Project

    Data.gov (United States)

    National Aeronautics and Space Administration — Thermal control of small spacecraft, including CubeSats, is a challenge for the next era of NASA spaceflight. Science objectives and components will still require...

  14. Investigation and control of factors influencing resistance upset butt welding.

    NARCIS (Netherlands)

    Kerstens, N.F.H.

    2010-01-01

    The purpose of this work is to investigate the factors influencing the resistance upset butt welding process to obtain an understanding of the metal behaviour and welding process characteristics, so that new automotive steels can be welded with reduced development time and fewer failures in producti

  15. Controlled growth factor release from synthetic extracellular matrices

    Science.gov (United States)

    Lee, Kuen Yong; Peters, Martin C.; Anderson, Kenneth W.; Mooney, David J.

    2000-12-01

    Polymeric matrices can be used to grow new tissues and organs, and the delivery of growth factors from these matrices is one method to regenerate tissues. A problem with engineering tissues that exist in a mechanically dynamic environment, such as bone, muscle and blood vessels, is that most drug delivery systems have been designed to operate under static conditions. We thought that polymeric matrices, which release growth factors in response to mechanical signals, might provide a new approach to guide tissue formation in mechanically stressed environments. Critical design features for this type of system include the ability to undergo repeated deformation, and a reversible binding of the protein growth factors to polymeric matrices to allow for responses to repeated stimuli. Here we report a model delivery system that can respond to mechanical signalling and upregulate the release of a growth factor to promote blood vessel formation. This approach may find a number of applications, including regeneration and engineering of new tissues and more general drug-delivery applications.

  16. Poor stroke-related risk factor control even after stroke: an opportunity for rehabilitation professionals.

    Science.gov (United States)

    Ellis, Charles; Breland, Hazel L

    2014-01-01

    The burden of chronic disease worldwide is substantial. Unfortunately, risk factor control for most chronic diseases remains poor even after diagnoses. This is a major concern because poor risk factor control often leads to secondary consequences of the disease and the development of co-existing diseases. Stroke is a chronic condition that frequently requires the services of rehabilitation professionals who can also play an important role in risk factor management to reduce recurrent stroke. Approaches to the management of stroke risk factors in stroke survivors vary greatly and consequently outcomes vary in a similar fashion. The current literature suggests that uniform offering of structured risk factor control programs over time to individuals with chronic disease can improve knowledge of stroke risk factors, knowledge of action to control risk factors and in turn facilitate self-management practices that reduce the negative consequences of chronic diseases. Rehabilitation professionals can play a vital role in the management and secondary prevention of chronic diseases during the rehabilitation process via patient education and training. Implications for Rehabilitation Evidence suggests that risk factor control remains poor in many individuals with chronic conditions such as stroke. Rehabilitation professionals can play a key role in programs designed to improve risk factor control in chronic conditions. Future risk factor control programs can be structured and implemented over time to include rehabilitation professionals.

  17. Enhanced imaging of DNA via active quality factor control

    Science.gov (United States)

    Humphris, A. D. L.; Round, A. N.; Miles, M. J.

    2001-10-01

    Adsorption processes at single molecule level are of fundamental importance for the understanding and development of biomaterials. Atomic force microscopy (AFM) has played a critical role in this field due to its high resolution and ability to image in a liquid environment. We present a method that improves the dynamic force sensitivity and the resolution of a conventional AFM. This is achieved via a positive feedback loop that enhances the effective quality factor of the cantilever in a liquid environment to values in excess of 300, compared to a nominal value of ˜1. This active quality factor enhancement has been used to image DNA and an increase in the height of the molecule observed.

  18. Environmental factors controlling methane emissions from peatlands in northern Minnesota

    Science.gov (United States)

    Dise, Nancy B.; Gorham, Eville; Verry, Elon S.

    1993-01-01

    The environmental factors affecting the emission of methane from peatlands were investigated by correlating CH4 emission data for two years, obtained from five different peatland ecosystems in northern Minnesota, with peat temperature, water table position, and degree of peat humification. The relationship obtained between the CH4 flux and these factors was compared to results from a field manipulation experiment in which the water table was artificially raised in three experimental plots within the driest peatland. It was found that peat temperature, water table position, and degree of peat humification explained 91 percent of the variance in log CH4 flux, successfully predicted annual CH4 emission from individual wetlands, and predicted the change in flux due to the water table manipulation. Raising the water table in the bog corrals by an average of 6 cm in autumn 1989 and 10 cm in summer 1990 increased CH4 emission by 2.5 and 2.2 times, respectively.

  19. Addressing the human factors issues associated with control room modifications

    Energy Technology Data Exchange (ETDEWEB)

    O`Hara, J.; Stubler, W. [Brookhaven National Lab., Upton, NY (United States). Dept. of Advanced Technology; Kramer, J. [Nuclear Regulatory Commission, Washington, DC (United States). Office of Nuclear Regulatory Research

    1998-03-01

    Advanced human-system interface (HSI) technology is being integrated into existing nuclear plants as part of plant modifications and upgrades. The result of this trend is that hybrid HSIs are created, i.e., HSIs containing a mixture of conventional (analog) and advanced (digital) technology. The purpose of the present research is to define the potential effects of hybrid HSIs on personnel performance and plant safety and to develop human factors guidance for safety reviews of them where necessary. In support of this objective, human factors issues associated with hybrid HSIs were identified. The issues were evaluated for their potential significance to plant safety, i.e., their human performance concerns have the potential to compromise plant safety. The issues were then prioritized and a subset was selected for design review guidance development.

  20. Polyketide chain length control by chain length factor.

    Science.gov (United States)

    Tang, Yi; Tsai, Shiou-Chuan; Khosla, Chaitan

    2003-10-22

    Bacterial aromatic polyketides are pharmacologically important natural products. A critical parameter that dictates product structure is the carbon chain length of the polyketide backbone. Systematic manipulation of polyketide chain length represents a major unmet challenge in natural product biosynthesis. Polyketide chain elongation is catalyzed by a heterodimeric ketosynthase. In contrast to homodimeric ketosynthases found in fatty acid synthases, the active site cysteine is absent from the one subunit of this heterodimer. The precise role of this catalytically silent subunit has been debated over the past decade. We demonstrate here that this subunit is the primary determinant of polyketide chain length, thereby validating its designation as chain length factor. Using structure-based mutagenesis, we identified key residues in the chain length factor that could be manipulated to convert an octaketide synthase into a decaketide synthase and vice versa. These results should lead to novel strategies for the engineered biosynthesis of hitherto unidentified polyketide scaffolds.

  1. Optimal replicator factor control in wireless sensor networks

    Institute of Scientific and Technical Information of China (English)

    2011-01-01

    For TDMA MAC protocols in wireless sensor networks (WSNs), redundancy and retransmission are two important methods to provide high end-to-end transmission reliability. Since reliable transmissions will lead to more energy consumption, there exists an intrinsic tradeoff between transmission reliability and energy efficiency. For each link, we name the number of its reserved time slots in each MAC superframe as a replicator factor. In the following paper, we propose a reliability-lifetime tradeoff framework (...

  2. Well productivity controlling factors in crystalline terrains of southeastern Brazil

    Science.gov (United States)

    Neves, Mirna A.; Morales, Norberto

    2007-05-01

    Over the last decades, increasing water demands have fostered research to obtain high well yields in crystalline terrains where, besides the intrinsic properties of rocks, the groundwater flow depends on several factors. The depth of the wells, the lithotypes, the presence and thickness of sedimentary coverings and weathered layers, the landforms, the geological structures, and the effects of tectonic stresses are among the most investigated factors considered as determinant of well productivity. The influence of these factors on productivity of wells that exploit the Crystalline Aquifer System in the Jundiaí River Catchment, southeastern Brazil, is investigated in this work. The largest region of the studied area is located on the Precambrian Basement, partially covered by sedimentary deposits. The results show that the sedimentary deposits and the weathered layer are important for high well yield, but it also depends on the existence of a net of open fractures, in order to maintain high productivity. The sites that have more possibility of occurrence of such structures are the regional shear and fault zones and other minor structures with NW-SE and E-W directions, which characterize areas subjected to transtensional stress related to the neotectonics.

  3. Resource Form Factor and Installation of GFA Controllers

    Energy Technology Data Exchange (ETDEWEB)

    DeSteese, John G.; Hammerstrom, Donald J.

    2009-11-15

    The focus of this task is to optimize the form and placement of a controller comprising the Grid Friendly™ appliance (GFA) controller, power supply and power relay (and/or a solid-state power electronic switch) that would command a domestic water heater to shed its load in response to stress on the electric power grid. The GFA controller would disconnect the water heater from its supply circuit whenever it senses a low voltage signal or other indicators of system stress communicated via the electric power distribution system. Power would be reconnected to the appliance when the GFA controller senses the absence of these signals. This project has also considered more frequent cycling of this controller’s relay switch to perform demand-side frequency regulation. The principal criteria considered in this optimization are reliability, cost and life expectancy of the GFA components. The alternative embodiments of the GFA equipment under consideration are: Option 1- installation inside the insulation space of the water heater between the tank and jacket Option 2 containment in a separate nearby electrical enclosure Option 3 - as a modification or adjunct to the distribution panel housing and/or the breaker that protects the water heater supply circuit.

  4. Factors Relating to Staff Attributions of Control over Challenging Behaviour

    Science.gov (United States)

    Dilworth, Jennifer A.; Phillips, Neil; Rose, John

    2011-01-01

    Background: Previous research has suggested that severity of intellectual disability (ID) and topography of behaviour may influence staff causal attributions regarding challenging behaviour. Subsequently, these causal attributions may influence helping behaviours. This study investigated the relationship between attributions of control over…

  5. Transcription factor control of growth rate dependent genes in Saccharomyces cerevisiae: A three factor design

    DEFF Research Database (Denmark)

    Fazio, Alessandro; Jewett, Michael Christopher; Daran-Lapujade, Pascale;

    2008-01-01

    Background: Characterization of cellular growth is central to understanding living systems. Here, we applied a three-factor design to study the relationship between specific growth rate and genome-wide gene expression in 36 steady-state chemostat cultures of Saccharomyces cerevisiae. The three...... factors we considered were specific growth rate, nutrient limitation, and oxygen availability. Results: We identified 268 growth rate dependent genes, independent of nutrient limitation and oxygen availability. The transcriptional response was used to identify key areas in metabolism around which m...... transcription factor target sets, transcription factors that coordinate balanced growth were also identified. Our analysis shows that FhII, Rap1, and Sfp1, regulating protein biosynthesis, have significantly enriched target sets for genes up-regulated with increasing growth rate. Cell cycle regulators...

  6. Risk factors for Creutzfeldt-Jakob disease: a reanalysis of case-control studies.

    NARCIS (Netherlands)

    D.P.W.M. Wientjens (Dorothee); Z. Davanipour; K. Kondo; W.B. Matthews; R.G. Will (Robert); C.M. van Duijn (Cock); A. Hofman (Albert)

    1996-01-01

    textabstractTo review the evidence for risk factors of Creutzfeldt-Jakob disease (CJD), we pooled and reanalyzed the raw data of three case-control studies. The pooled data set comprised 178 patients and 333 control subjects. The strength of association between CJD and putative risk factors was asse

  7. Factores de riesgo para carcinoma basocelular: Estudio de casos-controles en Córdoba Risk factors for basal cell carcinoma: Case-control study in Cordoba

    Directory of Open Access Journals (Sweden)

    Alejandro Ruiz Lascano

    2005-12-01

    Full Text Available El carcinoma basocelular es una enfermedad compleja. Su etiología es todavía poco clara y a pesar de su frecuencia hay pocos datos sobre factores de riesgo. Nosotros evaluamos factores de riesgo potenciales para carcinoma basocelular en una población de Córdoba (Argentina. Este estudio de casos y controles incluyó a 88 casos nuevos de carcinoma basocelular, y 88 controles pareados por sexo y edad. Los siguientes factores de riesgo fueron significativos en el análisis multivariado: fototipos I, II y III, exposición solar recreativa alta después de los 20 años de edad, exposición solar alta en vacaciones en la playa y la presencia de queratosis actínicas.Basal cell carcinoma is undoubtedly a complex disease. Its etiology is still unclear and despite its frequency, there is a paucity of data on its risk factors. We assessed potential risk factors for basal cell carcinoma in a population from Córdoba (Argentina. This case-control study involved 88 newly diagnosed cases and 88 controls, matched by age and sex. The following risk factors were significant in the multivariate analysis: skin type I-II-III, high recreational sun exposure after 20 years of age, high sun exposure for beach holidays and actinic keratosis.

  8. Social and cultural factors in the successful control of tuberculosis.

    OpenAIRE

    Rubel, A J; Garro, L C

    1992-01-01

    The burden of tuberculosis on the public health is staggering. Worldwide, annual incidence of new cases is estimated to be about 8 million. Almost 3 million deaths occur yearly. Early case identification and adherence to treatment regimens are the remaining barriers to successful control. In many nations, however, fewer than half those with active disease receive a diagnosis, and fewer than half those beginning treatment complete it. The twin problems of delay in seeking treatment and abandon...

  9. Viral control of bacterial biodiversity - Evidence from a nutrient enriched mesocosm experiment

    DEFF Research Database (Denmark)

    Sandaa, R.-A.; Gómez-Consarnau, L.; Pinhassi, J.;

    2009-01-01

    We demonstrate here results showing that bottom-up and top-down control mechanisms can operate simultaneously and in concert in marine microbial food webs, controlling prokaryote diversity by a combination of viral lysis and substrate limitation. Models in microbial ecology predict that a shift i...

  10. Tunable signal processing through modular control of transcription factor translocation

    Science.gov (United States)

    Hao, Nan; Budnik, Bogdan A.; Gunawardena, Jeremy; O’Shea, Erin K.

    2013-01-01

    Signaling pathways can induce different dynamics of transcription factor (TF) activation. We explored how TFs process signaling inputs to generate diverse dynamic responses. The budding yeast general stress responsive TF Msn2 acted as a tunable signal processor that could track, filter, or integrate signals in an input dependent manner. This tunable signal processing appears to originate from dual regulation of both nuclear import and export by phosphorylation, as mutants with one form of regulation sustained only one signal processing function. Versatile signal processing by Msn2 is crucial for generating distinct dynamic responses to different natural stresses. Our findings reveal how complex signal processing functions are integrated into a single molecule and provide a guide for the design of TFs with “programmable” signal processing functions. PMID:23349292

  11. Tunable signal processing through modular control of transcription factor translocation.

    Science.gov (United States)

    Hao, Nan; Budnik, Bogdan A; Gunawardena, Jeremy; O'Shea, Erin K

    2013-01-25

    Signaling pathways can induce different dynamics of transcription factor (TF) activation. We explored how TFs process signaling inputs to generate diverse dynamic responses. The budding yeast general stress-responsive TF Msn2 acted as a tunable signal processor that could track, filter, or integrate signals in an input-dependent manner. This tunable signal processing appears to originate from dual regulation of both nuclear import and export by phosphorylation, as mutants with one form of regulation sustained only one signal-processing function. Versatile signal processing by Msn2 is crucial for generating distinct dynamic responses to different natural stresses. Our findings reveal how complex signal-processing functions are integrated into a single molecule and provide a guide for the design of TFs with "programmable" signal-processing functions.

  12. J-Inner-Outer Factorization, J-Spectral Factorization, and Robust Control for Nonlinear Systems

    NARCIS (Netherlands)

    Ball, Joseph A.; Schaft, Arjan J. van der

    1996-01-01

    The problem of expressing a given nonlinear state-space system as the cascade connection of a lossless system and a stable, minimum-phase system (inner-outer factorization) is solved for the case of a stable system having state-space equations affine in the inputs. The solution is given in terms of

  13. Research on Open-Closed-Loop Iterative Learning Control with Variable Forgetting Factor of Mobile Robots

    Directory of Open Access Journals (Sweden)

    Hongbin Wang

    2016-01-01

    Full Text Available We propose an iterative learning control algorithm (ILC that is developed using a variable forgetting factor to control a mobile robot. The proposed algorithm can be categorized as an open-closed-loop iterative learning control, which produces control instructions by using both previous and current data. However, introducing a variable forgetting factor can weaken the former control output and its variance in the control law while strengthening the robustness of the iterative learning control. If it is applied to the mobile robot, this will reduce position errors in robot trajectory tracking control effectively. In this work, we show that the proposed algorithm guarantees tracking error bound convergence to a small neighborhood of the origin under the condition of state disturbances, output measurement noises, and fluctuation of system dynamics. By using simulation, we demonstrate that the controller is effective in realizing the prefect tracking.

  14. [Transforming growth factor-beta controls pathogenesis of Crohn disease].

    Science.gov (United States)

    Friess, H; di Mola, F F; Egger, B; Scheuren, A; Kleeff, J; Zimmermann, A; Büchler, M W

    1998-01-01

    The pathogenetic mechanisms which contribute to the progression of Crohn's disease are still not known. Transforming growth factor-beta (TGF-beta) and its subtypes are multifunctional polypeptides which regulate immunological processes as well as the synthesis of the extracellular matrix and fibrogenesis. In the present study, Crohn's disease tissue samples of 18 patients undergoing intestinal resection were analyzed by Northern blot analysis, in situ hybridization and immunostaining for TGF-beta 1-3 and the TGF-beta receptors type I-III (T beta R-I, T beta R-II, T beta R-III). There was a marked overexpression of TGF-beta 1, TGF-beta 3 and T beta R-II in 94% of the Crohn's disease tissue samples. TGF-beta 2 and T beta R-I ALK5 and T beta R-III were enhanced in 72%, 72% and 82% of the Crohn tissue samples, respectively. In situ hybridization and immunostaining revealed that there was frequent coexpression of TGF-beta with its signaling receptors. Our data indicate that TGF-beta and their receptors seem to be involved in the pathogenesis of Crohn's disease. Their enhanced expression might contribute to the increase in extracellular matrix resulting in fibrosis and subsequently in intestinal obstruction.

  15. Strain Specific Factors Control Effector Gene Silencing in Phytophthora sojae.

    Directory of Open Access Journals (Sweden)

    Sirjana Devi Shrestha

    Full Text Available The Phytophthora sojae avirulence gene Avr3a encodes an effector that is capable of triggering immunity on soybean plants carrying the resistance gene Rps3a. P. sojae strains that express Avr3a are avirulent to Rps3a plants, while strains that do not are virulent. To study the inheritance of Avr3a expression and virulence towards Rps3a, genetic crosses and self-fertilizations were performed. A cross between P. sojae strains ACR10 X P7076 causes transgenerational gene silencing of Avr3a allele, and this effect is meiotically stable up to the F5 generation. However, test-crosses of F1 progeny (ACR10 X P7076 with strain P6497 result in the release of silencing of Avr3a. Expression of Avr3a in the progeny is variable and correlates with the phenotypic penetrance of the avirulence trait. The F1 progeny from a direct cross of P6497 X ACR10 segregate for inheritance for Avr3a expression, a result that could not be explained by parental imprinting or heterozygosity. Analysis of small RNA arising from the Avr3a gene sequence in the parental strains and hybrid progeny suggests that the presence of small RNA is necessary but not sufficient for gene silencing. Overall, we conclude that inheritance of the Avr3a gene silenced phenotype relies on factors that are variable among P. sojae strains.

  16. Strain Specific Factors Control Effector Gene Silencing in Phytophthora sojae.

    Science.gov (United States)

    Shrestha, Sirjana Devi; Chapman, Patrick; Zhang, Yun; Gijzen, Mark

    2016-01-01

    The Phytophthora sojae avirulence gene Avr3a encodes an effector that is capable of triggering immunity on soybean plants carrying the resistance gene Rps3a. P. sojae strains that express Avr3a are avirulent to Rps3a plants, while strains that do not are virulent. To study the inheritance of Avr3a expression and virulence towards Rps3a, genetic crosses and self-fertilizations were performed. A cross between P. sojae strains ACR10 X P7076 causes transgenerational gene silencing of Avr3a allele, and this effect is meiotically stable up to the F5 generation. However, test-crosses of F1 progeny (ACR10 X P7076) with strain P6497 result in the release of silencing of Avr3a. Expression of Avr3a in the progeny is variable and correlates with the phenotypic penetrance of the avirulence trait. The F1 progeny from a direct cross of P6497 X ACR10 segregate for inheritance for Avr3a expression, a result that could not be explained by parental imprinting or heterozygosity. Analysis of small RNA arising from the Avr3a gene sequence in the parental strains and hybrid progeny suggests that the presence of small RNA is necessary but not sufficient for gene silencing. Overall, we conclude that inheritance of the Avr3a gene silenced phenotype relies on factors that are variable among P. sojae strains.

  17. Negative elongation factor controls energy homeostasis in cardiomyocytes.

    Science.gov (United States)

    Pan, Haihui; Qin, Kunhua; Guo, Zhanyong; Ma, Yonggang; April, Craig; Gao, Xiaoli; Andrews, Thomas G; Bokov, Alex; Zhang, Jianhua; Chen, Yidong; Weintraub, Susan T; Fan, Jian-Bing; Wang, Degeng; Hu, Yanfen; Aune, Gregory J; Lindsey, Merry L; Li, Rong

    2014-04-10

    Negative elongation factor (NELF) is known to enforce promoter-proximal pausing of RNA polymerase II (Pol II), a pervasive phenomenon observed across multicellular genomes. However, the physiological impact of NELF on tissue homeostasis remains unclear. Here, we show that whole-body conditional deletion of the B subunit of NELF (NELF-B) in adult mice results in cardiomyopathy and impaired response to cardiac stress. Tissue-specific knockout of NELF-B confirms its cell-autonomous function in cardiomyocytes. NELF directly supports transcription of those genes encoding rate-limiting enzymes in fatty acid oxidation (FAO) and the tricarboxylic acid (TCA) cycle. NELF also shares extensively transcriptional target genes with peroxisome proliferator-activated receptor α (PPARα), a master regulator of energy metabolism in the myocardium. Mechanistically, NELF helps stabilize the transcription initiation complex at the metabolism-related genes. Our findings strongly indicate that NELF is part of the PPARα-mediated transcription regulatory network that maintains metabolic homeostasis in cardiomyocytes.

  18. Factors controlling black carbon distribution in the Arctic

    Science.gov (United States)

    Qi, Ling; Li, Qinbin; Li, Yinrui; He, Cenlin

    2017-01-01

    We investigate the sensitivity of black carbon (BC) in the Arctic, including BC concentration in snow (BCsnow, ng g-1) and surface air (BCair, ng m-3), as well as emissions, dry deposition, and wet scavenging using the global three-dimensional (3-D) chemical transport model (CTM) GEOS-Chem. We find that the model underestimates BCsnow in the Arctic by 40 % on average (median = 11.8 ng g-1). Natural gas flaring substantially increases total BC emissions in the Arctic (by ˜ 70 %). The flaring emissions lead to up to 49 % increases (0.1-8.5 ng g-1) in Arctic BCsnow, dramatically improving model comparison with observations (50 % reduction in discrepancy) near flaring source regions (the western side of the extreme north of Russia). Ample observations suggest that BC dry deposition velocities over snow and ice in current CTMs (0.03 cm s-1 in the GEOS-Chem) are too small. We apply the resistance-in-series method to compute a dry deposition velocity (vd) that varies with local meteorological and surface conditions. The resulting velocity is significantly larger and varies by a factor of 8 in the Arctic (0.03-0.24 cm s-1), which increases the fraction of dry to total BC deposition (16 to 25 %) yet leaves the total BC deposition and BCsnow in the Arctic unchanged. This is largely explained by the offsetting higher dry and lower wet deposition fluxes. Additionally, we account for the effect of the Wegener-Bergeron-Findeisen (WBF) process in mixed-phase clouds, which releases BC particles from condensed phases (water drops and ice crystals) back to the interstitial air and thereby substantially reduces the scavenging efficiency of clouds for BC (by 43-76 % in the Arctic). The resulting BCsnow is up to 80 % higher, BC loading is considerably larger (from 0.25 to 0.43 mg m-2), and BC lifetime is markedly prolonged (from 9 to 16 days) in the Arctic. Overall, flaring emissions increase BCair in the Arctic (by ˜ 20 ng m-3), the updated vd more than halves BCair (by ˜ 20 ng m-3

  19. Factors controlling seasonal variations in Arctic black carbon

    Science.gov (United States)

    Shen, Z.; Ming, Y.; Horowitz, L. W.

    2015-12-01

    Arctic haze has a distinct seasonality with higher concentrations in winter and spring. This study evaluates how different processes of large-scale circulation and removal control seasonal variations in Arctic black carbon (BC) using the Geophysical Fluid Dynamics Laboratory (GFDL) atmospheric general circulation model (AM3). We find that transport and wet deposition play unequal roles in determining Arctic BC seasonal cycle. Despite seasonal differences in general circulation patterns, the eddy-driven BC transport changes little throughout the year, and the seasonal cycle of Arctic BC is attributed to wet removal. BC hydrophilic fraction affected by the aging process and hydrophilic BC (BCpi) wet deposition rate affected by cloud microphysics determine wet deposition. Both low hydrophilic fraction and low wet deposition rate account for the peak of BC in winter. The transition to low BC in summer results from an increase in wet deposition rate, while the return of BC in late autumn is mainly caused by a sharp decrease in hydrophilic fraction. The results suggest that the concentrations of Arctic aerosols as well as their climate impacts may be susceptible to modification in a future climate.

  20. Carbohydrate as a factor controlling leaf development in cocoa

    Energy Technology Data Exchange (ETDEWEB)

    Machado, R.C.R.

    1986-01-01

    Cocoa shows growth periodicity of the shoot apex where periods of active new leaf development (flushing) alternate with periods of dormancy (Interflush). This thesis presents the results of an investigation into the characteristics of leaf growth, and the production and translocation of photosynthate/carbohydrate between source and sink leaves aimed to investigate the possible role of plant carbohydrate status in the control of the intermittent leaf production. The photosynthetic capacity of mature leaves did not increase during the phase of major increase in carbohydrate consumption by developing leaves but rather decreased slightly. Translocation of assimilated /sup 14/carbon from mature leaves was however significantly increased during phase of rapid expansion of the new leaves. Compensatory changes in the /sup 14/carbon-export from a single remaining source leaf after defoliation showed that mature leaves normally operate much below both their maximum photosynthate loading capacity and export potential. Partial removal of developing leaves within one flush resulted in increased /sup 14/C-photosynthate import into the remaining leaf showing that a developing leaf has a greater import and unloading potential than that utilized during its development in one normal flush.

  1. Factors controlling navigation-channel Shoaling in Laguna Madre, Texas

    Science.gov (United States)

    Morton, R.A.; Nava, R.C.; Arhelger, M.

    2001-01-01

    Shoaling in the Gulf Intracoastal Waterway of Laguna Madre, Tex., is caused primarily by recycling of dredged sediments. Sediment recycling, which is controlled by water depth and location with respect to the predominant wind-driven currents, is minimal where dredged material is placed on tidal flats that are either flooded infrequently or where the water is extremely shallow. In contrast, nearly all of the dredged material placed in open water >1.5 m deep is reworked and either transported back into the channel or dispersed into the surrounding lagoon. A sediment flux analysis incorporating geotechnical properties demonstrated that erosion and not postemplacement compaction caused most sediment losses from the placement areas. Comparing sediment properties in the placement areas and natural lagoon indicated that the remaining dredged material is mostly a residual of initial channel construction. Experimental containment designs (shallow subaqueous mound, submerged levee, and emergent levee) constructed in high-maintenance areas to reduce reworking did not retain large volumes of dredged material. The emergent levee provided the greatest retention potential approximately 2 years after construction.

  2. CLOSTRIDIUM DIFFICILE INFECTION: RISK FACTORS, DIAGNOSIS AND CONTROL

    Directory of Open Access Journals (Sweden)

    Xhelil Koleci

    2012-03-01

    Full Text Available The epidemiology of Clostridium difficile infections (CDI has changed over the past decade. In addition to dramatic worldwide increases in incidence, new CDI populations are emerging. These populations include patients with community acquired infections with no previous antibiotic exposure, children, pregnant women and patients with IBD. Diagnosis of CDIs requires the identification of C. difficile toxin A or B in diarrheal stool. Current diagnostic tests, however, remains inadequate and an optimal diagnostic testing algorithm has not yet been defined. Metronidazole and vancomycin are currently first-line agents for CDI treatment. Vancomycine, however, has demonstrated superior efficacy and therefore is the preferred agent in patients with severe infections. As with many antibiotics, the incidence of treatment failure with metronidazole is increasing, thereby emphasizing the need to find alternative treatments. Disease recurrence continues to occur in 20-40% of patients and its treatment remains challenging. In patients who develop fulminant colitis from a CDI, early surgical consultation is essential. Intravenous immunoglobulin and tigecycline have been used in patients with severe refractory disease, however delaying surgery may be associated with worse outcomes. Due to the risk of horizontal transmission of C.difficile infection control measures are necessary. Animals may serve as reservoirs for humans. Ongoing research by human and veterinary scientist into, epidemiology, diagnosis, effective treatment protocols and prevention are essential.

  3. Digital power factor control and reactive power regulation for grid-connected photovoltaic inverter

    Energy Technology Data Exchange (ETDEWEB)

    Hassaine, L. [Power Electronics Systems Group, Universidad Carlos III de Madrid, Avda. de la Universidad 30, 28911 Leganes, Madrid (Spain); Ecole Nationale Polytechnique, Hassen Badi, El Harrach, Alger (Algeria); Olias, E.; Quintero, J. [Power Electronics Systems Group, Universidad Carlos III de Madrid, Avda. de la Universidad 30, 28911 Leganes, Madrid (Spain); Haddadi, M. [Ecole Nationale Polytechnique, Hassen Badi, El Harrach, Alger (Algeria)

    2009-01-15

    The overall efficiency of photovoltaic (PV) systems connected to the grid depends on the efficiency of direct current (DC) of the solar modules to alternate current (AC) inverter conversion. The requirements for inverter connection include: maximum power point, high efficiency, control power injected into the grid, high power factor and low total harmonic distortion of the currents injected into the grid. An approach to power factor control and reactive power regulation for PV systems connected to the grid using field programmable gate array (FPGA) is proposed. According to the grid demands; both the injected active and reactive powers are controlled. In this paper, a new digital control strategy for a single-phase inverter is carried out. This control strategy is based on the phase shift between the inverter output voltage and the grid voltage, and the digital sinusoidal pulse width modulation (DSPWM) patterns, in order to control the power factor for a wide range of the inverter output current and consequently the control and the regulation of the reactive power will be achieved. The advantage of the proposed control strategy is its implementation around simple digital circuits. In this work, a simulation study of this strategy has been realized using Matlab/Simulink and PSIM. In order to validate its performance, this control has been implemented in a FPGA. Experimental tests have been carried out demonstrating the viability of this control in order to control the power factor and the injected power into the grid. (author)

  4. Optimal Fuzzy PID Controller with Adjustable Factors and Its Application to Intelligent Artificial Legs

    Institute of Scientific and Technical Information of China (English)

    Tan Guanzheng(谭冠政); Xiao Hongfeng; Wang Yuechao

    2004-01-01

    A new kind of optimal fuzzy PID controller is proposed, which contains two parts. One is an on-line fuzzy inference mechanism and another is a conventional PID controller. In the fuzzy inference mechanism, three adjustable factors xp, xi, and xd are introduced. Their function is to further modify and optimize the result of the fuzzy inference to make the controller have the optimal control effect on a given object. The optimal values of these factors are determined based on the ITAE criterion and the flexible polyhedron search algorithm of Nelder and Mead. This PID controller has been used to control a D.C. motor of the intelligent artificial leg designed by the authors. The result of computer simulation indicates that the design of this controller is very effective and can be widely used to control different kinds of objects and processes.

  5. Optimal fuzzy PID controller with adjustable factors based on flexible polyhedron search algorithm

    Institute of Scientific and Technical Information of China (English)

    谭冠政; 肖宏峰; 王越超

    2002-01-01

    A new kind of optimal fuzzy PID controller is proposed, which contains two parts. One is an on-line fuzzy inference system, and the other is a conventional PID controller. In the fuzzy inference system, three adjustable factors xp, xi, and xd are introduced. Their functions are to further modify and optimize the result of the fuzzy inference so as to make the controller have the optimal control effect on a given object. The optimal values of these adjustable factors are determined based on the ITAE criterion and the Nelder and Mead′s flexible polyhedron search algorithm. This optimal fuzzy PID controller has been used to control the executive motor of the intelligent artificial leg designed by the authors. The result of computer simulation indicates that this controller is very effective and can be widely used to control different kinds of objects and processes.

  6. Biodiversity mediates top-down control in eelgrass ecosystems : A global comparative-experimental approach

    NARCIS (Netherlands)

    Duffy, J Emmett; Reynolds, Pamela L; Boström, Christoffer; Coyer, James A; Cusson, Mathieu; Donadi, Serena; Douglass, James G; Eklöf, Johan S; Engelen, Aschwin H; Eriksson, Britas Klemens; Fredriksen, Stein; Gamfeldt, Lars; Gustafsson, Camilla; Hoarau, Galice; Hori, Masakazu; Hovel, Kevin; Iken, Katrin; Lefcheck, Jonathan S; Moksnes, Per-Olav; Nakaoka, Masahiro; O'Connor, Mary I; Olsen, Jeanine L; Richardson, J Paul; Ruesink, Jennifer L; Sotka, Erik E; Thormar, Jonas; Whalen, Matthew A; Stachowicz, John J

    2015-01-01

    Nutrient pollution and reduced grazing each can stimulate algal blooms as shown by numerous experiments. But because experiments rarely incorporate natural variation in environmental factors and biodiversity, conditions determining the relative strength of bottom-up and top-down forcing remain unres

  7. Geological factors controlling radon hazardous concentration in groundwater

    Science.gov (United States)

    Przylibski, T. A.

    2009-04-01

    Radon waters are classified as waters containing more than 100 Bq/L of Rn-222. In many regions radon groundwaters are commonly used as a tap waters. Exploitation of radon groundwater without removing radon out of water in the intake may be hazardous for the consumers. Radon removing is relatively simple and cheap, and may be achieved trough the degassing of tapped water. The following factors are crucial for the genesis of radon (Rn-222) and changes in its concentration in groundwaters: the content of parent Ra-226 in the reservoir rock, the emanation coefficient of the reservoir rock, mixing of various groundwater components. Simplifying the geochemical characterisctics of Ra-226, one can say that the highest radium contents outside uranium deposits could be expected above all in crystalline rocks such as granites, ryolites and gneisses, and among sedimentary rocks - in fine-grained rocks - mudstones and clay rocks. Therefore the highest content of Rn-222 is characteristic of groundwaters flowing through the abovementioned rocks. What is very important for the genesis of groundwater dissolved Rn-222 is not only the total content of Ra-226 in the aquifer, but also the distribution of this isotope's atoms in relation to the surface of mineral grains (crystals) and crack surfaces. Only if Ra-226 atoms lie in the outer zone of grains (crystals), they can be the source of Rn-222 atoms released directly or indirectly into pores and fissures. If the pores and fissures are filled with free groundwater, then the radon dissolved in this water can migrate with it. Therefore particularly high Rn-222 concentration values can be expected in groundwaters circulating in zones of strongly cracked reservoir rocks, i.e. in the weathering zone, reaching the depth of several dozen meters below ground surface, as well as in zones of brittle tectonic deformations. The number of Rn-222 atoms formed in groundwater as a result of the decay of Ra-226 ion (Ra2+) dissolved in this water

  8. Correction factor based double model fuzzy logic control strategy of arc voltage in pulsed MIG welding

    Institute of Scientific and Technical Information of China (English)

    Wu Kaiyuan; Huang Shisheng; Meng Yongmin

    2005-01-01

    According to the feature of arc voltage control in welding steel using pulsed MIG welding, a correction factor based double model fuzzy logic controller (FLC) was developed to realize the arc voltage control by means of arc voltage feedback.When the error of peak arc voltage was great, a coarse adjusting fuzzy logic control rules with correction factor was designed,in the controller, the peak arc voltage was controlled by the wire feeding speed by means of arc voltage feedback. When the error of peak arc voltage was small, a fine adjusting fuzzy logic control rules with correction factor was designed, in this controller, the peak arc voltage was controlled by the background time by means of arc voltage feedback. The FLC was realized in a Look-Up Table ( LUT) method. Experiments had been carried out aiming at implementing the control strategy to control the arc length change in welding process. Experimental results show that the controller proposed enables the consistency of arc length and the stabolity of arc voltage and welding process to be achieved in pulsed MIG welding process.

  9. Putative paternal factors controlling chilling tolerance in Korean market-type cucumber (Cucumis sativus L.)

    Science.gov (United States)

    Chilling temperatures (Cucumis sativus L.) plants during winter and early spring growing seasons. Inheritance to chilling in U.S. processing cucumber is controlled by cytoplasmic (maternally) and nuclear factors. To understand inherit...

  10. Verification and validation of human factors issues in control room design and upgrades

    Energy Technology Data Exchange (ETDEWEB)

    Green, M.; Collier, S. [Inst. for Energiteknikk, Halden (Norway). OECD Halden Reactor Project

    1999-12-01

    Systems, facilities and equipment are periodically updated during a power plant's lifetime. This has human factors implications, especially if the central control room is involved. Human factors work may therefore be required. There is an extensive literature on human factors itself, but not so much on how it is verified and validated. Therefore, HRP and the Swedish Nuclear Power Inspectorate commissioned a study. The objective was to review the literature and establish a knowledge base on verification and validation (V and V) of human factors issues. The report first discusses verification and validation as applied to human factors work. It describes a design process and the typical human factors topics involved. It then presents a generic method for V and V of human factors. This is built on a review of standards, guidelines and other references given in an annotated bibliography. The method is illustrated by application to some human factors topics.

  11. Modeling and Control of DC/DC Boost Converter using K-Factor Control for MPPT of Solar PV System

    DEFF Research Database (Denmark)

    Vangari, Adithya; Haribabu, Divyanagalakshmi; Sakamuri, Jayachandra N.

    2015-01-01

    This paper is focused on the design of a controller for the DC/DC boost converter using K factor control, which is based on modified PI control method, for maximum power point tracking (MPPT) of solar PV system. A mathematical model for boost converter based on small signal averaging approach...... is presented. Design of the passive elements of the boost converter as per the system specifications is also illustrated. The performance of the proposed K factor control method is verified with the simulations for MPPT on solar PV system at different atmospheric conditions. A new circuit based model for solar...... PV array, which includes the effect of solar insolation and temperature on PV array output, for the application in power system transient simulations, is also presented. The performance of the PV array model is verified with simulations at different atmospheric conditions. A 160W PV module from BP...

  12. Neonatal risk factors for cerebral palsy in very preterm babies: case-control study.

    OpenAIRE

    Murphy, D. J.; Hope, P. L.; Johnson, A.

    1997-01-01

    OBJECTIVE: To identify neonatal risk factors for cerebral palsy among very preterm babies and in particular the associations independent of the coexistence of antenatal and intrapartum factors. DESIGN: Case-control study. SETTING: Oxford health region. SUBJECTS: Singleton babies born between 1984 and 1990 at less than 32 weeks' gestation who survived to discharge from hospital: 59 with cerebral palsy and 234 randomly selected controls without cerebral palsy. MAIN OUTCOME MEASURES: Adverse neo...

  13. Voltage-Sensitive Load Controllers for Voltage Regulation and Increased Load Factor in Distribution Systems

    DEFF Research Database (Denmark)

    Douglass, Philip James; Garcia-Valle, Rodrigo; Østergaard, Jacob

    2014-01-01

    consumption which can be mapped to temperature setpoint offsets of thermostat controlled loads. In networks where a lower voltage level corresponds to high system load (and vice versa), this controller acts to regulate voltage and increase the load factor. Simulations are conducted on low- and medium-voltage......This paper presents a novel controller design for controlling appliances based on local measurements of voltage. The controller finds the normalized voltage deviation accounting for the sensitivity of voltage measurements to appliance state. The controller produces a signal indicating desired power...... distribution systems with residential loads including voltage-sensitive water heaters. In low-voltage systems, the results of the simulations show the controller to be effective at reducing the extremes of voltage and increasing the load factor while respecting end-use temperature constraints. In medium-voltage...

  14. Dynamic increase and decrease of photonic crystal nanocavity Q factors for optical pulse control.

    Science.gov (United States)

    Upham, Jeremy; Tanaka, Yoshinori; Asano, Takashi; Noda, Susumu

    2008-12-22

    We introduce recent advances in dynamic control over the Q factor of a photonic crystal nanocavity system. By carefully timing a rapid increase of the Q factor from 3800 to 22,000, we succeed in capturing a 4ps signal pulse within the nanocavity with a photon lifetime of 18ps. By performing an additional transition of the Q factor within the photon lifetime, the held light is once again ejected from of the system on demand.

  15. Interactions in a tritrophic acarine predator-prey metapopulation system V: Within-plant dynamics of Phytoseiulus persimilis and Tetranychus urticae (Acari: Phytoseiidae, Tetranychidae)

    DEFF Research Database (Denmark)

    Nachman, Gösta; Zemek, Rostislav

    2003-01-01

    Biological control, Bottom-up factor, Phytoseiulus persimilis, Plant condition, Predacious mites, Simulation model, Tetranychus urticae, Top-down factor, Two-spotted spider mites......Biological control, Bottom-up factor, Phytoseiulus persimilis, Plant condition, Predacious mites, Simulation model, Tetranychus urticae, Top-down factor, Two-spotted spider mites...

  16. Community-Based School Finance and Accountability: A New Era for Local Control in Education Policy?

    Science.gov (United States)

    Vasquez Heilig, Julian; Ward, Derrick R.; Weisman, Eric; Cole, Heather

    2014-01-01

    Top-down accountability policies have arguably had very limited impact over the past 20 years. Education stakeholders are now contemplating new forms of bottom-up accountability. In 2013, policymakers in California enacted a community-based approach that creates the Local Control Funding Formula (LCFF) process for school finance to increase…

  17. Main controlling factors of distribution and genetics of marine reservoirs in China

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    Marine reservoirs are mainly made up of clastics and carbonate reservoirs, which are distributed widely in central Tarim, Sichuan, Ordos basins from the Pre-Cambrian to Cenozoic, mainly in Palaeozoic. Marine clastic reservoirs are developed in foreshore and nearshore, tidal flat and delta environment. The sedimentary facies are important controlling factors for reservoir quality. Compaction, pressolution and cementation are factors of decreasing porosity, and low palaeo-temperature gradient, early emplacement of oil and gas and dissolution are favorable for preservation of pore. Carbonate reservoirs are divided into reef and bank, karst, dolomite and fracture reservoirs. Dolomitization, dissolution, TSR and fracture are important factors of controlling carbonate reservoirs' quality.

  18. A novel arc welding inverter with unit power factor based on DSP control

    Institute of Scientific and Technical Information of China (English)

    Chen Shujun; Zeng Hua; Du Li; Yin Shuyan; Chen Yonggang

    2006-01-01

    A novel inverter power source is developed characterized with constant output current and unit power factor input.Digital signal processor (DSP) is used to realize power factor correction and control of back-stage inverter bridge of the arc welding inverter. The fore-stage adopts double closed loop proportion and integration (PI) rectifier technique and the backstage adopts digital pulse width modulation (PWM) technique. Simulated waves can be obtained in Matlab/Simulink and validated by experiments. Experiments of the prototype showed that the total harmonic distortion (THD) can be controlled within 10% and the power factor is approximate to 1.

  19. Research on the sudden changes and the controlling factors of deep coal mining conditions

    Institute of Scientific and Technical Information of China (English)

    HU Wei-yue; DONG Shu-ning

    2008-01-01

    It was illustrated that the mining conditions inducing disasters changed with depthboth in regularity of gradual and sudden change. The sudden change depth for differentdisaster conditions are different and controlled by different factors. The high temperatureand its change with depth are mainly controlled by strata structures and rock heat conductiv-ity property, the high rock stress and dynamical engineering disasters and their change withdepth are mainly controlled by tectonic conditions, roof strata rock property and deep rockmechanical property, coal mine water disasters and their change with depth are mainly con-trolled by rock mechanical property of coal seam floor and regional groundwater circulationconditions, gas disaster conditions and their change with depth are mainly controlled byburied conditions of coal seam and opening conditions of geological structures. It is men-tioned that the key point for the control of deep coal mining disaster is to clearly understandthe sudden change depth of different factors causing disasters.

  20. Factors contributing to poor glycaemic control in diabetic patients at Mopani District

    Directory of Open Access Journals (Sweden)

    N.H. Shiluban

    2009-09-01

    Full Text Available Diabetes mellitus is not only a major burden in the developed world, it is also an increasing health problem in less developed countries. Although health education could be a tool to achieve better glycaemic control, it is important to understand that health education should be adjusted to patients’ literacy, cultural environment and economic status. Among other factors, lack of money has an influence on the outcome of diabetes mellitus. Thus the purpose of the study is to identify factors contributing to poor glycaemia control in diabetic patients. Data was collected using self-report questionnaire on a convenient sample of 32 diabetic patients and unstructured, open-ended interviews on eight patients’ inorder to allow them freedom of expressing themselves with regard to factors that contribute to poor glycaemic control on diabetic patients. Data was then analysed using a computer program called Statistical Package for Social Sciences. The socioeconomic factors appeared to have significant influence on glycaemic control among participants, for instance 75% of the total subjects (32 indicated that they experienced problems of accessing health care services due to lack of money. Ignorance related to where to seek support system such as educational programme, and nutrition counselling were factors that were identified to contribute to diabetic patients’ poor glycaemia control.Permission to conduct the study was obtained from the Provincial Department of Health and the managers of the institutions where the study was conducted. Recommendations for dealing with the identified factors have been formulated.