WorldWideScience

Sample records for large scale production

  1. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  2. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  3. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  4. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  5. Wind and Photovoltaic Large-Scale Regional Models for hourly production evaluation

    DEFF Research Database (Denmark)

    Marinelli, Mattia; Maule, Petr; Hahmann, Andrea N.

    2015-01-01

    This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesosca...... of the transmission system, especially regarding the cross-border power flows. The tuning of these regional models is done using historical meteorological data acquired on a per-country basis and using publicly available data of installed capacity.......This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesoscale...

  6. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  7. Some effects of integrated production planning in large-scale kitchens

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Jacobsen, Peter

    2005-01-01

    Integrated production planning in large-scale kitchens proves advantageous for increasing the overall quality of the food produced and the flexibility in terms of a diverse food supply. The aim is to increase the flexibility and the variability in the production as well as the focus on freshness ...

  8. Optimization of large scale food production using Lean Manufacturing principles

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Breum, Gitte

    2009-01-01

    This paper discusses how the production principles of Lean Manufacturing (Lean) can be applied in a large-scale meal production. Lean principles are briefly presented, followed by a field study of how a kitchen at a Danish hospital has implemented Lean in the daily production. In the kitchen...... not be negatively affected by the rationalisation of production procedures. The field study shows that Lean principles can be applied in meal production and can result in increased production efficiency and systematic improvement of product quality without negative effects on the working environment. The results...... show that Lean can be applied and used to manage the production of meals in the kitchen....

  9. Reproducible, large-scale production of thallium-based high-temperature superconductors

    International Nuclear Information System (INIS)

    Gay, R.L.; Stelman, D.; Newcomb, J.C.; Grantham, L.F.; Schnittgrund, G.D.

    1990-01-01

    This paper reports on the development of a large scale spray-calcination technique generic to the preparation of ceramic high-temperature superconductor (HTSC) powders. Among the advantages of the technique is that of producing uniformly mixed metal oxides on a fine scale. Production of both yttrium and thallium-based HTSCs has been demonstrated using this technique. In the spray calciner, solutions of the desired composition are atomized as a fine mist into a hot gas. Evaporation and calcination are instantaneous, yielding an extremely fine, uniform oxide powder. The calciner is 76 cm in diameter and can produce metal oxide powder at relatively large rates (approximately 100 g/h) without contamination

  10. Process optimization of large-scale production of recombinant adeno-associated vectors using dielectric spectroscopy.

    Science.gov (United States)

    Negrete, Alejandro; Esteban, Geoffrey; Kotin, Robert M

    2007-09-01

    A well-characterized manufacturing process for the large-scale production of recombinant adeno-associated vectors (rAAV) for gene therapy applications is required to meet current and future demands for pre-clinical and clinical studies and potential commercialization. Economic considerations argue in favor of suspension culture-based production. Currently, the only feasible method for large-scale rAAV production utilizes baculovirus expression vectors and insect cells in suspension cultures. To maximize yields and achieve reproducibility between batches, online monitoring of various metabolic and physical parameters is useful for characterizing early stages of baculovirus-infected insect cells. In this study, rAAVs were produced at 40-l scale yielding ~1 x 10(15) particles. During the process, dielectric spectroscopy was performed by real time scanning in radio frequencies between 300 kHz and 10 MHz. The corresponding permittivity values were correlated with the rAAV production. Both infected and uninfected reached a maximum value; however, only infected cell cultures permittivity profile reached a second maximum value. This effect was correlated with the optimal harvest time for rAAV production. Analysis of rAAV indicated the harvesting time around 48 h post-infection (hpi), and 72 hpi produced similar quantities of biologically active rAAV. Thus, if operated continuously, the 24-h reduction in the production process of rAAV gives sufficient time for additional 18 runs a year corresponding to an extra production of ~2 x 10(16) particles. As part of large-scale optimization studies, this new finding will facilitate the bioprocessing scale-up of rAAV and other bioproducts.

  11. LARGE-SCALE HYDROGEN PRODUCTION FROM NUCLEAR ENERGY USING HIGH TEMPERATURE ELECTROLYSIS

    International Nuclear Information System (INIS)

    O'Brien, James E.

    2010-01-01

    Hydrogen can be produced from water splitting with relatively high efficiency using high-temperature electrolysis. This technology makes use of solid-oxide cells, running in the electrolysis mode to produce hydrogen from steam, while consuming electricity and high-temperature process heat. When coupled to an advanced high temperature nuclear reactor, the overall thermal-to-hydrogen efficiency for high-temperature electrolysis can be as high as 50%, which is about double the overall efficiency of conventional low-temperature electrolysis. Current large-scale hydrogen production is based almost exclusively on steam reforming of methane, a method that consumes a precious fossil fuel while emitting carbon dioxide to the atmosphere. Demand for hydrogen is increasing rapidly for refining of increasingly low-grade petroleum resources, such as the Athabasca oil sands and for ammonia-based fertilizer production. Large quantities of hydrogen are also required for carbon-efficient conversion of biomass to liquid fuels. With supplemental nuclear hydrogen, almost all of the carbon in the biomass can be converted to liquid fuels in a nearly carbon-neutral fashion. Ultimately, hydrogen may be employed as a direct transportation fuel in a 'hydrogen economy.' The large quantity of hydrogen that would be required for this concept should be produced without consuming fossil fuels or emitting greenhouse gases. An overview of the high-temperature electrolysis technology will be presented, including basic theory, modeling, and experimental activities. Modeling activities include both computational fluid dynamics and large-scale systems analysis. We have also demonstrated high-temperature electrolysis in our laboratory at the 15 kW scale, achieving a hydrogen production rate in excess of 5500 L/hr.

  12. The use of production management techniques in the construction of large scale physics detectors

    CERN Document Server

    Bazan, A; Estrella, F; Kovács, Z; Le Flour, T; Le Goff, J M; Lieunard, S; McClatchey, R; Murray, S; Varga, L Z; Vialle, J P; Zsenei, M

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so- called Workflow Management software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector ...

  13. The use of production management techniques in the construction of large scale physics detectors

    International Nuclear Information System (INIS)

    Bazan, A.; Chevenier, G.; Estrella, F.

    1999-01-01

    The construction process of detectors for the Large Hadron Collider (LHC) experiments is large scale, heavily constrained by resource availability and evolves with time. As a consequence, changes in detector component design need to be tracked and quickly reflected in the construction process. With similar problems in industry engineers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Product Data Management (PDM) systems to control access to documented versions of designs and managers employ so-called Workflow Management Software (WfMS) to coordinate production work processes. However, PDM and WfMS software are not generally integrated in industry. The scale of LHC experiments, like CMS, demands that industrial production techniques be applied in detector construction. This paper outlines the major functions and applications of the CRISTAL system (Cooperating Repositories and an Information System for Tracking Assembly Lifecycles) in use in CMS which successfully integrates PDM and WfMS techniques in managing large scale physics detector construction. This is the first time industrial production techniques have been deployed to this extent in detector construction

  14. Large-Scale Production of Nanographite by Tube-Shear Exfoliation in Water.

    Directory of Open Access Journals (Sweden)

    Nicklas Blomquist

    Full Text Available The number of applications based on graphene, few-layer graphene, and nanographite is rapidly increasing. A large-scale process for production of these materials is critically needed to achieve cost-effective commercial products. Here, we present a novel process to mechanically exfoliate industrial quantities of nanographite from graphite in an aqueous environment with low energy consumption and at controlled shear conditions. This process, based on hydrodynamic tube shearing, produced nanometer-thick and micrometer-wide flakes of nanographite with a production rate exceeding 500 gh-1 with an energy consumption about 10 Whg-1. In addition, to facilitate large-area coating, we show that the nanographite can be mixed with nanofibrillated cellulose in the process to form highly conductive, robust and environmentally friendly composites. This composite has a sheet resistance below 1.75 Ω/sq and an electrical resistivity of 1.39×10-4 Ωm and may find use in several applications, from supercapacitors and batteries to printed electronics and solar cells. A batch of 100 liter was processed in less than 4 hours. The design of the process allow scaling to even larger volumes and the low energy consumption indicates a low-cost process.

  15. Very-large-scale production of antibodies in plants: The biologization of manufacturing.

    Science.gov (United States)

    Buyel, J F; Twyman, R M; Fischer, R

    2017-07-01

    Gene technology has facilitated the biologization of manufacturing, i.e. the use and production of complex biological molecules and systems at an industrial scale. Monoclonal antibodies (mAbs) are currently the major class of biopharmaceutical products, but they are typically used to treat specific diseases which individually have comparably low incidences. The therapeutic potential of mAbs could also be used for more prevalent diseases, but this would require a massive increase in production capacity that could not be met by traditional fermenter systems. Here we outline the potential of plants to be used for the very-large-scale (VLS) production of biopharmaceutical proteins such as mAbs. We discuss the potential market sizes and their corresponding production capacities. We then consider available process technologies and scale-down models and how these can be used to develop VLS processes. Finally, we discuss which adaptations will likely be required for VLS production, lessons learned from existing cell culture-based processes and the food industry, and practical requirements for the implementation of a VLS process. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  16. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  17. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  18. Large scale production and downstream processing of a recombinant porcine parvovirus vaccine

    NARCIS (Netherlands)

    Maranga, L.; Rueda, P.; Antonis, A.F.G.; Vela, C.; Langeveld, J.P.M.; Casal, J.I.; Carrondo, M.J.T.

    2002-01-01

    Porcine parvovirus (PPV) virus-like particles (VLPs) constitute a potential vaccine for prevention of parvovirus-induced reproductive failure in gilts. Here we report the development of a large scale (25 l) production process for PPV-VLPs with baculovirus-infected insect cells. A low multiplicity of

  19. A test trial irradiation of natural rubber latex on large scale for the production of examination gloves in a production scale

    International Nuclear Information System (INIS)

    Devendra, R.; Kulatunge, S.; Chandralal, H.N.K.K.; Kalyani, N.M.V.; Seneviratne, J.; Wellage, S.

    1996-01-01

    Radiation Vulcanization of natural rubber latex has been developed extensively through various research and development programme. During these investigations many data was collected and from these data it was proved that radiation vulcanized natural rubber latex (RVNRL) can be used as a new material for industry (RVNRL symposium 1989; Makuuchi IAEA report). This material has been extensively tested in making of dipped goods and extruded products. However these investigations were confined only to laboratory experiments and these experiments mainly reflected material properties of RVNRL and only a little was observed about its behavior in actual production scale operation. The present exercise was carried out mainly to study the behavior of the material in production scale by irradiating latex on a large scale and producing gloves in a production scale plant. It was found that RVNRL can be used in conventional glove plants without making major alteration to the plant. Quality of the gloves that were produced using RVNRL is acceptable. It was also found that the small deviation of vulcanization dose will affect the crosslinking density of films. This will drastically reduce the tensile strength of the film. Crosslinking density or pre-vulcanized relax modulus (PRM) at 100% is a reliable property to control the pre vulcanization of latex by radiation

  20. A 3D Sphere Culture System Containing Functional Polymers for Large-Scale Human Pluripotent Stem Cell Production

    Directory of Open Access Journals (Sweden)

    Tomomi G. Otsuji

    2014-05-01

    Full Text Available Utilizing human pluripotent stem cells (hPSCs in cell-based therapy and drug discovery requires large-scale cell production. However, scaling up conventional adherent cultures presents challenges of maintaining a uniform high quality at low cost. In this regard, suspension cultures are a viable alternative, because they are scalable and do not require adhesion surfaces. 3D culture systems such as bioreactors can be exploited for large-scale production. However, the limitations of current suspension culture methods include spontaneous fusion between cell aggregates and suboptimal passaging methods by dissociation and reaggregation. 3D culture systems that dynamically stir carrier beads or cell aggregates should be refined to reduce shearing forces that damage hPSCs. Here, we report a simple 3D sphere culture system that incorporates mechanical passaging and functional polymers. This setup resolves major problems associated with suspension culture methods and dynamic stirring systems and may be optimal for applications involving large-scale hPSC production.

  1. LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY

    International Nuclear Information System (INIS)

    SCHULTZ, K.R.; BROWN, L.C.; BESENBRUCH, G.E.; HAMILTON, C.J.

    2003-01-01

    OAK B202 LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY. The ''Hydrogen Economy'' will reduce petroleum imports and greenhouse gas emissions. However, current commercial hydrogen production processes use fossil fuels and releases carbon dioxide. Hydrogen produced from nuclear energy could avoid these concerns. The authors have recently completed a three-year project for the US Department of Energy whose objective was to ''define an economically feasible concept for production of hydrogen, by nuclear means, using an advanced high-temperature nuclear reactor as the energy source''. Thermochemical water-splitting, a chemical process that accomplishes the decomposition of water into hydrogen and oxygen, met this objective. The goal of the first phase of this study was to evaluate thermochemical processes which offer the potential for efficient, cost-effective, large-scale production of hydrogen and to select one for further detailed consideration. The authors selected the Sulfur-Iodine cycle, In the second phase, they reviewed all the basic reactor types for suitability to provide the high temperature heat needed by the selected thermochemical water splitting cycle and chose the helium gas-cooled reactor. In the third phase they designed the chemical flowsheet for the thermochemical process and estimated the efficiency and cost of the process and the projected cost of producing hydrogen. These results are summarized in this paper

  2. Large-scale production of lentiviral vector in a closed system hollow fiber bioreactor

    Directory of Open Access Journals (Sweden)

    Jonathan Sheu

    Full Text Available Lentiviral vectors are widely used in the field of gene therapy as an effective method for permanent gene delivery. While current methods of producing small scale vector batches for research purposes depend largely on culture flasks, the emergence and popularity of lentiviral vectors in translational, preclinical and clinical research has demanded their production on a much larger scale, a task that can be difficult to manage with the numbers of producer cell culture flasks required for large volumes of vector. To generate a large scale, partially closed system method for the manufacturing of clinical grade lentiviral vector suitable for the generation of induced pluripotent stem cells (iPSCs, we developed a method employing a hollow fiber bioreactor traditionally used for cell expansion. We have demonstrated the growth, transfection, and vector-producing capability of 293T producer cells in this system. Vector particle RNA titers after subsequent vector concentration yielded values comparable to lentiviral iPSC induction vector batches produced using traditional culture methods in 225 cm2 flasks (T225s and in 10-layer cell factories (CF10s, while yielding a volume nearly 145 times larger than the yield from a T225 flask and nearly three times larger than the yield from a CF10. Employing a closed system hollow fiber bioreactor for vector production offers the possibility of manufacturing large quantities of gene therapy vector while minimizing reagent usage, equipment footprint, and open system manipulation.

  3. Environmental degradation, global food production, and risk for large-scale migrations

    International Nuclear Information System (INIS)

    Doeoes, B.R.

    1994-01-01

    This paper attempts to estimate to what extent global food production is affected by the ongoing environmental degradation through processes, such as soil erosion, salinization, chemical contamination, ultraviolet radiation, and biotic stress. Estimates have also been made of available opportunities to improve food production efficiency by, e.g., increased use of fertilizers, irrigation, and biotechnology, as well as improved management. Expected losses and gains of agricultural land in competition with urbanization, industrial development, and forests have been taken into account. Although estimated gains in food production deliberately have been overestimated and losses underestimated, calculations indicate that during the next 30-35 years the annual net gain in food production will be significantly lower than the rate of world population growth. An attempt has also been made to identify possible scenarios for large-scale migrations, caused mainly by rapid population growth in combination with insufficient local food production and poverty. 18 refs, 7 figs, 6 tabs

  4. An economical device for carbon supplement in large-scale micro-algae production.

    Science.gov (United States)

    Su, Zhenfeng; Kang, Ruijuan; Shi, Shaoyuan; Cong, Wei; Cai, Zhaoling

    2008-10-01

    One simple but efficient carbon-supplying device was designed and developed, and the correlative carbon-supplying technology was described. The absorbing characterization of this device was studied. The carbon-supplying system proved to be economical for large-scale cultivation of Spirulina sp. in an outdoor raceway pond, and the gaseous carbon dioxide absorptivity was enhanced above 78%, which could reduce the production cost greatly.

  5. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    Science.gov (United States)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong

  6. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    Science.gov (United States)

    Ritsch, E.; Atlas Collaboration

    2014-06-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during Run 1 relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for Run 2, and beyond. A number of fast detector simulation, digitization and reconstruction techniques are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  7. The Ecological Impacts of Large-Scale Agrofuel Monoculture Production Systems in the Americas

    Science.gov (United States)

    Altieri, Miguel A.

    2009-01-01

    This article examines the expansion of agrofuels in the Americas and the ecological impacts associated with the technologies used in the production of large-scale monocultures of corn and soybeans. In addition to deforestation and displacement of lands devoted to food crops due to expansion of agrofuels, the massive use of transgenic crops and…

  8. Large-scale enzymatic production of natural flavour esters in organic solvent with continuous water removal.

    Science.gov (United States)

    Gubicza, L; Kabiri-Badr, A; Keoves, E; Belafi-Bako, K

    2001-11-30

    A new, large-scale process was developed for the enzymatic production of low molecular weight flavour esters in organic solvent. Solutions for the elimination of substrate and product inhibitions are presented. The excess water produced during the process was continuously removed by hetero-azeotropic distillation and esters were produced at yields of over 90%.

  9. Microbial advanced biofuels production: overcoming emulsification challenges for large-scale operation.

    Science.gov (United States)

    Heeres, Arjan S; Picone, Carolina S F; van der Wielen, Luuk A M; Cunha, Rosiane L; Cuellar, Maria C

    2014-04-01

    Isoprenoids and alkanes produced and secreted by microorganisms are emerging as an alternative biofuel for diesel and jet fuel replacements. In a similar way as for other bioprocesses comprising an organic liquid phase, the presence of microorganisms, medium composition, and process conditions may result in emulsion formation during fermentation, hindering product recovery. At the same time, a low-cost production process overcoming this challenge is required to make these advanced biofuels a feasible alternative. We review the main mechanisms and causes of emulsion formation during fermentation, because a better understanding on the microscale can give insights into how to improve large-scale processes and the process technology options that can address these challenges. Copyright © 2014 Elsevier Ltd. All rights reserved.

  10. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  11. Concepts and Plans towards fast large scale Monte Carlo production for the ATLAS Experiment

    CERN Document Server

    Chapman, J; Duehrssen, M; Elsing, M; Froidevaux, D; Harrington, R; Jansky, R; Langenberg, R; Mandrysch, R; Marshall, Z; Ritsch, E; Salzburger, A

    2014-01-01

    The huge success of the physics program of the ATLAS experiment at the Large Hadron Collider (LHC) during run I relies upon a great number of simulated Monte Carlo events. This Monte Carlo production takes the biggest part of the computing resources being in use by ATLAS as of now. In this document we describe the plans to overcome the computing resource limitations for large scale Monte Carlo production in the ATLAS Experiment for run II, and beyond. A number of fast detector simulation, digitization and reconstruction techniques and are being discussed, based upon a new flexible detector simulation framework. To optimally benefit from these developments, a redesigned ATLAS MC production chain is presented at the end of this document.

  12. Polymerase-endonuclease amplification reaction (PEAR for large-scale enzymatic production of antisense oligonucleotides.

    Directory of Open Access Journals (Sweden)

    Xiaolong Wang

    Full Text Available Antisense oligonucleotides targeting microRNAs or their mRNA targets prove to be powerful tools for molecular biology research and may eventually emerge as new therapeutic agents. Synthetic oligonucleotides are often contaminated with highly homologous failure sequences. Synthesis of a certain oligonucleotide is difficult to scale up because it requires expensive equipment, hazardous chemicals and a tedious purification process. Here we report a novel thermocyclic reaction, polymerase-endonuclease amplification reaction (PEAR, for the amplification of oligonucleotides. A target oligonucleotide and a tandem repeated antisense probe are subjected to repeated cycles of denaturing, annealing, elongation and cleaving, in which thermostable DNA polymerase elongation and strand slipping generate duplex tandem repeats, and thermostable endonuclease (PspGI cleavage releases monomeric duplex oligonucleotides. Each round of PEAR achieves over 100-fold amplification. The product can be used in one more round of PEAR directly, and the process can be further repeated. In addition to avoiding dangerous materials and improved product purity, this reaction is easy to scale up and amenable to full automation. PEAR has the potential to be a useful tool for large-scale production of antisense oligonucleotide drugs.

  13. Large-scale production of diesel-like biofuels - process design as an inherent part of microorganism development.

    Science.gov (United States)

    Cuellar, Maria C; Heijnen, Joseph J; van der Wielen, Luuk A M

    2013-06-01

    Industrial biotechnology is playing an important role in the transition to a bio-based economy. Currently, however, industrial implementation is still modest, despite the advances made in microorganism development. Given that the fuels and commodity chemicals sectors are characterized by tight economic margins, we propose to address overall process design and efficiency at the start of bioprocess development. While current microorganism development is targeted at product formation and product yield, addressing process design at the start of bioprocess development means that microorganism selection can also be extended to other critical targets for process technology and process scale implementation, such as enhancing cell separation or increasing cell robustness at operating conditions that favor the overall process. In this paper we follow this approach for the microbial production of diesel-like biofuels. We review current microbial routes with both oleaginous and engineered microorganisms. For the routes leading to extracellular production, we identify the process conditions for large scale operation. The process conditions identified are finally translated to microorganism development targets. We show that microorganism development should be directed at anaerobic production, increasing robustness at extreme process conditions and tailoring cell surface properties. All the same time, novel process configurations integrating fermentation and product recovery, cell reuse and low-cost technologies for product separation are mandatory. This review provides a state-of-the-art summary of the latest challenges in large-scale production of diesel-like biofuels. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  15. Revising the potential of large-scale Jatropha oil production in Tanzania: An economic land evaluation assessment

    International Nuclear Information System (INIS)

    Segerstedt, Anna; Bobert, Jans

    2013-01-01

    Following up the rather sobering results of the biofuels boom in Tanzania, we analyze the preconditions that would make large-scale oil production from the feedstock Jatropha curcas viable. We do this by employing an economic land evaluation approach; first, we estimate the physical land suitability and the necessary inputs to reach certain amounts of yields. Subsequently, we estimate costs and benefits for different input-output levels. Finally, to incorporate the increased awareness of sustainability in the export sector, we introduce also certification criteria. Using data from an experimental farm in Kilosa, we find that high yields are crucial for the economic feasibility and that they can only be obtained on good soils at high input rates. Costs of compliance with certification criteria depend on site specific characteristics such as land suitability and precipitation. In general, both domestic production and (certified) exports are too expensive to be able to compete with conventional diesel/rapeseed oil from the EU. Even though the crop may have potential for large scale production as a niche product, there is still a lot of risk involved and more experimental research is needed. - Highlights: ► We use an economic land evaluation analysis to reassess the potential of large-scale Jatropha oil. ► High yields are possible only at high input rates and for good soil qualities. ► Production costs are still too high to break even on the domestic and export market. ► More research is needed to stabilize yields and improve the oil content. ► Focus should be on broadening our knowledge-base rather than promoting new Jatropha investments

  16. Electrolytic production of light lanthanides from molten chloride alloys on a large laboratory scale

    International Nuclear Information System (INIS)

    Szklarski, W.; Bogacz, A.; Strzyzewska, M.

    1979-01-01

    Literature data relating to electrolytic production of rare earth metals are presented. Conditions and results are given of own investigations into the electrolytic process of light lanthanide chloride solutions (LA-Nd) in molten potassium and sodium chlorides conducted on a large laboratory scale using molybdenic, iron, cobaltic and zinc cathodes. Design schemes of employed electrolysers are enclosed. (author)

  17. The use of soil moisture - remote sensing products for large-scale groundwater modeling and assessment

    NARCIS (Netherlands)

    Sutanudjaja, E.H.

    2012-01-01

    In this thesis, the possibilities of using spaceborne remote sensing for large-scale groundwater modeling are explored. We focus on a soil moisture product called European Remote Sensing Soil Water Index (ERS SWI, Wagner et al., 1999) - representing the upper profile soil moisture. As a test-bed, we

  18. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  19. Comparing centralised and decentralised anaerobic digestion of stillage from a large-scale bioethanol plant to animal feed production.

    Science.gov (United States)

    Drosg, B; Wirthensohn, T; Konrad, G; Hornbachner, D; Resch, C; Wäger, F; Loderer, C; Waltenberger, R; Kirchmayr, R; Braun, R

    2008-01-01

    A comparison of stillage treatment options for large-scale bioethanol plants was based on the data of an existing plant producing approximately 200,000 t/yr of bioethanol and 1,400,000 t/yr of stillage. Animal feed production--the state-of-the-art technology at the plant--was compared to anaerobic digestion. The latter was simulated in two different scenarios: digestion in small-scale biogas plants in the surrounding area versus digestion in a large-scale biogas plant at the bioethanol production site. Emphasis was placed on a holistic simulation balancing chemical parameters and calculating logistic algorithms to compare the efficiency of the stillage treatment solutions. For central anaerobic digestion different digestate handling solutions were considered because of the large amount of digestate. For land application a minimum of 36,000 ha of available agricultural area would be needed and 600,000 m(3) of storage volume. Secondly membrane purification of the digestate was investigated consisting of decanter, microfiltration, and reverse osmosis. As a third option aerobic wastewater treatment of the digestate was discussed. The final outcome was an economic evaluation of the three mentioned stillage treatment options, as a guide to stillage management for operators of large-scale bioethanol plants. Copyright IWA Publishing 2008.

  20. Large-scale production of Fischer-Tropsch diesel from biomass. Optimal gasification and gas cleaning systems

    International Nuclear Information System (INIS)

    Boerrigter, H.; Van der Drift, A.

    2004-12-01

    The paper is presented in the form of copies of overhead sheets. The contents concern definitions, an overview of Integrated biomass gasification and Fischer Tropsch (FT) systems (state-of-the-art, gas cleaning and biosyngas production, experimental demonstration and conclusions), some aspects of large-scale systems (motivation, biomass import) and an outlook

  1. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  2. An integrated model for assessing both crop productivity and agricultural water resources at a large scale

    Science.gov (United States)

    Okada, M.; Sakurai, G.; Iizumi, T.; Yokozawa, M.

    2012-12-01

    Agricultural production utilizes regional resources (e.g. river water and ground water) as well as local resources (e.g. temperature, rainfall, solar energy). Future climate changes and increasing demand due to population increases and economic developments would intensively affect the availability of water resources for agricultural production. While many studies assessed the impacts of climate change on agriculture, there are few studies that dynamically account for changes in water resources and crop production. This study proposes an integrated model for assessing both crop productivity and agricultural water resources at a large scale. Also, the irrigation management to subseasonal variability in weather and crop response varies for each region and each crop. To deal with such variations, we used the Markov Chain Monte Carlo technique to quantify regional-specific parameters associated with crop growth and irrigation water estimations. We coupled a large-scale crop model (Sakurai et al. 2012), with a global water resources model, H08 (Hanasaki et al. 2008). The integrated model was consisting of five sub-models for the following processes: land surface, crop growth, river routing, reservoir operation, and anthropogenic water withdrawal. The land surface sub-model was based on a watershed hydrology model, SWAT (Neitsch et al. 2009). Surface and subsurface runoffs simulated by the land surface sub-model were input to the river routing sub-model of the H08 model. A part of regional water resources available for agriculture, simulated by the H08 model, was input as irrigation water to the land surface sub-model. The timing and amount of irrigation water was simulated at a daily step. The integrated model reproduced the observed streamflow in an individual watershed. Additionally, the model accurately reproduced the trends and interannual variations of crop yields. To demonstrate the usefulness of the integrated model, we compared two types of impact assessment of

  3. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  4. Large-scale production and properties of human plasma-derived activated Factor VII concentrate.

    Science.gov (United States)

    Tomokiyo, K; Yano, H; Imamura, M; Nakano, Y; Nakagaki, T; Ogata, Y; Terano, T; Miyamoto, S; Funatsu, A

    2003-01-01

    An activated Factor VII (FVIIa) concentrate, prepared from human plasma on a large scale, has to date not been available for clinical use for haemophiliacs with antibodies against FVIII and FIX. In the present study, we attempted to establish a large-scale manufacturing process to obtain plasma-derived FVIIa concentrate with high recovery and safety, and to characterize its biochemical and biological properties. FVII was purified from human cryoprecipitate-poor plasma, by a combination of anion exchange and immunoaffinity chromatography, using Ca2+-dependent anti-FVII monoclonal antibody. To activate FVII, a FVII preparation that was nanofiltered using a Bemberg Microporous Membrane-15 nm was partially converted to FVIIa by autoactivation on an anion-exchange resin. The residual FVII in the FVII and FVIIa mixture was completely activated by further incubating the mixture in the presence of Ca2+ for 18 h at 10 degrees C, without any additional activators. For preparation of the FVIIa concentrate, after dialysis of FVIIa against 20 mm citrate, pH 6.9, containing 13 mm glycine and 240 mm NaCl, the FVIIa preparation was supplemented with 2.5% human albumin (which was first pasteurized at 60 degrees C for 10 h) and lyophilized in vials. To inactivate viruses contaminating the FVIIa concentrate, the lyophilized product was further heated at 65 degrees C for 96 h in a water bath. Total recovery of FVII from 15 000 l of plasma was approximately 40%, and the FVII preparation was fully converted to FVIIa with trace amounts of degraded products (FVIIabeta and FVIIagamma). The specific activity of the FVIIa was approximately 40 U/ micro g. Furthermore, virus-spiking tests demonstrated that immunoaffinity chromatography, nanofiltration and dry-heating effectively removed and inactivated the spiked viruses in the FVIIa. These results indicated that the FVIIa concentrate had both high specific activity and safety. We established a large-scale manufacturing process of human plasma

  5. Fission product release from nuclear fuel II. Validation of ASTEC/ELSA on analytical and large scale experiments

    International Nuclear Information System (INIS)

    Brillant, G.; Marchetto, C.; Plumecocq, W.

    2013-01-01

    Highlights: • A wide range of experiments is presented for the ASTEC/ELSA code validation. • Analytical tests such as AECL, ORNL and VERCORS are considered. • A large-scale experiment, PHEBUS FPT1, is considered. • The good agreement with measurements shows the efficiency of the ASTEC modelling. • Improvements concern the FP release modelling from MOX and high burn-up UO 2 fuels. - Abstract: This article is the second of two articles dedicated to the mechanisms of fission product release from a degraded core. The models of fission product release from nuclear fuel in the ASTEC code have been described in detail in the first part of this work (Brillant et al., this issue). In this contribution, the validation of ELSA, the module of ASTEC that deals with fission product and structural material release from a degraded core, is presented. A large range of experimental tests, with various temperature and conditions for the fuel surrounding atmosphere (oxidising and reducing), is thus simulated with the ASTEC code. The validation database includes several analytical experiments with both bare fuel (e.g. MCE1 experiments) and cladded fuel (e.g. HCE3, VERCORS). Furthermore, the PHEBUS large-scale experiments are used for the validation of ASTEC. The rather satisfactory comparison between ELSA calculations and experimental measurements demonstrates the efficiency of the analytical models to describe fission product release in severe accident conditions

  6. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    Science.gov (United States)

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been successful to date. To investigate the elusive biochemistry and molecular mechanisms of olfaction, we have developed a mammalian expression system for the large-scale production and purification of a functional OR protein in milligram quantities. Here, we report the study of human OR17-4 (hOR17-4) purified from a HEK293S tetracycline-inducible system. Scale-up of production yield was achieved through suspension culture in a bioreactor, which enabled the preparation of >10 mg of monomeric hOR17-4 receptor after immunoaffinity and size exclusion chromatography, with expression yields reaching 3 mg/L of culture medium. Several key post-translational modifications were identified using MS, and CD spectroscopy showed the receptor to be ≈50% α-helix, similar to other recently determined G protein-coupled receptor structures. Detergent-solubilized hOR17-4 specifically bound its known activating odorants lilial and floralozone in vitro, as measured by surface plasmon resonance. The hOR17-4 also recognized specific odorants in heterologous cells as determined by calcium ion mobilization. Our system is feasible for the production of large quantities of OR necessary for structural and functional analyses and research into OR biosensor devices. PMID:19581598

  7. Large-scale Modeling of Nitrous Oxide Production: Issues of Representing Spatial Heterogeneity

    Science.gov (United States)

    Morris, C. K.; Knighton, J.

    2017-12-01

    Nitrous oxide is produced from the biological processes of nitrification and denitrification in terrestrial environments and contributes to the greenhouse effect that warms Earth's climate. Large scale modeling can be used to determine how global rate of nitrous oxide production and consumption will shift under future climates. However, accurate modeling of nitrification and denitrification is made difficult by highly parameterized, nonlinear equations. Here we show that the representation of spatial heterogeneity in inputs, specifically soil moisture, causes inaccuracies in estimating the average nitrous oxide production in soils. We demonstrate that when soil moisture is averaged from a spatially heterogeneous surface, net nitrous oxide production is under predicted. We apply this general result in a test of a widely-used global land surface model, the Community Land Model v4.5. The challenges presented by nonlinear controls on nitrous oxide are highlighted here to provide a wider context to the problem of extraordinary denitrification losses in CLM. We hope that these findings will inform future researchers on the possibilities for model improvement of the global nitrogen cycle.

  8. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  9. Constructing Model of Relationship among Behaviors and Injuries to Products Based on Large Scale Text Data on Injuries

    Science.gov (United States)

    Nomori, Koji; Kitamura, Koji; Motomura, Yoichi; Nishida, Yoshifumi; Yamanaka, Tatsuhiro; Komatsubara, Akinori

    In Japan, childhood injury prevention is urgent issue. Safety measures through creating knowledge of injury data are essential for preventing childhood injuries. Especially the injury prevention approach by product modification is very important. The risk assessment is one of the most fundamental methods to design safety products. The conventional risk assessment has been carried out subjectively because product makers have poor data on injuries. This paper deals with evidence-based risk assessment, in which artificial intelligence technologies are strongly needed. This paper describes a new method of foreseeing usage of products, which is the first step of the evidence-based risk assessment, and presents a retrieval system of injury data. The system enables a product designer to foresee how children use a product and which types of injuries occur due to the product in daily environment. The developed system consists of large scale injury data, text mining technology and probabilistic modeling technology. Large scale text data on childhood injuries was collected from medical institutions by an injury surveillance system. Types of behaviors to a product were derived from the injury text data using text mining technology. The relationship among products, types of behaviors, types of injuries and characteristics of children was modeled by Bayesian Network. The fundamental functions of the developed system and examples of new findings obtained by the system are reported in this paper.

  10. Pilot study of large-scale production of mutant pigs by ENU mutagenesis.

    Science.gov (United States)

    Hai, Tang; Cao, Chunwei; Shang, Haitao; Guo, Weiwei; Mu, Yanshuang; Yang, Shulin; Zhang, Ying; Zheng, Qiantao; Zhang, Tao; Wang, Xianlong; Liu, Yu; Kong, Qingran; Li, Kui; Wang, Dayu; Qi, Meng; Hong, Qianlong; Zhang, Rui; Wang, Xiupeng; Jia, Qitao; Wang, Xiao; Qin, Guosong; Li, Yongshun; Luo, Ailing; Jin, Weiwu; Yao, Jing; Huang, Jiaojiao; Zhang, Hongyong; Li, Menghua; Xie, Xiangmo; Zheng, Xuejuan; Guo, Kenan; Wang, Qinghua; Zhang, Shibin; Li, Liang; Xie, Fei; Zhang, Yu; Weng, Xiaogang; Yin, Zhi; Hu, Kui; Cong, Yimei; Zheng, Peng; Zou, Hailong; Xin, Leilei; Xia, Jihan; Ruan, Jinxue; Li, Hegang; Zhao, Weiming; Yuan, Jing; Liu, Zizhan; Gu, Weiwang; Li, Ming; Wang, Yong; Wang, Hongmei; Yang, Shiming; Liu, Zhonghua; Wei, Hong; Zhao, Jianguo; Zhou, Qi; Meng, Anming

    2017-06-22

    N-ethyl-N-nitrosourea (ENU) mutagenesis is a powerful tool to generate mutants on a large scale efficiently, and to discover genes with novel functions at the whole-genome level in Caenorhabditis elegans, flies, zebrafish and mice, but it has never been tried in large model animals. We describe a successful systematic three-generation ENU mutagenesis screening in pigs with the establishment of the Chinese Swine Mutagenesis Consortium. A total of 6,770 G1 and 6,800 G3 pigs were screened, 36 dominant and 91 recessive novel pig families with various phenotypes were established. The causative mutations in 10 mutant families were further mapped. As examples, the mutation of SOX10 (R109W) in pig causes inner ear malfunctions and mimics human Mondini dysplasia, and upregulated expression of FBXO32 is associated with congenital splay legs. This study demonstrates the feasibility of artificial random mutagenesis in pigs and opens an avenue for generating a reservoir of mutants for agricultural production and biomedical research.

  11. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  12. Performance of mushroom fruiting for large scale commercial production

    International Nuclear Information System (INIS)

    Mat Rosol Awang; Rosnani Abdul Rashid; Hassan Hamdani Mutaat; Mohd Meswan Maskom

    2012-01-01

    The paper described the determination of mushroom fruiting yield, which is vital to economics of mushroom production. Consistency in mushroom yields enabling an estimation to be made for revenues and hence profitability could be predicted. It has been reported by many growers, there are a large variation in mushroom yields over different times of production. To assess such claims we have run four batches of mushroom fruiting and the performance fruiting body productions are presented. (author)

  13. Large-Scale Production of Fuel and Feed from Marine Microalgae

    Energy Technology Data Exchange (ETDEWEB)

    Huntley, Mark [Cornell Univ., Ithaca, NY (United States)

    2015-09-30

    In summary, this Consortium has demonstrated a fully integrated process for the production of biofuels and high-value nutritional bioproducts at pre-commercial scale. We have achieved unprecedented yields of algal oil, and converted the oil to viable fuels. We have demonstrated the potential value of the residual product as a viable feed ingredient for many important animals in the global food supply.

  14. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  15. Low-Cost and Scaled-Up Production of Fluorine-Free, Substrate-Independent, Large-Area Superhydrophobic Coatings Based on Hydroxyapatite Nanowire Bundles.

    Science.gov (United States)

    Chen, Fei-Fei; Yang, Zi-Yue; Zhu, Ying-Jie; Xiong, Zhi-Chao; Dong, Li-Ying; Lu, Bing-Qiang; Wu, Jin; Yang, Ri-Long

    2018-01-09

    To date, the scaled-up production and large-area applications of superhydrophobic coatings are limited because of complicated procedures, environmentally harmful fluorinated compounds, restrictive substrates, expensive equipment, and raw materials usually involved in the fabrication process. Herein, the facile, low-cost, and green production of superhydrophobic coatings based on hydroxyapatite nanowire bundles (HNBs) is reported. Hydrophobic HNBs are synthesised by using a one-step solvothermal method with oleic acid as the structure-directing and hydrophobic agent. During the reaction process, highly hydrophobic C-H groups of oleic acid molecules can be attached in situ to the surface of HNBs through the chelate interaction between Ca 2+ ions and carboxylic groups. This facile synthetic method allows the scaled-up production of HNBs up to about 8 L, which is the largest production scale of superhydrophobic paint based on HNBs ever reported. In addition, the design of the 100 L reaction system is also shown. The HNBs can be coated on any substrate with an arbitrary shape by the spray-coating technique. The self-cleaning ability in air and oil, high-temperature stability, and excellent mechanical durability of the as-prepared superhydrophobic coatings are demonstrated. More importantly, the HNBs are coated on large-sized practical objects to form large-area superhydrophobic coatings. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Technical data summary: Uranium(IV) production using a large scale electrochemical cell

    International Nuclear Information System (INIS)

    Hsu, T.C.

    1984-05-01

    This Technical Data Summary outlines an electrochemical process to produce U(IV), in the form of uranous nitrate, from U(VI), as uranyl nitrate. U(IV) with hydrazine could then be used as an alternative plutonium reductant to substantially reduce the waste volume from the Purex solvent extraction process. This TDS is divided into three parts. The first part (Chapters I to IV) generally describes the electrochemical production of U(IV). The second part (Chapters V to VII) describes a pilot scale U(IV) production facility that was constructed and operated at an engineering semiworks area of SRP, referred to as TNX. The lst part (Chapter VIII) describes a preliminary design for a full-scale facility that would meet the projected need for U(IV) as a reductant in SRP's separations processes. The preliminary design was described in a Basic Data Summary for the U(IV) production facility, and a Venture Guidance Appraisal (VGA) was prepared from the Basic Data Summary. The VGA for the U(IV) process showed that because of the large capital investment required, this approach to waste reduction was not economically competitive with another alternative that required only modifying the ongoing Purex process at no additional capital cost. However, implementing he U(IV) process as part of an overall canyon renovation, presently scheduled for the 1990's, may be economically attractive. The purpose of this TDS is therefore to bring together the information and experience obtained thus far in the U(IV) program so that a useful body of information will be available to support any future development of this process

  17. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  18. Operational experinece with large scale biogas production at the promest manure processing plant in Helmond, the Netherlands

    International Nuclear Information System (INIS)

    Schomaker, A.H.H.M.

    1992-01-01

    In The Netherlands a surplus of 15 million tons of liquid pig manure is produced yearly on intensive pig breeding farms. The dutch government has set a three-way policy to reduce this excess of manure: 1. conversion of animal fodder into a product with less and better ingestible nutrients; 2. distribution of the surplus to regions with a shortage of animal manure; 3. processing of the remainder of the surplus in large scale processing plants. The first large scale plant for the processing of liquid pig manure was put in operation in 1988 as a demonstration plant at Promest in Helmond. The design capacity of this plant is 100,000 tons of pig manure per year. The plant was initiated by the Manure Steering Committee of the province Noord-Brabant in order to prove at short notice whether large scale manure processing might contribute to the solution of the problem of the manure surplus in The Netherlands. This steering committee is a corporation of the national and provincial government and the agricultural industrial life. (au)

  19. LARGE SCALE METHOD FOR THE PRODUCTION AND PURIFICATION OF CURIUM

    Science.gov (United States)

    Higgins, G.H.; Crane, W.W.T.

    1959-05-19

    A large-scale process for production and purification of Cm/sup 242/ is described. Aluminum slugs containing Am are irradiated and declad in a NaOH-- NaHO/sub 3/ solution at 85 to 100 deg C. The resulting slurry filtered and washed with NaOH, NH/sub 4/OH, and H/sub 2/O. Recovery of Cm from filtrate and washings is effected by an Fe(OH)/sub 3/ precipitation. The precipitates are then combined and dissolved ln HCl and refractory oxides centrifuged out. These oxides are then fused with Na/sub 2/CO/sub 3/ and dissolved in HCl. The solution is evaporated and LiCl solution added. The Cm, rare earths, and anionic impurities are adsorbed on a strong-base anfon exchange resin. Impurities are eluted with LiCl--HCl solution, rare earths and Cm are eluted by HCl. Other ion exchange steps further purify the Cm. The Cm is then precipitated as fluoride and used in this form or further purified and processed. (T.R.H.)

  20. Facile Large-scale synthesis of stable CuO nanoparticles

    Science.gov (United States)

    Nazari, P.; Abdollahi-Nejand, B.; Eskandari, M.; Kohnehpoushi, S.

    2018-04-01

    In this work, a novel approach in synthesizing the CuO nanoparticles was introduced. A sequential corrosion and detaching was proposed in the growth and dispersion of CuO nanoparticles in the optimum pH value of eight. The produced CuO nanoparticles showed six nm (±2 nm) in diameter and spherical feather with a high crystallinity and uniformity in size. In this method, a large-scale production of CuO nanoparticles (120 grams in an experimental batch) from Cu micro-particles was achieved which may met the market criteria for large-scale production of CuO nanoparticles.

  1. Large-Scale Selection and Breeding To Generate Industrial Yeasts with Superior Aroma Production

    Science.gov (United States)

    Steensels, Jan; Meersman, Esther; Snoek, Tim; Saels, Veerle

    2014-01-01

    The concentrations and relative ratios of various aroma compounds produced by fermenting yeast cells are essential for the sensory quality of many fermented foods, including beer, bread, wine, and sake. Since the production of these aroma-active compounds varies highly among different yeast strains, careful selection of variants with optimal aromatic profiles is of crucial importance for a high-quality end product. This study evaluates the production of different aroma-active compounds in 301 different Saccharomyces cerevisiae, Saccharomyces paradoxus, and Saccharomyces pastorianus yeast strains. Our results show that the production of key aroma compounds like isoamyl acetate and ethyl acetate varies by an order of magnitude between natural yeasts, with the concentrations of some compounds showing significant positive correlation, whereas others vary independently. Targeted hybridization of some of the best aroma-producing strains yielded 46 intraspecific hybrids, of which some show a distinct heterosis (hybrid vigor) effect and produce up to 45% more isoamyl acetate than the best parental strains while retaining their overall fermentation performance. Together, our results demonstrate the potential of large-scale outbreeding to obtain superior industrial yeasts that are directly applicable for commercial use. PMID:25192996

  2. Optimization of Large-Scale Culture Conditions for the Production of Cordycepin with Cordyceps militaris by Liquid Static Culture

    Directory of Open Access Journals (Sweden)

    Chao Kang

    2014-01-01

    Full Text Available Cordycepin is one of the most important bioactive compounds produced by species of Cordyceps sensu lato, but it is hard to produce large amounts of this substance in industrial production. In this work, single factor design, Plackett-Burman design, and central composite design were employed to establish the key factors and identify optimal culture conditions which improved cordycepin production. Using these culture conditions, a maximum production of cordycepin was 2008.48 mg/L for 700 mL working volume in the 1000 mL glass jars and total content of cordycepin reached 1405.94 mg/bottle. This method provides an effective way for increasing the cordycepin production at a large scale. The strategies used in this study could have a wide application in other fermentation processes.

  3. Higher-Twist Dynamics in Large Transverse Momentum Hadron Production

    International Nuclear Information System (INIS)

    Francois, Alero

    2009-01-01

    A scaling law analysis of the world data on inclusive large-p # perpendicular# hadron production in hadronic collisions is carried out. A significant deviation from leading-twist perturbative QCD predictions at next-to-leading order is reported. The observed discrepancy is largest at high values of x # perpendicular# = 2p # perpendicular#/√s. In contrast, the production of prompt photons and jets exhibits the scaling behavior which is close to the conformal limit, in agreement with the leading-twist expectation. These results bring evidence for a non-negligible contribution of higher-twist processes in large-p # perpendicular# hadron production in hadronic collisions, where the hadron is produced directly in the hard subprocess rather than by gluon or quark jet fragmentation. Predictions for scaling exponents at RHIC and LHC are given, and it is suggested to trigger the isolated large-p # perpendicular# hadron production to enhance higher-twist processes.

  4. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  5. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  6. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...

  7. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  8. Production of baryons with large transverse momentum

    International Nuclear Information System (INIS)

    Landshoff, P.V.; Polkinghorne, J.C.; Scott, D.M.

    1975-01-01

    The multiple scattering of constituent quarks provides a natural mechanism for fairly copious production of large-transverse-momentum baryons in nucleon--nucleon collisions. The predicted scaling law agrees well with available data, and the mechanism provides a qualitative explanation of nuclear-target effects. In comparison with previous parton models, correlations are predicted to be qualitatively different, and large-p/sub T/ baryon production by meson beams is relatively suppressed

  9. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  10. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  11. Luminescence property and large-scale production of ZnO nanowires by current heating deposition

    International Nuclear Information System (INIS)

    Singjai, P.; Jintakosol, T.; Singkarat, S.; Choopun, S.

    2007-01-01

    Large-scale production for ZnO nanowires has been demonstrated by current heating deposition. Based on the use of a solid-vapor phase carbothermal sublimation technique, a ZnO-graphite mixed rod was placed between two copper bars and gradually heated by passing current through it under constant flowing of argon gas at atmospheric pressure. The product seen as white films deposited on the rod surface was separated for further characterizations. The results have shown mainly comb-like structures of ZnO nanowires in diameter ranging from 50 to 200 nm and length up to several tens micrometers. From optical testing, ionoluminescence spectra of as-grown and annealed samples have shown high green emission intensities centered at 510 nm. In contrast, the small UV peak centered at 390 nm was observed clearly in the as-grown sample which almost disappeared after the annealing treatment

  12. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  13. Toyota production system beyond large-scale production

    CERN Document Server

    Ohno, Taiichi

    1998-01-01

    In this classic text, Taiichi Ohno--inventor of the Toyota Production System and Lean manufacturing--shares the genius that sets him apart as one of the most disciplined and creative thinkers of our time. Combining his candid insights with a rigorous analysis of Toyota's attempts at Lean production, Ohno's book explains how Lean principles can improve any production endeavor. A historical and philosophical description of just-in-time and Lean manufacturing, this work is a must read for all students of human progress. On a more practical level, it continues to provide inspiration and instruction for those seeking to improve efficiency through the elimination of waste.

  14. Logistics of large scale commercial IVF embryo production.

    Science.gov (United States)

    Blondin, P

    2016-01-01

    The use of IVF in agriculture is growing worldwide. This can be explained by the development of better IVF media and techniques, development of sexed semen and the recent introduction of bovine genomics on farms. Being able to perform IVF on a large scale, with multiple on-farm experts to perform ovum pick-up and IVF laboratories capable of handling large volumes in a consistent and sustainable way, remains a huge challenge. To be successful, there has to be a partnership between veterinarians on farms, embryologists in the laboratory and animal owners. Farmers must understand the limits of what IVF can or cannot do under different conditions; veterinarians must manage expectations of farmers once strategies have been developed regarding potential donors; and embryologists must maintain fluent communication with both groups to make sure that objectives are met within predetermined budgets. The logistics of such operations can be very overwhelming, but the return can be considerable if done right. The present mini review describes how such operations can become a reality, with an emphasis on the different aspects that must be considered by all parties.

  15. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  16. Large-scale production of graphitic carbon nitride with outstanding nitrogen photofixation ability via a convenient microwave treatment

    International Nuclear Information System (INIS)

    Ma, Huiqiang; Shi, Zhenyu; Li, Shuang; Liu, Na

    2016-01-01

    Highlights: • Microwave method for synthesizing g-C_3N_4 with N_2 photofixation ability is reported. • Nitrogen vacancies play the important role on the nitrogen photofixation ability. • The present process is a convenient method for large-scale production of g-C_3N_4. - Abstract: A convenient microwave treatment for synthesizing graphitic carbon nitride (g-C_3N_4) with outstanding nitrogen photofixation ability under visible light is reported. X-ray diffraction (XRD), N_2 adsorption, UV–vis spectroscopy, SEM, N_2-TPD, EPR, photoluminescence (PL) and photocurrent measurements were used to characterize the prepared catalysts. The results indicate that microwave treatment can form many irregular pores in as-prepared g-C_3N_4, which causes the increased surface area and separation rate of electrons and holes. More importantly, microwave treatment causes the formation of many nitrogen vacancies in as-prepared g-C_3N_4. These nitrogen vacancies not only serve as active sites to adsorb and activate N_2 molecules but also promote interfacial charge transfer from catalysts to N_2 molecules, thus significantly improving the nitrogen photofixation ability. Moreover, the present process is a convenient method for large-scale production of g-C_3N_4 which is significantly important for the practical application.

  17. All-solid-state lithium-ion and lithium metal batteries - paving the way to large-scale production

    Science.gov (United States)

    Schnell, Joscha; Günther, Till; Knoche, Thomas; Vieider, Christoph; Köhler, Larissa; Just, Alexander; Keller, Marlou; Passerini, Stefano; Reinhart, Gunther

    2018-04-01

    Challenges and requirements for the large-scale production of all-solid-state lithium-ion and lithium metal batteries are herein evaluated via workshops with experts from renowned research institutes, material suppliers, and automotive manufacturers. Aiming to bridge the gap between materials research and industrial mass production, possible solutions for the production chains of sulfide and oxide based all-solid-state batteries from electrode fabrication to cell assembly and quality control are presented. Based on these findings, a detailed comparison of the production processes for a sulfide based all-solid-state battery with conventional lithium-ion cell production is given, showing that processes for composite electrode fabrication can be adapted with some effort, while the fabrication of the solid electrolyte separator layer and the integration of a lithium metal anode will require completely new processes. This work identifies the major steps towards mass production of all-solid-state batteries, giving insight into promising manufacturing technologies and helping stakeholders, such as machine engineering, cell producers, and original equipment manufacturers, to plan the next steps towards safer batteries with increased storage capacity.

  18. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  19. Large-scale production and study of a synthetic G protein-coupled receptor: Human olfactory receptor 17-4

    OpenAIRE

    Cook, Brian L.; Steuerwald, Dirk; Kaiser, Liselotte; Graveland-Bikker, Johanna; Vanberghem, Melanie; Berke, Allison P.; Herlihy, Kara; Pick, Horst; Vogel, Horst; Zhang, Shuguang

    2009-01-01

    Although understanding of the olfactory system has progressed at the level of downstream receptor signaling and the wiring of olfactory neurons, the system remains poorly understood at the molecular level of the receptors and their interaction with and recognition of odorant ligands. The structure and functional mechanisms of these receptors still remain a tantalizing enigma, because numerous previous attempts at the large-scale production of functional olfactory receptors (ORs) have not been...

  20. Production of recombinant antigens and antibodies in Nicotiana benthamiana using 'magnifection' technology: GMP-compliant facilities for small- and large-scale manufacturing.

    Science.gov (United States)

    Klimyuk, Victor; Pogue, Gregory; Herz, Stefan; Butler, John; Haydon, Hugh

    2014-01-01

    This review describes the adaptation of the plant virus-based transient expression system, magnICON(®) for the at-scale manufacturing of pharmaceutical proteins. The system utilizes so-called "deconstructed" viral vectors that rely on Agrobacterium-mediated systemic delivery into the plant cells for recombinant protein production. The system is also suitable for production of hetero-oligomeric proteins like immunoglobulins. By taking advantage of well established R&D tools for optimizing the expression of protein of interest using this system, product concepts can reach the manufacturing stage in highly competitive time periods. At the manufacturing stage, the system offers many remarkable features including rapid production cycles, high product yield, virtually unlimited scale-up potential, and flexibility for different manufacturing schemes. The magnICON system has been successfully adaptated to very different logistical manufacturing formats: (1) speedy production of multiple small batches of individualized pharmaceuticals proteins (e.g. antigens comprising individualized vaccines to treat NonHodgkin's Lymphoma patients) and (2) large-scale production of other pharmaceutical proteins such as therapeutic antibodies. General descriptions of the prototype GMP-compliant manufacturing processes and facilities for the product formats that are in preclinical and clinical testing are provided.

  1. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  2. Large-scale distribution of tritium in a commercial product

    International Nuclear Information System (INIS)

    Combs, F.; Doda, R.J.

    1979-01-01

    Tritium enters the environment from various sources including nuclear reactor operations, weapons testing, natural production, and from the manufacture, use and ultimate disposal of commercial products containing tritium. A recent commercial application of tritium in the United States of America involves the backlighting of liquid crystal displays (LCD) in digital electronic watches. These watches are distributed through normal commercial channels to the general public. One million curies (MCi) of tritium were distributed in 1977 in this product. This is a significant quantity of tritium compared with power reactor-produced tritium (3MCi yearly) or with naturally produced tritium (6MCi yearly). This is the single largest commercial application involving tritium to date. The final disposition of tritium from large quantities of this product, after its useful life, must be estimated by considering the means of disposal and the possibility of dispersal of tritium concurrent with disposal. The most likely method of final disposition of this product will be disposal in solid refuse; this includes burial in land fills and incineration. Burial in land fills will probably contain the tritium for its effective lifetime, whereas incineration will release all the tritium gas (as the oxide) to the atmosphere. The use and disposal of this product will be studied as part of an environmental study that is at present being prepared for the U.S. Nuclear Regulatory Commission. (author)

  3. Towards a Quantitative Use of Satellite Remote Sensing in Crop Growth Models for Large Scale Agricultural Production Estimate (Invited)

    Science.gov (United States)

    Defourny, P.

    2013-12-01

    The development of better agricultural monitoring capabilities is clearly considered as a critical step for strengthening food production information and market transparency thanks to timely information about crop status, crop area and yield forecasts. The documentation of global production will contribute to tackle price volatility by allowing local, national and international operators to make decisions and anticipate market trends with reduced uncertainty. Several operational agricultural monitoring systems are currently operating at national and international scales. Most are based on the methods derived from the pioneering experiences completed some decades ago, and use remote sensing to qualitatively compare one year to the others to estimate the risks of deviation from a normal year. The GEO Agricultural Monitoring Community of Practice described the current monitoring capabilities at the national and global levels. An overall diagram summarized the diverse relationships between satellite EO and agriculture information. There is now a large gap between the current operational large scale systems and the scientific state of the art in crop remote sensing, probably because the latter mainly focused on local studies. The poor availability of suitable in-situ and satellite data over extended areas hampers large scale demonstrations preventing the much needed up scaling research effort. For the cropland extent, this paper reports a recent research achievement using the full ENVISAT MERIS 300 m archive in the context of the ESA Climate Change Initiative. A flexible combination of classification methods depending to the region of the world allows mapping the land cover as well as the global croplands at 300 m for the period 2008 2012. This wall to wall product is then compared with regards to the FP 7-Geoland 2 results obtained using as Landsat-based sampling strategy over the IGADD countries. On the other hand, the vegetation indices and the biophysical variables

  4. Scale-up and large-scale production of Tetraselmis sp. CTP4 (Chlorophyta) for CO2 mitigation: from an agar plate to 100-m3 industrial photobioreactors.

    Science.gov (United States)

    Pereira, Hugo; Páramo, Jaime; Silva, Joana; Marques, Ana; Barros, Ana; Maurício, Dinis; Santos, Tamára; Schulze, Peter; Barros, Raúl; Gouveia, Luísa; Barreira, Luísa; Varela, João

    2018-03-23

    Industrial production of novel microalgal isolates is key to improving the current portfolio of available strains that are able to grow in large-scale production systems for different biotechnological applications, including carbon mitigation. In this context, Tetraselmis sp. CTP4 was successfully scaled up from an agar plate to 35- and 100-m 3 industrial scale tubular photobioreactors (PBR). Growth was performed semi-continuously for 60 days in the autumn-winter season (17 th October - 14 th December). Optimisation of tubular PBR operations showed that improved productivities were obtained at a culture velocity of 0.65-1.35 m s -1 and a pH set-point for CO 2 injection of 8.0. Highest volumetric (0.08 ± 0.01 g L -1 d -1 ) and areal (20.3 ± 3.2 g m -2 d -1 ) biomass productivities were attained in the 100-m 3 PBR compared to those of the 35-m 3 PBR (0.05 ± 0.02 g L -1 d -1 and 13.5 ± 4.3 g m -2 d -1 , respectively). Lipid contents were similar in both PBRs (9-10% of ash free dry weight). CO 2 sequestration was followed in the 100-m 3 PBR, revealing a mean CO 2 mitigation efficiency of 65% and a biomass to carbon ratio of 1.80. Tetraselmis sp. CTP4 is thus a robust candidate for industrial-scale production with promising biomass productivities and photosynthetic efficiencies up to 3.5% of total solar irradiance.

  5. Large-scale production of graphitic carbon nitride with outstanding nitrogen photofixation ability via a convenient microwave treatment

    Energy Technology Data Exchange (ETDEWEB)

    Ma, Huiqiang [College of Chemistry, Chemical Engineering, and Environmental Engineering, Liaoning Shihua University, Fushun 113001 (China); College of Environment and Resources, Key Lab of Groundwater Resources and Environment, Ministry of Education, Jilin University, Changchun 130021 (China); Shi, Zhenyu; Li, Shuang [College of Chemistry, Chemical Engineering, and Environmental Engineering, Liaoning Shihua University, Fushun 113001 (China); Liu, Na, E-mail: Naliujlu@163.com [College of Environment and Resources, Key Lab of Groundwater Resources and Environment, Ministry of Education, Jilin University, Changchun 130021 (China)

    2016-08-30

    Highlights: • Microwave method for synthesizing g-C{sub 3}N{sub 4} with N{sub 2} photofixation ability is reported. • Nitrogen vacancies play the important role on the nitrogen photofixation ability. • The present process is a convenient method for large-scale production of g-C{sub 3}N{sub 4}. - Abstract: A convenient microwave treatment for synthesizing graphitic carbon nitride (g-C{sub 3}N{sub 4}) with outstanding nitrogen photofixation ability under visible light is reported. X-ray diffraction (XRD), N{sub 2} adsorption, UV–vis spectroscopy, SEM, N{sub 2}-TPD, EPR, photoluminescence (PL) and photocurrent measurements were used to characterize the prepared catalysts. The results indicate that microwave treatment can form many irregular pores in as-prepared g-C{sub 3}N{sub 4}, which causes the increased surface area and separation rate of electrons and holes. More importantly, microwave treatment causes the formation of many nitrogen vacancies in as-prepared g-C{sub 3}N{sub 4}. These nitrogen vacancies not only serve as active sites to adsorb and activate N{sub 2} molecules but also promote interfacial charge transfer from catalysts to N{sub 2} molecules, thus significantly improving the nitrogen photofixation ability. Moreover, the present process is a convenient method for large-scale production of g-C{sub 3}N{sub 4} which is significantly important for the practical application.

  6. Large transverse momentum processes in a non-scaling parton model

    International Nuclear Information System (INIS)

    Stirling, W.J.

    1977-01-01

    The production of large transverse momentum mesons in hadronic collisions by the quark fusion mechanism is discussed in a parton model which gives logarithmic corrections to Bjorken scaling. It is found that the moments of the large transverse momentum structure function exhibit a simple scale breaking behaviour similar to the behaviour of the Drell-Yan and deep inelastic structure functions of the model. An estimate of corresponding experimental consequences is made and the extent to which analogous results can be expected in an asymptotically free gauge theory is discussed. A simple set of rules is presented for incorporating the logarithmic corrections to scaling into all covariant parton model calculations. (Auth.)

  7. Potential for large-scale uses for fission-product Xenon

    International Nuclear Information System (INIS)

    Rohrmann, C.A.

    1983-03-01

    Of all fission products in spent, low-enrichment-uranium power-reactor fuels, xenon is produced in the highest yield - nearly one cubic meter, STP, per metric ton. In aged fuels which may be considered for processing in the US, radioactive xenon isotopes approach the lowest limits of detection. The separation from accompanying radioactive 85 Kr is the essential problem; however, this is state-of-the-art technology which has been demonstrated on the pilot scale to yield xenon with pico-curie levels of 85 Kr contamination. If needed for special applications, such levels could be further reduced. Environmental considerations require the isolation of essentially all fission-product krypton during fuel processing. Economic restraints assure that the bulk of this krypton will need to be separated from the much-more-voluminous xenon fraction of the total amount of fission gas. Xenon may thus be discarded or made available for uses at probably very low cost. In contrast with many other fission products which have unique radioactive characteristics which make them useful as sources of heat, gamma and x-rays, and luminescence - as well as for medicinal diagnostics and therapeutics - fission-product xenon differs from naturally occurring xenon only in its isotopic composition which gives it a slightly hgiher atomic weight, because of the much higher concentrations of the 134 Xe and 136 Xe isotopes. Therefore, fission-product xenon can most likely find uses in applications which already exist but which can not be exploited most beneficially because of the high cost and scarcity of natural xenon. Unique uses would probably include applications in improved incandescent light illumination in place of krypton and in human anesthesia

  8. Engineering microbial cell factories for the production of plant natural products: from design principles to industrial-scale production.

    Science.gov (United States)

    Liu, Xiaonan; Ding, Wentao; Jiang, Huifeng

    2017-07-19

    Plant natural products (PNPs) are widely used as pharmaceuticals, nutraceuticals, seasonings, pigments, etc., with a huge commercial value on the global market. However, most of these PNPs are still being extracted from plants. A resource-conserving and environment-friendly synthesis route for PNPs that utilizes microbial cell factories has attracted increasing attention since the 1940s. However, at the present only a handful of PNPs are being produced by microbial cell factories at an industrial scale, and there are still many challenges in their large-scale application. One of the challenges is that most biosynthetic pathways of PNPs are still unknown, which largely limits the number of candidate PNPs for heterologous microbial production. Another challenge is that the metabolic fluxes toward the target products in microbial hosts are often hindered by poor precursor supply, low catalytic activity of enzymes and obstructed product transport. Consequently, despite intensive studies on the metabolic engineering of microbial hosts, the fermentation costs of most heterologously produced PNPs are still too high for industrial-scale production. In this paper, we review several aspects of PNP production in microbial cell factories, including important design principles and recent progress in pathway mining and metabolic engineering. In addition, implemented cases of industrial-scale production of PNPs in microbial cell factories are also highlighted.

  9. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  10. Large-scale production of megakaryocytes from human pluripotent stem cells by chemically defined forward programming.

    Science.gov (United States)

    Moreau, Thomas; Evans, Amanda L; Vasquez, Louella; Tijssen, Marloes R; Yan, Ying; Trotter, Matthew W; Howard, Daniel; Colzani, Maria; Arumugam, Meera; Wu, Wing Han; Dalby, Amanda; Lampela, Riina; Bouet, Guenaelle; Hobbs, Catherine M; Pask, Dean C; Payne, Holly; Ponomaryov, Tatyana; Brill, Alexander; Soranzo, Nicole; Ouwehand, Willem H; Pedersen, Roger A; Ghevaert, Cedric

    2016-04-07

    The production of megakaryocytes (MKs)--the precursors of blood platelets--from human pluripotent stem cells (hPSCs) offers exciting clinical opportunities for transfusion medicine. Here we describe an original approach for the large-scale generation of MKs in chemically defined conditions using a forward programming strategy relying on the concurrent exogenous expression of three transcription factors: GATA1, FLI1 and TAL1. The forward programmed MKs proliferate and differentiate in culture for several months with MK purity over 90% reaching up to 2 × 10(5) mature MKs per input hPSC. Functional platelets are generated throughout the culture allowing the prospective collection of several transfusion units from as few as 1 million starting hPSCs. The high cell purity and yield achieved by MK forward programming, combined with efficient cryopreservation and good manufacturing practice (GMP)-compatible culture, make this approach eminently suitable to both in vitro production of platelets for transfusion and basic research in MK and platelet biology.

  11. Large-scale production of bioenergy by the side of fuel-peat; Bioenergian suurtuotanto polttoturpeen rinnalla

    Energy Technology Data Exchange (ETDEWEB)

    Heikkilae, K. [Vapo Oy, Jyvaeskylae (Finland)

    1996-12-31

    The objective of the project was to clarify the large-scale production possibilities and the construction of the costs for bioenergy, and to develop the operational manners so that smaller volumes of biomasses are integrated to prevailing peat production and delivered so that peat ensures the quality of the fuel supply, as well as the prices and the reliability of deliveries. Hence it is possible to utilize the same organisation, machinery and volumes. The operation will be designed to be all-year-round so that the profitability can be improved. Another aim is to get the non-utilizeable wood-wastes into use, which would serve also the silvicultural purposes. The utilizeable municipal and other wastes and sludges could be used within biomass, and to make, using proper mixing ratios, biofuels precisely suitable for the purposes of the customer. At the grain growing areas it is possible to utilize the straw and at the seaside the reed grass

  12. Understory fern community structure, growth and spore production responses to a large-scale hurricane experiment in a Puerto Rico rainforest

    Science.gov (United States)

    Joanne M. Sharpe; Aaron B. Shiels

    2014-01-01

    Ferns are abundant in most rainforest understories yet their responses to hurricanes have not been well studied. Fern community structure, growth and spore production were monitored for two years before and five years after a large-scale experiment that simulated two key components of severe hurricane disturbance: canopy openness and debris deposition. The canopy was...

  13. Large-scale bioenergy production from soybeans and switchgrass in Argentina: Part A: Potential and economic feasibility for national and international markets

    NARCIS (Netherlands)

    van Dam, J.; Faaij, A.P.C.; Hilbert, J.; Petruzzi, H.; Turkenburg, W.C.

    2009-01-01

    This study focuses on the economic feasibility for large-scale biomass production from soybeans or switchgrass from a region in Argentina. This is determined, firstly, by estimating whether the potential supply of biomass, when food and feed demand are met, is sufficient under different scenarios to

  14. Challenges in Managing Trustworthy Large-scale Digital Science

    Science.gov (United States)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  15. Development of large scale production of Nd-doped phosphate glasses for megajoule-scale laser systems

    International Nuclear Information System (INIS)

    Ficini, G.; Campbell, J.H.

    1996-01-01

    Nd-doped phosphate glasses are the preferred gain medium for high-peak-power lasers used for Inertial Confinement Fusion research because they have excellent energy storage and extraction characteristics. In addition, these glasses can be manufactured defect-free in large sizes and at relatively low cost. To meet the requirements of the future mega-joule size lasers, advanced laser glass manufacturing methods are being developed that would enable laser glass to be continuously produced at the rate of several thousand large (790 x 440 x 44 mm 3 ) plates of glass per year. This represents more than a 10 to 100-fold improvement in the scale of the present manufacturing technology

  16. Rain forest nutrient cycling and productivity in response to large-scale litter manipulation.

    Science.gov (United States)

    Wood, Tana E; Lawrence, Deborah; Clark, Deborah A; Chazdon, Robin L

    2009-01-01

    Litter-induced pulses of nutrient availability could play an important role in the productivity and nutrient cycling of forested ecosystems, especially tropical forests. Tropical forests experience such pulses as a result of wet-dry seasonality and during major climatic events, such as strong El Niños. We hypothesized that (1) an increase in the quantity and quality of litter inputs would stimulate leaf litter production, woody growth, and leaf litter nutrient cycling, and (2) the timing and magnitude of this response would be influenced by soil fertility and forest age. To test these hypotheses in a Costa Rican wet tropical forest, we established a large-scale litter manipulation experiment in two secondary forest sites and four old-growth forest sites of differing soil fertility. In replicated plots at each site, leaves and twigs (forest floor. We analyzed leaf litter mass, [N] and [P], and N and P inputs for addition, removal, and control plots over a two-year period. We also evaluated basal area increment of trees in removal and addition plots. There was no response of forest productivity or nutrient cycling to litter removal; however, litter addition significantly increased leaf litter production and N and P inputs 4-5 months following litter application. Litter production increased as much as 92%, and P and N inputs as much as 85% and 156%, respectively. In contrast, litter manipulation had no significant effect on woody growth. The increase in leaf litter production and N and P inputs were significantly positively related to the total P that was applied in litter form. Neither litter treatment nor forest type influenced the temporal pattern of any of the variables measured. Thus, environmental factors such as rainfall drive temporal variability in litter and nutrient inputs, while nutrient release from decomposing litter influences the magnitude. Seasonal or annual variation in leaf litter mass, such as occurs in strong El Niño events, could positively

  17. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  18. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  19. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  20. Development of small-scale peat production; Pienturvetuotannon kehittaeminen

    Energy Technology Data Exchange (ETDEWEB)

    Erkkilae, A.; Kallio, E. [VTT Energy, Jyvaeskylae (Finland)

    1997-12-01

    The aim of the project is to develop production conditions, methods and technology of small-scale peat production to such a level that the productivity is improved and competitivity maintained. The aim in 1996 was to survey the present status of small-scale peat production, and research and development needs and to prepare a development plan for small-scale peat production for a continued project in 1997 and for the longer term. A questionnaire was sent to producers by mail, and its results were completed by phone interviews. Response was obtained from 164 producers, i.e. from about 75 - 85 % of small-scale peat producers. The quantity of energy peat produced by these amounted to 3.3 TWh and that of other peat to 265 000 m{sup 3}. The total production of energy peat (large- scale producers Vapo Oy and Turveruukki Oy included) amounted to 25.0 TWh in 1996 in Finland, of which 91 % (22.8 TWh) was milled peat and 9 % (2.2 TWh) of sod peat. The total production of peat other than energy peat amounted to 1.4 million m{sup 3}. The proportion of small-scale peat production was 13 % of energy peat, 11 % of milled peat and 38 % of sod peat. The proportion of small-scale producers was 18 % of other peat production. The results deviate clearly from those obtained in a study of small-scale production in the 1980s. The amount of small-scale production is clearly larger than generally assessed. Small-scale production focuses more on milled peat than on sod peat. The work will be continued in 1997. Based on development needs appeared in the questionnaire, the aim is to reduce environmental impacts and runoff effluents from small- scale production, to increase the efficiency of peat deliveries and to reduce peat production costs by improving the service value of machines by increasing co-operative use. (orig.)

  1. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  2. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  3. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgqvist, Emma M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gagne, Douglas A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hillesheim, Michael B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Walker, H. A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Jeff [Colorado School of Mines, Golden, CO (United States); Boak, Jeremy [Colorado School of Mines, Golden, CO (United States); Washington, Jeremy [Colorado School of Mines, Golden, CO (United States); Sharp, Cory [Colorado School of Mines, Golden, CO (United States)

    2017-11-03

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  4. Inkjet printing as a roll-to-roll compatible technology for the production of large area electronic devices on a pre-industrial scale

    NARCIS (Netherlands)

    Teunissen, P.; Rubingh, E.; Lammeren, T. van; Abbel, R.J.; Groen, P.

    2014-01-01

    Inkjet printing is a promising approach towards the solution processing of electronic devices on an industrial scale. Of particular interest is the production of high-end applications such as large area OLEDs on flexible substrates. Roll-to-roll (R2R) processing technologies involving inkjet

  5. A new large-scale manufacturing platform for complex biopharmaceuticals.

    Science.gov (United States)

    Vogel, Jens H; Nguyen, Huong; Giovannini, Roberto; Ignowski, Jolene; Garger, Steve; Salgotra, Anil; Tom, Jennifer

    2012-12-01

    Complex biopharmaceuticals, such as recombinant blood coagulation factors, are addressing critical medical needs and represent a growing multibillion-dollar market. For commercial manufacturing of such, sometimes inherently unstable, molecules it is important to minimize product residence time in non-ideal milieu in order to obtain acceptable yields and consistently high product quality. Continuous perfusion cell culture allows minimization of residence time in the bioreactor, but also brings unique challenges in product recovery, which requires innovative solutions. In order to maximize yield, process efficiency, facility and equipment utilization, we have developed, scaled-up and successfully implemented a new integrated manufacturing platform in commercial scale. This platform consists of a (semi-)continuous cell separation process based on a disposable flow path and integrated with the upstream perfusion operation, followed by membrane chromatography on large-scale adsorber capsules in rapid cycling mode. Implementation of the platform at commercial scale for a new product candidate led to a yield improvement of 40% compared to the conventional process technology, while product quality has been shown to be more consistently high. Over 1,000,000 L of cell culture harvest have been processed with 100% success rate to date, demonstrating the robustness of the new platform process in GMP manufacturing. While membrane chromatography is well established for polishing in flow-through mode, this is its first commercial-scale application for bind/elute chromatography in the biopharmaceutical industry and demonstrates its potential in particular for manufacturing of potent, low-dose biopharmaceuticals. Copyright © 2012 Wiley Periodicals, Inc.

  6. Iodine oxides in large-scale THAI tests

    International Nuclear Information System (INIS)

    Funke, F.; Langrock, G.; Kanzleiter, T.; Poss, G.; Fischer, K.; Kühnel, A.; Weber, G.; Allelein, H.-J.

    2012-01-01

    Highlights: ► Iodine oxide particles were produced from gaseous iodine and ozone. ► Ozone replaced the effect of ionizing radiation in the large-scale THAI facility. ► The mean diameter of the iodine oxide particles was about 0.35 μm. ► Particle formation was faster than the chemical reaction between iodine and ozone. ► Deposition of iodine oxide particles was slow in the absence of other aerosols. - Abstract: The conversion of gaseous molecular iodine into iodine oxide aerosols has significant relevance in the understanding of the fission product iodine volatility in a LWR containment during severe accidents. In containment, the high radiation field caused by fission products released from the reactor core induces radiolytic oxidation into iodine oxides. To study the characteristics and the behaviour of iodine oxides in large scale, two THAI tests Iod-13 and Iod-14 were performed, simulating radiolytic oxidation of molecular iodine by reaction of iodine with ozone, with ozone injected from an ozone generator. The observed iodine oxides form submicron particles with mean volume-related diameters of about 0.35 μm and show low deposition rates in the THAI tests performed in the absence of other nuclear aerosols. Formation of iodine aerosols from gaseous precursors iodine and ozone is fast as compared to their chemical interaction. The current approach in empirical iodine containment behaviour models in severe accidents, including the radiolytic production of I 2 -oxidizing agents followed by the I 2 oxidation itself, is confirmed by these THAI tests.

  7. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  8. Large-scale production of UO2 kernels by sol–gel process at INET

    International Nuclear Information System (INIS)

    Hao, Shaochang; Ma, Jingtao; Zhao, Xingyu; Wang, Yang; Zhou, Xiangwen; Deng, Changsheng

    2014-01-01

    In order to supply elements (300,000 elements per year) for the Chinese pebble bed modular high temperature gas cooled reactor (HTR-PM), it is necessary to scale up the production of UO 2 kernels to 3–6 kgU per batch. The sol–gel process for preparation of UO 2 kernels have been improved and optimized at Institute of Nuclear and New Energy Technology (INET), Tsinghua University, PR China, and a whole set of facility was designed and constructed based on the process. This report briefly describes the main steps of the process, the key equipment and the production capacities of every step. Six batches of kernels for scale-up verification and four batches of kernels for fuel elements for in-pile irradiation tests have been successfully produced, respectively. The quality of the produced kernels meets the design requirements. The production capacity of the process reaches 3–6 kgU per batch

  9. EFFECTS OF LARGE-SCALE POULTRY FARMS ON AQUATIC MICROBIAL COMMUNITIES: A MOLECULAR INVESTIGATION.

    Science.gov (United States)

    The effects of large-scale poultry production operations on water quality and human health are largely unknown. Poultry litter is frequently applied as fertilizer to agricultural lands adjacent to large poultry farms. Run-off from the land introduces a variety of stressors into t...

  10. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  11. Charm production and mass scales in deep inelastic processes

    International Nuclear Information System (INIS)

    Close, F.E.; Scott, D.M.; Sivers, D.

    1976-07-01

    Because of their large mass, the production of charmed particles offers the possibility of new insight into fundamental dynamics. An approach to deep inelastic processes is discussed in which Generalized Vector Meson Dominance is used to extend parton model results away from the strict Bjorken scaling limit into regions where mass scales play an important role. The processes e + e - annihilation, photoproduction, deep inelastic leptoproduction, photon-photon scattering and the production of lepton pairs in hadronic collisions are discussed. The GCMD approach provides a reasonably unified framework and makes specific predictions concerning the way in which these reactions reflect an underlying flavour symmetry, broken by large mass differences. (author)

  12. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  13. Biological hydrogen production by dark fermentation: challenges and prospects towards scaled-up production.

    Science.gov (United States)

    RenNanqi; GuoWanqian; LiuBingfeng; CaoGuangli; DingJie

    2011-06-01

    Among different technologies of hydrogen production, bio-hydrogen production exhibits perhaps the greatest potential to replace fossil fuels. Based on recent research on dark fermentative hydrogen production, this article reviews the following aspects towards scaled-up application of this technology: bioreactor development and parameter optimization, process modeling and simulation, exploitation of cheaper raw materials and combining dark-fermentation with photo-fermentation. Bioreactors are necessary for dark-fermentation hydrogen production, so the design of reactor type and optimization of parameters are essential. Process modeling and simulation can help engineers design and optimize large-scale systems and operations. Use of cheaper raw materials will surely accelerate the pace of scaled-up production of biological hydrogen. And finally, combining dark-fermentation with photo-fermentation holds considerable promise, and has successfully achieved maximum overall hydrogen yield from a single substrate. Future development of bio-hydrogen production will also be discussed. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  15. Engineered catalytic biofilms for continuous large scale production of n-octanol and (S)-styrene oxide.

    Science.gov (United States)

    Gross, Rainer; Buehler, Katja; Schmid, Andreas

    2013-02-01

    This study evaluates the technical feasibility of biofilm-based biotransformations at an industrial scale by theoretically designing a process employing membrane fiber modules as being used in the chemical industry and compares the respective process parameters to classical stirred-tank studies. To our knowledge, catalytic biofilm processes for fine chemicals production have so far not been reported on a technical scale. As model reactions, we applied the previously studied asymmetric styrene epoxidation employing Pseudomonas sp. strain VLB120ΔC biofilms and the here-described selective alkane hydroxylation. Using the non-heme iron containing alkane hydroxylase system (AlkBGT) from P. putida Gpo1 in the recombinant P. putida PpS81 pBT10 biofilm, we were able to continuously produce 1-octanol from octane with a maximal productivity of 1.3 g L ⁻¹(aq) day⁻¹ in a single tube micro reactor. For a possible industrial application, a cylindrical membrane fiber module packed with 84,000 polypropylene fibers is proposed. Based on the here presented calculations, 59 membrane fiber modules (of 0.9 m diameter and 2 m length) would be feasible to realize a production process of 1,000 tons/year for styrene oxide. Moreover, the product yield on carbon can at least be doubled and over 400-fold less biomass waste would be generated compared to classical stirred-tank reactor processes. For the octanol process, instead, further intensification in biological activity and/or surface membrane enlargement is required to reach production scale. By taking into consideration challenges such as biomass growth control and maintaining a constant biological activity, this study shows that a biofilm process at an industrial scale for the production of fine chemicals is a sustainable alternative in terms of product yield and biomass waste production. Copyright © 2012 Wiley Periodicals, Inc.

  16. Large-scale freestanding nanometer-thick graphite pellicles for mass production of nanodevices beyond 10 nm.

    Science.gov (United States)

    Kim, Seul-Gi; Shin, Dong-Wook; Kim, Taesung; Kim, Sooyoung; Lee, Jung Hun; Lee, Chang Gu; Yang, Cheol-Woong; Lee, Sungjoo; Cho, Sang Jin; Jeon, Hwan Chul; Kim, Mun Ja; Kim, Byung-Gook; Yoo, Ji-Beom

    2015-09-21

    Extreme ultraviolet lithography (EUVL) has received much attention in the semiconductor industry as a promising candidate to extend dimensional scaling beyond 10 nm. We present a new pellicle material, nanometer-thick graphite film (NGF), which shows an extreme ultraviolet (EUV) transmission of 92% at a thickness of 18 nm. The maximum temperature induced by laser irradiation (λ = 800 nm) of 9.9 W cm(-2) was 267 °C, due to the high thermal conductivity of the NGF. The freestanding NGF was found to be chemically stable during annealing at 500 °C in a hydrogen environment. A 50 × 50 mm large area freestanding NGF was fabricated using the wet and dry transfer (WaDT) method. The NGF can be used as an EUVL pellicle for the mass production of nanodevices beyond 10 nm.

  17. Prelude to rational scale-up of penicillin production: a scale-down study.

    Science.gov (United States)

    Wang, Guan; Chu, Ju; Noorman, Henk; Xia, Jianye; Tang, Wenjun; Zhuang, Yingping; Zhang, Siliang

    2014-03-01

    Penicillin is one of the best known pharmaceuticals and is also an important member of the β-lactam antibiotics. Over the years, ambitious yields, titers, productivities, and low costs in the production of the β-lactam antibiotics have been stepwise realized through successive rounds of strain improvement and process optimization. Penicillium chrysogenum was proven to be an ideal cell factory for the production of penicillin, and successful approaches were exploited to elevate the production titer. However, the industrial production of penicillin faces the serious challenge that environmental gradients, which are caused by insufficient mixing and mass transfer limitations, exert a considerably negative impact on the ultimate productivity and yield. Scale-down studies regarding diverse environmental gradients have been carried out on bacteria, yeasts, and filamentous fungi as well as animal cells. In accordance, a variety of scale-down devices combined with fast sampling and quenching protocols have been established to acquire the true snapshots of the perturbed cellular conditions. The perturbed metabolome information stemming from scale-down studies contributed to the comprehension of the production process and the identification of improvement approaches. However, little is known about the influence of the flow field and the mechanisms of intracellular metabolism. Consequently, it is still rather difficult to realize a fully rational scale-up. In the future, developing a computer framework to simulate the flow field of the large-scale fermenters is highly recommended. Furthermore, a metabolically structured kinetic model directly related to the production of penicillin will be further coupled to the fluid flow dynamics. A mathematical model including the information from both computational fluid dynamics and chemical reaction dynamics will then be established for the prediction of detailed information over the entire period of the fermentation process and

  18. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  19. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  20. Parameter and State Estimation of Large-Scale Complex Systems Using Python Tools

    Directory of Open Access Journals (Sweden)

    M. Anushka S. Perera

    2015-07-01

    Full Text Available This paper discusses the topics related to automating parameter, disturbance and state estimation analysis of large-scale complex nonlinear dynamic systems using free programming tools. For large-scale complex systems, before implementing any state estimator, the system should be analyzed for structural observability and the structural observability analysis can be automated using Modelica and Python. As a result of structural observability analysis, the system may be decomposed into subsystems where some of them may be observable --- with respect to parameter, disturbances, and states --- while some may not. The state estimation process is carried out for those observable subsystems and the optimum number of additional measurements are prescribed for unobservable subsystems to make them observable. In this paper, an industrial case study is considered: the copper production process at Glencore Nikkelverk, Kristiansand, Norway. The copper production process is a large-scale complex system. It is shown how to implement various state estimators, in Python, to estimate parameters and disturbances, in addition to states, based on available measurements.

  1. Estimating GHG emission mitigation supply curves of large-scale biomass use on a country level

    International Nuclear Information System (INIS)

    Dornburg, Veronika; Dam, Jinke van; Faaij, Andre

    2007-01-01

    This study evaluates the possible influences of a large-scale introduction of biomass material and energy systems and their market volumes on land, material and energy market prices and their feedback to greenhouse gas (GHG) emission mitigation costs. GHG emission mitigation supply curves for large-scale biomass use were compiled using a methodology that combines a bottom-up analysis of biomass applications, biomass cost supply curves and market prices of land, biomaterials and bioenergy carriers. These market prices depend on the scale of biomass use and the market volume of materials and energy carriers and were estimated using own-price elasticities of demand. The methodology was demonstrated for a case study of Poland in the year 2015 applying different scenarios on economic development and trade in Europe. For the key technologies considered, i.e. medium density fibreboard, poly lactic acid, electricity and methanol production, GHG emission mitigation costs increase strongly with the scale of biomass production. Large-scale introduction of biomass use decreases the GHG emission reduction potential at costs below 50 Euro /Mg CO 2eq with about 13-70% depending on the scenario. Biomaterial production accounts for only a small part of this GHG emission reduction potential due to relatively small material markets and the subsequent strong decrease of biomaterial market prices at large scale of production. GHG emission mitigation costs depend strongly on biomass supply curves, own-price elasticity of land and market volumes of bioenergy carriers. The analysis shows that these influences should be taken into account for developing biomass implementations strategies

  2. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  3. Potential Impact of Large Scale Abstraction on the Quality of Shallow ...

    African Journals Online (AJOL)

    PRO

    Significant increase in crop production would not, however, be ... sounding) using Geonics EM34-3 and Abem SAS300C Terrameter to determine the aquifer (fresh water lens) ..... Final report on environmental impact assessment of large scale.

  4. The potential for large scale uses for fission product xenon

    International Nuclear Information System (INIS)

    Rohrmann, C.A.

    1983-01-01

    Of all fission products in spent, low enrichment, uranium, power reactor fuels xenon is produced in the highest yield - nearly one cubic meter, STP, per metric ton. In aged fuels which may be considered for processing in the U.S. radioactive xenon isotopes approach the lowest limits of detection. The separation from accompanying radioactive 85 Kr is the essential problem; however, this is state of the art technology which has been demonstrated on the pilot scale to yield xenon with pico-curie levels of 85 Kr contamination. If needed for special applications, such levels could be further reduced. Environmental considerations require the isolation of essentially all fission product krypton during fuel processing. Economic restraints assure that the bulk of this krypton will need to be separated from the much more voluminous xenon fraction of the total amount of fission gas. Xenon may thus be discarded or made available for uses at probably very low cost. In contrast with many other fission products which have unique radioactive characteristics which make them useful as sources of heat, gamma and x-rays and luminescence as well as for medicinal diagnostics and therapeutics fission product xenon differs from naturally occurring xenon only in its isotopic composition which gives it a slightly higher atomic weight, because of the much higher concentrations of the 134 X and 136 Xe isotopes. Therefore, fission product xenon can most likely find uses in applications which already exist but which can not be exploited most beneficially because of the high cost and scarcity of natural xenon. Unique uses would probably include applications in improved incandescent light illumination in place of krypton and in human anesthesia

  5. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  6. Development of polymers for large scale roll-to-roll processing of polymer solar cells

    DEFF Research Database (Denmark)

    Carlé, Jon Eggert

    Development of polymers for large scale roll-to-roll processing of polymer solar cells Conjugated polymers potential to both absorb light and transport current as well as the perspective of low cost and large scale production has made these kinds of material attractive in solar cell research....... The research field of polymer solar cells (PSCs) is rapidly progressing along three lines: Improvement of efficiency and stability together with the introduction of large scale production methods. All three lines are explored in this work. The thesis describes low band gap polymers and why these are needed....... Polymer of this type display broader absorption resulting in better overlap with the solar spectrum and potentially higher current density. Synthesis, characterization and device performance of three series of polymers illustrating how the absorption spectrum of polymers can be manipulated synthetically...

  7. Using value stream mapping technique through the lean production transformation process: An implementation in a large-scaled tractor company

    Directory of Open Access Journals (Sweden)

    Mehmet Rıza Adalı

    2017-04-01

    Full Text Available Today’s world, manufacturing industries have to continue their development and continuity in more competitive environment via decreasing their costs. As a first step in the lean production process transformation is to analyze the value added activities and non-value adding activities. This study aims at applying the concepts of Value Stream Mapping (VSM in a large-scaled tractor company in Sakarya. Waste and process time are identified by mapping the current state in the production line of platform. The future state was suggested with improvements for elimination of waste and reduction of lead time, which went from 13,08 to 4,35 days. Analysis are made using current and future states to support the suggested improvements and cycle time of the production line of platform is improved 8%. Results showed that VSM is a good alternative in the decision-making for change in production process.

  8. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Strzalka, Aneta; Alam, Nazmul; Duminil, Eric; Coors, Volker; Eicker, Ursula

    2012-01-01

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  9. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  10. Large-scale Lurgi plant would be uneconomic: study group

    Energy Technology Data Exchange (ETDEWEB)

    1964-03-21

    Gas Council and National Coal Board agreed that building of large scale Lurgi plant on the basis of study is not at present acceptable on economic grounds. The committee considered that new processes based on naphtha offered more economic sources of base and peak load production. Tables listing data provided in contractors' design studies and summary of contractors' process designs are included.

  11. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  12. Scale-up and optimization of biohydrogen production reactor from laboratory-scale to industrial-scale on the basis of computational fluid dynamics simulation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Xu; Ding, Jie; Guo, Wan-Qian; Ren, Nan-Qi [State Key Laboratory of Urban Water Resource and Environment, Harbin Institute of Technology, 202 Haihe Road, Nangang District, Harbin, Heilongjiang 150090 (China)

    2010-10-15

    The objective of conducting experiments in a laboratory is to gain data that helps in designing and operating large-scale biological processes. However, the scale-up and design of industrial-scale biohydrogen production reactors is still uncertain. In this paper, an established and proven Eulerian-Eulerian computational fluid dynamics (CFD) model was employed to perform hydrodynamics assessments of an industrial-scale continuous stirred-tank reactor (CSTR) for biohydrogen production. The merits of the laboratory-scale CSTR and industrial-scale CSTR were compared and analyzed on the basis of CFD simulation. The outcomes demonstrated that there are many parameters that need to be optimized in the industrial-scale reactor, such as the velocity field and stagnation zone. According to the results of hydrodynamics evaluation, the structure of industrial-scale CSTR was optimized and the results are positive in terms of advancing the industrialization of biohydrogen production. (author)

  13. Testing of Large-Scale ICV Glasses with Hanford LAW Simulant

    Energy Technology Data Exchange (ETDEWEB)

    Hrma, Pavel R.; Kim, Dong-Sang; Vienna, John D.; Matyas, Josef; Smith, Donald E.; Schweiger, Michael J.; Yeager, John D.

    2005-03-01

    Preliminary glass compositions for immobilizing Hanford low-activity waste (LAW) by the in-container vitrification (ICV) process were initially fabricated at crucible- and engineering-scale, including simulants and actual (radioactive) LAW. Glasses were characterized for vapor hydration test (VHT) and product consistency test (PCT) responses and crystallinity (both quenched and slow-cooled samples). Selected glasses were tested for toxicity characteristic leach procedure (TCLP) responses, viscosity, and electrical conductivity. This testing showed that glasses with LAW loading of 20 mass% can be made readily and meet all product constraints by a far margin. Glasses with over 22 mass% Na2O can be made to meet all other product quality and process constraints. Large-scale testing was performed at the AMEC, Geomelt Division facility in Richland. Three tests were conducted using simulated LAW with increasing loadings of 12, 17, and 20 mass% Na2O. Glass samples were taken from the test products in a manner to represent the full expected range of product performance. These samples were characterized for composition, density, crystalline and non-crystalline phase assemblage, and durability using the VHT, PCT, and TCLP tests. The results, presented in this report, show that the AMEC ICV product with meets all waste form requirements with a large margin. These results provide strong evidence that the Hanford LAW can be successfully vitrified by the ICV technology and can meet all the constraints related to product quality. The economic feasibility of the ICV technology can be further enhanced by subsequent optimization.

  14. How Close We Are to Achieving Commercially Viable Large-Scale Photobiological Hydrogen Production by Cyanobacteria: A Review of the Biological Aspects

    Science.gov (United States)

    Sakurai, Hidehiro; Masukawa, Hajime; Kitashima, Masaharu; Inoue, Kazuhito

    2015-01-01

    Photobiological production of H2 by cyanobacteria is considered to be an ideal source of renewable energy because the inputs, water and sunlight, are abundant. The products of photobiological systems are H2 and O2; the H2 can be used as the energy source of fuel cells, etc., which generate electricity at high efficiencies and minimal pollution, as the waste product is H2O. Overall, production of commercially viable algal fuels in any form, including biomass and biodiesel, is challenging, and the very few systems that are operational have yet to be evaluated. In this paper we will: briefly review some of the necessary conditions for economical production, summarize the reports of photobiological H2 production by cyanobacteria, present our schemes for future production, and discuss the necessity for further progress in the research needed to achieve commercially viable large-scale H2 production. PMID:25793279

  15. Market competitive Fischer-Tropsch diesel production. Techno-economic and environmental analysis of a thermo-chemical Biorefinery process for large scale biosyngas-derived FT-diesel production

    International Nuclear Information System (INIS)

    Van Ree, R.; Van der Drift, A.; Zwart, R.W.R.; Boerrigter, H.

    2005-08-01

    The contents of the presentation are summarized as follows: Introduction of the Dutch policy framework, Biomass availability and contractibility, and Biomass transportation fuels: current use and perspectives; Next subject concerns Large-scale BioSyngas production: optimum gasification technology; slagging EF-gasifier; identification and modelling biomass-conversion chains; overall energetic chain efficiencies, economics, environmental char; and a comparison with fossil-derived diesel. Further subjects are Current technological SOTA and R, D and D-trajectory; Pre-design 600 MWth demonstration plant; and the Conclusions

  16. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  17. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  18. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  19. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  20. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  1. New systems for the large-scale production of male tsetse flies (Diptera: Glossinidae)

    International Nuclear Information System (INIS)

    Opiyo, E.; Luger, D.; Robinson, A.S.

    2000-01-01

    morsitans morsitans Westwood produced a total of 500,000 sterile males. In Burkina Faso, between 1976 and 1984, a colony of 330,000 G. palpalis gambiensis Vanderplank and G. tachinoides Westwood provided 950,000 sterile males for release into an area of 3,000 km 2 (Clair et al. 1990) while during the Bicot project in Nigeria in an area of 1,500 km 2 , 1.5 million sterile male G. p. palpalis Robineau-Desvoidy were released (Olandunmade et al. 1990). Recently, 8.5 million sterile males were released on Unguja Island, Zanzibar, the United Republic of Tanzania in an area of 1,600 km 2 produced by a colony of about 600,000 G. austeni Newstead (Saleh et al. 1997, Kitwika et al. 1997). This led to the eradication of the tsetse population and a massive reduction in disease incidence in cattle (Saleh et al. 1997). Tsetse fly SIT has been applied on a limited scale because of the inability to provide large numbers of sterile males for release. The present rearing system is labour intensive and too many quality sensitive steps in the mass production system are not sufficiently standardised to transfer the system directly to large-scale production. Tsetse rearing evolved from feeding on live hosts to an in vitro rearing system where blood is fed to flies through a silicone membrane (Feldmann 1994a). At present, cages are small, hold a small number of flies and have to be manually transferred for feeding and then returned for pupal collection. This limits the number of flies that can be handled at any one time. In order to improve these processes, a Tsetse Production Unit (TPU) was developed and evaluated. During conventional tsetse rearing, flies need to be sexed with the correct number and sex of flies, whether for stocking production cages or for the release of males only. This has to be done by hand on an individual fly basis following the immobilisation of adults at C. A procedure is reported in this paper for the self-stocking of production cages (SSPC) which enables flies to

  2. Large-Scale Fabrication of Silicon Nanowires for Solar Energy Applications.

    Science.gov (United States)

    Zhang, Bingchang; Jie, Jiansheng; Zhang, Xiujuan; Ou, Xuemei; Zhang, Xiaohong

    2017-10-11

    The development of silicon (Si) materials during past decades has boosted up the prosperity of the modern semiconductor industry. In comparison with the bulk-Si materials, Si nanowires (SiNWs) possess superior structural, optical, and electrical properties and have attracted increasing attention in solar energy applications. To achieve the practical applications of SiNWs, both large-scale synthesis of SiNWs at low cost and rational design of energy conversion devices with high efficiency are the prerequisite. This review focuses on the recent progresses in large-scale production of SiNWs, as well as the construction of high-efficiency SiNW-based solar energy conversion devices, including photovoltaic devices and photo-electrochemical cells. Finally, the outlook and challenges in this emerging field are presented.

  3. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA.

    Science.gov (United States)

    Nandi, Anirban; Pan, Sharadwata; Potumarthi, Ravichandra; Danquah, Michael K; Sarethy, Indira P

    2014-01-01

    Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D). In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA). It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  4. A Proposal for Six Sigma Integration for Large-Scale Production of Penicillin G and Subsequent Conversion to 6-APA

    Directory of Open Access Journals (Sweden)

    Anirban Nandi

    2014-01-01

    Full Text Available Six Sigma methodology has been successfully applied to daily operations by several leading global private firms including GE and Motorola, to leverage their net profits. Comparatively, limited studies have been conducted to find out whether this highly successful methodology can be applied to research and development (R&D. In the current study, we have reviewed and proposed a process for a probable integration of Six Sigma methodology to large-scale production of Penicillin G and its subsequent conversion to 6-aminopenicillanic acid (6-APA. It is anticipated that the important aspects of quality control and quality assurance will highly benefit from the integration of Six Sigma methodology in mass production of Penicillin G and/or its conversion to 6-APA.

  5. Survey of large-scale solar water heaters installed in Taiwan, China

    Energy Technology Data Exchange (ETDEWEB)

    Chang Keh-Chin; Lee Tsong-Sheng; Chung Kung-Ming [Cheng Kung Univ., Tainan (China); Lien Ya-Feng; Lee Chine-An [Cheng Kung Univ. Research and Development Foundation, Tainan (China)

    2008-07-01

    Almost all the solar collectors installed in Taiwan, China were used for production of hot water for homeowners (residential systems), in which the area of solar collectors is less than 10 square meters. From 2001 to 2006, there were only 39 large-scale systems (defined as the area of solar collectors being over 100 m{sup 2}) installed. Their utilization purposes are for rooming house (dormitory), swimming pool, restaurant, and manufacturing process. A comprehensive survey of those large-scale solar water heaters was conducted in 2006. The objectives of the survey were to asses the systems' performance and to have the feedback from the individual users. It is found that lack of experience in system design and maintenance are the key factors for reliable operation of a system. For further promotion of large-scale solar water heaters in Taiwan, a more compressive program on a system design for manufacturing process should be conducted. (orig.)

  6. Large-scale additive manufacturing with bioinspired cellulosic materials.

    Science.gov (United States)

    Sanandiya, Naresh D; Vijay, Yadunund; Dimopoulou, Marina; Dritsas, Stylianos; Fernandez, Javier G

    2018-06-05

    Cellulose is the most abundant and broadly distributed organic compound and industrial by-product on Earth. However, despite decades of extensive research, the bottom-up use of cellulose to fabricate 3D objects is still plagued with problems that restrict its practical applications: derivatives with vast polluting effects, use in combination with plastics, lack of scalability and high production cost. Here we demonstrate the general use of cellulose to manufacture large 3D objects. Our approach diverges from the common association of cellulose with green plants and it is inspired by the wall of the fungus-like oomycetes, which is reproduced introducing small amounts of chitin between cellulose fibers. The resulting fungal-like adhesive material(s) (FLAM) are strong, lightweight and inexpensive, and can be molded or processed using woodworking techniques. We believe this first large-scale additive manufacture with ubiquitous biological polymers will be the catalyst for the transition to environmentally benign and circular manufacturing models.

  7. Evaluation of hollow fiber culture for large-scale production of mouse embryonic stem cell-derived hematopoietic stem cells.

    Science.gov (United States)

    Nakano, Yu; Iwanaga, Shinya; Mizumoto, Hiroshi; Kajiwara, Toshihisa

    2018-03-03

    Hematopoietic stem cells (HSCs) have the ability to differentiate into all types of blood cells and can be transplanted to treat blood disorders. However, it is difficult to obtain HSCs in large quantities because of the shortage of donors. Recent efforts have focused on acquiring HSCs by differentiation of pluripotent stem cells. As a conventional differentiation method of pluripotent stem cells, the formation of embryoid bodies (EBs) is often employed. However, the size of EBs is limited by depletion of oxygen and nutrients, which prevents them from being efficient for the production of HSCs. In this study, we developed a large-scale hematopoietic differentiation approach for mouse embryonic stem (ES) cells by applying a hollow fiber (HF)/organoid culture method. Cylindrical organoids, which had the potential for further spontaneous differentiation, were established inside of hollow fibers. Using this method, we improved the proliferation rate of mouse ES cells to produce an increased HSC population and achieved around a 40-fold higher production volume of HSCs in HF culture than in conventional EB culture. Therefore, the HF/organoid culture method may be a new mass culture method to acquire pluripotent stem cell-derived HSCs.

  8. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  9. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  10. A review of large-scale solar heating systems in Europe

    International Nuclear Information System (INIS)

    Fisch, M.N.; Guigas, M.; Dalenback, J.O.

    1998-01-01

    Large-scale solar applications benefit from the effect of scale. Compared to small solar domestic hot water (DHW) systems for single-family houses, the solar heat cost can be cut at least in third. The most interesting projects for replacing fossil fuels and the reduction of CO 2 -emissions are solar systems with seasonal storage in combination with gas or biomass boilers. In the framework of the EU-APAS project Large-scale Solar Heating Systems, thirteen existing plants in six European countries have been evaluated. lie yearly solar gains of the systems are between 300 and 550 kWh per m 2 collector area. The investment cost of solar plants with short-term storage varies from 300 up to 600 ECU per m 2 . Systems with seasonal storage show investment costs twice as high. Results of studies concerning the market potential for solar heating plants, taking new collector concepts and industrial production into account, are presented. Site specific studies and predesign of large-scale solar heating plants in six European countries for housing developments show a 50% cost reduction compared to existing projects. The cost-benefit-ratio for the planned systems with long-term storage is between 0.7 and 1.5 ECU per kWh per year. (author)

  11. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  12. Biofuel Development and Large-Scale Land Deals in Sub-Saharan Africa

    OpenAIRE

    Giorgia Giovannetti; Elisa Ticci

    2013-01-01

    Africa's biofuel potential over the last ten years has increasingly attracted foreign investors’ attention. We estimate the determinants of foreign investors land demand for biofuel production in SSA, using Poisson specifications of the gravity model. Our estimates suggest that land availability, abundance of water resources and weak land governance are significant determinants of large-scale land acquisitions for biofuel production. This in turn suggests that this type of investment is mainl...

  13. Efficient large-scale protein production of larvae and pupae of silkworm by Bombyx mori nuclear polyhedrosis virus bacmid system

    International Nuclear Information System (INIS)

    Motohashi, Tomoko; Shimojima, Tsukasa; Fukagawa, Tatsuo; Maenaka, Katsumi; Park, Enoch Y.

    2005-01-01

    Silkworm is one of the most attractive hosts for large-scale production of eukaryotic proteins as well as recombinant baculoviruses for gene transfer to mammalian cells. The bacmid system of Autographa californica nuclear polyhedrosis virus (AcNPV) has already been established and widely used. However, the AcNPV does not have a potential to infect silkworm. We developed the first practical Bombyx mori nuclear polyhedrosis virus bacmid system directly applicable for the protein expression of silkworm. By using this system, the green fluorescence protein was successfully expressed in silkworm larvae and pupae not only by infection of its recombinant virus but also by direct injection of its bacmid DNA. This method provides the rapid protein production in silkworm as long as 10 days, is free from biohazard, thus will be a powerful tool for the future production factory of recombinant eukaryotic proteins and baculoviruses

  14. Computational Modelling of Large Scale Phage Production Using a Two-Stage Batch Process

    Directory of Open Access Journals (Sweden)

    Konrad Krysiak-Baltyn

    2018-04-01

    Full Text Available Cost effective and scalable methods for phage production are required to meet an increasing demand for phage, as an alternative to antibiotics. Computational models can assist the optimization of such production processes. A model is developed here that can simulate the dynamics of phage population growth and production in a two-stage, self-cycling process. The model incorporates variable infection parameters as a function of bacterial growth rate and employs ordinary differential equations, allowing application to a setup with multiple reactors. The model provides simple cost estimates as a function of key operational parameters including substrate concentration, feed volume and cycling times. For the phage and bacteria pairing examined, costs and productivity varied by three orders of magnitude, with the lowest cost found to be most sensitive to the influent substrate concentration and low level setting in the first vessel. An example case study of phage production is also presented, showing how parameter values affect the production costs and estimating production times. The approach presented is flexible and can be used to optimize phage production at laboratory or factory scale by minimizing costs or maximizing productivity.

  15. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  16. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  17. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  18. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  19. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  20. Most experiments done so far with limited plants. Large-scale testing ...

    Indian Academy of Sciences (India)

    First page Back Continue Last page Graphics. Most experiments done so far with limited plants. Large-scale testing needs to be done with objectives such as: Apart from primary transformants, their progenies must be tested. Experiments on segregation, production of homozygous lines, analysis of expression levels in ...

  1. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  2. The Role of Small-Scale Biofuel Production in Brazil: Lessons for Developing Countries

    Directory of Open Access Journals (Sweden)

    Arielle Muniz Kubota

    2017-07-01

    Full Text Available Small-scale biofuel initiatives to produce sugarcane ethanol are claimed to be a sustainable opportunity for ethanol supply, particularly for regions with price-restricted or no access to modern biofuels, such as communities located far from the large ethanol production centers in Brazil and family-farm communities in Sub-Saharan Africa, respectively. However, smallholders often struggle to achieve economic sustainability with ethanol microdistilleries. The aim of this paper is to provide an assessment of the challenges faced by small-scale bioenergy initiatives and discuss the conditions that would potentially make these initiatives economically feasible. Ethanol microdistilleries were assessed through a critical discussion of existent models and through an economic analysis of different sugarcane ethanol production models. The technical-economic analysis showed that the lack of competitiveness against large-scale ethanol distillery, largely due to both low crop productivity and process efficiency, makes it unlikely that small-scale distilleries can compete in the national/international ethanol market without governmental policies and subsidies. Nevertheless, small-scale projects intended for local supply and integrated food–fuel systems seem to be an interesting alternative that can potentially make ethanol production in small farms viable as well as increase food security and project sustainability particularly for local communities in developing countries.

  3. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  4. Advances in Large-Scale Solar Heating and Long Term Storage in Denmark

    DEFF Research Database (Denmark)

    Heller, Alfred

    2000-01-01

    According to (the) information from the European Large-Scale Solar Heating Network, (See http://www.hvac.chalmers.se/cshp/), the area of installed solar collectors for large-scale application is in Europe, approximately 8 mill m2, corresponding to about 4000 MW thermal power. The 11 plants...... the last 10 years and the corresponding cost per collector area for the final installed plant is kept constant, even so the solar production is increased. Unfortunately large-scale seasonal storage was not able to keep up with the advances in solar technology, at least for pit water and gravel storage...... of the total 51 plants are equipped with long-term storage. In Denmark, 7 plants are installed, comprising of approx. 18,000-m2 collector area with new plants planned. The development of these plants and the involved technologies will be presented in this paper, with a focus on the improvements for Danish...

  5. Innovation-driven efficient development of the Longwangmiao Fm large-scale sulfur gas reservoir in Moxi block, Sichuan Basin

    Directory of Open Access Journals (Sweden)

    Xinhua Ma

    2016-03-01

    Full Text Available The Lower Cambrian Longwangmiao Fm gas reservoir in Moxi block of the Anyue Gas field, Sichuan Basin, is the largest single-sandbody integrated carbonate gas reservoir proved so far in China. Notwithstanding this reservoir's advantages like large-scale reserves and high single-well productivity, there are multiple complicated factors restricting its efficient development, such as a median content of hydrogen sulfide, low porosity and strong heterogeneity of fracture–cave formation, various modes of gas–water occurrences, and close relation between overpressure and stress sensitivity. Up till now, since only a few Cambrian large-scale carbonate gas reservoirs have ever been developed in the world, there still exists some blind spots especially about its exploration and production rules. Besides, as for large-scale sulfur gas reservoirs, the exploration and construction is costly, and production test in the early evaluation stage is severely limited, all of which will bring about great challenges in productivity construction and high potential risks. In this regard, combining with Chinese strategic demand of strengthening clean energy supply security, the PetroChina Southwest Oil & Gas Field Company has carried out researches and field tests for the purpose of providing high-production wells, optimizing development design, rapidly constructing high-quality productivity and upgrading HSE security in the Longwangmiao Fm gas reservoir in Moxi block. Through the innovations of technology and management mode within 3 years, this gas reservoir has been built into a modern large-scale gas field with high quality, high efficiency and high benefit, and its annual capacity is now up to over 100 × 108 m3, with a desirable production capacity and development indexes gained as originally anticipated. It has become a new model of large-scale gas reservoirs with efficient development, providing a reference for other types of gas reservoirs in China.

  6. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  7. Preserving biological diversity in the face of large-scale demands for biofuels

    International Nuclear Information System (INIS)

    Cook, J.J.; Beyea, J.; Keeler, K.H.

    1991-01-01

    Large-scale production and harvesting of biomass to replace fossil fuels could reduce biological diversity by eliminating habitat for native species. Forests would be managed and harvested more intensively, and virtually all arable land unsuitable for high-value agriculture or silviculture might be used to grow crops dedicated to energy. Given the prospects for a potentially large increase in biofuel production, it is time now to develop strategies for mitigating the loss of biodiversity that might ensue. Planning at micro to macro scales will be crucial to minimize the ecological impacts of producing biofuels. In particular, cropping and harvesting systems will need to provide the biological, spatial, and temporal diversity characteristics of natural ecosystems and successional sequences, if we are to have this technology support the environmental health of the world rather than compromise it. Incorporation of these ecological values will be necessary to forestall costly environmental restoration, even at the cost of submaximal biomass productivity. It is therefore doubtful that all managers will take the longer view. Since the costs of biodiversity loss are largely external to economic markets, society cannot rely on the market to protect biodiversity, and some sort of intervention will be necessary. 116 refs., 1 tab

  8. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  9. Large-scale solvothermal synthesis of fluorescent carbon nanoparticles

    International Nuclear Information System (INIS)

    Ku, Kahoe; Park, Jinwoo; Kim, Nayon; Kim, Woong; Lee, Seung-Wook; Chung, Haegeun; Han, Chi-Hwan

    2014-01-01

    The large-scale production of high-quality carbon nanomaterials is highly desirable for a variety of applications. We demonstrate a novel synthetic route to the production of fluorescent carbon nanoparticles (CNPs) in large quantities via a single-step reaction. The simple heating of a mixture of benzaldehyde, ethanol and graphite oxide (GO) with residual sulfuric acid in an autoclave produced 7 g of CNPs with a quantum yield of 20%. The CNPs can be dispersed in various organic solvents; hence, they are easily incorporated into polymer composites in forms such as nanofibers and thin films. Additionally, we observed that the GO present during the CNP synthesis was reduced. The reduced GO (RGO) was sufficiently conductive (σ ≈ 282 S m −1 ) such that it could be used as an electrode material in a supercapacitor; in addition, it can provide excellent capacitive behavior and high-rate capability. This work will contribute greatly to the development of efficient synthetic routes to diverse carbon nanomaterials, including CNPs and RGO, that are suitable for a wide range of applications. (paper)

  10. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  11. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  12. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  13. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  14. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  15. The Proposal of Scaling the Roles in Scrum of Scrums for Distributed Large Projects

    OpenAIRE

    Abeer M. AlMutairi; M. Rizwan Jameel Qureshi

    2015-01-01

    Scrum of scrums is an approach used to scale the traditional Scrum methodology to fit for the development of complex and large projects. However, scaling the roles of scrum members brought new challenges especially in distributed and large software projects. This paper describes in details the roles of each scrum member in scrum of scrum to propose a solution to use a dedicated product owner for a team and inclusion of sub-backlog. The main goal of the proposed solution i...

  16. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P [PA Energy, Malling (Denmark); Vedde, J [SiCon. Silicon and PV consulting, Birkeroed (Denmark)

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  17. Analysis of Utilization of Fecal Resources in Large-scale Livestock and Poultry Breeding in China

    Directory of Open Access Journals (Sweden)

    XUAN Meng

    2018-02-01

    Full Text Available The purpose of this paper is to develop a systematic investigation for the serious problems of livestock and poultry breeding in China and the technical demand of promoting the utilization of manure. Based on the status quo of large-scale livestock and poultry farming in typical areas in China, the work had been done beared on statistics and analysis of the modes and proportions of utilization of manure resources. Such a statistical method had been applied to the country -identified large -scale farm, which the total amount of pollutants reduction was in accordance with the "12th Five-Year Plan" standards. The results showed that there were some differences in the modes of resource utilization due to livestock and poultry manure at different scales and types:(1 Hogs, dairy cattle and beef cattle in total accounted for more than 75% of the agricultural manure storage;(2 Laying hens and broiler chickens accounted for about 65% of the total production of the organic manure produced by fecal production. It is demonstrated that the major modes of resource utilization of dung and urine were related to the natural characteristics, agricultural production methods, farming scale and economic development level in the area. It was concluded that the unreasonable planning, lacking of cleansing during breeding, false selection of manure utilizing modes were the major problems in China忆s large-scale livestock and poultry fecal resources utilization.

  18. History matching of large scale fractures to production data; Calage de la geometrie des reseaux de fractures aux donnees hydrodynamiques de production d'un champ petrolier

    Energy Technology Data Exchange (ETDEWEB)

    Jenni, S.

    2005-01-01

    Object based models are very helpful to represent complex geological media such as fractured reservoirs. For building realistic fracture networks, these models have to be constrained to both static (seismic, geomechanics, geology) and dynamic data (well tests and production history). In this report we present a procedure for the calibration of large-scale fracture networks to production history. The history matching procedure includes a realistic geological modeling, a parameterization method coherent with the geological model and allowing an efficient optimization. Fluid flow modeling is based on a double medium approach. The calibration procedure was applied to a semi-synthetic case based on a real fractured reservoir. The calibration to water-cut data was performed. (author)

  19. Economic Model Predictive Control for Large-Scale and Distributed Energy Systems

    DEFF Research Database (Denmark)

    Standardi, Laura

    Sources (RESs) in the smart grids is increasing. These energy sources bring uncertainty to the production due to their fluctuations. Hence,smart grids need suitable control systems that are able to continuously balance power production and consumption.  We apply the Economic Model Predictive Control (EMPC......) strategy to optimise the economic performances of the energy systems and to balance the power production and consumption. In the case of large-scale energy systems, the electrical grid connects a high number of power units. Because of this, the related control problem involves a high number of variables......In this thesis, we consider control strategies for large and distributed energy systems that are important for the implementation of smart grid technologies.  An electrical grid has to ensure reliability and avoid long-term interruptions in the power supply. Moreover, the share of Renewable Energy...

  20. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  1. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  2. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  3. Application of plant metabonomics in quality assessment for large-scale production of traditional Chinese medicine.

    Science.gov (United States)

    Ning, Zhangchi; Lu, Cheng; Zhang, Yuxin; Zhao, Siyu; Liu, Baoqin; Xu, Xuegong; Liu, Yuanyan

    2013-07-01

    The curative effects of traditional Chinese medicines are principally based on the synergic effect of their multi-targeting, multi-ingredient preparations, in contrast to modern pharmacology and drug development that often focus on a single chemical entity. Therefore, the method employing a few markers or pharmacologically active constituents to assess the quality and authenticity of the complex preparations has a number of severe challenges. Metabonomics can provide an effective platform for complex sample analysis. It is also reported to be applied to the quality analysis of the traditional Chinese medicine. Metabonomics enables comprehensive assessment of complex traditional Chinese medicines or herbal remedies and sample classification of diverse biological statuses, origins, or qualities in samples, by means of chemometrics. Identification, processing, and pharmaceutical preparation are the main procedures in the large-scale production of Chinese medicinal preparations. Through complete scans, plants metabonomics addresses some of the shortfalls of single analyses and presents a considerable potential to become a sharp tool for traditional Chinese medicine quality assessment. Georg Thieme Verlag KG Stuttgart · New York.

  4. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  5. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe

    NARCIS (Netherlands)

    Blaas, H.; Kroeze, C.

    2014-01-01

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed.

  6. Algorithm 873: LSTRS: MATLAB Software for Large-Scale Trust-Region Subproblems and Regularization

    DEFF Research Database (Denmark)

    Rojas Larrazabal, Marielba de la Caridad; Santos, Sandra A.; Sorensen, Danny C.

    2008-01-01

    A MATLAB 6.0 implementation of the LSTRS method is resented. LSTRS was described in Rojas, M., Santos, S.A., and Sorensen, D.C., A new matrix-free method for the large-scale trust-region subproblem, SIAM J. Optim., 11(3):611-646, 2000. LSTRS is designed for large-scale quadratic problems with one...... at each step. LSTRS relies on matrix-vector products only and has low and fixed storage requirements, features that make it suitable for large-scale computations. In the MATLAB implementation, the Hessian matrix of the quadratic objective function can be specified either explicitly, or in the form...... of a matrix-vector multiplication routine. Therefore, the implementation preserves the matrix-free nature of the method. A description of the LSTRS method and of the MATLAB software, version 1.2, is presented. Comparisons with other techniques and applications of the method are also included. A guide...

  7. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  8. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    Science.gov (United States)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.

  9. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  10. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  11. Risk Management Challenges in Large-scale Energy PSS

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2017-01-01

    Probabilistic risk management approaches have a long tradition in engineering. A large variety of tools and techniques based on the probabilistic view of risk is available and applied in PSS practice. However, uncertainties that arise due to lack of knowledge and information are still missing...... adequate representations. We focus on a large-scale energy company in Denmark as one case of current product/servicesystems risk management best practices. We analyze their risk management process and investigate the tools they use in order to support decision making processes within the company. First, we...... identify the following challenges in the current risk management practices that are in line with literature: (1) current methods are not appropriate for the situations dominated by weak knowledge and information; (2) quality of traditional models in such situations is open to debate; (3) quality of input...

  12. Feasibility of an energy conversion system in Canada involving large-scale integrated hydrogen production using solid fuels

    International Nuclear Information System (INIS)

    Gnanapragasam, Nirmal V.; Reddy, Bale V.; Rosen, Marc A.

    2010-01-01

    A large-scale hydrogen production system is proposed using solid fuels and designed to increase the sustainability of alternative energy forms in Canada, and the technical and economic aspects of the system within the Canadian energy market are examined. The work investigates the feasibility and constraints in implementing such a system within the energy infrastructure of Canada. The proposed multi-conversion and single-function system produces hydrogen in large quantities using energy from solid fuels such as coal, tar sands, biomass, municipal solid waste (MSW) and agricultural/forest/industrial residue. The proposed system involves significant technology integration, with various energy conversion processes (such as gasification, chemical looping combustion, anaerobic digestion, combustion power cycles-electrolysis and solar-thermal converters) interconnected to increase the utilization of solid fuels as much as feasible within cost, environmental and other constraints. The analysis involves quantitative and qualitative assessments based on (i) energy resources availability and demand for hydrogen, (ii) commercial viability of primary energy conversion technologies, (iii) academia, industry and government participation, (iv) sustainability and (v) economics. An illustrative example provides an initial road map for implementing such a system. (author)

  13. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  14. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  15. Food security through large scale investments in agriculture

    Science.gov (United States)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  16. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  17. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  18. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  19. Comparative Study of Laboratory-Scale and Prototypic Production-Scale Fuel Fabrication Processes and Product Characteristics

    International Nuclear Information System (INIS)

    2014-01-01

    An objective of the High Temperature Gas Reactor fuel development and qualification program for the United States Department of Energy has been to qualify fuel fabricated in prototypic production-scale equipment. The quality and characteristics of the tristructural isotropic coatings on fuel kernels are influenced by the equipment scale and processing parameters. Some characteristics affecting product quality were suppressed while others have become more significant in the larger equipment. Changes to the composition and method of producing resinated graphite matrix material has eliminated the use of hazardous, flammable liquids and enabled it to be procured as a vendor-supplied feed stock. A new method of overcoating TRISO particles with the resinated graphite matrix eliminates the use of hazardous, flammable liquids, produces highly spherical particles with a narrow size distribution, and attains product yields in excess of 99%. Compact fabrication processes have been scaled-up and automated with relatively minor changes to compact quality to manual laboratory-scale processes. The impact on statistical variability of the processes and the products as equipment was scaled are discussed. The prototypic production-scale processes produce test fuels that meet fuel quality specifications.

  20. Review of AVLIS technology for production-scale LIS systems and construction

    International Nuclear Information System (INIS)

    Davis, J.I.; Moses, E.I.

    1983-12-01

    The use of lasers for uranium and/or plutonium isotope separation is expected to be the first application of lasers utilizing specific atomic processes for large-scale materials processing. Specific accomplishments toward the development of production-scale technology for LIS systems will be presented, along with the status of major construction projects. 24 figures

  1. Biomethanol production from gasification of non-woody plant in South Africa: Optimum scale and economic performance

    International Nuclear Information System (INIS)

    Amigun, Bamikole; Gorgens, Johann; Knoetze, Hansie

    2010-01-01

    Methanol production from biomass is a promising carbon neutral fuel, well suited for use in fuel cell vehicles (FCVs), as transportation fuel and as chemical building block. The concept used in this study incorporates an innovative Absorption Enhanced Reforming (AER) gasification process, which enables an efficient conversion of biomass into a hydrogen-rich gas (syngas) and then, uses the Mitsubishi methanol converter (superconverter) for methanol synthesis. Technical and economic prospects for production of methanol have been evaluated. The methanol plants described have a biomass input between 10 and 2000 MW th . The economy of the methanol production plants is very dependent on the production capacity and large-scale facilities are required to benefit from economies of scale. However, large-scale plants are likely to have higher transportation costs per unit biomass transported as a result of longer transportation distances. Analyses show that lower unit investment costs accompanying increased production scale outweighs the cost for transporting larger quantities of biomass. The unit cost of methanol production mostly depends on the capital investments. The total unit cost of methanol is found to decrease from about 10.66 R/l for a 10 MW th to about 6.44 R/l for a 60 MW th and 3.95 R/l for a 400 MW th methanol plant. The unit costs stabilise (a near flat profile was observed) for plant sizes between 400 and 2000 MW th , but the unit cost do however continue to decrease to about 2.89 R/l for a 2000 MW th plant. Long term cost reduction mainly resides in technological learning and large-scale production. Therefore, technology development towards large-scale technology that takes into account sustainable biomass production could be a better choice due to economic reasons.

  2. Biomethanol production from gasification of non-woody plant in South Africa: Optimum scale and economic performance

    Energy Technology Data Exchange (ETDEWEB)

    Amigun, Bamikole, E-mail: bamigun@csir.co.z [Sustainable Energy Futures, Natural Resources and the Environment, Council for Scientific and Industrial Research (CSIR), Pretoria (South Africa); Process Engineering Department, Stellenbosch University, Private Bag X1, Matieland, Stellenbosch 7602 (South Africa); Gorgens, Johann; Knoetze, Hansie [Process Engineering Department, Stellenbosch University, Private Bag X1, Matieland, Stellenbosch 7602 (South Africa)

    2010-01-15

    Methanol production from biomass is a promising carbon neutral fuel, well suited for use in fuel cell vehicles (FCVs), as transportation fuel and as chemical building block. The concept used in this study incorporates an innovative Absorption Enhanced Reforming (AER) gasification process, which enables an efficient conversion of biomass into a hydrogen-rich gas (syngas) and then, uses the Mitsubishi methanol converter (superconverter) for methanol synthesis. Technical and economic prospects for production of methanol have been evaluated. The methanol plants described have a biomass input between 10 and 2000 MW{sub th}. The economy of the methanol production plants is very dependent on the production capacity and large-scale facilities are required to benefit from economies of scale. However, large-scale plants are likely to have higher transportation costs per unit biomass transported as a result of longer transportation distances. Analyses show that lower unit investment costs accompanying increased production scale outweighs the cost for transporting larger quantities of biomass. The unit cost of methanol production mostly depends on the capital investments. The total unit cost of methanol is found to decrease from about 10.66 R/l for a 10 MW{sub th} to about 6.44 R/l for a 60 MW{sub th} and 3.95 R/l for a 400 MW{sub th} methanol plant. The unit costs stabilise (a near flat profile was observed) for plant sizes between 400 and 2000 MW{sub th}, but the unit cost do however continue to decrease to about 2.89 R/l for a 2000 MW{sub th} plant. Long term cost reduction mainly resides in technological learning and large-scale production. Therefore, technology development towards large-scale technology that takes into account sustainable biomass production could be a better choice due to economic reasons.

  3. Biomethanol production from gasification of non-woody plant in South Africa. Optimum scale and economic performance

    Energy Technology Data Exchange (ETDEWEB)

    Amigun, Bamikole [Sustainable Energy Futures, Natural Resources and the Environment, Council for Scientific and Industrial Research (CSIR), Pretoria (South Africa); Process Engineering Department, Stellenbosch University, Private Bag X1, Matieland, Stellenbosch 7602 (South Africa); Gorgens, Johann; Knoetze, Hansie [Process Engineering Department, Stellenbosch University, Private Bag X1, Matieland, Stellenbosch 7602 (South Africa)

    2010-01-15

    Methanol production from biomass is a promising carbon neutral fuel, well suited for use in fuel cell vehicles (FCVs), as transportation fuel and as chemical building block. The concept used in this study incorporates an innovative Absorption Enhanced Reforming (AER) gasification process, which enables an efficient conversion of biomass into a hydrogen-rich gas (syngas) and then, uses the Mitsubishi methanol converter (superconverter) for methanol synthesis. Technical and economic prospects for production of methanol have been evaluated. The methanol plants described have a biomass input between 10 and 2000 MW{sub th}. The economy of the methanol production plants is very dependent on the production capacity and large-scale facilities are required to benefit from economies of scale. However, large-scale plants are likely to have higher transportation costs per unit biomass transported as a result of longer transportation distances. Analyses show that lower unit investment costs accompanying increased production scale outweighs the cost for transporting larger quantities of biomass. The unit cost of methanol production mostly depends on the capital investments. The total unit cost of methanol is found to decrease from about 10.66 R/l for a 10 MW{sub th} to about 6.44 R/l for a 60 MW{sub th} and 3.95 R/l for a 400 MW{sub th} methanol plant. The unit costs stabilise (a near flat profile was observed) for plant sizes between 400 and 2000 MW{sub th}, but the unit cost do however continue to decrease to about 2.89 R/l for a 2000 MW{sub th} plant. Long term cost reduction mainly resides in technological learning and large-scale production. Therefore, technology development towards large-scale technology that takes into account sustainable biomass production could be a better choice due to economic reasons. (author)

  4. Large Scale Generation and Characterization of Anti-Human IgA Monoclonal Antibody in Ascitic Fluid of Balb/c Mice

    OpenAIRE

    Fatemeh Ezzatifar; Jafar Majidi; Behzad Baradaran; Leili Aghebati Maleki; Jalal Abdolalizadeh; Mehdi Yousefi

    2015-01-01

    Purpose: Monoclonal antibodies are potentially powerful tools used in biomedical research, diagnosis, and treatment of infectious diseases and cancers. The monoclonal antibody against Human IgA can be used as a diagnostic application to detect infectious diseases. The aim of this study was to improve an appropriate protocol for large-scale production of mAbs against IgA. Methods: For large-scale production of the monoclonal antibody, hybridoma cells that produce monoclonal antibodies again...

  5. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe.

    Science.gov (United States)

    Blaas, Harry; Kroeze, Carolien

    2014-10-15

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed. To this end, scenarios for the year 2050 are analysed, assuming that in the 27 countries of the European Union fossil diesel will be replaced by biodiesel from algae. Estimates are made for the required fertiliser inputs to algae parks, and how this may increase concentrations of nitrogen and phosphorus in coastal waters, potentially leading to eutrophication. The Global NEWS (Nutrient Export from WaterSheds) model has been used to estimate the transport of nitrogen and phosphorus to the European coastal waters. The results indicate that the amount of nitrogen and phosphorus in the coastal waters may increase considerably in the future as a result of large-scale production of algae for the production of biodiesel, even in scenarios assuming effective waste water treatment and recycling of waste water in algae production. To ensure sustainable production of biodiesel from micro-algae, it is important to develop cultivation systems with low nutrient losses to the environment. Copyright © 2014 Elsevier B.V. All rights reserved.

  6. VisualRank: applying PageRank to large-scale image search.

    Science.gov (United States)

    Jing, Yushi; Baluja, Shumeet

    2008-11-01

    Because of the relative ease in understanding and processing text, commercial image-search systems often rely on techniques that are largely indistinguishable from text-search. Recently, academic studies have demonstrated the effectiveness of employing image-based features to provide alternative or additional signals. However, it remains uncertain whether such techniques will generalize to a large number of popular web queries, and whether the potential improvement to search quality warrants the additional computational cost. In this work, we cast the image-ranking problem into the task of identifying "authority" nodes on an inferred visual similarity graph and propose VisualRank to analyze the visual link structures among images. The images found to be "authorities" are chosen as those that answer the image-queries well. To understand the performance of such an approach in a real system, we conducted a series of large-scale experiments based on the task of retrieving images for 2000 of the most popular products queries. Our experimental results show significant improvement, in terms of user satisfaction and relevancy, in comparison to the most recent Google Image Search results. Maintaining modest computational cost is vital to ensuring that this procedure can be used in practice; we describe the techniques required to make this system practical for large scale deployment in commercial search engines.

  7. Abnormal binding and disruption in large scale networks involved in human partial seizures

    Directory of Open Access Journals (Sweden)

    Bartolomei Fabrice

    2013-12-01

    Full Text Available There is a marked increase in the amount of electrophysiological and neuroimaging works dealing with the study of large scale brain connectivity in the epileptic brain. Our view of the epileptogenic process in the brain has largely evolved over the last twenty years from the historical concept of “epileptic focus” to a more complex description of “Epileptogenic networks” involved in the genesis and “propagation” of epileptic activities. In particular, a large number of studies have been dedicated to the analysis of intracerebral EEG signals to characterize the dynamic of interactions between brain areas during temporal lobe seizures. These studies have reported that large scale functional connectivity is dramatically altered during seizures, particularly during temporal lobe seizure genesis and development. Dramatic changes in neural synchrony provoked by epileptic rhythms are also responsible for the production of ictal symptoms or changes in patient’s behaviour such as automatisms, emotional changes or consciousness alteration. Beside these studies dedicated to seizures, large-scale network connectivity during the interictal state has also been investigated not only to define biomarkers of epileptogenicity but also to better understand the cognitive impairments observed between seizures.

  8. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  9. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  10. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  11. Large scale hydrogen production from wind energy in the Magallanes area for consumption in the central zone of Chile

    International Nuclear Information System (INIS)

    Zolezzi, J.M.; Garay, A.; Reveco, M.

    2010-01-01

    The energy proposal of this research suggests the use of places with abundant wind resources for the production of H 2 on a large scale to be transported and used in the central zone of Chile with the purpose of diversifying the country's energy matrix in order to decrease its dependence on fossil fuels, increase its autonomy, and cover the future increases in energy demand. This research showed that the load factor of the proposed wind park reaches a value of 54.5%, putting in evidence the excellent wind conditions of the zone. This implies that the cost of the electricity produced by the wind park located in the Chilean Patagonia would have a cost of 0.0213 US$ kWh -1 in the year 2030. The low prices of the electricity obtained from the park, thanks to the economy of scale and the huge wind potential, represent a very attractive scenario for the production of H 2 in the future. The study concludes that by the year 2030 the cost of the H 2 generated in Magallanes and transported to the port of Quinteros would be 18.36 US$ MBTU -1 , while by that time the cost of oil would be about 17.241 US$ MBTU -1 , a situation that places H 2 in a very competitive position as a fuel. (author)

  12. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2013-01-01

    necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...... optimization decision. A multi-objective PSO (MOPSO) algorithm was introduced to solve this optimization problem, which can accelerate the convergence and guarantee the diversity of Pareto-optimal front set as well. The feasibility and effectiveness of the proposed bi-level planning approach and the MOPSO...

  13. On the Large-Scaling Issues of Cloud-based Applications for Earth Science Dat

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as NASA's SWOT and NISAR where its SAR data volumes and data throughput rates are order of magnitude larger than present day missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Experiences have shown that to embrace efficient cloud computing approaches for large-scale science data systems requires more than just moving existing code to cloud environments. At large cloud scales, we need to deal with scaling and cost issues. We present our experiences on deploying multiple instances of our hybrid-cloud computing science data system (HySDS) to support large-scale processing of Earth Science data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer 75%-90% costs savings but with an unpredictable computing environment based on market forces.

  14. HFSB-seeding for large-scale tomographic PIV in wind tunnels

    Science.gov (United States)

    Caridi, Giuseppe Carlo Alp; Ragni, Daniele; Sciacchitano, Andrea; Scarano, Fulvio

    2016-12-01

    A new system for large-scale tomographic particle image velocimetry in low-speed wind tunnels is presented. The system relies upon the use of sub-millimetre helium-filled soap bubbles as flow tracers, which scatter light with intensity several orders of magnitude higher than micron-sized droplets. With respect to a single bubble generator, the system increases the rate of bubbles emission by means of transient accumulation and rapid release. The governing parameters of the system are identified and discussed, namely the bubbles production rate, the accumulation and release times, the size of the bubble injector and its location with respect to the wind tunnel contraction. The relations between the above parameters, the resulting spatial concentration of tracers and measurement of dynamic spatial range are obtained and discussed. Large-scale experiments are carried out in a large low-speed wind tunnel with 2.85 × 2.85 m2 test section, where a vertical axis wind turbine of 1 m diameter is operated. Time-resolved tomographic PIV measurements are taken over a measurement volume of 40 × 20 × 15 cm3, allowing the quantitative analysis of the tip-vortex structure and dynamical evolution.

  15. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  16. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  17. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  18. Large Scale Production of Densified Hydrogen Using Integrated Refrigeration and Storage

    Science.gov (United States)

    Notardonato, William U.; Swanger, Adam Michael; Jumper, Kevin M.; Fesmire, James E.; Tomsik, Thomas M.; Johnson, Wesley L.

    2017-01-01

    Recent demonstration of advanced liquid hydrogen storage techniques using Integrated Refrigeration and Storage (IRAS) technology at NASA Kennedy Space Center led to the production of large quantities of solid densified liquid and slush hydrogen in a 125,000 L tank. Production of densified hydrogen was performed at three different liquid levels and LH2 temperatures were measured by twenty silicon diode temperature sensors. System energy balances and solid mass fractions are calculated. Experimental data reveal hydrogen temperatures dropped well below the triple point during testing (up to 1 K), and were continuing to trend downward prior to system shutdown. Sub-triple point temperatures were seen to evolve in a time dependent manner along the length of the horizontal, cylindrical vessel. Twenty silicon diode temperature sensors were recorded over approximately one month for testing at two different fill levels (33 67). The phenomenon, observed at both two fill levels, is described and presented detailed and explained herein., and The implications of using IRAS for energy storage, propellant densification, and future cryofuel systems are discussed.

  19. Comparative Study of Laboratory-Scale and Prototypic Production-Scale Fuel Fabrication Processes and Product Characteristics

    International Nuclear Information System (INIS)

    Marshall, Douglas W.

    2014-01-01

    An objective of the High Temperature Gas Reactor fuel development and qualification program for the United States Department of Energy has been to qualify fuel fabricated in prototypic production-scale equipment. The quality and characteristics of the tristructural isotropic (TRISO) coatings on fuel kernels are influenced by the equipment scale and processing parameters. The standard deviations of some TRISO layer characteristics were diminished while others have become more significant in the larger processing equipment. The impact on statistical variability of the processes and the products, as equipment was scaled, are discussed. The prototypic production-scale processes produce test fuels meeting all fuel quality specifications. (author)

  20. Scale up of proteoliposome derived Cochleate production.

    Science.gov (United States)

    Zayas, Caridad; Bracho, Gustavo; Lastre, Miriam; González, Domingo; Gil, Danay; Acevedo, Reinaldo; del Campo, Judith; Taboada, Carlos; Solís, Rosa L; Barberá, Ramón; Pérez, Oliver

    2006-04-12

    Cochleate are highly stable structures with promising immunological features. Cochleate structures are usually obtaining from commercial lipids. Proteoliposome derived Cochleate are derived from an outer membrane vesicles of Neisseria meningitidis B. Previously, we obtained Cochleates using dialysis procedures. In order to increase the production process, we used a crossflow system (CFS) that allows easy scale up to obtain large batches in an aseptic environment. The raw material and solutions used in the production process are already approved for human application. This work demonstrates that CFS is very efficient process to obtain Cochleate structures with a yield of more than 80% and the immunogenicity comparable to that obtained by dialysis membrane.

  1. Voltage stability issues in a distribution grid with large scale PV plant

    Energy Technology Data Exchange (ETDEWEB)

    Perez, Alvaro Ruiz; Marinopoulos, Antonios; Reza, Muhamad; Srivastava, Kailash [ABB AB, Vaesteraas (Sweden). Corporate Research Center; Hertem, Dirk van [Katholieke Univ. Leuven, Heverlee (Belgium). ESAT-ELECTA

    2011-07-01

    Solar photovoltaics (PV) has become a competitive renewable energy source. The production of solar PV cells and panels has increased significantly, while the cost is reduced due to economics of scale and technological achievements in the field. At the same time, the increase in efficiency of PV power systems and high energy prices are expected to lead PV systems to grid parity in the coming decade. This is expected to boost even more the large scale implementation of PV power plants (utility scale PV) and therefore the impact of such large scale PV plants to power system needs to be studies. This paper investigates the voltage stability issues arising from the connection of a large PV power plant to the power grid. For this purpose, a 15 MW PV power plant was implemented into a distribution grid, modeled and simulated using DIgSILENT Power Factory. Two scenarios were developed: in the first scenario, active power injected into the grid by the PV power plants was varied and the resulted U-Q curve was analyzed. In the second scenario, the impact of connecting PV power plants to different points in the grid - resulting in different strength of the connection - was investigated. (orig.)

  2. Biotechnological lignite conversion - a large-scale concept

    Energy Technology Data Exchange (ETDEWEB)

    Reich-Walber, M.; Meyrahn, H.; Felgener, G.W. [Rheinbraun AG, Koeln (Germany). Fuel Technology and Lab. Dept.

    1997-12-31

    Concerning the research on biotechnological lignite upgrading, Rheinbraun`s overall objective is the large-scale production of liquid and gaseous products for the energy and chemical/refinery sectors. The presentation outlines Rheinbraun`s technical concept for electricity production on the basis of biotechnologically solubilized lignite. A first rough cost estimate based on the assumptions described in the paper in detail and compared with the latest power plant generation shows the general cost efficiency of this technology despite the additional costs in respect of coal solubilization. The main reasons are low-cost process techniques for coal conversion on the one hand and cost reductions mainly in power plant technology (more efficient combustion processes and simplified gas clean-up) but also in coal transport (easy fuel handling) on the other hand. Moreover, it is hoped that an extended range of products will make it possible to widen the fields of lignite application. The presentation also points out that there is still a huge gap between this scenario and reality by limited microbiological knowledge. To close this gap Rheinbraun started a research project supported by the North-Rhine Westphalian government in 1995. Several leading biotechnological companies and institutes in Germany and the United States are involved in the project. The latest results of the current project will be presented in the paper. This includes fundamental research activities in the field of microbial coal conversion as well as investigations into bioreactor design and product treatment (dewatering, deashing and desulphurization). (orig.)

  3. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  4. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  5. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  6. Large-scale hydrological model river storage and discharge correction using a satellite altimetry-based discharge product

    Science.gov (United States)

    Emery, Charlotte Marie; Paris, Adrien; Biancamaria, Sylvain; Boone, Aaron; Calmant, Stéphane; Garambois, Pierre-André; Santos da Silva, Joecila

    2018-04-01

    Land surface models (LSMs) are widely used to study the continental part of the water cycle. However, even though their accuracy is increasing, inherent model uncertainties can not be avoided. In the meantime, remotely sensed observations of the continental water cycle variables such as soil moisture, lakes and river elevations are more frequent and accurate. Therefore, those two different types of information can be combined, using data assimilation techniques to reduce a model's uncertainties in its state variables or/and in its input parameters. The objective of this study is to present a data assimilation platform that assimilates into the large-scale ISBA-CTRIP LSM a punctual river discharge product, derived from ENVISAT nadir altimeter water elevation measurements and rating curves, over the whole Amazon basin. To deal with the scale difference between the model and the observation, the study also presents an initial development for a localization treatment that allows one to limit the impact of observations to areas close to the observation and in the same hydrological network. This assimilation platform is based on the ensemble Kalman filter and can correct either the CTRIP river water storage or the discharge. Root mean square error (RMSE) compared to gauge discharges is globally reduced until 21 % and at Óbidos, near the outlet, RMSE is reduced by up to 52 % compared to ENVISAT-based discharge. Finally, it is shown that localization improves results along the main tributaries.

  7. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  8. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  9. An integrated assessment of a large-scale biodiesel production in Italy: Killing several birds with one stone?

    International Nuclear Information System (INIS)

    Russi, Daniela

    2008-01-01

    Biofuels are often presented as a contribution towards the solution of the problems related to our strong dependency on fossil fuels, i.e. greenhouse effect, energy dependency, urban pollution, besides being a way to support rural development. In this paper, an integrated assessment approach is employed to discuss the social desirability of a large-scale biodiesel production in Italy, taking into account social, environmental and economic factors. The conclusion is that the advantages in terms of reduction of greenhouse gas emissions, energy dependency and urban pollution would be very modest. The small benefits would not be enough to offset the huge costs in terms of land requirement: if the target of the European Directive 2003/30/EC were reached (5.75% of the energy used for transport by 2010) the equivalent of about one-third of the Italian agricultural land would be needed. The consequences would be a considerable increase in food imports and large environmental impacts in the agricultural phase. Also, since biodiesel must be de-taxed in order to make it competitive with oil-derived diesel, the Italian energy revenues would be reduced. In the end, rural development remains the only sound reason to promote biodiesel, but even for this objective other strategies look more advisable, like supporting organic agriculture. (author)

  10. Synthesis and sintering Ni-Zn ferrite obtained for combustion reaction in large scale

    International Nuclear Information System (INIS)

    Vieira, D.A.; Diniz, V.C.S.; Costa, A.C.F.M.; Cornejo, D.R.; Kiminami, R.H.G.A.

    2014-01-01

    This research proposes to evaluate the magnetic properties of ferrite Ni-Zn synthesized by combustion reaction on a large scale and sintered at 1250 deg C in resistive furnace. The sample was characterized by X-ray diffraction (XRD), scanning electron microscopy (SEM), and magnetic measurements. The results show that the synthesized product in large scale resulted in soft magnetic material with saturation magnetization of 40 emu·g"-"1 and coercivity of 0.080 kOe, after sintering it was observed an increase to 68 emu·g"-"1 in the magnetization and a reduction to 0.016 kOe in coercivity, indicating that the obtained material has promising characteristics for applications in electro-electronic devices. (author)

  11. Scaling net ecosystem production and net biome production over a heterogeneous region in the Western United States

    Science.gov (United States)

    D.P. Turner; W.D. Ritts; B.E. Law; W.B. Cohen; Z. Yan; T. Hudiburg; J.L. Campbell; M. Duane

    2007-01-01

    Bottom-up scaling of net ecosystem production (NEP) and net biome production (NBP) was used to generate a carbon budget for a large heterogeneous region (the state of Oregon, 2.5x105 km2 ) in the Western United States. Landsat resolution (30 m) remote sensing provided the basis for mapping land cover and disturbance history...

  12. European-scale modelling of groundwater denitrification and associated N2O production

    International Nuclear Information System (INIS)

    Keuskamp, J.A.; Drecht, G. van; Bouwman, A.F.

    2012-01-01

    This paper presents a spatially explicit model for simulating the fate of nitrogen (N) in soil and groundwater and nitrous oxide (N 2 O) production in groundwater with a 1 km resolution at the European scale. The results show large heterogeneity of nitrate outflow from groundwater to surface water and production of N 2 O. This heterogeneity is the result of variability in agricultural and hydrological systems. Large parts of Europe have no groundwater aquifers and short travel times from soil to surface water. In these regions no groundwater denitrification and N 2 O production is expected. Predicted N leaching (16% of the N inputs) and N 2 O emissions (0.014% of N leaching) are much less than the IPCC default leaching rate and combined emission factor for groundwater and riparian zones, respectively. - Highlights: ► Groundwater denitrification and N 2 O production was modelled at the European scale. ► In large parts of Europe no groundwater denitrification is expected. ► N leaching and N 2 O emission in Europe are much less than the IPCC default values. - European groundwater denitrification is spatially variable, and associated nitrous oxide production is much less than based on the IPCC default estimate.

  13. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  14. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  15. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  16. Possible implications of large scale radiation processing of food

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of successful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fulfilment of conditions for successful processing is observed in the group of dry food, in expensive spices in particular. (author)

  17. Possible implications of large scale radiation processing of food

    Science.gov (United States)

    Zagórski, Z. P.

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.

  18. Adopting small-scale production of electricity

    Energy Technology Data Exchange (ETDEWEB)

    Tengvard, Maria; Palm, Jenny (Linkoeping Univ., Dept. of Technology and Social Change, Linkoeping (Sweden)). e-mail: maria.tengvard@liu.se

    2009-07-01

    In Sweden in 2008, a 'new' concept for small-scale electricity production attracted massive media attention. This was mainly due to the efforts of Swedish company Egen El, which is marketing small-scale photovoltaics (PVs) and wind turbines to households, both homeowners and tenants. Their main selling point is simplicity: their products are so easy to install that everyone can do it. Autumn 2008 also saw IKEA announce that within three years it would market solar panels. How, then, do households perceive these products? Why would households choose to buy them? How do households think about producing their own electricity? Analysis of material based on in-depth interviews with members of 20 households reveals that environmental concerns supply the main motive for adopting PVs or micro wind power generation. In some cases, the adopting households have an extensively ecological lifestyle and such adoption represents a way to take action in the energy area. For some, this investment is symbolic: a way of displaying environmental consciousness or setting an example to others. For still others, the adoption is a protest against 'the system' with its large dominant actors or is a way to become self-sufficient. These microgeneration installations are rejected mainly on economic grounds; other motives are respect for neighbours and difficulties finding a place to install a wind turbine.

  19. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  20. Large scale production of densified hydrogen to the triple point and below

    Science.gov (United States)

    Swanger, A. M.; Notardonato, W. U.; E Fesmire, J.; Jumper, K. M.; Johnson, W. L.; Tomsik, T. M.

    2017-12-01

    Recent demonstration of advanced liquid hydrogen storage techniques using Integrated Refrigeration and Storage technology at NASA Kennedy Space Center led to the production of large quantities of densified liquid and slush hydrogen in a 125,000 L tank. Production of densified hydrogen was performed at three different liquid levels and LH2 temperatures were measured by twenty silicon diode temperature sensors. Overall densification performance of the system is explored, and solid mass fractions are calculated. Experimental data reveal hydrogen temperatures dropped well below the triple point during testing, and were continuing to trend downward prior to system shutdown. Sub-triple point temperatures were seen to evolve in a time dependent manner along the length of the horizontal, cylindrical vessel. The phenomenon, observed at two fill levels, is detailed herein. The implications of using IRAS for energy storage, propellant densification, and future cryofuel systems are discussed.

  1. Hydrogen combustion modelling in large-scale geometries

    International Nuclear Information System (INIS)

    Studer, E.; Beccantini, A.; Kudriakov, S.; Velikorodny, A.

    2014-01-01

    Hydrogen risk mitigation issues based on catalytic recombiners cannot exclude flammable clouds to be formed during the course of a severe accident in a Nuclear Power Plant. Consequences of combustion processes have to be assessed based on existing knowledge and state of the art in CFD combustion modelling. The Fukushima accidents have also revealed the need for taking into account the hydrogen explosion phenomena in risk management. Thus combustion modelling in a large-scale geometry is one of the remaining severe accident safety issues. At present day there doesn't exist a combustion model which can accurately describe a combustion process inside a geometrical configuration typical of the Nuclear Power Plant (NPP) environment. Therefore the major attention in model development has to be paid on the adoption of existing approaches or creation of the new ones capable of reliably predicting the possibility of the flame acceleration in the geometries of that type. A set of experiments performed previously in RUT facility and Heiss Dampf Reactor (HDR) facility is used as a validation database for development of three-dimensional gas dynamic model for the simulation of hydrogen-air-steam combustion in large-scale geometries. The combustion regimes include slow deflagration, fast deflagration, and detonation. Modelling is based on Reactive Discrete Equation Method (RDEM) where flame is represented as an interface separating reactants and combustion products. The transport of the progress variable is governed by different flame surface wrinkling factors. The results of numerical simulation are presented together with the comparisons, critical discussions and conclusions. (authors)

  2. High-Resiliency and Auto-Scaling of Large-Scale Cloud Computing for OCO-2 L2 Full Physics Processing

    Science.gov (United States)

    Hua, H.; Manipon, G.; Starch, M.; Dang, L. B.; Southam, P.; Wilson, B. D.; Avis, C.; Chang, A.; Cheng, C.; Smyth, M.; McDuffie, J. L.; Ramirez, P.

    2015-12-01

    Next generation science data systems are needed to address the incoming flood of data from new missions such as SWOT and NISAR where data volumes and data throughput rates are order of magnitude larger than present day missions. Additionally, traditional means of procuring hardware on-premise are already limited due to facilities capacity constraints for these new missions. Existing missions, such as OCO-2, may also require high turn-around time for processing different science scenarios where on-premise and even traditional HPC computing environments may not meet the high processing needs. We present our experiences on deploying a hybrid-cloud computing science data system (HySDS) for the OCO-2 Science Computing Facility to support large-scale processing of their Level-2 full physics data products. We will explore optimization approaches to getting best performance out of hybrid-cloud computing as well as common issues that will arise when dealing with large-scale computing. Novel approaches were utilized to do processing on Amazon's spot market, which can potentially offer ~10X costs savings but with an unpredictable computing environment based on market forces. We will present how we enabled high-tolerance computing in order to achieve large-scale computing as well as operational cost savings.

  3. An Integrated Assessment of Location-Dependent Scaling for Microalgae Biofuel Production Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Andre M.; Abodeely, Jared; Skaggs, Richard; Moeglein, William AM; Newby, Deborah T.; Venteris, Erik R.; Wigmosta, Mark S.

    2014-06-19

    Successful development of a large-scale microalgae-based biofuels industry requires comprehensive analysis and understanding of the feedstock supply chain—from facility siting/design through processing/upgrading of the feedstock to a fuel product. The evolution from pilot-scale production facilities to energy-scale operations presents many multi-disciplinary challenges, including a sustainable supply of water and nutrients, operational and infrastructure logistics, and economic competitiveness with petroleum-based fuels. These challenges are addressed in part by applying the Integrated Assessment Framework (IAF)—an integrated multi-scale modeling, analysis, and data management suite—to address key issues in developing and operating an open-pond facility by analyzing how variability and uncertainty in space and time affect algal feedstock production rates, and determining the site-specific “optimum” facility scale to minimize capital and operational expenses. This approach explicitly and systematically assesses the interdependence of biofuel production potential, associated resource requirements, and production system design trade-offs. The IAF was applied to a set of sites previously identified as having the potential to cumulatively produce 5 billion-gallons/year in the southeastern U.S. and results indicate costs can be reduced by selecting the most effective processing technology pathway and scaling downstream processing capabilities to fit site-specific growing conditions, available resources, and algal strains.

  4. Water limited agriculture in Africa: Climate change sensitivity of large scale land investments

    Science.gov (United States)

    Rulli, M. C.; D'Odorico, P.; Chiarelli, D. D.; Davis, K. F.

    2015-12-01

    The past few decades have seen unprecedented changes in the global agricultural system with a dramatic increase in the rates of food production fueled by an escalating demand for food calories, as a result of demographic growth, dietary changes, and - more recently - new bioenergy policies. Food prices have become consistently higher and increasingly volatile with dramatic spikes in 2007-08 and 2010-11. The confluence of these factors has heightened demand for land and brought a wave of land investment to the developing world: some of the more affluent countries are trying to secure land rights in areas suitable for agriculture. According to some estimates, to date, roughly 38 million hectares have been acquired worldwide by large scale investors, 16 million of which in Africa. More than 85% of large scale land acquisitions in Africa are by foreign investors. Many land deals are motivated not only by the need for fertile land but for the water resources required for crop production. Despite some recent assessments of the water appropriation associated with large scale land investments, their impact on the water resources of the target countries under present conditions and climate change scenarios remains poorly understood. Here we investigate irrigation water requirements by various crops planted in the acquired land as an indicator of the pressure likely placed by land investors on ("blue") water resources of target regions in Africa and evaluate the sensitivity to climate changes scenarios.

  5. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  6. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  7. Uncertainty of measurement for large product verification: evaluation of large aero gas turbine engine datums

    International Nuclear Information System (INIS)

    Muelaner, J E; Wang, Z; Keogh, P S; Brownell, J; Fisher, D

    2016-01-01

    Understanding the uncertainty of dimensional measurements for large products such as aircraft, spacecraft and wind turbines is fundamental to improving efficiency in these products. Much work has been done to ascertain the uncertainty associated with the main types of instruments used, based on laser tracking and photogrammetry, and the propagation of this uncertainty through networked measurements. Unfortunately this is not sufficient to understand the combined uncertainty of industrial measurements, which include secondary tooling and datum structures used to locate the coordinate frame. This paper presents for the first time a complete evaluation of the uncertainty of large scale industrial measurement processes. Generic analysis and design rules are proven through uncertainty evaluation and optimization for the measurement of a large aero gas turbine engine. This shows how the instrument uncertainty can be considered to be negligible. Before optimization the dominant source of uncertainty was the tooling design, after optimization the dominant source was thermal expansion of the engine; meaning that no further improvement can be made without measurement in a temperature controlled environment. These results will have a significant impact on the ability of aircraft and wind turbines to improve efficiency and therefore reduce carbon emissions, as well as the improved reliability of these products. (paper)

  8. THE DECAY OF A WEAK LARGE-SCALE MAGNETIC FIELD IN TWO-DIMENSIONAL TURBULENCE

    Energy Technology Data Exchange (ETDEWEB)

    Kondić, Todor; Hughes, David W.; Tobias, Steven M., E-mail: t.kondic@leeds.ac.uk [Department of Applied Mathematics, University of Leeds, Leeds LS2 9JT (United Kingdom)

    2016-06-01

    We investigate the decay of a large-scale magnetic field in the context of incompressible, two-dimensional magnetohydrodynamic turbulence. It is well established that a very weak mean field, of strength significantly below equipartition value, induces a small-scale field strong enough to inhibit the process of turbulent magnetic diffusion. In light of ever-increasing computer power, we revisit this problem to investigate fluids and magnetic Reynolds numbers that were previously inaccessible. Furthermore, by exploiting the relation between the turbulent diffusion of the magnetic potential and that of the magnetic field, we are able to calculate the turbulent magnetic diffusivity extremely accurately through the imposition of a uniform mean magnetic field. We confirm the strong dependence of the turbulent diffusivity on the product of the magnetic Reynolds number and the energy of the large-scale magnetic field. We compare our findings with various theoretical descriptions of this process.

  9. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  10. Development of Anti-Insect Microencapsulated Polypropylene Films Using a Large Scale Film Coating System.

    Science.gov (United States)

    Song, Ah Young; Choi, Ha Young; Lee, Eun Song; Han, Jaejoon; Min, Sea C

    2018-04-01

    Films containing microencapsulated cinnamon oil (CO) were developed using a large-scale production system to protect against the Indian meal moth (Plodia interpunctella). CO at concentrations of 0%, 0.8%, or 1.7% (w/w ink mixture) was microencapsulated with polyvinyl alcohol. The microencapsulated CO emulsion was mixed with ink (47% or 59%, w/w) and thinner (20% or 25%, w/w) and coated on polypropylene (PP) films. The PP film was then laminated with a low-density polyethylene (LDPE) film on the coated side. The film with microencapsulated CO at 1.7% repelled P. interpunctella most effectively. Microencapsulation did not negatively affect insect repelling activity. The release rate of cinnamaldehyde, an active repellent, was lower when CO was microencapsulated than that in the absence of microencapsulation. Thermogravimetric analysis exhibited that microencapsulation prevented the volatilization of CO. The tensile strength, percentage elongation at break, elastic modulus, and water vapor permeability of the films indicated that microencapsulation did not affect the tensile and moisture barrier properties (P > 0.05). The results of this study suggest that effective films for the prevention of Indian meal moth invasion can be produced by the microencapsulation of CO using a large-scale film production system. Low-density polyethylene-laminated polypropylene films printed with ink incorporating microencapsulated cinnamon oil using a large-scale film production system effectively repelled Indian meal moth larvae. Without altering the tensile and moisture barrier properties of the film, microencapsulation resulted in the release of an active repellent for extended periods with a high thermal stability of cinnamon oil, enabling commercial film production at high temperatures. This anti-insect film system may have applications to other food-packaging films that use the same ink-printing platform. © 2018 Institute of Food Technologists®.

  11. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and

  12. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  13. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  14. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  15. Toxic Combustion Product Yields as a Function of Equivalence Ratio and Flame Retardants in Under-Ventilated Fires: Bench-Large-Scale Comparisons

    Directory of Open Access Journals (Sweden)

    David A. Purser

    2016-09-01

    Full Text Available In large-scale compartment fires; combustion product yields vary with combustion conditions mainly in relation to the fuel:air equivalence ratio (Φ and the effects of gas-phase flame retardants. Yields of products of inefficient combustion; including the major toxic products CO; HCN and organic irritants; increase considerably as combustion changes from well-ventilated (Φ < 1 to under-ventilated (Φ = 1–3. It is therefore essential that bench-scale toxicity tests reproduce this behaviour across the Φ range. Yield data from repeat compartment fire tests for any specific fuel show some variation on either side of a best-fit curve for CO yield as a function of Φ. In order to quantify the extent to which data from the steady state tube furnace (SSTF [1]; ISO TS19700 [2] represents compartment fire yields; the range and average deviations of SSTF data for CO yields from the compartment fire best-fit curve were compared to those for direct compartment fire measurements for six different polymeric fuels with textile and non-textile applications and for generic post-flashover fire CO yield data. The average yields; range and standard deviations of the SSTF data around the best-fit compartment fire curves were found to be close to those for the compartment fire data. It is concluded that SSTF data are as good a predictor of compartment fire yields as are repeat compartment fire test data.

  16. Use of a large-scale rainfall simulator reveals novel insights into stemflow generation

    Science.gov (United States)

    Levia, D. F., Jr.; Iida, S. I.; Nanko, K.; Sun, X.; Shinohara, Y.; Sakai, N.

    2017-12-01

    Detailed knowledge of stemflow generation and its effects on both hydrological and biogoechemical cycling is important to achieve a holistic understanding of forest ecosystems. Field studies and a smaller set of experiments performed under laboratory conditions have increased our process-based knowledge of stemflow production. Building upon these earlier works, a large-scale rainfall simulator was employed to deepen our understanding of stemflow generation processes. The use of the large-scale rainfall simulator provides a unique opportunity to examine a range of rainfall intensities under constant conditions that are difficult under natural conditions due to the variable nature of rainfall intensities in the field. Stemflow generation and production was examined for three species- Cryptomeria japonica D. Don (Japanese cedar), Chamaecyparis obtusa (Siebold & Zucc.) Endl. (Japanese cypress), Zelkova serrata Thunb. (Japanese zelkova)- under both leafed and leafless conditions at several different rainfall intensities (15, 20, 30, 40, 50, and 100 mm h-1) using a large-scale rainfall simulator in National Research Institute for Earth Science and Disaster Resilience (Tsukuba, Japan). Stemflow production and rates and funneling ratios were examined in relation to both rainfall intensity and canopy structure. Preliminary results indicate a dynamic and complex response of the funneling ratios of individual trees to different rainfall intensities among the species examined. This is partly the result of different canopy structures, hydrophobicity of vegetative surfaces, and differential wet-up processes across species and rainfall intensities. This presentation delves into these differences and attempts to distill them into generalizable patterns, which can advance our theories of stemflow generation processes and ultimately permit better stewardship of forest resources. ________________ Funding note: This research was supported by JSPS Invitation Fellowship for Research in

  17. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  18. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  19. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  20. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  1. State-of-the-art of large scale biogas plants

    International Nuclear Information System (INIS)

    Prisum, J.M.; Noergaard, P.

    1992-01-01

    A survey of the technological state of large scale biogas plants in Europe treating manure is given. 83 plants are in operation at present. Of these, 16 are centralised digestion plants. Transport costs at centralised digestion plants amounts to between 25 and 40 percent of the total operational costs. Various transport equipment is used. Most large scale digesters are CSTRs, but serial, contact, 2-step, and plug-flow digesters are also found. Construction materials are mostly steel and concrete. Mesophilic digestion is most common (56%), thermophilic digestion is used in 17% of the plants, combined mesophilic and thermophilic digestion is used in 28% of the centralised plants. Mixing of digester content is performed with gas injection, propellers, and gas-liquid displacement. Heating is carried out using external or internal heat exchangers. Heat recovery is only used in Denmark. Gas purification equipment is commonplace, but not often needed. Several plants use separation of the digested manure, often as part of a post-treatment/-purification process or for the production of 'compost'. Screens, sieve belt separaters, centrifuges and filter presses are employed. The use of biogas varies considerably. In some cases, combined heat and power stations are supplying the grid and district heating systems. Other plants use only either the electricity or heat. (au)

  2. Breakthrough In Current In Plane Metrology For Monitoring Large Scale MRAM Production

    DEFF Research Database (Denmark)

    Cagliani, Alberto; Østerberg, Frederik Westergaard; Hansen, Ole

    2017-01-01

    The current-in-plane tunneling technique (CIPT) has been a crucial tool in the development of magnetic tunnel junction stacks suitable for Magnetic Random Access Memories (MRAM) for more than a decade. The MRAM development has now reached the maturity to make the transition from R&D to large...... of the Resistance Area product (RA) and the Tunnel Magnetoresistance (TMR) measurements, compared to state of the art CIPT metrology tools dedicated to R&D. On two test wafers, the repeatability of RA and MR was improved up to 350% and the measurement reproducibility up to 1700%. We believe that CIPT metrology now...

  3. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  4. Large-scale synthesis of single-crystalline MgO with bone-like nanostructures

    International Nuclear Information System (INIS)

    Niu Haixia; Yang Qing; Tang Kaibin; Xie Yi

    2006-01-01

    Uniform bone-like MgO nanocrystals have been prepared via a solvothermal process using commercial Mg powders as the starting material in the absence of any catalyst or surfactant followed by a subsequent calcination. Field emission scanning electron microscopy (FE-SEM) and transmission electron microscopy (TEM) measurements indicate that the product consists of a large quantity of bone-like nanocrystals with lengths of 120-200 nm. The widths of these nanocrystals at both ends are in the range of 20-50 nm, which are 3-20 nm wider than those of the middle parts. Explorations of X-ray diffraction (XRD) and selected area electronic diffraction (SAED) exhibit that the product is high-quality cubic single-crystalline nanocrystals. The photoluminescence (PL) measurement suggests that the product has an intensive emission centered at 410 nm, showing that the product has potential application in optical devices. The advantages of our method lie in high yield, the easy availability of the starting materials and permitting large-scale production at low cost. The growth mechanism was proposed to be related with solvent's oxidation in the precursor formation process and following nucleation and mass-transfer in the decomposition of the precursor

  5. Radial scaling in inclusive jet production at hadron colliders

    Science.gov (United States)

    Taylor, Frank E.

    2018-03-01

    Inclusive jet production in p-p and p ¯ -p collisions shows many of the same kinematic systematics as observed in single-particle inclusive production at much lower energies. In an earlier study (1974) a phenomenology, called radial scaling, was developed for the single-particle inclusive cross sections that attempted to capture the essential underlying physics of pointlike parton scattering and the fragmentation of partons into hadrons suppressed by the kinematic boundary. The phenomenology was successful in emphasizing the underlying systematics of the inclusive particle productions. Here we demonstrate that inclusive jet production at the Large Hadron Collider (LHC) in high-energy p-p collisions and at the Tevatron in p ¯ -p inelastic scattering shows similar behavior. The ATLAS inclusive jet production plotted as a function of this scaling variable is studied for √s of 2.76, 7 and 13 TeV and is compared to p ¯ -p inclusive jet production at 1.96 TeV measured at the CDF and D0 at the Tevatron and p-Pb inclusive jet production at the LHC ATLAS at √sNN=5.02 TeV . Inclusive single-particle production at Fermi National Accelerator Laboratory fixed target and Intersecting Storage Rings energies are compared to inclusive J /ψ production at the LHC measured in ATLAS, CMS and LHCb. Striking common features of the data are discussed.

  6. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  7. On the Soft Limit of the Large Scale Structure Power Spectrum: UV Dependence

    CERN Document Server

    Garny, Mathias; Porto, Rafael A; Sagunski, Laura

    2015-01-01

    We derive a non-perturbative equation for the large scale structure power spectrum of long-wavelength modes. Thereby, we use an operator product expansion together with relations between the three-point function and power spectrum in the soft limit. The resulting equation encodes the coupling to ultraviolet (UV) modes in two time-dependent coefficients, which may be obtained from response functions to (anisotropic) parameters, such as spatial curvature, in a modified cosmology. We argue that both depend weakly on fluctuations deep in the UV. As a byproduct, this implies that the renormalized leading order coefficient(s) in the effective field theory (EFT) of large scale structures receive most of their contribution from modes close to the non-linear scale. Consequently, the UV dependence found in explicit computations within standard perturbation theory stems mostly from counter-term(s). We confront a simplified version of our non-perturbative equation against existent numerical simulations, and find good agr...

  8. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  9. Expanded Large-Scale Forcing Properties Derived from the Multiscale Data Assimilation System and Its Application to Single-Column Models

    Science.gov (United States)

    Feng, S.; Li, Z.; Liu, Y.; Lin, W.; Toto, T.; Vogelmann, A. M.; Fridlind, A. M.

    2013-12-01

    We present an approach to derive large-scale forcing that is used to drive single-column models (SCMs) and cloud resolving models (CRMs)/large eddy simulation (LES) for evaluating fast physics parameterizations in climate models. The forcing fields are derived by use of a newly developed multi-scale data assimilation (MS-DA) system. This DA system is developed on top of the NCEP Gridpoint Statistical Interpolation (GSI) System and is implemented in the Weather Research and Forecasting (WRF) model at a cloud resolving resolution of 2 km. This approach has been applied to the generation of large scale forcing for a set of Intensive Operation Periods (IOPs) over the Atmospheric Radiation Measurement (ARM) Climate Research Facility's Southern Great Plains (SGP) site. The dense ARM in-situ observations and high-resolution satellite data effectively constrain the WRF model. The evaluation shows that the derived forcing displays accuracies comparable to the existing continuous forcing product and, overall, a better dynamic consistency with observed cloud and precipitation. One important application of this approach is to derive large-scale hydrometeor forcing and multiscale forcing, which is not provided in the existing continuous forcing product. It is shown that the hydrometeor forcing poses an appreciable impact on cloud and precipitation fields in the single-column model simulations. The large-scale forcing exhibits a significant dependency on domain-size that represents SCM grid-sizes. Subgrid processes often contribute a significant component to the large-scale forcing, and this contribution is sensitive to the grid-size and cloud-regime.

  10. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  11. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  12. Theory and algorithms for solving large-scale numerical problems. Application to the management of electricity production

    International Nuclear Information System (INIS)

    Chiche, A.

    2012-01-01

    This manuscript deals with large-scale optimization problems, and more specifically with solving the electricity unit commitment problem arising at EDF. First, we focused on the augmented Lagrangian algorithm. The behavior of that algorithm on an infeasible convex quadratic optimization problem is analyzed. It is shown that the algorithm finds a point that satisfies the shifted constraints with the smallest possible shift in the sense of the Euclidean norm and that it minimizes the objective on the corresponding shifted constrained set. The convergence to such a point is realized at a global linear rate, which depends explicitly on the augmentation parameter. This suggests us a rule for determining the augmentation parameter to control the speed of convergence of the shifted constraint norm to zero. This rule has the advantage of generating bounded augmentation parameters even when the problem is infeasible. As a by-product, the algorithm computes the smallest translation in the Euclidean norm that makes the constraints feasible. Furthermore, this work provides solution methods for stochastic optimization industrial problems decomposed on a scenario tree, based on the progressive hedging algorithm introduced by [Rockafellar et Wets, 1991]. We also focus on the convergence of that algorithm. On the one hand, we offer a counter-example showing that the algorithm could diverge if its augmentation parameter is iteratively updated. On the other hand, we show how to recover the multipliers associated with the non-dualized constraints defined on the scenario tree from those associated with the corresponding constraints of the scenario subproblems. Their convergence is also analyzed for convex problems. The practical interest of theses solutions techniques is corroborated by numerical experiments performed on the electric production management problem. We apply the progressive hedging algorithm to a realistic industrial problem. More precisely, we solve the French medium

  13. Integrating large-scale functional genomics data to dissect metabolic networks for hydrogen production

    Energy Technology Data Exchange (ETDEWEB)

    Harwood, Caroline S

    2012-12-17

    The goal of this project is to identify gene networks that are critical for efficient biohydrogen production by leveraging variation in gene content and gene expression in independently isolated Rhodopseudomonas palustris strains. Coexpression methods were applied to large data sets that we have collected to define probabilistic causal gene networks. To our knowledge this a first systems level approach that takes advantage of strain-to strain variability to computationally define networks critical for a particular bacterial phenotypic trait.

  14. How to correct long-term system externality of large scale wind power development by a capacity mechanism?

    International Nuclear Information System (INIS)

    Cepeda, Mauricio; Finon, Dominique

    2013-04-01

    This paper deals with the practical problems related to long-term security of supply in electricity markets in the presence of large-scale wind power development. The success of renewable promotion schemes adds a new dimension to ensuring long-term security of supply. It necessitates designing second-best policies to prevent large-scale wind power development from distorting long-run equilibrium prices and investments in conventional generation and in particular in peaking units. We rely upon a long-term simulation model which simulates electricity market players' investment decisions in a market regime and incorporates large-scale wind power development either in the presence of either subsidised wind production or in market-driven development. We test the use of capacity mechanisms to compensate for the long-term effects of large-scale wind power development on the system reliability. The first finding is that capacity mechanisms can help to reduce the social cost of large scale wind power development in terms of decrease of loss of load probability. The second finding is that, in a market-based wind power deployment without subsidy, wind generators are penalized for insufficient contribution to the long term system's reliability. (authors)

  15. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  16. Aeroelastic Stability Investigations for Large-scale Vertical Axis Wind Turbines

    International Nuclear Information System (INIS)

    2 P O Box 5800, Albuquerque, NM, 87185 (United States))" data-affiliation=" (Senior Member of Technical Staff, Analytical Structural Dynamics Sandia National Laboratories2 P O Box 5800, Albuquerque, NM, 87185 (United States))" >Owens, B C; 2 P O Box 5800, Albuquerque, NM, 87185 (United States))" data-affiliation=" (Principal Member of Technical Staff, Wind Energy Technologies Sandia National Laboratories2 P O Box 5800, Albuquerque, NM, 87185 (United States))" >Griffith, D T

    2014-01-01

    The availability of offshore wind resources in coastal regions, along with a high concentration of load centers in these areas, makes offshore wind energy an attractive opportunity for clean renewable electricity production. High infrastructure costs such as the offshore support structure and operation and maintenance costs for offshore wind technology, however, are significant obstacles that need to be overcome to make offshore wind a more cost-effective option. A vertical-axis wind turbine (VAWT) rotor configuration offers a potential transformative technology solution that significantly lowers cost of energy for offshore wind due to its inherent advantages for the offshore market. However, several potential challenges exist for VAWTs and this paper addresses one of them with an initial investigation of dynamic aeroelastic stability for large-scale, multi-megawatt VAWTs. The aeroelastic formulation and solution method from the BLade Aeroelastic STability Tool (BLAST) for HAWT blades was employed to extend the analysis capability of a newly developed structural dynamics design tool for VAWTs. This investigation considers the effect of configuration geometry, material system choice, and number of blades on the aeroelastic stability of a VAWT, and provides an initial scoping for potential aeroelastic instabilities in large-scale VAWT designs

  17. Aeroelastic Stability Investigations for Large-scale Vertical Axis Wind Turbines

    Science.gov (United States)

    Owens, B. C.; Griffith, D. T.

    2014-06-01

    The availability of offshore wind resources in coastal regions, along with a high concentration of load centers in these areas, makes offshore wind energy an attractive opportunity for clean renewable electricity production. High infrastructure costs such as the offshore support structure and operation and maintenance costs for offshore wind technology, however, are significant obstacles that need to be overcome to make offshore wind a more cost-effective option. A vertical-axis wind turbine (VAWT) rotor configuration offers a potential transformative technology solution that significantly lowers cost of energy for offshore wind due to its inherent advantages for the offshore market. However, several potential challenges exist for VAWTs and this paper addresses one of them with an initial investigation of dynamic aeroelastic stability for large-scale, multi-megawatt VAWTs. The aeroelastic formulation and solution method from the BLade Aeroelastic STability Tool (BLAST) for HAWT blades was employed to extend the analysis capability of a newly developed structural dynamics design tool for VAWTs. This investigation considers the effect of configuration geometry, material system choice, and number of blades on the aeroelastic stability of a VAWT, and provides an initial scoping for potential aeroelastic instabilities in large-scale VAWT designs.

  18. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  19. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  20. Analysis of supply chain, scale factor, and optimum plant capacity for the production of ethanol from corn stover

    International Nuclear Information System (INIS)

    Leboreiro, Jose; Hilaly, Ahmad K.

    2013-01-01

    A detailed model is used to perform a thorough analysis on ethanol production from corn stover via the dilute acid process. The biomass supply chain cost model accounts for all steps needed to source corn stover including collection, transportation, and storage. The manufacturing cost model is based on work done at NREL; attainable conversions of key process parameters are used to calculate production cost. The choice of capital investment scaling function and scaling parameter has a significant impact on the optimum plant capacity. For the widely used exponential function, the scaling factors are functions of plant capacity. The pre-exponential factor decreases with increasing plant capacity while the exponential factor increases as the plant capacity increases. The use of scaling parameters calculated for small plant capacities leads to falsely large optimum plants; data from a wide range of plant capacities is required to produce accurate results. A mathematical expression to scale capital investment for fermentation-based biorefineries is proposed which accounts for the linear scaling behavior of bio-reactors (such as saccharification vessels and fermentors) as well as the exponential nature of all other plant equipment. Ignoring the linear scaling behavior of bio-reactors leads to artificially large optimum plant capacities. The minimum production cost is found to be in the range of 789–830 $ m −3 which is significantly higher than previously reported. Optimum plant capacities are in the range of 5750–9850 Mg d −1 . The optimum plant capacity and production cost are highly sensitive to farmer participation in biomass harvest for low participation rates. -- Highlights: •A detailed model is used to perform a technoeconomic analysis for the production of ethanol from corn stover. •The capital investment scaling factors were found to be a function of plant capacity. •Bio-reactors (such as saccharification vessels and fermentors) in large size

  1. Probing high scale physics with top quarks at the Large Hadron Collider

    Science.gov (United States)

    Dong, Zhe

    With the Large Hadron Collider (LHC) running at TeV scale, we are expecting to find the deviations from the Standard Model in the experiments, and understanding what is the origin of these deviations. Being the heaviest elementary particle observed so far in the experiments with the mass at the electroweak scale, top quark is a powerful probe for new phenomena of high scale physics at the LHC. Therefore, we concentrate on studying the high scale physics phenomena with top quark pair production or decay at the LHC. In this thesis, we study the discovery potential of string resonances decaying to t/tbar final state, and examine the possibility of observing baryon-number-violating top-quark production or decay, at the LHC. We point out that string resonances for a string scale below 4 TeV can be detected via the t/tbar channel, by reconstructing center-of-mass frame kinematics of the resonances from either the t/tbar semi-leptonic decay or recent techniques of identifying highly boosted tops. For the study of baryon-number-violating processes, by a model independent effective approach and focusing on operators with minimal mass-dimension, we find that corresponding effective coefficients could be directly probed at the LHC already with an integrated luminosity of 1 inverse femtobarns at 7 TeV, and further constrained with 30 (100) inverse femtobarns at 7 (14) TeV.

  2. Large scale synthesis of α-Si3N4 nanowires through a kinetically favored chemical vapour deposition process

    Science.gov (United States)

    Liu, Haitao; Huang, Zhaohui; Zhang, Xiaoguang; Fang, Minghao; Liu, Yan-gai; Wu, Xiaowen; Min, Xin

    2018-01-01

    Understanding the kinetic barrier and driving force for crystal nucleation and growth is decisive for the synthesis of nanowires with controllable yield and morphology. In this research, we developed an effective reaction system to synthesize very large scale α-Si3N4 nanowires (hundreds of milligrams) and carried out a comparative study to characterize the kinetic influence of gas precursor supersaturation and liquid metal catalyst. The phase composition, morphology, microstructure and photoluminescence properties of the as-synthesized products were characterized by X-ray diffraction, fourier-transform infrared spectroscopy, field emission scanning electron microscopy, transmission electron microscopy and room temperature photoluminescence measurement. The yield of the products not only relates to the reaction temperature (thermodynamic condition) but also to the distribution of gas precursors (kinetic condition). As revealed in this research, by controlling the gas diffusion process, the yield of the nanowire products could be greatly improved. The experimental results indicate that the supersaturation is the dominant factor in the as-designed system rather than the catalyst. With excellent non-flammability and high thermal stability, the large scale α-Si3N4 products would have potential applications to the improvement of strength of high temperature ceramic composites. The photoluminescence spectrum of the α-Si3N4 shows a blue shift which could be valued for future applications in blue-green emitting devices. There is no doubt that the large scale products are the base of these applications.

  3. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  4. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  5. Combining offshore wind energy and large-scale mussel farming: background & technical, ecological and economic considerations

    NARCIS (Netherlands)

    Lagerveld, S.; Rockmann, C.; Scholl, M.M.; Bartelings, H.; Burg, van den S.W.K.; Jak, R.G.; Jansen, H.M.; Klijnstra, J.; Leopold, M.F.; Poelman, M.; Smith, S.R.; Stavenuiter, J.; Veenstra, F.A.; Veltman, C.; Westra, C.

    2014-01-01

    This Blauwdruk project report presents background and technical, ecological and economic considerations of the potential combination of offshore wind energy production and large-scale mussel farming in offshore areas in the North Sea. The main objective of the Blauwdruk project was to study the

  6. Large-scale atomistic simulations of nanostructured materials based on divide-and-conquer density functional theory

    Directory of Open Access Journals (Sweden)

    Vashishta P.

    2011-05-01

    Full Text Available A linear-scaling algorithm based on a divide-and-conquer (DC scheme is designed to perform large-scale molecular-dynamics simulations, in which interatomic forces are computed quantum mechanically in the framework of the density functional theory (DFT. This scheme is applied to the thermite reaction at an Al/Fe2O3 interface. It is found that mass diffusion and reaction rate at the interface are enhanced by a concerted metal-oxygen flip mechanism. Preliminary simulations are carried out for an aluminum particle in water based on the conventional DFT, as a target system for large-scale DC-DFT simulations. A pair of Lewis acid and base sites on the aluminum surface preferentially catalyzes hydrogen production in a low activation-barrier mechanism found in the simulations

  7. Mesoderm Lineage 3D Tissue Constructs Are Produced at Large-Scale in a 3D Stem Cell Bioprocess.

    Science.gov (United States)

    Cha, Jae Min; Mantalaris, Athanasios; Jung, Sunyoung; Ji, Yurim; Bang, Oh Young; Bae, Hojae

    2017-09-01

    Various studies have presented different approaches to direct pluripotent stem cell differentiation such as applying defined sets of exogenous biochemical signals and genetic/epigenetic modifications. Although differentiation to target lineages can be successfully regulated, such conventional methods are often complicated, laborious, and not cost-effective to be employed to the large-scale production of 3D stem cell-based tissue constructs. A 3D-culture platform that could realize the large-scale production of mesoderm lineage tissue constructs from embryonic stem cells (ESCs) is developed. ESCs are cultured using our previously established 3D-bioprocess platform which is amenable to mass-production of 3D ESC-based tissue constructs. Hepatocarcinoma cell line conditioned medium is introduced to the large-scale 3D culture to provide a specific biomolecular microenvironment to mimic in vivo mesoderm formation process. After 5 days of spontaneous differentiation period, the resulting 3D tissue constructs are composed of multipotent mesodermal progenitor cells verified by gene and molecular expression profiles. Subsequently the optimal time points to trigger terminal differentiation towards cardiomyogenesis or osteogenesis from the mesodermal tissue constructs is found. A simple and affordable 3D ESC-bioprocess that can reach the scalable production of mesoderm origin tissues with significantly improved correspondent tissue properties is demonstrated. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  8. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  9. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  10. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  11. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  12. Large Scale Software Building with CMake in ATLAS

    Science.gov (United States)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  13. Implementation of a large-scale hospital information infrastructure for multi-unit health-care services.

    Science.gov (United States)

    Yoo, Sun K; Kim, Dong Keun; Kim, Jung C; Park, Youn Jung; Chang, Byung Chul

    2008-01-01

    With the increase in demand for high quality medical services, the need for an innovative hospital information system has become essential. An improved system has been implemented in all hospital units of the Yonsei University Health System. Interoperability between multi-units required appropriate hardware infrastructure and software architecture. This large-scale hospital information system encompassed PACS (Picture Archiving and Communications Systems), EMR (Electronic Medical Records) and ERP (Enterprise Resource Planning). It involved two tertiary hospitals and 50 community hospitals. The monthly data production rate by the integrated hospital information system is about 1.8 TByte and the total quantity of data produced so far is about 60 TByte. Large scale information exchange and sharing will be particularly useful for telemedicine applications.

  14. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  15. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  16. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  17. Evaluation of enzymatic reactors for large-scale panose production.

    Science.gov (United States)

    Fernandes, Fabiano A N; Rodrigues, Sueli

    2007-07-01

    Panose is a trisaccharide constituted by a maltose molecule bonded to a glucose molecule by an alpha-1,6-glycosidic bond. This trisaccharide has potential to be used in the food industry as a noncariogenic sweetener, as the oral flora does not ferment it. Panose can also be considered prebiotic for stimulating the growth of benefic microorganisms, such as lactobacillus and bifidobacteria, and for inhibiting the growth of undesired microorganisms such as E. coli and Salmonella. In this paper, the production of panose by enzymatic synthesis in a batch and a fed-batch reactor was optimized using a mathematical model developed to simulate the process. Results show that optimum production is obtained in a fed-batch process with an optimum production of 11.23 g/l h of panose, which is 51.5% higher than production with batch reactor.

  18. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  19. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  20. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  1. Timing of Formal Phase Safety Reviews for Large-Scale Integrated Hazard Analysis

    Science.gov (United States)

    Massie, Michael J.; Morris, A. Terry

    2010-01-01

    Integrated hazard analysis (IHA) is a process used to identify and control unacceptable risk. As such, it does not occur in a vacuum. IHA approaches must be tailored to fit the system being analyzed. Physical, resource, organizational and temporal constraints on large-scale integrated systems impose additional direct or derived requirements on the IHA. The timing and interaction between engineering and safety organizations can provide either benefits or hindrances to the overall end product. The traditional approach for formal phase safety review timing and content, which generally works well for small- to moderate-scale systems, does not work well for very large-scale integrated systems. This paper proposes a modified approach to timing and content of formal phase safety reviews for IHA. Details of the tailoring process for IHA will describe how to avoid temporary disconnects in major milestone reviews and how to maintain a cohesive end-to-end integration story particularly for systems where the integrator inherently has little to no insight into lower level systems. The proposal has the advantage of allowing the hazard analysis development process to occur as technical data normally matures.

  2. Large-scale tropospheric transport in the Chemistry-Climate Model Initiative (CCMI) simulations

    Science.gov (United States)

    Orbe, Clara; Yang, Huang; Waugh, Darryn W.; Zeng, Guang; Morgenstern, Olaf; Kinnison, Douglas E.; Lamarque, Jean-Francois; Tilmes, Simone; Plummer, David A.; Scinocca, John F.; Josse, Beatrice; Marecal, Virginie; Jöckel, Patrick; Oman, Luke D.; Strahan, Susan E.; Deushi, Makoto; Tanaka, Taichu Y.; Yoshida, Kohei; Akiyoshi, Hideharu; Yamashita, Yousuke; Stenke, Andreas; Revell, Laura; Sukhodolov, Timofei; Rozanov, Eugene; Pitari, Giovanni; Visioni, Daniele; Stone, Kane A.; Schofield, Robyn; Banerjee, Antara

    2018-05-01

    Understanding and modeling the large-scale transport of trace gases and aerosols is important for interpreting past (and projecting future) changes in atmospheric composition. Here we show that there are large differences in the global-scale atmospheric transport properties among the models participating in the IGAC SPARC Chemistry-Climate Model Initiative (CCMI). Specifically, we find up to 40 % differences in the transport timescales connecting the Northern Hemisphere (NH) midlatitude surface to the Arctic and to Southern Hemisphere high latitudes, where the mean age ranges between 1.7 and 2.6 years. We show that these differences are related to large differences in vertical transport among the simulations, in particular to differences in parameterized convection over the oceans. While stronger convection over NH midlatitudes is associated with slower transport to the Arctic, stronger convection in the tropics and subtropics is associated with faster interhemispheric transport. We also show that the differences among simulations constrained with fields derived from the same reanalysis products are as large as (and in some cases larger than) the differences among free-running simulations, most likely due to larger differences in parameterized convection. Our results indicate that care must be taken when using simulations constrained with analyzed winds to interpret the influence of meteorology on tropospheric composition.

  3. Large-scale tropospheric transport in the Chemistry–Climate Model Initiative (CCMI simulations

    Directory of Open Access Journals (Sweden)

    C. Orbe

    2018-05-01

    Full Text Available Understanding and modeling the large-scale transport of trace gases and aerosols is important for interpreting past (and projecting future changes in atmospheric composition. Here we show that there are large differences in the global-scale atmospheric transport properties among the models participating in the IGAC SPARC Chemistry–Climate Model Initiative (CCMI. Specifically, we find up to 40 % differences in the transport timescales connecting the Northern Hemisphere (NH midlatitude surface to the Arctic and to Southern Hemisphere high latitudes, where the mean age ranges between 1.7 and 2.6 years. We show that these differences are related to large differences in vertical transport among the simulations, in particular to differences in parameterized convection over the oceans. While stronger convection over NH midlatitudes is associated with slower transport to the Arctic, stronger convection in the tropics and subtropics is associated with faster interhemispheric transport. We also show that the differences among simulations constrained with fields derived from the same reanalysis products are as large as (and in some cases larger than the differences among free-running simulations, most likely due to larger differences in parameterized convection. Our results indicate that care must be taken when using simulations constrained with analyzed winds to interpret the influence of meteorology on tropospheric composition.

  4. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  5. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  6. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  7. Large-scale diversity of slope fishes: pattern inconsistency between multiple diversity indices.

    Science.gov (United States)

    Gaertner, Jean-Claude; Maiorano, Porzia; Mérigot, Bastien; Colloca, Francesco; Politou, Chrissi-Yianna; Gil De Sola, Luis; Bertrand, Jacques A; Murenu, Matteo; Durbec, Jean-Pierre; Kallianiotis, Argyris; Mannini, Alessandro

    2013-01-01

    Large-scale studies focused on the diversity of continental slope ecosystems are still rare, usually restricted to a limited number of diversity indices and mainly based on the empirical comparison of heterogeneous local data sets. In contrast, we investigate large-scale fish diversity on the basis of multiple diversity indices and using 1454 standardized trawl hauls collected throughout the upper and middle slope of the whole northern Mediterranean Sea (36°3'- 45°7' N; 5°3'W - 28°E). We have analyzed (1) the empirical relationships between a set of 11 diversity indices in order to assess their degree of complementarity/redundancy and (2) the consistency of spatial patterns exhibited by each of the complementary groups of indices. Regarding species richness, our results contrasted both the traditional view based on the hump-shaped theory for bathymetric pattern and the commonly-admitted hypothesis of a large-scale decreasing trend correlated with a similar gradient of primary production in the Mediterranean Sea. More generally, we found that the components of slope fish diversity we analyzed did not always show a consistent pattern of distribution according either to depth or to spatial areas, suggesting that they are not driven by the same factors. These results, which stress the need to extend the number of indices traditionally considered in diversity monitoring networks, could provide a basis for rethinking not only the methodological approach used in monitoring systems, but also the definition of priority zones for protection. Finally, our results call into question the feasibility of properly investigating large-scale diversity patterns using a widespread approach in ecology, which is based on the compilation of pre-existing heterogeneous and disparate data sets, in particular when focusing on indices that are very sensitive to sampling design standardization, such as species richness.

  8. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  9. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  10. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  11. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  12. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  13. Large Scale Generation and Characterization of Anti-Human CD34 Monoclonal Antibody in Ascetic Fluid of Balb/c Mice

    OpenAIRE

    Aghebati Maleki, Leili; Majidi, Jafar; Baradaran, Behzad; Abdolalizadeh, Jalal; Kazemi, Tohid; Aghebati Maleki, Ali; Sineh sepehr, Koushan

    2013-01-01

    Purpose: Monoclonal antibodies or specific antibodies are now an essential tool of biomedical research and are of great commercial and medical value. The purpose of this study was to produce large scale of monoclonal antibody against CD34 in order to diagnostic application in leukemia and purification of human hematopoietic stem/progenitor cells. Methods: For large scale production of monoclonal antibody, hybridoma cells that produce monoclonal antibody against human CD34 were injected into t...

  14. Production of black holes in TeV-scale gravity

    International Nuclear Information System (INIS)

    Ringwald, A.

    2002-12-01

    Copious production of microscopic black holes is one of the least model-dependent predictions of TeV-scale gravity scenarios. We review the arguments behind this assertion and discuss opportunities to track the striking associated signatures in the near future. These include searches at neutrino telescopes, such as AMANDA and RICE, at cosmic ray air shower facilities, such as the Pierre Auger Observatory, and at colliders, such as the Large Hadron Collider. (orig.)

  15. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  17. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  18. Life cycle assessment of rapeseed oil, rape methyl ester and ethanol as fuels - a comparison between large- and smallscale production

    Energy Technology Data Exchange (ETDEWEB)

    Bernesson, Sven [Swedish Univ. of Agriculture Sciences, Uppsala (Sweden). Dep. of Biometry and Engineering

    2004-05-01

    Production of rapeseed oil, rape methyl ester (RME) and ethanol fuel for heavy diesel engines can be carried out with different systems solutions, in which the choice of system is usually related to the scale of the production. The main purpose of this study was to analyse whether the use of a small-scale rapeseed oil, RME and ethanol fuel production system reduced the environmental load in comparison to a medium- and a large-scale system. To fulfil this purpose, a limited LCA, including air-emissions and energy requirements, was carried out for the three fuels and the three plant sizes. Four different methods to allocate the environmental burden between different products were compared: physical allocation according to the lower heat value in the products [MJ/kg], economic allocation according to the product prices [SEK/kg], no allocation and allocation with a system expansion so that rapemeal and distiller's waste could replace soymeal mixed with soyoil and glycerine could replace glycerine produced from fossil raw material. The functional unit, to which the total environmental load was related, was 1.0 MJ of energy delivered on the engine shaft to the final consumer. Production of raw materials, cultivation, transport, fuel production and use of the fuels produced were included in the systems studied. It was shown in the study that the differences in environmental impact and energy requirement between small-, medium- and large-scale systems were small or even negligible in most cases for all three fuels, except for the photochemical ozone creation potential (POCP) during ethanol fuel production. The longer transport distances to a certain degree outweighed the higher oil extraction efficiency, the higher energy efficiency and the more efficient use of machinery and buildings in the large-scale system. The dominating production step was the cultivation, in which production of fertilisers, followed by soil emissions and tractive power, made major contributions to

  19. Life cycle assessment of rapeseed oil, rape methyl ester and ethanol as fuels - a comparison between large- and smallscale production

    Energy Technology Data Exchange (ETDEWEB)

    Bernesson, Sven [Swedish Univ. of Agriculture Sciences, Uppsala (Sweden). Dep. of Biometry and Engineering

    2004-05-01

    Production of rapeseed oil, rape methyl ester (RME) and ethanol fuel for heavy diesel engines can be carried out with different systems solutions, in which the choice of system is usually related to the scale of the production. The main purpose of this study was to analyse whether the use of a small-scale rapeseed oil, RME and ethanol fuel production system reduced the environmental load in comparison to a medium- and a large-scale system. To fulfil this purpose, a limited LCA, including air-emissions and energy requirements, was carried out for the three fuels and the three plant sizes. Four different methods to allocate the environmental burden between different products were compared: physical allocation according to the lower heat value in the products [MJ/kg], economic allocation according to the product prices [SEK/kg], no allocation and allocation with a system expansion so that rapemeal and distiller's waste could replace soymeal mixed with soyoil and glycerine could replace glycerine produced from fossil raw material. The functional unit, to which the total environmental load was related, was 1.0 MJ of energy delivered on the engine shaft to the final consumer. Production of raw materials, cultivation, transport, fuel production and use of the fuels produced were included in the systems studied. It was shown in the study that the differences in environmental impact and energy requirement between small-, medium- and large-scale systems were small or even negligible in most cases for all three fuels, except for the photochemical ozone creation potential (POCP) during ethanol fuel production. The longer transport distances to a certain degree outweighed the higher oil extraction efficiency, the higher energy efficiency and the more efficient use of machinery and buildings in the large-scale system. The dominating production step was the cultivation, in which production of fertilisers, followed by soil emissions and tractive power, made major

  20. KINETIC ALFVÉN WAVE GENERATION BY LARGE-SCALE PHASE MIXING

    International Nuclear Information System (INIS)

    Vásconez, C. L.; Pucci, F.; Valentini, F.; Servidio, S.; Malara, F.; Matthaeus, W. H.

    2015-01-01

    One view of the solar wind turbulence is that the observed highly anisotropic fluctuations at spatial scales near the proton inertial length d p may be considered as kinetic Alfvén waves (KAWs). In the present paper, we show how phase mixing of large-scale parallel-propagating Alfvén waves is an efficient mechanism for the production of KAWs at wavelengths close to d p and at a large propagation angle with respect to the magnetic field. Magnetohydrodynamic (MHD), Hall magnetohydrodynamic (HMHD), and hybrid Vlasov–Maxwell (HVM) simulations modeling the propagation of Alfvén waves in inhomogeneous plasmas are performed. In the linear regime, the role of dispersive effects is singled out by comparing MHD and HMHD results. Fluctuations produced by phase mixing are identified as KAWs through a comparison of polarization of magnetic fluctuations and wave-group velocity with analytical linear predictions. In the nonlinear regime, a comparison of HMHD and HVM simulations allows us to point out the role of kinetic effects in shaping the proton-distribution function. We observe the generation of temperature anisotropy with respect to the local magnetic field and the production of field-aligned beams. The regions where the proton-distribution function highly departs from thermal equilibrium are located inside the shear layers, where the KAWs are excited, this suggesting that the distortions of the proton distribution are driven by a resonant interaction of protons with KAW fluctuations. Our results are relevant in configurations where magnetic-field inhomogeneities are present, as, for example, in the solar corona, where the presence of Alfvén waves has been ascertained

  1. KINETIC ALFVÉN WAVE GENERATION BY LARGE-SCALE PHASE MIXING

    Energy Technology Data Exchange (ETDEWEB)

    Vásconez, C. L.; Pucci, F.; Valentini, F.; Servidio, S.; Malara, F. [Dipartimento di Fisica, Università della Calabria, I-87036, Rende (CS) (Italy); Matthaeus, W. H. [Department of Physics and Astronomy, University of Delaware, DE 19716 (United States)

    2015-12-10

    One view of the solar wind turbulence is that the observed highly anisotropic fluctuations at spatial scales near the proton inertial length d{sub p} may be considered as kinetic Alfvén waves (KAWs). In the present paper, we show how phase mixing of large-scale parallel-propagating Alfvén waves is an efficient mechanism for the production of KAWs at wavelengths close to d{sub p} and at a large propagation angle with respect to the magnetic field. Magnetohydrodynamic (MHD), Hall magnetohydrodynamic (HMHD), and hybrid Vlasov–Maxwell (HVM) simulations modeling the propagation of Alfvén waves in inhomogeneous plasmas are performed. In the linear regime, the role of dispersive effects is singled out by comparing MHD and HMHD results. Fluctuations produced by phase mixing are identified as KAWs through a comparison of polarization of magnetic fluctuations and wave-group velocity with analytical linear predictions. In the nonlinear regime, a comparison of HMHD and HVM simulations allows us to point out the role of kinetic effects in shaping the proton-distribution function. We observe the generation of temperature anisotropy with respect to the local magnetic field and the production of field-aligned beams. The regions where the proton-distribution function highly departs from thermal equilibrium are located inside the shear layers, where the KAWs are excited, this suggesting that the distortions of the proton distribution are driven by a resonant interaction of protons with KAW fluctuations. Our results are relevant in configurations where magnetic-field inhomogeneities are present, as, for example, in the solar corona, where the presence of Alfvén waves has been ascertained.

  2. Optimization of large-scale industrial systems : an emerging method

    Energy Technology Data Exchange (ETDEWEB)

    Hammache, A.; Aube, F.; Benali, M.; Cantave, R. [Natural Resources Canada, Varennes, PQ (Canada). CANMET Energy Technology Centre

    2006-07-01

    This paper reviewed optimization methods of large-scale industrial production systems and presented a novel systematic multi-objective and multi-scale optimization methodology. The methodology was based on a combined local optimality search with global optimality determination, and advanced system decomposition and constraint handling. The proposed method focused on the simultaneous optimization of the energy, economy and ecology aspects of industrial systems (E{sup 3}-ISO). The aim of the methodology was to provide guidelines for decision-making strategies. The approach was based on evolutionary algorithms (EA) with specifications including hybridization of global optimality determination with a local optimality search; a self-adaptive algorithm to account for the dynamic changes of operating parameters and design variables occurring during the optimization process; interactive optimization; advanced constraint handling and decomposition strategy; and object-oriented programming and parallelization techniques. Flowcharts of the working principles of the basic EA were presented. It was concluded that the EA uses a novel decomposition and constraint handling technique to enhance the Pareto solution search procedure for multi-objective problems. 6 refs., 9 figs.

  3. SCALES: SEVIRI and GERB CaL/VaL area for large-scale field experiments

    Science.gov (United States)

    Lopez-Baeza, Ernesto; Belda, Fernando; Bodas, Alejandro; Crommelynck, Dominique; Dewitte, Steven; Domenech, Carlos; Gimeno, Jaume F.; Harries, John E.; Jorge Sanchez, Joan; Pineda, Nicolau; Pino, David; Rius, Antonio; Saleh, Kauzar; Tarruella, Ramon; Velazquez, Almudena

    2004-02-01

    The main objective of the SCALES Project is to exploit the unique opportunity offered by the recent launch of the first European METEOSAT Second Generation geostationary satellite (MSG-1) to generate and validate new radiation budget and cloud products provided by the GERB (Geostationary Earth Radiation Budget) instrument. SCALES" specific objectives are: (i) definition and characterization of a large reasonably homogeneous area compatible to GERB pixel size (around 50 x 50 km2), (ii) validation of GERB TOA radiances and fluxes derived by means of angular distribution models, (iii) development of algorithms to estimate surface net radiation from GERB TOA measurements, and (iv) development of accurate methodologies to measure radiation flux divergence and analyze its influence on the thermal regime and dynamics of the atmosphere, also using GERB data. SCALES is highly innovative: it focuses on a new and unique space instrument and develops a new specific validation methodology for low resolution sensors that is based on the use of a robust reference meteorological station (Valencia Anchor Station) around which 3D high resolution meteorological fields are obtained from the MM5 Meteorological Model. During the 1st GERB Ground Validation Campaign (18th-24th June, 2003), CERES instruments on Aqua and Terra provided additional radiance measurements to support validation efforts. CERES instruments operated in the PAPS mode (Programmable Azimuth Plane Scanning) focusing the station. Ground measurements were taken by lidar, sun photometer, GPS precipitable water content, radiosounding ascents, Anchor Station operational meteorological measurements at 2m and 15m., 4 radiation components at 2m, and mobile stations to characterize a large area. In addition, measurements during LANDSAT overpasses on June 14th and 30th were also performed. These activities were carried out within the GIST (GERB International Science Team) framework, during GERB Commissioning Period.

  4. Topology assisted self-organization of colloidal nanoparticles: application to 2D large-scale nanomastering

    Directory of Open Access Journals (Sweden)

    Hind Kadiri

    2014-08-01

    Full Text Available Our aim was to elaborate a novel method for fully controllable large-scale nanopatterning. We investigated the influence of the surface topology, i.e., a pre-pattern of hydrogen silsesquioxane (HSQ posts, on the self-organization of polystyrene beads (PS dispersed over a large surface. Depending on the post size and spacing, long-range ordering of self-organized polystyrene beads is observed wherein guide posts were used leading to single crystal structure. Topology assisted self-organization has proved to be one of the solutions to obtain large-scale ordering. Besides post size and spacing, the colloidal concentration and the nature of solvent were found to have a significant effect on the self-organization of the PS beads. Scanning electron microscope and associated Fourier transform analysis were used to characterize the morphology of the ordered surfaces. Finally, the production of silicon molds is demonstrated by using the beads as a template for dry etching.

  5. Large-Scale, Continuous-Flow Production of Stressed Biomass (Desulfovibrio vulgaris Hildenborough)

    Energy Technology Data Exchange (ETDEWEB)

    Geller, Jil T.; Borglin, Sharon E.; Fortney, Julian L.; Lam, Bonita R.; Hazen, Terry C.; Biggin, Mark D.

    2010-05-01

    The Protein Complex Analysis Project (PCAP, http://pcap.lbl.gov/), focuses on high-throughput analysis of microbial protein complexes in the anaerobic, sulfate-reducing organism, DesulfovibriovulgarisHildenborough(DvH).Interest in DvHas a model organism for bioremediation of contaminated groundwater sites arises from its ability to reduce heavy metals. D. vulgarishas been isolated from contaminated groundwater of sites in the DOE complex. To understand the effect of environmental changes on the organism, midlog-phase cultures are exposed to nitrate and salt stresses (at the minimum inhibitory concentration, which reduces growth rates by 50percent), and compared to controls of cultures at midlogand stationary phases. Large volumes of culture of consistent quality (up to 100 liters) are needed because of the relatively low cell density of DvHcultures (one order of magnitude lower than E. coli, for example) and PCAP's challenge to characterize low-abundance membrane proteins. Cultures are grown in continuous flow stirred tank reactors (CFSTRs) to produce consistent cell densities. Stressor is added to the outflow from the CFSTR, and the mixture is pumped through a plug flow reactor (PFR), to provide a stress exposure time of 2 hours. Effluent is chilled and held in large carboys until it is centrifuged. A variety of analyses -- including metabolites, total proteins, cell density and phospholipidfatty-acids -- track culture consistency within a production run, and differences due to stress exposure and growth phase for the different conditions used. With our system we are able to produce the requisite 100 L of culture for a given condition within a week.

  6. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  7. Ocean Acidification Experiments in Large-Scale Mesocosms Reveal Similar Dynamics of Dissolved Organic Matter Production and Biotransformation

    Directory of Open Access Journals (Sweden)

    Maren Zark

    2017-09-01

    Full Text Available Dissolved organic matter (DOM represents a major reservoir of carbon in the oceans. Environmental stressors such as ocean acidification (OA potentially affect DOM production and degradation processes, e.g., phytoplankton exudation or microbial uptake and biotransformation of molecules. Resulting changes in carbon storage capacity of the ocean, thus, may cause feedbacks on the global carbon cycle. Previous experiments studying OA effects on the DOM pool under natural conditions, however, were mostly conducted in temperate and coastal eutrophic areas. Here, we report on OA effects on the existing and newly produced DOM pool during an experiment in the subtropical North Atlantic Ocean at the Canary Islands during an (1 oligotrophic phase and (2 after simulated deep water upwelling. The last is a frequently occurring event in this region controlling nutrient and phytoplankton dynamics. We manipulated nine large-scale mesocosms with a gradient of pCO2 ranging from ~350 up to ~1,030 μatm and monitored the DOM molecular composition using ultrahigh-resolution mass spectrometry via Fourier-transform ion cyclotron resonance mass spectrometry (FT-ICR-MS. An increase of 37 μmol L−1 DOC was observed in all mesocosms during a phytoplankton bloom induced by simulated upwelling. Indications for enhanced DOC accumulation under elevated CO2 became apparent during a phase of nutrient recycling toward the end of the experiment. The production of DOM was reflected in changes of the molecular DOM composition. Out of the 7,212 molecular formulae, which were detected throughout the experiment, ~50% correlated significantly in mass spectrometric signal intensity with cumulative bacterial protein production (BPP and are likely a product of microbial transformation. However, no differences in the produced compounds were found with respect to CO2 levels. Comparing the results of this experiment with a comparable OA experiment in the Swedish Gullmar Fjord, reveals

  8. Integrating weather and geotechnical monitoring data for assessing the stability of large scale surface mining operations

    Directory of Open Access Journals (Sweden)

    Steiakakis Chrysanthos

    2016-01-01

    Full Text Available The geotechnical challenges for safe slope design in large scale surface mining operations are enormous. Sometimes one degree of slope inclination can significantly reduce the overburden to ore ratio and therefore dramatically improve the economics of the operation, while large scale slope failures may have a significant impact on human lives. Furthermore, adverse weather conditions, such as high precipitation rates, may unfavorably affect the already delicate balance between operations and safety. Geotechnical, weather and production parameters should be systematically monitored and evaluated in order to safely operate such pits. Appropriate data management, processing and storage are critical to ensure timely and informed decisions.

  9. Food appropriation through large scale land acquisitions

    International Nuclear Information System (INIS)

    Cristina Rulli, Maria; D’Odorico, Paolo

    2014-01-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300–550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190–370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations. (letter)

  10. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  11. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  12. Gram-scale production of a basidiomycetous laccase in Aspergillus niger.

    Science.gov (United States)

    Mekmouche, Yasmina; Zhou, Simeng; Cusano, Angela M; Record, Eric; Lomascolo, Anne; Robert, Viviane; Simaan, A Jalila; Rousselot-Pailley, Pierre; Ullah, Sana; Chaspoul, Florence; Tron, Thierry

    2014-01-01

    We report on the expression in Aspergillus niger of a laccase gene we used to produce variants in Saccharomyces cerevisiae. Grams of recombinant enzyme can be easily obtained. This highlights the potential of combining this generic laccase sequence to the yeast and fungal expression systems for large-scale productions of variants. Copyright © 2013 The Society for Biotechnology, Japan. Published by Elsevier B.V. All rights reserved.

  13. Productivity and production efficiency among small scale irrigated ...

    African Journals Online (AJOL)

    The study examined productivity and production efficiency among small scale irrigated sugarcane farmers in Niger State, Nigeria using a stochastic translog frontier function. Data for the study were obtained using structured questionnaires administered to 100 randomly selected sugarcane farmers from Paiko and Gurara ...

  14. Entropy Production of Emerging Turbulent Scales in a Temporal Supercritical N-Neptane/Nitrogen Three-Dimensional Mixing Layer

    Science.gov (United States)

    Bellan, J.; Okongo, N.

    2000-01-01

    A study of emerging turbulent scales entropy production is conducted for a supercritical shear layer as a precursor to the eventual modeling of Subgrid Scales (from a turbulent state) leading to Large Eddy Simulations.

  15. Biopolitics problems of large-scale hydraulic engineering construction

    International Nuclear Information System (INIS)

    Romanenko, V.D.

    1997-01-01

    The XX century which will enter in a history as a century of large-scale hydraulic engineering constructions come to the finish. Only on the European continent 517 large reservoirs (more than 1000 million km 3 of water were detained, had been constructed for a period from 1901 till 1985. In the Danube basin a plenty for reservoirs of power stations, navigations, navigating sluices and other hydraulic engineering structures are constructed. Among them more than 40 especially large objects are located along the main bed of the river. A number of hydro-complexes such as Dnieper-Danube and Gabcikovo, Danube-Oder-Labe (project), Danube-Tissa, Danube-Adriatic Sea (project), Danube-Aegean Sea, Danube-Black Sea ones, are entered into operation or are in a stage of designing. Hydraulic engineering construction was especially heavily conducted in Ukraine. On its territory some large reservoirs on Dnieper and Yuzhny Bug were constructed, which have heavily changed the hydrological regime of the rivers. Summarised the results of river systems regulating in Ukraine one can be noted that more than 27 thousand ponds (3 km 3 per year), 1098 reservoirs of total volume 55 km 3 , 11 large channels of total length more than 2000 km and with productivity of 1000 m 2 /s have been created in Ukraine. Hydraulic engineering construction played an important role in development of the industry and agriculture, water-supply of the cities and settlements, in environmental effects, and maintenance of safe navigation in Danube, Dnieper and other rivers. In next part of the paper, the environmental changes after construction of the Karakum Channel on the Aral Sea in the Middle Asia are discussed

  16. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  17. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  18. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  19. Large Scale Product Recommendation of Supermarket Ware Based on Customer Behaviour Analysis

    Directory of Open Access Journals (Sweden)

    Andreas Kanavos

    2018-05-01

    Full Text Available In this manuscript, we present a prediction model based on the behaviour of each customer using data mining techniques. The proposed model utilizes a supermarket database and an additional database from Amazon, both containing information about customers’ purchases. Subsequently, our model analyzes these data in order to classify customers as well as products, being trained and validated with real data. This model is targeted towards classifying customers according to their consuming behaviour and consequently proposes new products more likely to be purchased by them. The corresponding prediction model is intended to be utilized as a tool for marketers so as to provide an analytically targeted and specified consumer behavior. Our algorithmic framework and the subsequent implementation employ the cloud infrastructure and use the MapReduce Programming Environment, a model for processing large data-sets in a parallel manner with a distributed algorithm on computer clusters, as well as Apache Spark, which is a newer framework built on the same principles as Hadoop. Through a MapReduce model application on each step of the proposed method, text processing speed and scalability are enhanced in reference to other traditional methods. Our results show that the proposed method predicts with high accuracy the purchases of a supermarket.

  20. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  1. Large scale production of antitumor cucurbitacins from Ecballium ...

    African Journals Online (AJOL)

    ajl6

    2012-08-16

    Aug 16, 2012 ... 1Department of Plant Biotechnology, National Research Center, Cairo, 12622 Egypt. ... Bioreactor plays a vital role in the commercial production of secondary metabolites .... comparing the peak area with that at the same retention time with ... air dried by rotatory evaporator and then extracted using ethanol:.

  2. Production of black holes in TeV-scale gravity

    International Nuclear Information System (INIS)

    Ringwald, A.

    2003-01-01

    Copious production of microscopic black holes is one of the least model-dependent predictions of TeV-scale gravity scenarios. We review the arguments behind this assertion and discuss opportunities to track the striking associated signatures in the near future. These include searches at neutrino telescopes, such as AMANDA and RICE, at cosmic ray air shower facilities, such as the Pierre Auger Observatory, and at colliders, such as the Large Hadron Collider. (Abstract Copyright [2003], Wiley Periodicals, Inc.)

  3. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  4. Nutrient removal from Chinese coastal waters by large-scale seaweed aquaculture

    KAUST Repository

    Xiao, Xi

    2017-04-21

    China is facing intense coastal eutrophication. Large-scale seaweed aquaculture in China is popular, now accounting for over 2/3\\'s of global production. Here, we estimate the nutrient removal capability of large-scale Chinese seaweed farms to determine its significance in mitigating eutrophication. We combined estimates of yield and nutrient concentration of Chinese seaweed aquaculture to quantify that one hectare of seaweed aquaculture removes the equivalent nutrient inputs entering 17.8 ha for nitrogen and 126.7 ha for phosphorus of Chinese coastal waters, respectively. Chinese seaweed aquaculture annually removes approximately 75,000 t nitrogen and 9,500 t phosphorus. Whereas removal of the total N inputs to Chinese coastal waters requires a seaweed farming area 17 times larger than the extant area, one and a half times more of the seaweed area would be able to remove close to 100% of the P inputs. With the current growth rate of seaweed aquaculture, we project this industry will remove 100% of the current phosphorus inputs to Chinese coastal waters by 2026. Hence, seaweed aquaculture already plays a hitherto unrealized role in mitigating coastal eutrophication, a role that may be greatly expanded with future growth of seaweed aquaculture.

  5. Nutrient removal from Chinese coastal waters by large-scale seaweed aquaculture

    KAUST Repository

    Xiao, Xi; Agusti, Susana; Lin, Fang; Li, Ke; Pan, Yaoru; Yu, Yan; Zheng, Yuhan; Wu, Jiaping; Duarte, Carlos M.

    2017-01-01

    China is facing intense coastal eutrophication. Large-scale seaweed aquaculture in China is popular, now accounting for over 2/3's of global production. Here, we estimate the nutrient removal capability of large-scale Chinese seaweed farms to determine its significance in mitigating eutrophication. We combined estimates of yield and nutrient concentration of Chinese seaweed aquaculture to quantify that one hectare of seaweed aquaculture removes the equivalent nutrient inputs entering 17.8 ha for nitrogen and 126.7 ha for phosphorus of Chinese coastal waters, respectively. Chinese seaweed aquaculture annually removes approximately 75,000 t nitrogen and 9,500 t phosphorus. Whereas removal of the total N inputs to Chinese coastal waters requires a seaweed farming area 17 times larger than the extant area, one and a half times more of the seaweed area would be able to remove close to 100% of the P inputs. With the current growth rate of seaweed aquaculture, we project this industry will remove 100% of the current phosphorus inputs to Chinese coastal waters by 2026. Hence, seaweed aquaculture already plays a hitherto unrealized role in mitigating coastal eutrophication, a role that may be greatly expanded with future growth of seaweed aquaculture.

  6. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  7. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  8. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  9. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  10. Optimizing in vitro large scale production of giant reed (Arundo donax L.) by liquid medium culture

    International Nuclear Information System (INIS)

    Cavallaro, Valeria; Patanè, Cristina; Cosentino, Salvatore L.; Di Silvestro, Isabella; Copani, Venera

    2014-01-01

    Tissue culture methods offer the potential for large-scale propagation of giant reed (Arundo donax L.), a promising crop for energy biomass. In previous trials, giant reed resulted particularly suitable to in vitro culture. In this paper, with the final goal of enhancing the efficiency of in vitro production process and reducing costs, the influence of four different culture media (agar or gellan-gum solidified medium, liquid medium into a temporary immersion system-RITA ® or in a stationary state) on in vitro shoot proliferation of giant reed was evaluated. Giant reed exhibited a particular sensitivity to gelling agents during the phase of secondary shoot formation. Gellan gum, as compared to agar, improved the efficiency of in vitro culture giving more shoots with higher mean fresh and dry weight. Moreover, the cultivation of this species into a liquid medium under temporary immersion conditions or in a stationary state, was comparatively as effective as and cheaper than that into a gellan gum medium. Increasing 6-benzylaminopurine (BA) up to 4 mg l −1 also resulted in a further enhancement of secondary shoot proliferation. The good adaptability of this species to liquid medium and the high multiplication rates observed indicate the possibility to obtain from a single node at least 1200 plantlets every six multiplication cycles (about 6 months), a number 100 fold higher than that obtained yearly per plant by the conventional methods of vegetative multiplication. In open field, micropropagated plantlets guaranteed a higher number of survived plants, secondary stems and above ground biomass as compared to rhizome ones. - Highlights: • In vitro propagation offers the potential for large-scale propagation of giant reed. • The success of an in vitro protocol depends on the rate and mode of shoot proliferation. • Substituting liquid media to solid ones may decrease propagation costs in Arundo donax. • Giant reed showed good proliferation rates in

  11. Large-scale heat pumps in sustainable energy systems: System and project perspectives

    Directory of Open Access Journals (Sweden)

    Blarke Morten B.

    2007-01-01

    Full Text Available This paper shows that in support of its ability to improve the overall economic cost-effectiveness and flexibility of the Danish energy system, the financially feasible integration of large-scale heat pumps (HP with existing combined heat and power (CHP plants, is critically sensitive to the operational mode of the HP vis-à-vis the operational coefficient of performance, mainly given by the temperature level of the heat source. When using ground source for low-temperature heat source, heat production costs increases by about 10%, while partial use of condensed flue gasses for low-temperature heat source results in an 8% cost reduction. Furthermore, the analysis shows that when a large-scale HP is integrated with an existing CHP plant, the projected spot market situation in The Nordic Power Exchange (Nord Pool towards 2025, which reflects a growing share of wind power and heat-supply constrained power generation electricity, further reduces the operational hours of the CHP unit over time, while increasing the operational hours of the HP unit. In result, an HP unit at half the heat production capacity as the CHP unit in combination with a heat-only boiler represents as a possibly financially feasible alternative to CHP operation, rather than a supplement to CHP unit operation. While such revised operational strategy would have impacts on policies to promote co-generation, these results indicate that the integration of large-scale HP may jeopardize efforts to promote co-generation. Policy instruments should be designed to promote the integration of HP with lower than half of the heating capacity of the CHP unit. Also it is found, that CHP-HP plant designs should allow for the utilization of heat recovered from the CHP unit’s flue gasses for both concurrent (CHP unit and HP unit and independent operation (HP unit only. For independent operation, the recovered heat is required to be stored. .

  12. Evaluation of the social and economical impacts related to the large scale production of bioethanol in Brazil; Avaliacao dos impactos socioeconomicos relacionados a producao em larga escala do bioetanol no Brasil

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2009-10-15

    The social and economical impacts related to the large scale bio ethanol production have been evaluated considering the replacement of 10% of gasoline equivalent consumption in the world forecasting for the year 2025, evaluating the impacts not only in the sectors directly involved (bio ethanol and sugar cane production), but also taking into account the effects on all production chain of the economy (direct, indirect and induced effects). For this analysis, an income-product model was developed allowing to simulate production gains during the agricultural phase, and to combine different technologies for the production of bio ethanol, for quantification the impacts of the forwarding the second generation technology.

  13. Quality Assurance in Large Scale Online Course Production

    Science.gov (United States)

    Holsombach-Ebner, Cinda

    2013-01-01

    The course design and development process (often referred to here as the "production process") at Embry-Riddle Aeronautical University (ERAU-Worldwide) aims to produce turnkey style courses to be taught by a highly-qualified pool of over 800 instructors. Given the high number of online courses and tremendous number of live sections…

  14. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  15. Novel approach for extinguishing large-scale coal fires using gas-liquid foams in open pit mines.

    Science.gov (United States)

    Lu, Xinxiao; Wang, Deming; Qin, Botao; Tian, Fuchao; Shi, Guangyi; Dong, Shuaijun

    2015-12-01

    Coal fires are a serious threat to the workers' security and safe production in open pit mines. The coal fire source is hidden and innumerable, and the large-area cavity is prevalent in the coal seam after the coal burned, causing the conventional extinguishment technology difficult to work. Foams are considered as an efficient means of fire extinguishment in these large-scale workplaces. A noble foam preparation method is introduced, and an original design of cavitation jet device is proposed to add foaming agent stably. The jet cavitation occurs when the water flow rate and pressure ratio reach specified values. Through self-building foaming system, the high performance foams are produced and then infused into the blast drilling holes at a large flow. Without complicated operation, this system is found to be very suitable for extinguishing large-scale coal fires. Field application shows that foam generation adopting the proposed key technology makes a good fire extinguishment effect. The temperature reduction using foams is 6-7 times higher than water, and CO concentration is reduced from 9.43 to 0.092‰ in the drilling hole. The coal fires are controlled successfully in open pit mines, ensuring the normal production as well as the security of personnel and equipment.

  16. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  17. An Automated Approach to Map Winter Cropped Area of Smallholder Farms across Large Scales Using MODIS Imagery

    Directory of Open Access Journals (Sweden)

    Meha Jain

    2017-06-01

    Full Text Available Fine-scale agricultural statistics are an important tool for understanding trends in food production and their associated drivers, yet these data are rarely collected in smallholder systems. These statistics are particularly important for smallholder systems given the large amount of fine-scale heterogeneity in production that occurs in these regions. To overcome the lack of ground data, satellite data are often used to map fine-scale agricultural statistics. However, doing so is challenging for smallholder systems because of (1 complex sub-pixel heterogeneity; (2 little to no available calibration data; and (3 high amounts of cloud cover as most smallholder systems occur in the tropics. We develop an automated method termed the MODIS Scaling Approach (MSA to map smallholder cropped area across large spatial and temporal scales using MODIS Enhanced Vegetation Index (EVI satellite data. We use this method to map winter cropped area, a key measure of cropping intensity, across the Indian subcontinent annually from 2000–2001 to 2015–2016. The MSA defines a pixel as cropped based on winter growing season phenology and scales the percent of cropped area within a single MODIS pixel based on observed EVI values at peak phenology. We validated the result with eleven high-resolution scenes (spatial scale of 5 × 5 m2 or finer that we classified into cropped versus non-cropped maps using training data collected by visual inspection of the high-resolution imagery. The MSA had moderate to high accuracies when validated using these eleven scenes across India (R2 ranging between 0.19 and 0.89 with an overall R2 of 0.71 across all sites. This method requires no calibration data, making it easy to implement across large spatial and temporal scales, with 100% spatial coverage due to the compositing of EVI to generate cloud-free data sets. The accuracies found in this study are similar to those of other studies that map crop production using automated methods

  18. Large-Scale Sentinel-1 Processing for Solid Earth Science and Urgent Response using Cloud Computing and Machine Learning

    Science.gov (United States)

    Hua, H.; Owen, S. E.; Yun, S. H.; Agram, P. S.; Manipon, G.; Starch, M.; Sacco, G. F.; Bue, B. D.; Dang, L. B.; Linick, J. P.; Malarout, N.; Rosen, P. A.; Fielding, E. J.; Lundgren, P.; Moore, A. W.; Liu, Z.; Farr, T.; Webb, F.; Simons, M.; Gurrola, E. M.

    2017-12-01

    With the increased availability of open SAR data (e.g. Sentinel-1 A/B), new challenges are being faced with processing and analyzing the voluminous SAR datasets to make geodetic measurements. Upcoming SAR missions such as NISAR are expected to generate close to 100TB per day. The Advanced Rapid Imaging and Analysis (ARIA) project can now generate geocoded unwrapped phase and coherence products from Sentinel-1 TOPS mode data in an automated fashion, using the ISCE software. This capability is currently being exercised on various study sites across the United States and around the globe, including Hawaii, Central California, Iceland and South America. The automated and large-scale SAR data processing and analysis capabilities use cloud computing techniques to speed the computations and provide scalable processing power and storage. Aspects such as how to processing these voluminous SLCs and interferograms at global scales, keeping up with the large daily SAR data volumes, and how to handle the voluminous data rates are being explored. Scene-partitioning approaches in the processing pipeline help in handling global-scale processing up to unwrapped interferograms with stitching done at a late stage. We have built an advanced science data system with rapid search functions to enable access to the derived data products. Rapid image processing of Sentinel-1 data to interferograms and time series is already being applied to natural hazards including earthquakes, floods, volcanic eruptions, and land subsidence due to fluid withdrawal. We will present the status of the ARIA science data system for generating science-ready data products and challenges that arise from being able to process SAR datasets to derived time series data products at large scales. For example, how do we perform large-scale data quality screening on interferograms? What approaches can be used to minimize compute, storage, and data movement costs for time series analysis in the cloud? We will also

  19. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  20. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  1. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  2. Spontaneous large-scale autolysis in Clostridium acetobutylicum contributes to generation of more spores

    Directory of Open Access Journals (Sweden)

    Zhen eLiu

    2015-09-01

    Full Text Available Autolysis is a widespread phenomenon in bacteria. In batch fermentation of Clostridium acetobutylicum ATCC 824, there is a spontaneous large-scale autolysis phenomenon with significant decrease of cell density immediately after exponential phase. To unravel the role of autolysis, an autolysin-coding gene, CA_C0554, was disrupted by using ClosTron system to obtain the mutant C. acetobutylicum lyc::int(72. The lower final cell density and faster cell density decrease rate of C. acetobutylicum ATCC 824 than those of C. acetobutylicum lyc::int(72 indicates that CA_C0554 was an important but not the sole autolysin-coding gene responding for the large-scale autolysis. Similar glucose utilization and solvents production but obvious lower cell density of C. acetobutylicum ATCC 824 comparing to C. acetobutylicum lyc::int(72 suggests that lysed C. acetobutylicum ATCC 824 cells were metabolic inactive. On the contrary, the spore density of C. acetobutylicum ATCC 824 is 26.1% higher than that of C. acetobutylicum lyc::int(72 in the final culture broth of batch fermentation. We speculated that spontaneous autolysis of metabolic-inactive cells provided nutrients for the sporulating cells. The present study suggests that one important biological role of spontaneous large-scale autolysis in C. acetobutylicum ATCC 824 batch fermentation is contributing to generation of more spores during sporulation.

  3. Spontaneous large-scale autolysis in Clostridium acetobutylicum contributes to generation of more spores.

    Science.gov (United States)

    Liu, Zhen; Qiao, Kai; Tian, Lei; Zhang, Quan; Liu, Zi-Yong; Li, Fu-Li

    2015-01-01

    Autolysis is a widespread phenomenon in bacteria. In batch fermentation of Clostridium acetobutylicum ATCC 824, there is a spontaneous large-scale autolysis phenomenon with significant decrease of cell density immediately after exponential phase. To unravel the role of autolysis, an autolysin-coding gene, CA_C0554, was disrupted by using ClosTron system to obtain the mutant C. acetobutylicum lyc::int(72). The lower final cell density and faster cell density decrease rate of C. acetobutylicum ATCC 824 than those of C. acetobutylicum lyc::int(72) indicates that CA_C0554 was an important but not the sole autolysin-coding gene responding for the large-scale autolysis. Similar glucose utilization and solvents production but obvious lower cell density of C. acetobutylicum ATCC 824 comparing to C. acetobutylicum lyc::int(72) suggests that lysed C. acetobutylicum ATCC 824 cells were metabolic inactive. On the contrary, the spore density of C. acetobutylicum ATCC 824 is 26.1% higher than that of C. acetobutylicum lyc::int(72) in the final culture broth of batch fermentation. We speculated that spontaneous autolysis of metabolic-inactive cells provided nutrients for the sporulating cells. The present study suggests that one important biological role of spontaneous large-scale autolysis in C. acetobutylicum ATCC 824 batch fermentation is contributing to generation of more spores during sporulation.

  4. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  5. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  6. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  7. Experience with LHC Magnets from Prototyping to Large Scale Industrial Production and Integration

    CERN Multimedia

    Rossi, L

    2004-01-01

    The construction of the LHC superconducting magnets is approaching its half way to completion. At the end of 2003, main dipoles cold masses for more than one octant were delivered; meanwhile the winding for the second octant was almost completed. The other large magnets, like the main quadrupoles and the insertion quadrupoles, have entered into series production as well. Providing more than 20 km of superconducting magnets, with the quality required for an accelerator like LHC, is an unprecedented challenge in term of complexity that has required many steps from the construction of 1 meterlong magnets in the laboratory to today’s production of more than one 15 meter-long magnet per day in Industry. The work and its organization is made even more complex by the fact that CERN supplies most of the critical components and part of the main tooling to the magnet manufacturers, both for cost reduction and for quality issues. In this paper the critical aspects of the construction will be reviewed and the actual ...

  8. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  9. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  10. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  11. Large-scale continuous process to vitrify nuclear defense waste: operating experience with nonradioactive waste

    International Nuclear Information System (INIS)

    Cosper, M.B.; Randall, C.T.; Traverso, G.M.

    1982-01-01

    The developmental program underway at SRL has demonstrated the vitrification process proposed for the sludge processing facility of the DWPF on a large scale. DWPF design criteria for production rate, equipment lifetime, and operability have all been met. The expected authorization and construction of the DWPF will result in the safe and permanent immobilization of a major quantity of existing high level waste. 11 figures, 4 tables

  12. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  13. Position Paper on Jatropha curcas. State of the Art Small and Large Scale Project Development

    Energy Technology Data Exchange (ETDEWEB)

    Daey Ouwens, K.; Franken, Y.J.; Rijssenbeek, W. [Fuels from Agriculture in Communal Technology FACT, Eindhoven (Netherlands); Francis, G. [University of Hohenheim, Hohenheim (Germany); Riedacker, A. [French National Institute for Agricultural Research INRA, Paris (France); Foidl, N.; Jongschaap, R.; Bindraban, P. [Plant Research International PRI, Wageningen (Netherlands)

    2007-06-15

    Much information has been collected during the Seminar on Jatropha held in Wageningen, Netherlands, March 2007, summarized in this paper. Much research is still necessary to improve yield, to allow use of biological products such as oil cake as animal fodder, etc. Good documented yield data are still scarce. Cooperation with research institutions is therefore recommended. At this stage it is still particularly important to distinguish between reality, promises and dangerous extrapolations. To avoid, spectacular and regretful failures and waste of money for investors as well as great disappointments of local populations, promoters of large scale plantation are invited to adopt stepwise approaches: large scale plantations should only be considered after some 4 to 5 years obtaining experimental data (annual seed yield and oil yield, economical viability etc.) from a sufficient number of small scale experimental plots (about 1 ha) corresponding to the whole range of soil and climatic conditions of such projects.

  14. Large scale Brownian dynamics of confined suspensions of rigid particles

    Science.gov (United States)

    Sprinkle, Brennan; Balboa Usabiaga, Florencio; Patankar, Neelesh A.; Donev, Aleksandar

    2017-12-01

    We introduce methods for large-scale Brownian Dynamics (BD) simulation of many rigid particles of arbitrary shape suspended in a fluctuating fluid. Our method adds Brownian motion to the rigid multiblob method [F. Balboa Usabiaga et al., Commun. Appl. Math. Comput. Sci. 11(2), 217-296 (2016)] at a cost comparable to the cost of deterministic simulations. We demonstrate that we can efficiently generate deterministic and random displacements for many particles using preconditioned Krylov iterative methods, if kernel methods to efficiently compute the action of the Rotne-Prager-Yamakawa (RPY) mobility matrix and its "square" root are available for the given boundary conditions. These kernel operations can be computed with near linear scaling for periodic domains using the positively split Ewald method. Here we study particles partially confined by gravity above a no-slip bottom wall using a graphical processing unit implementation of the mobility matrix-vector product, combined with a preconditioned Lanczos iteration for generating Brownian displacements. We address a major challenge in large-scale BD simulations, capturing the stochastic drift term that arises because of the configuration-dependent mobility. Unlike the widely used Fixman midpoint scheme, our methods utilize random finite differences and do not require the solution of resistance problems or the computation of the action of the inverse square root of the RPY mobility matrix. We construct two temporal schemes which are viable for large-scale simulations, an Euler-Maruyama traction scheme and a trapezoidal slip scheme, which minimize the number of mobility problems to be solved per time step while capturing the required stochastic drift terms. We validate and compare these schemes numerically by modeling suspensions of boomerang-shaped particles sedimented near a bottom wall. Using the trapezoidal scheme, we investigate the steady-state active motion in dense suspensions of confined microrollers, whose

  15. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  16. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  17. Optimization of large-scale fabrication of dielectric elastomer transducers

    DEFF Research Database (Denmark)

    Hassouneh, Suzan Sager

    Dielectric elastomers (DEs) have gained substantial ground in many different applications, such as wave energy harvesting, valves and loudspeakers. For DE technology to be commercially viable, it is necessary that any large-scale production operation is nondestructive, efficient and cheap. Danfoss......-strength laminates to perform as monolithic elements. For the front-to-back and front-to-front configurations, conductive elastomers were utilised. One approach involved adding the cheap and conductive filler, exfoliated graphite (EG) to a PDMS matrix to increase dielectric permittivity. The results showed that even...... as conductive adhesives were rejected. Dielectric properties below the percolation threshold were subsequently investigated, in order to conclude the study. In order to avoid destroying the network structure, carbon nanotubes (CNTs) were used as fillers during the preparation of the conductive elastomers...

  18. Water Resources Implications of Cellulosic Biofuel Production at a Regional Scale

    Science.gov (United States)

    Christopher, S. F.; Schoenholtz, S. H.; Nettles, J. E.

    2011-12-01

    Recent increases in oil prices, a strong national interest in greater energy independence, and a concern for the role of fossil fuels in global climate change, have led to a dramatic expansion in use of alternative renewable energy sources in the U.S. The U.S. government has mandated production of 36 billion gallons of renewable fuels by 2022, of which 16 billion gallons are required to be cellulosic biofuels. Production of cellulosic biomass offers a promising alternative to corn-based systems because large-scale production of corn-based ethanol often requires irrigation and is associated with increased erosion, excess sediment export, and enhanced leaching of nitrogen and phosphorus. Although cultivation of switchgrass using standard agricultural practices is one option being considered for production of cellulosic biomass, intercropping cellulosic biofuel crops within managed forests could provide feedstock without primary land use change or the water quality impacts associated with annual crops. Catchlight Energy LLC is examining the feasibility and sustainability of intercropping switchgrass in loblolly pine plantations in the southeastern U.S. Ongoing research is determining efficient operational techniques and information needed to evaluate effects of these practices on water resources in small watershed-scale (~25 ha) studies. Three sets of four to five sub-watersheds are fully instrumented and currently collecting calibration data in North Carolina, Alabama, and Mississippi. These watershed studies will provide detailed information to understand processes and guide management decisions. However, environmental implications of cellulosic systems need to be examined at a regional scale. We used the Soil Water Assessment Tool (SWAT), a physically-based hydrologic model, to examine water quantity effects of various land use change scenarios ranging from switchgrass intercropping a small percentage of managed pine forest land to conversion of all managed

  19. African aerosol and large-scale precipitation variability over West Africa

    International Nuclear Information System (INIS)

    Huang Jingfeng; Zhang Chidong; Prospero, Joseph M

    2009-01-01

    We investigated the large-scale connection between African aerosol and precipitation in the West African Monsoon (WAM) region using 8-year (2000-2007) monthly and daily Moderate Resolution Imaging Spectroradiometer (MODIS) aerosol products (aerosol optical depth, fine mode fraction) and Tropical Rainfall Measuring Mission (TRMM) precipitation and rain type. These high-quality data further confirmed our previous results that the large-scale link between aerosol and precipitation in this region undergoes distinct seasonal and spatial variability. Previously detected suppression of precipitation during months of high aerosol concentration occurs in both convective and stratiform rain, but not systematically in shallow rain. This suggests the suppression of deep convection due to the aerosol. Based on the seasonal cycle of dust and smoke and their geographical distribution, our data suggest that both dust (coarse mode aerosol) and smoke (fine mode aerosol) contribute to the precipitation suppression. However, the dust effect is evident over the Gulf of Guinea while the smoke effect is evident over both land and ocean. A back trajectory analysis further demonstrates that the precipitation reduction is statistically linked to the upwind aerosol concentration. This study suggests that African aerosol outbreaks in the WAM region can influence precipitation in the local monsoon system which has direct societal impact on the local community. It calls for more systematic investigations to determine the modulating mechanisms using both observational and modeling approaches.

  20. An innovative large scale integration of silicon nanowire-based field effect transistors

    Science.gov (United States)

    Legallais, M.; Nguyen, T. T. T.; Mouis, M.; Salem, B.; Robin, E.; Chenevier, P.; Ternon, C.

    2018-05-01

    Since the early 2000s, silicon nanowire field effect transistors are emerging as ultrasensitive biosensors while offering label-free, portable and rapid detection. Nevertheless, their large scale production remains an ongoing challenge due to time consuming, complex and costly technology. In order to bypass these issues, we report here on the first integration of silicon nanowire networks, called nanonet, into long channel field effect transistors using standard microelectronic process. A special attention is paid to the silicidation of the contacts which involved a large number of SiNWs. The electrical characteristics of these FETs constituted by randomly oriented silicon nanowires are also studied. Compatible integration on the back-end of CMOS readout and promising electrical performances open new opportunities for sensing applications.

  1. Large-scale CO2 storage — Is it feasible?

    Directory of Open Access Journals (Sweden)

    Johansen H.

    2013-06-01

    Full Text Available CCS is generally estimated to have to account for about 20% of the reduction of CO2 emissions to the atmosphere. This paper focuses on the technical aspects of CO2 storage, even if the CCS challenge is equally dependent upon finding viable international solutions to a wide range of economic, political and cultural issues. It has already been demonstrated that it is technically possible to store adequate amounts of CO2 in the subsurface (Sleipner, InSalah, Snøhvit. The large-scale storage challenge (several Gigatons of CO2 per year is more an issue of minimizing cost without compromising safety, and of making international regulations.The storage challenge may be split into 4 main parts: 1 finding reservoirs with adequate storage capacity, 2 make sure that the sealing capacity above the reservoir is sufficient, 3 build the infrastructure for transport, drilling and injection, and 4 set up and perform the necessary monitoring activities. More than 150 years of worldwide experience from the production of oil and gas is an important source of competence for CO2 storage. The storage challenge is however different in three important aspects: 1 the storage activity results in pressure increase in the subsurface, 2 there is no production of fluids that give important feedback on reservoir performance, and 3 the monitoring requirement will have to extend for a much longer time into the future than what is needed during oil and gas production. An important property of CO2 is that its behaviour in the subsurface is significantly different from that of oil and gas. CO2 in contact with water is reactive and corrosive, and may impose great damage on both man-made and natural materials, if proper precautions are not executed. On the other hand, the long-term effect of most of these reactions is that a large amount of CO2 will become immobilized and permanently stored as solid carbonate minerals. The reduced opportunity for direct monitoring of fluid samples

  2. Large-scale CO2 storage — Is it feasible?

    Science.gov (United States)

    Johansen, H.

    2013-06-01

    CCS is generally estimated to have to account for about 20% of the reduction of CO2 emissions to the atmosphere. This paper focuses on the technical aspects of CO2 storage, even if the CCS challenge is equally dependent upon finding viable international solutions to a wide range of economic, political and cultural issues. It has already been demonstrated that it is technically possible to store adequate amounts of CO2 in the subsurface (Sleipner, InSalah, Snøhvit). The large-scale storage challenge (several Gigatons of CO2 per year) is more an issue of minimizing cost without compromising safety, and of making international regulations.The storage challenge may be split into 4 main parts: 1) finding reservoirs with adequate storage capacity, 2) make sure that the sealing capacity above the reservoir is sufficient, 3) build the infrastructure for transport, drilling and injection, and 4) set up and perform the necessary monitoring activities. More than 150 years of worldwide experience from the production of oil and gas is an important source of competence for CO2 storage. The storage challenge is however different in three important aspects: 1) the storage activity results in pressure increase in the subsurface, 2) there is no production of fluids that give important feedback on reservoir performance, and 3) the monitoring requirement will have to extend for a much longer time into the future than what is needed during oil and gas production. An important property of CO2 is that its behaviour in the subsurface is significantly different from that of oil and gas. CO2 in contact with water is reactive and corrosive, and may impose great damage on both man-made and natural materials, if proper precautions are not executed. On the other hand, the long-term effect of most of these reactions is that a large amount of CO2 will become immobilized and permanently stored as solid carbonate minerals. The reduced opportunity for direct monitoring of fluid samples close to the

  3. Developing technology for large-scale production of forest chips. Wood Energy Technology Programme 1999-2003. Interim report

    International Nuclear Information System (INIS)

    Hakkila, P.

    2003-01-01

    Finland is enhancing its use of renewable sources in energy production. From the 1995 level, the use of renewable energy is to be increased by 50 % by 2010, and 100 % by 2025. Wood-based fuels will play a leading role in this development. The main source of wood-based fuels is processing residues from the forest industries. However, as all processing residues are already in use, an increase is possible only as far as the capacity and wood consumption of the forest industries grow. Energy policy affects the production and availability of processing residues only indirectly. Another large source of wood-based energy is forest fuels, consisting of traditional firewood and chips comminuted from low-quality biomass. It is estimated that the reserve of technically harvest-able forest biomass is 10-16 Mm' annually, when no specific cost limit is applied. This corresponds to 2-3 Mtoe or 6-9 % of the present consumption of primary energy in Finland. How much of this re-serve it will actually be possible to harvest and utilize depends on the cost competitiveness of forest chips against alternative sources of energy. A goal of Finnish energy and climate strategies is to use 5 Mm' forest chips annually by 2010. The use of wood fuels is being promoted by means of taxation, investment aid and support for chip production from young forests. Furthermore, research and development is being supported in order to create techno-economic conditions for the competitive production of forest chips. In 1999, the National Technology Agency Tekes established the five-year Wood Energy Technology Programme to stimulate the development of efficient systems for the large-scale production of forest chips. Key tar-gets are competitive costs, reliable supply and good quality chips. The two guiding principles of the programme are: (1) close cooperation between researchers and practitioners and (2) to apply research and development to the practical applications and commercialization. As of November

  4. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  5. Economies of scale in biogas production and the significance of flexible regulation

    International Nuclear Information System (INIS)

    Skovsgaard, Lise; Jacobsen, Henrik Klinge

    2017-01-01

    Biogas production is characterised by economies of scale in capital and operational costs of the plant and diseconomies of scale from transport of input materials. We analyse biogas in a Danish setting where most biogas is based on manure, we use a case study with actual distances, and find that the benefits of scale in capital and operational costs dominate the diseconomies of increasing transport distances to collect manure. To boost the yield it is common to use co-substrates in the biogas production. We investigate how costs and income changes, when sugar beet is added in this case study, and demonstrate that transport cost can be critical in relation to co-substrates. Further we compare the new Danish support for upgraded biogas with the traditional support for biogas being used in Combined Heat and Power production in relation to scale economies. We argue that economies of scale is facilitated by the new regulation providing similar support to upgraded biogas fed into the natural gas grid, however in order to keep transport costs low, we suggest that the biogas plants should be allowed to use and combine as many co-substrates as possible, respecting the sustainability criteria regarding energy crops in Danish legislation. - Highlights: • For Denmark we find economies of scale in biogas production based on pure manure. • Adding sugar beet outweigh economy of scale due to increased transport costs. • We investigate the main risks associated with input prices, yield and output prices. • Biogas fed into the gas grid should receive similar support as directly used in CHP. • Regulation should allow large biogas plants with few restrictions on co-substrates.

  6. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  7. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    Shimakawa, Yoshio; Nibe, Nobuaki; Hori, Toru

    2002-05-01

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  8. Large scale CMB anomalies from thawing cosmic strings

    Energy Technology Data Exchange (ETDEWEB)

    Ringeval, Christophe [Centre for Cosmology, Particle Physics and Phenomenology, Institute of Mathematics and Physics, Louvain University, 2 Chemin du Cyclotron, 1348 Louvain-la-Neuve (Belgium); Yamauchi, Daisuke; Yokoyama, Jun' ichi [Research Center for the Early Universe (RESCEU), Graduate School of Science, The University of Tokyo, Tokyo 113-0033 (Japan); Bouchet, François R., E-mail: christophe.ringeval@uclouvain.be, E-mail: yamauchi@resceu.s.u-tokyo.ac.jp, E-mail: yokoyama@resceu.s.u-tokyo.ac.jp, E-mail: bouchet@iap.fr [Institut d' Astrophysique de Paris, UMR 7095-CNRS, Université Pierre et Marie Curie, 98bis boulevard Arago, 75014 Paris (France)

    2016-02-01

    Cosmic strings formed during inflation are expected to be either diluted over super-Hubble distances, i.e., invisible today, or to have crossed our past light cone very recently. We discuss the latter situation in which a few strings imprint their signature in the Cosmic Microwave Background (CMB) Anisotropies after recombination. Being almost frozen in the Hubble flow, these strings are quasi static and evade almost all of the previously derived constraints on their tension while being able to source large scale anisotropies in the CMB sky. Using a local variance estimator on thousand of numerically simulated Nambu-Goto all sky maps, we compute the expected signal and show that it can mimic a dipole modulation at large angular scales while being negligible at small angles. Interestingly, such a scenario generically produces one cold spot from the thawing of a cosmic string loop. Mixed with anisotropies of inflationary origin, we find that a few strings of tension GU = O(1) × 10{sup −6} match the amplitude of the dipole modulation reported in the Planck satellite measurements and could be at the origin of other large scale anomalies.

  9. Scaling up the diversity-resilience relationship with trait databases and remote sensing data: the recovery of productivity after wildfire.

    Science.gov (United States)

    Spasojevic, Marko J; Bahlai, Christie A; Bradley, Bethany A; Butterfield, Bradley J; Tuanmu, Mao-Ning; Sistla, Seeta; Wiederholt, Ruscena; Suding, Katharine N

    2016-04-01

    Understanding the mechanisms underlying ecosystem resilience - why some systems have an irreversible response to disturbances while others recover - is critical for conserving biodiversity and ecosystem function in the face of global change. Despite the widespread acceptance of a positive relationship between biodiversity and resilience, empirical evidence for this relationship remains fairly limited in scope and localized in scale. Assessing resilience at the large landscape and regional scales most relevant to land management and conservation practices has been limited by the ability to measure both diversity and resilience over large spatial scales. Here, we combined tools used in large-scale studies of biodiversity (remote sensing and trait databases) with theoretical advances developed from small-scale experiments to ask whether the functional diversity within a range of woodland and forest ecosystems influences the recovery of productivity after wildfires across the four-corner region of the United States. We additionally asked how environmental variation (topography, macroclimate) across this geographic region influences such resilience, either directly or indirectly via changes in functional diversity. Using path analysis, we found that functional diversity in regeneration traits (fire tolerance, fire resistance, resprout ability) was a stronger predictor of the recovery of productivity after wildfire than the functional diversity of seed mass or species richness. Moreover, slope, elevation, and aspect either directly or indirectly influenced the recovery of productivity, likely via their effect on microclimate, while macroclimate had no direct or indirect effects. Our study provides some of the first direct empirical evidence for functional diversity increasing resilience at large spatial scales. Our approach highlights the power of combining theory based on local-scale studies with tools used in studies at large spatial scales and trait databases to

  10. Cuprous Oxide Scale up: Gram Production via Bulk Synthesis using Classic Solvents at Low Temperatures

    Energy Technology Data Exchange (ETDEWEB)

    Hall, A. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Han, T. Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-05-07

    Cuprous oxide is a p-type semiconducting material that has been highly researched for its interesting properties. Many small-scale syntheses have exhibited excellent control over size and morphology. As the demand for cuprous oxide grows, the synthesis method need to evolve to facilitate large-scale production. This paper supplies a facile bulk synthesis method for Cu₂O on average, 1-liter reaction volume can produce 1 gram of particles. In order to study the shape and size control mechanisms on such a scale, the reaction volume was diminished to 250 mL producing on average 0.3 grams of nanoparticles per batch. Well-shaped nanoparticles have been synthesized using an aqueous solution of CuCl₂, NaOH, SDS surfactant, and NH₂OH-HCl at mild temperatures. The time allotted between the addition of NaOH and NH₂OH-HCl was determined to be critical for Cu(OH)2 production, an important precursor to the final produce The effects of stirring rates on a large scale was also analyzed during reagent addition and post reagent addition. A morphological change from rhombic dodecahedra to spheres occurred as the stirring speed was increased. The effects of NH₂OH-HCl concentration were also studied to control the etching effects of the final product.

  11. Large scale processing of dielectric electroactive polymers

    DEFF Research Database (Denmark)

    Vudayagiri, Sindhu

    Efficient processing techniques are vital to the success of any manufacturing industry. The processing techniques determine the quality of the products and thus to a large extent the performance and reliability of the products that are manufactured. The dielectric electroactive polymer (DEAP...

  12. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  13. Centralized manure digestion. Selection of locations and estimation of costs of large-scale manure storage application

    International Nuclear Information System (INIS)

    1995-03-01

    A study to assess the possibilities and the consequences of the use of existing Dutch large scale manure silos at centralised anaerobic digestion plants (CAD-plants) for manure and energy-rich organic wastes is carried out. Reconstruction of these large scale manure silos into digesters for a CAD-plant is not self-evident due to the high height/diameter ratio of these silos and the extra investments that have to be made for additional facilities for roofing, insulation, mixing and heating. From the results of an inventory and selection of large scale manure silos with a storage capacity above 1,500 m 3 it appeared that there are 21 locations in The Netherlands that can be qualified for realisation of a CAD plant with a processing capacity of 100 m 3 biomass (80% manure, 20% additives) per day. These locations are found in particular at the 'shortage-areas' for manure fertilisation in the Dutch provinces Groningen and Drenthe. Three of these 21 locations with large scale silos are considered to be the most suitable for realisation of a large scale CAD-plant. The selection is based on an optimal scale for a CAD-plant of 300 m 3 material (80% manure, 20% additives) to be processed per day and the most suitable consuming markets for the biogas produced at the CAD-plant. The three locations are at Middelharnis, Veendam, and Klazinaveen. Applying the conditions as used in this study and accounting for all costs for transport of manure, additives and end-product including the costs for the storage facilities, a break-even operation might be realised at a minimum income for the additives of approximately 50 Dutch guilders per m 3 (including TAV). This income price is considerably lower than the prevailing costs for tipping or processing of organic wastes in The Netherlands. This study revealed that a break-even exploitation of a large scale CAD-plant for the processing of manure with energy-rich additives is possible. (Abstract Truncated)

  14. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  15. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  16. An interactive display system for large-scale 3D models

    Science.gov (United States)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  17. Remote sensing of the biological dynamics of large-scale salt evaporation ponds

    Science.gov (United States)

    Richardson, Laurie L.; Bachoon, Dave; Ingram-Willey, Vebbra; Chow, Colin C.; Weinstock, Kenneth

    1992-01-01

    Optical properties of salt evaporation ponds associated with Exportadora de Sal, a salt production company in Baja California Sur, Mexico, were analyzed using a combination of spectroradiometer and extracted pigment data, and Landsat-5 Thematic Mapper imagery. The optical characteristics of each pond are determined by the biota, which consists of dense populations of algae and photosynthetic bacteria containing a wide variety of photosynthetic and photoprotective pigments. Analysis has shown that spectral and image data can differentiate between taxonomic groups of the microbiota, detect changes in population distributions, and reveal large-scale seasonal dynamics.

  18. Large-scale laboratory study of breaking wave hydrodynamics over a fixed bar

    Science.gov (United States)

    van der A, Dominic A.; van der Zanden, Joep; O'Donoghue, Tom; Hurther, David; Cáceres, Iván.; McLelland, Stuart J.; Ribberink, Jan S.

    2017-04-01

    A large-scale wave flume experiment has been carried out involving a T = 4 s regular wave with H = 0.85 m wave height plunging over a fixed barred beach profile. Velocity profiles were measured at 12 locations along the breaker bar using LDA and ADV. A strong undertow is generated reaching magnitudes of 0.8 m/s on the shoreward side of the breaker bar. A circulation pattern occurs between the breaking area and the inner surf zone. Time-averaged turbulent kinetic energy (TKE) is largest in the breaking area on the shoreward side of the bar where the plunging jet penetrates the water column. At this location, and on the bar crest, TKE generated at the water surface in the breaking process reaches the bottom boundary layer. In the breaking area, TKE does not reduce to zero within a wave cycle which leads to a high level of "residual" turbulence and therefore lower temporal variation in TKE compared to previous studies of breaking waves on plane beach slopes. It is argued that this residual turbulence results from the breaker bar-trough geometry, which enables larger length scales and time scales of breaking-generated vortices and which enhances turbulence production within the water column compared to plane beaches. Transport of TKE is dominated by the undertow-related flux, whereas the wave-related and turbulent fluxes are approximately an order of magnitude smaller. Turbulence production and dissipation are largest in the breaker zone and of similar magnitude, but in the shoaling zone and inner surf zone production is negligible and dissipation dominates.

  19. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  20. Large-scale perturbations from the waterfall field in hybrid inflation

    International Nuclear Information System (INIS)

    Fonseca, José; Wands, David; Sasaki, Misao

    2010-01-01

    We estimate large-scale curvature perturbations from isocurvature fluctuations in the waterfall field during hybrid inflation, in addition to the usual inflaton field perturbations. The tachyonic instability at the end of inflation leads to an explosive growth of super-Hubble scale perturbations, but they retain the steep blue spectrum characteristic of vacuum fluctuations in a massive field during inflation. The power spectrum thus peaks around the Hubble-horizon scale at the end of inflation. We extend the usual δN formalism to include the essential role of these small fluctuations when estimating the large-scale curvature perturbation. The resulting curvature perturbation due to fluctuations in the waterfall field is second-order and the spectrum is expected to be of order 10 −54 on cosmological scales

  1. Decoupling local mechanics from large-scale structure in modular metamaterials

    Science.gov (United States)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  2. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  3. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  4. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  5. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  6. Scaling of the Urban Water Footprint: An Analysis of 65 Mid- to Large-Sized U.S. Metropolitan Areas

    Science.gov (United States)

    Mahjabin, T.; Garcia, S.; Grady, C.; Mejia, A.

    2017-12-01

    Scaling laws have been shown to be relevant to a range of disciplines including biology, ecology, hydrology, and physics, among others. Recently, scaling was shown to be important for understanding and characterizing cities. For instance, it was found that urban infrastructure (water supply pipes and electrical wires) tends to scale sublinearly with city population, implying that large cities are more efficient. In this study, we explore the scaling of the water footprint of cities. The water footprint is a measure of water appropriation that considers both the direct and indirect (virtual) water use of a consumer or producer. Here we compute the water footprint of 65 mid- to large-sized U.S. metropolitan areas, accounting for direct and indirect water uses associated with agricultural and industrial commodities, and residential and commercial water uses. We find that the urban water footprint, computed as the sum of the water footprint of consumption and production, exhibits sublinear scaling with an exponent of 0.89. This suggests the possibility of large cities being more water-efficient than small ones. To further assess this result, we conduct additional analysis by accounting for international flows, and the effects of green water and city boundary definition on the scaling. The analysis confirms the scaling and provides additional insight about its interpretation.

  7. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  8. European-scale modelling of groundwater denitrification and associated N2O production

    KAUST Repository

    Keuskamp, J.A.

    2012-06-01

    This paper presents a spatially explicit model for simulating the fate of nitrogen (N) in soil and groundwater and nitrous oxide (N 2O) production in groundwater with a 1 km resolution at the European scale. The results show large heterogeneity of nitrate outflow from groundwater to surface water and production of N 2O. This heterogeneity is the result of variability in agricultural and hydrological systems. Large parts of Europe have no groundwater aquifers and short travel times from soil to surface water. In these regions no groundwater denitrification and N 2O production is expected. Predicted N leaching (16% of the N inputs) and N 2O emissions (0.014% of N leaching) are much less than the IPCC default leaching rate and combined emission factor for groundwater and riparian zones, respectively. © 2012 Elsevier Ltd. All rights reserved.

  9. Temporal flexibility and careers: The role of large-scale organizations for physicians

    OpenAIRE

    Forrest Briscoe

    2006-01-01

    Temporal flexibility and careers: The role of large-scale organizations for physicians. Forrest Briscoe Briscoe This study investigates how employment in large-scale organizations affects the work lives of practicing physicians. Well-established theory associates larger organizations with bureaucratic constraint, loss of workplace control, and dissatisfaction, but this author finds that large scale is also associated with greater schedule and career flexibility. Ironically, the bureaucratic p...

  10. The role of large scale motions on passive scalar transport

    Science.gov (United States)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  11. Signatures of non-universal large scales in conditional structure functions from various turbulent flows

    International Nuclear Information System (INIS)

    Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K

    2011-01-01

    We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)

  12. Consultancy on Large-Scale Submerged Aerobic Cultivation Process Design - Final Technical Report: February 1, 2016 -- June 30, 2016

    Energy Technology Data Exchange (ETDEWEB)

    Crater, Jason [Gemomatica, Inc., San Diego, CA (United States); Galleher, Connor [Gemomatica, Inc., San Diego, CA (United States); Lievense, Jeff [Gemomatica, Inc., San Diego, CA (United States)

    2017-05-12

    NREL is developing an advanced aerobic bubble column model using Aspen Custom Modeler (ACM). The objective of this work is to integrate the new fermentor model with existing techno-economic models in Aspen Plus and Excel to establish a new methodology for guiding process design. To assist this effort, NREL has contracted Genomatica to critique and make recommendations for improving NREL's bioreactor model and large scale aerobic bioreactor design for biologically producing lipids at commercial scale. Genomatica has highlighted a few areas for improving the functionality and effectiveness of the model. Genomatica recommends using a compartment model approach with an integrated black-box kinetic model of the production microbe. We also suggest including calculations for stirred tank reactors to extend the models functionality and adaptability for future process designs. Genomatica also suggests making several modifications to NREL's large-scale lipid production process design. The recommended process modifications are based on Genomatica's internal techno-economic assessment experience and are focused primarily on minimizing capital and operating costs. These recommendations include selecting/engineering a thermotolerant yeast strain with lipid excretion; using bubble column fermentors; increasing the size of production fermentors; reducing the number of vessels; employing semi-continuous operation; and recycling cell mass.

  13. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    Science.gov (United States)

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  14. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    NARCIS (Netherlands)

    Loon, van A.F.; Huijgevoort, van M.H.J.; Lanen, van H.A.J.

    2012-01-01

    Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological

  15. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  16. Dual Decomposition for Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Vandenberghe, Lieven

    2013-01-01

    Dual decomposition is applied to power balancing of exible thermal storage units. The centralized large-scale problem is decomposed into smaller subproblems and solved locallyby each unit in the Smart Grid. Convergence is achieved by coordinating the units consumption through a negotiation...

  17. Generation of large-scale vortives in compressible helical turbulence

    International Nuclear Information System (INIS)

    Chkhetiani, O.G.; Gvaramadze, V.V.

    1989-01-01

    We consider generation of large-scale vortices in compressible self-gravitating turbulent medium. The closed equation describing evolution of the large-scale vortices in helical turbulence with finite correlation time is obtained. This equation has the form similar to the hydromagnetic dynamo equation, which allows us to call the vortx genertation effect the vortex dynamo. It is possible that principally the same mechanism is responsible both for amplification and maintenance of density waves and magnetic fields in gaseous disks of spiral galaxies. (author). 29 refs

  18. Dynamic model of frequency control in Danish power system with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2013-01-01

    This work evaluates the impact of large scale integration of wind power in future power systems when 50% of load demand can be met from wind power. The focus is on active power balance control, where the main source of power imbalance is an inaccurate wind speed forecast. In this study, a Danish...... power system model with large scale of wind power is developed and a case study for an inaccurate wind power forecast is investigated. The goal of this work is to develop an adequate power system model that depicts relevant dynamic features of the power plants and compensates for load generation...... imbalances, caused by inaccurate wind speed forecast, by an appropriate control of the active power production from power plants....

  19. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  20. Impact of large-scale tides on cosmological distortions via redshift-space power spectrum

    Science.gov (United States)

    Akitsu, Kazuyuki; Takada, Masahiro

    2018-03-01

    Although large-scale perturbations beyond a finite-volume survey region are not direct observables, these affect measurements of clustering statistics of small-scale (subsurvey) perturbations in large-scale structure, compared with the ensemble average, via the mode-coupling effect. In this paper we show that a large-scale tide induced by scalar perturbations causes apparent anisotropic distortions in the redshift-space power spectrum of galaxies in a way depending on an alignment between the tide, wave vector of small-scale modes and line-of-sight direction. Using the perturbation theory of structure formation, we derive a response function of the redshift-space power spectrum to large-scale tide. We then investigate the impact of large-scale tide on estimation of cosmological distances and the redshift-space distortion parameter via the measured redshift-space power spectrum for a hypothetical large-volume survey, based on the Fisher matrix formalism. To do this, we treat the large-scale tide as a signal, rather than an additional source of the statistical errors, and show that a degradation in the parameter is restored if we can employ the prior on the rms amplitude expected for the standard cold dark matter (CDM) model. We also discuss whether the large-scale tide can be constrained at an accuracy better than the CDM prediction, if the effects up to a larger wave number in the nonlinear regime can be included.

  1. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  2. On the use of Cloud Computing and Machine Learning for Large-Scale SAR Science Data Processing and Quality Assessment Analysi

    Science.gov (United States)

    Hua, H.

    2016-12-01

    Geodetic imaging is revolutionizing geophysics, but the scope of discovery has been limited by labor-intensive technological implementation of the analyses. The Advanced Rapid Imaging and Analysis (ARIA) project has proven capability to automate SAR data processing and analysis. Existing and upcoming SAR missions such as Sentinel-1A/B and NISAR are also expected to generate massive amounts of SAR data. This has brought to the forefront the need for analytical tools for SAR quality assessment (QA) on the large volumes of SAR data-a critical step before higher-level time series and velocity products can be reliably generated. Initially leveraging an advanced hybrid-cloud computing science data system for performing large-scale processing, machine learning approaches were augmented for automated analysis of various quality metrics. Machine learning-based user-training of features, cross-validation, prediction models were integrated into our cloud-based science data processing flow to enable large-scale and high-throughput QA analytics for enabling improvements to the production quality of geodetic data products.

  3. The Hamburg large scale geostrophic ocean general circulation model. Cycle 1

    International Nuclear Information System (INIS)

    Maier-Reimer, E.; Mikolajewicz, U.

    1992-02-01

    The rationale for the Large Scale Geostrophic ocean circulation model (LSG-OGCM) is based on the observations that for a large scale ocean circulation model designed for climate studies, the relevant characteristic spatial scales are large compared with the internal Rossby radius throughout most of the ocean, while the characteristic time scales are large compared with the periods of gravity modes and barotropic Rossby wave modes. In the present version of the model, the fast modes have been filtered out by a conventional technique of integrating the full primitive equations, including all terms except the nonlinear advection of momentum, by an implicit time integration method. The free surface is also treated prognostically, without invoking a rigid lid approximation. The numerical scheme is unconditionally stable and has the additional advantage that it can be applied uniformly to the entire globe, including the equatorial and coastal current regions. (orig.)

  4. Soft X-ray Emission from Large-Scale Galactic Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S.; O'Dea, C.; Veilleux, S.

    1998-01-01

    Kiloparsec-scale soft X-ray nebulae extend along the galaxy minor axes in several Seyfert galaxies, including NGC 2992, NGC 4388 and NGC 5506. In these three galaxies, the extended X-ray emission observed in ROSAT HRI images has 0.2-2.4 keV X-ray luminosities of 0.4-3.5 x 10(40) erg s(-1) . The X-ray nebulae are roughly co-spatial with the large-scale radio emission, suggesting that both are produced by large-scale galactic outflows. Assuming pressure balance between the radio and X-ray plasmas, the X-ray filling factor is >~ 10(4) times as large as the radio plasma filling factor, suggesting that large-scale outflows in Seyfert galaxies are predominantly winds of thermal X-ray emitting gas. We favor an interpretation in which large-scale outflows originate as AGN-driven jets that entrain and heat gas on kpc scales as they make their way out of the galaxy. AGN- and starburst-driven winds are also possible explanations if the winds are oriented along the rotation axis of the galaxy disk. Since large-scale outflows are present in at least 50 percent of Seyfert galaxies, the soft X-ray emission from the outflowing gas may, in many cases, explain the ``soft excess" X-ray feature observed below 2 keV in X-ray spectra of many Seyfert 2 galaxies.

  5. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  6. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  7. Evaluation of Large-scale Public Sector Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Hanne Foss

    2017-01-01

    and more delimited policy areas take place. In our analysis we apply four governance perspectives (rational-instrumental, rational-interest based, institutional-cultural and a chaos perspective) in a comparative analysis of the evaluations of two large-scale public sector reforms in Denmark and Norway. We...

  8. On the soft limit of the large scale structure power spectrum. UV dependence

    International Nuclear Information System (INIS)

    Garny, Mathias

    2015-08-01

    We derive a non-perturbative equation for the large scale structure power spectrum of long-wavelength modes. Thereby, we use an operator product expansion together with relations between the three-point function and power spectrum in the soft limit. The resulting equation encodes the coupling to ultraviolet (UV) modes in two time-dependent coefficients, which may be obtained from response functions to (anisotropic) parameters, such as spatial curvature, in a modified cosmology. We argue that both depend weakly on fluctuations deep in the UV. As a byproduct, this implies that the renormalized leading order coefficient(s) in the effective field theory (EFT) of large scale structures receive most of their contribution from modes close to the non-linear scale. Consequently, the UV dependence found in explicit computations within standard perturbation theory stems mostly from counter-term(s). We confront a simplified version of our non-perturbative equation against existent numerical simulations, and find good agreement within the expected uncertainties. Our approach can in principle be used to precisely infer the relevance of the leading order EFT coefficient(s) using small volume simulations in an 'anisotropic separate universe' framework. Our results suggest that the importance of these coefficient(s) is a ∝ 10% effect, and plausibly smaller.

  9. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  10. Large-scale coastal impact induced by a catastrophic storm

    DEFF Research Database (Denmark)

    Fruergaard, Mikkel; Andersen, Thorbjørn Joest; Johannessen, Peter N

    breaching. Our results demonstrate that violent, millennial-scale storms can trigger significant large-scale and long-term changes on barrier coasts, and that coastal changes assumed to take place over centuries or even millennia may occur in association with a single extreme storm event....

  11. Large-eddy simulation with accurate implicit subgrid-scale diffusion

    NARCIS (Netherlands)

    B. Koren (Barry); C. Beets

    1996-01-01

    textabstractA method for large-eddy simulation is presented that does not use an explicit subgrid-scale diffusion term. Subgrid-scale effects are modelled implicitly through an appropriate monotone (in the sense of Spekreijse 1987) discretization method for the advective terms. Special attention is

  12. Relating large-scale subsidence to convection development in Arctic mixed-phase marine stratocumulus

    Science.gov (United States)

    Young, Gillian; Connolly, Paul J.; Dearden, Christopher; Choularton, Thomas W.

    2018-02-01

    Large-scale subsidence, associated with high-pressure systems, is often imposed in large-eddy simulation (LES) models to maintain the height of boundary layer (BL) clouds. Previous studies have considered the influence of subsidence on warm liquid clouds in subtropical regions; however, the relationship between subsidence and mixed-phase cloud microphysics has not specifically been studied. For the first time, we investigate how widespread subsidence associated with synoptic-scale meteorological features can affect the microphysics of Arctic mixed-phase marine stratocumulus (Sc) clouds. Modelled with LES, four idealised scenarios - a stable Sc, varied droplet (Ndrop) or ice (Nice) number concentrations, and a warming surface (representing motion southwards) - were subjected to different levels of subsidence to investigate the cloud microphysical response. We find strong sensitivities to large-scale subsidence, indicating that high-pressure systems in the ocean-exposed Arctic regions have the potential to generate turbulence and changes in cloud microphysics in any resident BL mixed-phase clouds.Increased cloud convection is modelled with increased subsidence, driven by longwave radiative cooling at cloud top and rain evaporative cooling and latent heating from snow growth below cloud. Subsidence strengthens the BL temperature inversion, thus reducing entrainment and allowing the liquid- and ice-water paths (LWPs, IWPs) to increase. Through increased cloud-top radiative cooling and subsequent convective overturning, precipitation production is enhanced: rain particle number concentrations (Nrain), in-cloud rain mass production rates, and below-cloud evaporation rates increase with increased subsidence.Ice number concentrations (Nice) play an important role, as greater concentrations suppress the liquid phase; therefore, Nice acts to mediate the strength of turbulent overturning promoted by increased subsidence. With a warming surface, a lack of - or low - subsidence

  13. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  14. Macroecological factors explain large-scale spatial population patterns of ancient agriculturalists

    NARCIS (Netherlands)

    Xu, C.; Chen, B.; Abades, S.; Reino, L.; Teng, S.; Ljungqvist, F.C.; Huang, Z.Y.X.; Liu, X.

    2015-01-01

    Aim: It has been well demonstrated that the large-scale distribution patterns of numerous species are driven by similar macroecological factors. However, understanding of this topic remains limited when applied to our own species. Here we take a large-scale look at ancient agriculturalist

  15. Large Scale Investments in Infrastructure : Competing Policy regimes to Control Connections

    NARCIS (Netherlands)

    Otsuki, K.; Read, M.L.; Zoomers, E.B.

    2016-01-01

    This paper proposes to analyse implications of large-scale investments in physical infrastructure for social and environmental justice. While case studies on the global land rush and climate change have advanced our understanding of how large-scale investments in land, forests and water affect

  16. Rotation invariant fast features for large-scale recognition

    Science.gov (United States)

    Takacs, Gabriel; Chandrasekhar, Vijay; Tsai, Sam; Chen, David; Grzeszczuk, Radek; Girod, Bernd

    2012-10-01

    We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.

  17. Biogrid--a microfluidic device for large-scale enzyme-free dissociation of stem cell aggregates.

    Science.gov (United States)

    Wallman, Lars; Åkesson, Elisabet; Ceric, Dario; Andersson, Per Henrik; Day, Kelly; Hovatta, Outi; Falci, Scott; Laurell, Thomas; Sundström, Erik

    2011-10-07

    Culturing stem cells as free-floating aggregates in suspension facilitates large-scale production of cells in closed systems, for clinical use. To comply with GMP standards, the use of substances such as proteolytic enzymes should be avoided. Instead of enzymatic dissociation, the growing cell aggregates may be mechanically cut at passage, but available methods are not compatible with large-scale cell production and hence translation into the clinic becomes a severe bottle-neck. We have developed the Biogrid device, which consists of an array of micrometerscale knife edges, micro-fabricated in silicon, and a manifold in which the microgrid is placed across the central fluid channel. By connecting one side of the Biogrid to a syringe or a pump and the other side to the cell culture, the culture medium with suspended cell aggregates can be aspirated, forcing the aggregates through the microgrid, and ejected back to the cell culture container. Large aggregates are thereby dissociated into smaller fragments while small aggregates pass through the microgrid unaffected. As proof-of-concept, we demonstrate that the Biogrid device can be successfully used for repeated passage of human neural stem/progenitor cells cultured as so-called neurospheres, as well as for passage of suspension cultures of human embryonic stem cells. We also show that human neural stem/progenitor cells tolerate transient pressure changes far exceeding those that will occur in a fluidic system incorporating the Biogrid microgrids. Thus, by using the Biogrid device it is possible to mechanically passage large quantities of cells in suspension cultures in closed fluidic systems, without the use of proteolytic enzymes.

  18. Economic viability of large-scale fusion systems

    Energy Technology Data Exchange (ETDEWEB)

    Helsley, Charles E., E-mail: cehelsley@fusionpowercorporation.com; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system

  19. Economic viability of large-scale fusion systems

    International Nuclear Information System (INIS)

    Helsley, Charles E.; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system economically

  20. Production of margarine fats by enzymatic interesterification with silica-granulated Thermomyces lanuginosa lipase in a large-scale study

    DEFF Research Database (Denmark)

    Zhang, Hong; Xu, Xuebing; Nilsson, Jörgen

    2001-01-01

    Interesterification of a blend of palm stearin and coconut oil (75:25, w/w), catalyzed by an immobilized Thermomyces lanuginosa lipase by silica granulation, Lipozyme TL IM, was studied for production of margarine fats in a 1- or 300-kg pilot-scale batch-stirred tank reactor. Parameters...

  1. Large-scale structure in the universe: Theory vs observations

    International Nuclear Information System (INIS)

    Kashlinsky, A.; Jones, B.J.T.

    1990-01-01

    A variety of observations constrain models of the origin of large scale cosmic structures. We review here the elements of current theories and comment in detail on which of the current observational data provide the principal constraints. We point out that enough observational data have accumulated to constrain (and perhaps determine) the power spectrum of primordial density fluctuations over a very large range of scales. We discuss the theories in the light of observational data and focus on the potential of future observations in providing even (and ever) tighter constraints. (orig.)

  2. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    Full Text Available Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological drought? To answer this question, we evaluated the simulation of drought propagation in an ensemble mean of ten large-scale models, both land-surface models and global hydrological models, that participated in the model intercomparison project of WATCH (WaterMIP. For a selection of case study areas, we studied drought characteristics (number of droughts, duration, severity, drought propagation features (pooling, attenuation, lag, lengthening, and hydrological drought typology (classical rainfall deficit drought, rain-to-snow-season drought, wet-to-dry-season drought, cold snow season drought, warm snow season drought, composite drought.

    Drought characteristics simulated by large-scale models clearly reflected drought propagation; i.e. drought events became fewer and longer when moving through the hydrological cycle. However, more differentiation was expected between fast and slowly responding systems, with slowly responding systems having fewer and longer droughts in runoff than fast responding systems. This was not found using large-scale models. Drought propagation features were poorly reproduced by the large-scale models, because runoff reacted immediately to precipitation, in all case study areas. This fast reaction to precipitation, even in cold climates in winter and in semi-arid climates in summer, also greatly influenced the hydrological drought typology as identified by the large-scale models. In general, the large-scale models had the correct representation of drought types, but the percentages of occurrence had some important mismatches, e.g. an overestimation of classical rainfall deficit droughts, and an

  3. Large scale analysis of signal reachability.

    Science.gov (United States)

    Todor, Andrei; Gabr, Haitham; Dobra, Alin; Kahveci, Tamer

    2014-06-15

    Major disorders, such as leukemia, have been shown to alter the transcription of genes. Understanding how gene regulation is affected by such aberrations is of utmost importance. One promising strategy toward this objective is to compute whether signals can reach to the transcription factors through the transcription regulatory network (TRN). Due to the uncertainty of the regulatory interactions, this is a #P-complete problem and thus solving it for very large TRNs remains to be a challenge. We develop a novel and scalable method to compute the probability that a signal originating at any given set of source genes can arrive at any given set of target genes (i.e., transcription factors) when the topology of the underlying signaling network is uncertain. Our method tackles this problem for large networks while providing a provably accurate result. Our method follows a divide-and-conquer strategy. We break down the given network into a sequence of non-overlapping subnetworks such that reachability can be computed autonomously and sequentially on each subnetwork. We represent each interaction using a small polynomial. The product of these polynomials express different scenarios when a signal can or cannot reach to target genes from the source genes. We introduce polynomial collapsing operators for each subnetwork. These operators reduce the size of the resulting polynomial and thus the computational complexity dramatically. We show that our method scales to entire human regulatory networks in only seconds, while the existing methods fail beyond a few tens of genes and interactions. We demonstrate that our method can successfully characterize key reachability characteristics of the entire transcriptions regulatory networks of patients affected by eight different subtypes of leukemia, as well as those from healthy control samples. All the datasets and code used in this article are available at bioinformatics.cise.ufl.edu/PReach/scalable.htm. © The Author 2014

  4. Environmental Impacts of Large Scale Biochar Application Through Spatial Modeling

    Science.gov (United States)

    Huber, I.; Archontoulis, S.

    2017-12-01

    In an effort to study the environmental (emissions, soil quality) and production (yield) impacts of biochar application at regional scales we coupled the APSIM-Biochar model with the pSIMS parallel platform. So far the majority of biochar research has been concentrated on lab to field studies to advance scientific knowledge. Regional scale assessments are highly needed to assist decision making. The overall objective of this simulation study was to identify areas in the USA that have the most gain environmentally from biochar's application, as well as areas which our model predicts a notable yield increase due to the addition of biochar. We present the modifications in both APSIM biochar and pSIMS components that were necessary to facilitate these large scale model runs across several regions in the United States at a resolution of 5 arcminutes. This study uses the AgMERRA global climate data set (1980-2010) and the Global Soil Dataset for Earth Systems modeling as a basis for creating its simulations, as well as local management operations for maize and soybean cropping systems and different biochar application rates. The regional scale simulation analysis is in progress. Preliminary results showed that the model predicts that high quality soils (particularly those common to Iowa cropping systems) do not receive much, if any, production benefit from biochar. However, soils with low soil organic matter ( 0.5%) do get a noteworthy yield increase of around 5-10% in the best cases. We also found N2O emissions to be spatial and temporal specific; increase in some areas and decrease in some other areas due to biochar application. In contrast, we found increases in soil organic carbon and plant available water in all soils (top 30 cm) due to biochar application. The magnitude of these increases (% change from the control) were larger in soil with low organic matter (below 1.5%) and smaller in soils with high organic matter (above 3%) and also dependent on biochar

  5. Toward Instructional Leadership: Principals' Perceptions of Large-Scale Assessment in Schools

    Science.gov (United States)

    Prytula, Michelle; Noonan, Brian; Hellsten, Laurie

    2013-01-01

    This paper describes a study of the perceptions that Saskatchewan school principals have regarding large-scale assessment reform and their perceptions of how assessment reform has affected their roles as principals. The findings revealed that large-scale assessments, especially provincial assessments, have affected the principal in Saskatchewan…

  6. A large scale field experiment in the Amazon basin (LAMBADA/BATERISTA)

    NARCIS (Netherlands)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C.

    1995-01-01

    A description is given of a large-scale field experiment planned in the Amazon basin, aimed at assessing the large-scale balances of energy, water and carbon dioxide. The embedding of this experiment in global change programmes is described, viz. the Biospheric Aspects of the Hydrological Cycle

  7. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  8. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Nusser, Adi [Physics Department and the Asher Space Science Institute-Technion, Haifa 32000 (Israel); Branchini, Enzo [Department of Physics, Universita Roma Tre, Via della Vasca Navale 84, 00146 Rome (Italy); Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu [Departments of Astronomy and Physics, University of California, Berkeley, CA 94720 (United States)

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  9. Planck intermediate results XLII. Large-scale Galactic magnetic fields

    DEFF Research Database (Denmark)

    Adam, R.; Ade, P. A. R.; Alves, M. I. R.

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured ...

  10. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  11. Enhanced saturated fatty acids accumulation in cultures of newly-isolated strains of Schizochytrium sp. and Thraustochytriidae sp. for large-scale biodiesel production.

    Science.gov (United States)

    Wang, Qiuzhen; Sen, Biswarup; Liu, Xianhua; He, Yaodong; Xie, Yunxuan; Wang, Guangyi

    2018-08-01

    Heterotrophic marine protists (Thraustochytrids) have received increasingly global attention as a renewable, sustainable and alternative source of biodiesel because of their high ability of saturated fatty acids (SFAs) accumulation. Yet, the influence of extrinsic factors (nutrients and environmental conditions) on thraustochytrid culture and optimal conditions for high SFAs production are poorly described. In the present study, two different thraustochytrid strains, Schizochytrium sp. PKU#Mn4 and Thraustochytriidae sp. PKU#Mn16 were studied for their growth and SFAs production profiles under various conditions (carbon, nitrogen, temperature, pH, KH 2 PO 4 , salinity, and agitation speed). Of the culture conditions, substrates (C and N) source and conc., temperature, and agitation speed significantly influenced the cell growth and SFAs production of both strains. Although both the strains were capable of growth and SFAs production in the broad range of culture conditions, their physiological responses to KH 2 PO 4 , pH, and salinity were dissimilar. Under their optimal batch culture conditions, peak SFAs productions of 3.3g/L and 2.2g/L with 62% and 49% SFAs contents (relative to total fatty acids) were achieved, respectively. The results of 5-L fed-batch fermentation under optimal conditions showed a nearly 4.5-fold increase in SFAs production (i.e., 7.5g/L) by both strains compared to unoptimized conditions. Of the two strains, the quality of biodiesel produced from the fatty acids of PKU#Mn4 met the biodiesel standard defined by ASTM6751. This study, to the knowledge of the authors, is the first comprehensive report of optimal fermentation conditions demonstrating enhanced SFAs production by strains belonging to two different thraustochytrid genera and provides the basis for large-scale biodiesel production. Copyright © 2018. Published by Elsevier B.V.

  12. No Large Scale Curvature Perturbations during Waterfall of Hybrid Inflation

    OpenAIRE

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2010-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depend crucially on the competition between the classical and the quantum mechanical back-reactions to terminate inflation. If one considers only the clas...

  13. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  14. Pseudoscalar-photon mixing and the large scale alignment of QsO ...

    Indian Academy of Sciences (India)

    physics pp. 679-682. Pseudoscalar-photon mixing and the large scale alignment of QsO optical polarizations. PANKAJ JAIN, sUKANTA PANDA and s sARALA. Physics Department, Indian Institute of Technology, Kanpur 208 016, India. Abstract. We review the observation of large scale alignment of QSO optical polariza-.

  15. On the universal character of the large scale structure of the universe

    International Nuclear Information System (INIS)

    Demianski, M.; International Center for Relativistic Astrophysics; Rome Univ.; Doroshkevich, A.G.

    1991-01-01

    We review different theories of formation of the large scale structure of the Universe. Special emphasis is put on the theory of inertial instability. We show that for a large class of initial spectra the resulting two point correlation functions are similar. We discuss also the adhesion theory which uses the Burgers equation, Navier-Stokes equation or coagulation process. We review the Zeldovich theory of gravitational instability and discuss the internal structure of pancakes. Finally we discuss the role of the velocity potential in determining the global characteristics of large scale structures (distribution of caustics, scale of voids, etc.). In the last chapter we list the main unsolved problems and main successes of the theory of formation of large scale structure. (orig.)

  16. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  17. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  18. Probing cosmology with the homogeneity scale of the Universe through large scale structure surveys

    International Nuclear Information System (INIS)

    Ntelis, Pierros

    2017-01-01

    This thesis exposes my contribution to the measurement of homogeneity scale using galaxies, with the cosmological interpretation of results. In physics, any model is characterized by a set of principles. Most models in cosmology are based on the Cosmological Principle, which states that the universe is statistically homogeneous and isotropic on a large scales. Today, this principle is considered to be true since it is respected by those cosmological models that accurately describe the observations. However, while the isotropy of the universe is now confirmed by many experiments, it is not the case for the homogeneity. To study cosmic homogeneity, we propose to not only test a model but to test directly one of the postulates of modern cosmology. Since 1998 the measurements of cosmic distances using type Ia supernovae, we know that the universe is now in a phase of accelerated expansion. This phenomenon can be explained by the addition of an unknown energy component, which is called dark energy. Since dark energy is responsible for the expansion of the universe, we can study this mysterious fluid by measuring the rate of expansion of the universe. The universe has imprinted in its matter distribution a standard ruler, the Baryon Acoustic Oscillation (BAO) scale. By measuring this scale at different times during the evolution of our universe, it is then possible to measure the rate of expansion of the universe and thus characterize this dark energy. Alternatively, we can use the homogeneity scale to study this dark energy. Studying the homogeneity and the BAO scale requires the statistical study of the matter distribution of the universe at large scales, superior to tens of Mega-parsecs. Galaxies and quasars are formed in the vast over densities of matter and they are very luminous: these sources trace the distribution of matter. By measuring the emission spectra of these sources using large spectroscopic surveys, such as BOSS and eBOSS, we can measure their positions

  19. Impacts of large-scale climatic disturbances on the terrestrial carbon cycle

    Directory of Open Access Journals (Sweden)

    Lucht Wolfgang

    2006-07-01

    Full Text Available Abstract Background The amount of carbon dioxide in the atmosphere steadily increases as a consequence of anthropogenic emissions but with large interannual variability caused by the terrestrial biosphere. These variations in the CO2 growth rate are caused by large-scale climate anomalies but the relative contributions of vegetation growth and soil decomposition is uncertain. We use a biogeochemical model of the terrestrial biosphere to differentiate the effects of temperature and precipitation on net primary production (NPP and heterotrophic respiration (Rh during the two largest anomalies in atmospheric CO2 increase during the last 25 years. One of these, the smallest atmospheric year-to-year increase (largest land carbon uptake in that period, was caused by global cooling in 1992/93 after the Pinatubo volcanic eruption. The other, the largest atmospheric increase on record (largest land carbon release, was caused by the strong El Niño event of 1997/98. Results We find that the LPJ model correctly simulates the magnitude of terrestrial modulation of atmospheric carbon anomalies for these two extreme disturbances. The response of soil respiration to changes in temperature and precipitation explains most of the modelled anomalous CO2 flux. Conclusion Observed and modelled NEE anomalies are in good agreement, therefore we suggest that the temporal variability of heterotrophic respiration produced by our model is reasonably realistic. We therefore conclude that during the last 25 years the two largest disturbances of the global carbon cycle were strongly controlled by soil processes rather then the response of vegetation to these large-scale climatic events.

  20. Basic physical phenomena, neutron production and scaling of the dense plasma focus

    International Nuclear Information System (INIS)

    Kaeppeler, H.J.

    This paper presents an attempt at establishing a model theory for the dense plasma focus in order to present a consistent interpretation of the basic physical phenomena leading to neutron production from both acceleration and thermal processes. To achieve this, the temporal history of the focus is divided into the compression of the plasma sheath, a qiescent and very dense phase with ensuing expansion, and an instable phase where the focus plasma is disrupted by instabilities. Finally, the decay of density, velocity and thermal fields is considered. Under the assumption that Io 2 /sigmaoRo 2 = const and to/Tc = const, scaling laws for plasma focus devices are derived. It is shown that while generally the neutron yield scales with the fourth power of maximum current, neutron production from thermal processes becomes increasingly important for large devices, while in the small devices neutron production from acceleration processes is by far predominant. (orig.) [de

  1. A refined regional modeling approach for the Corn Belt - Experiences and recommendations for large-scale integrated modeling

    Science.gov (United States)

    Panagopoulos, Yiannis; Gassman, Philip W.; Jha, Manoj K.; Kling, Catherine L.; Campbell, Todd; Srinivasan, Raghavan; White, Michael; Arnold, Jeffrey G.

    2015-05-01

    Nonpoint source pollution from agriculture is the main source of nitrogen and phosphorus in the stream systems of the Corn Belt region in the Midwestern US. This region is comprised of two large river basins, the intensely row-cropped Upper Mississippi River Basin (UMRB) and Ohio-Tennessee River Basin (OTRB), which are considered the key contributing areas for the Northern Gulf of Mexico hypoxic zone according to the US Environmental Protection Agency. Thus, in this area it is of utmost importance to ensure that intensive agriculture for food, feed and biofuel production can coexist with a healthy water environment. To address these objectives within a river basin management context, an integrated modeling system has been constructed with the hydrologic Soil and Water Assessment Tool (SWAT) model, capable of estimating river basin responses to alternative cropping and/or management strategies. To improve modeling performance compared to previous studies and provide a spatially detailed basis for scenario development, this SWAT Corn Belt application incorporates a greatly refined subwatershed structure based on 12-digit hydrologic units or 'subwatersheds' as defined by the US Geological Service. The model setup, calibration and validation are time-demanding and challenging tasks for these large systems, given the scale intensive data requirements, and the need to ensure the reliability of flow and pollutant load predictions at multiple locations. Thus, the objectives of this study are both to comprehensively describe this large-scale modeling approach, providing estimates of pollution and crop production in the region as well as to present strengths and weaknesses of integrated modeling at such a large scale along with how it can be improved on the basis of the current modeling structure and results. The predictions were based on a semi-automatic hydrologic calibration approach for large-scale and spatially detailed modeling studies, with the use of the Sequential

  2. The impact of a large-scale quality improvement programme on work engagement: preliminary results from a national cross-sectional-survey of the 'Productive Ward'.

    Science.gov (United States)

    White, Mark; Wells, John S G; Butterworth, Tony

    2014-12-01

    Quality improvement (QI) Programmes, like the Productive Ward: Releasing-time-to-care initiative, aim to 'engage' and 'empower' ward teams to actively participate, innovate and lead quality improvement at the front line. However, little is known about the relationship and impact that QI work has on the 'engagement' of the clinical teams who participate and vice-versa. This paper explores and examines the impact of a large-scale QI programme, the Productive Ward, on the 'work engagement' of the nurses and ward teams involved. Using the Utrecht Work Engagement Scale (UWES), we surveyed, measured and analysed work engagement in a representative test group of hospital-based ward teams who had recently commenced the latest phase of the national 'Productive Ward' initiative in Ireland and compared them to a control group of similar size and matched (as far as is possible) on variables such as ward size, employment grade and clinical specialty area. 338 individual datasets were recorded, n=180 (53.6%) from the Productive Ward group, and n=158 (46.4%) from the control group; the overall response rate was 67%, and did not differ significantly between the Productive Ward and control groups. The work engagement mean score (±standard deviation) in the Productive group was 4.33(±0.88), and 4.07(±1.06) in the control group, representing a modest but statistically significant between-group difference (p=0.013, independent samples t-test). Similarly modest differences were observed in all three dimensions of the work engagement construct. Employment grade and the clinical specialty area were also significantly related to the work engagement score (pengagement (the vigour, absorption and dedication) of ward-based teams. The use and suitability of the UWES as an appropriate measure of 'engagement' in QI interventions was confirmed. The engagement of nurses and front-line clinical teams is a major component of creating, developing and sustaining a culture of improvement. Copyright

  3. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  4. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  5. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  6. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  7. Sub-surface laser nanostructuring in stratified metal/dielectric media: a versatile platform towards flexible, durable and large-scale plasmonic writing

    International Nuclear Information System (INIS)

    Siozios, A; Bellas, D V; Lidorikis, E; Patsalas, P; Kalfagiannis, N; Cranton, W M; Koutsogeorgis, D C; Bazioti, C; Dimitrakopulos, G P; Vourlias, G

    2015-01-01

    Laser nanostructuring of pure ultrathin metal layers or ceramic/metal composite thin films has emerged as a promising route for the fabrication of plasmonic patterns with applications in information storage, cryptography, and security tagging. However, the environmental sensitivity of pure Ag layers and the complexity of ceramic/metal composite film growth hinder the implementation of this technology to large-scale production, as well as its combination with flexible substrates. In the present work we investigate an alternative pathway, namely, starting from non-plasmonic multilayer metal/dielectric layers, whose growth is compatible with large scale production such as in-line sputtering and roll-to-roll deposition, which are then transformed into plasmonic templates by single-shot UV-laser annealing (LA). This entirely cold, large-scale process leads to a subsurface nanoconstruction involving plasmonic Ag nanoparticles (NPs) embedded in a hard and inert dielectric matrix on top of both rigid and flexible substrates. The subsurface encapsulation of Ag NPs provides durability and long-term stability, while the cold character of LA suits the use of sensitive flexible substrates. The morphology of the final composite film depends primarily on the nanocrystalline character of the dielectric host and its thermal conductivity. We demonstrate the emergence of a localized surface plasmon resonance, and its tunability depending on the applied fluence and environmental pressure. The results are well explained by theoretical photothermal modeling. Overall, our findings qualify the proposed process as an excellent candidate for versatile, large-scale optical encoding applications. (paper)

  8. Large-scale Flow and Transport of Magnetic Flux in the Solar ...

    Indian Academy of Sciences (India)

    tribpo

    Abstract. Horizontal large-scale velocity field describes horizontal displacement of the photospheric magnetic flux in zonal and meridian directions. The flow systems of solar plasma, constructed according to the velocity field, create the large-scale cellular-like patterns with up-flow in the center and the down-flow on the ...

  9. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  10. Large-scale modeling of rain fields from a rain cell deterministic model

    Science.gov (United States)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  11. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    Science.gov (United States)

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...disadvantages of ML- Chord are its fixed size (two layers), and limited scala - bility for large-scale systems. RC-Chord extends ML- D. Karrels et al...configurable before runtime. This can be improved by incorporating a distributed learning algorithm to tune the number and range of the DLoE tracking

  12. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  13. Large-scale ash recycling in Central Sweden; Storskalig askhantering i mellansverige

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Mats [Stora Skog (Sweden)

    1998-08-01

    When logging residues (tops, branches) are withdrawn from the forest, most of the nutrient content of the trees is also lost. Some of the nutrient content of the soil is restored by weathering, but not all. When biomass is burnt as fuel most of the nutrients will be found in the ash. By recycling wood ash, in similar amounts as was withdrawn with the biomass, it is possible to compensate for the nutrient losses. This project was initiated to study how a rational recycling of wood ash could be performed under conditions valid for Stora, a large forest company in the middle of Sweden. A second aim was to give guiding principles for Stora`s own ash recycling while awaiting instructions from the authorities. In the project both theoretical studies and practical field studies were carried out. Studied areas are production of a stabilised ash product and different systems for transport and spreading of the ash product. The costs and results of spreading have also been monitored. The project showed that spreading of the ash can normally only take place when there is no snow. If production or transport is carried out during another time of the year, the ash has to be stored, either at the industry, in an intermediate storage, or in the forest. One important conclusion from the test period was that the result of the spreading depends heavily on the quality of the ash. Some of the ashes hardened in the spreading equipment, causing a complete stop of the spreading. It also caused problems if the ash was too wet. Plate-spreaders led to unequal quality of spreading, where some areas got more ash and some got less. Granulated ash was most easy to spread. Recommended system for spreading ash is: granulated ash transported unpacked in separate transports with lorries with exchangeable platforms. A large fores tractor spreads the ash in clearings, in the summer. The project has shown that large-scale ash recycling is possible to realize 22 figs, 5 tabs, 13 appendices

  14. Operation Modeling of Power Systems Integrated with Large-Scale New Energy Power Sources

    Directory of Open Access Journals (Sweden)

    Hui Li

    2016-10-01

    Full Text Available In the most current methods of probabilistic power system production simulation, the output characteristics of new energy power generation (NEPG has not been comprehensively considered. In this paper, the power output characteristics of wind power generation and photovoltaic power generation are firstly analyzed based on statistical methods according to their historical operating data. Then the characteristic indexes and the filtering principle of the NEPG historical output scenarios are introduced with the confidence level, and the calculation model of NEPG’s credible capacity is proposed. Based on this, taking the minimum production costs or the best energy-saving and emission-reduction effect as the optimization objective, the power system operation model with large-scale integration of new energy power generation (NEPG is established considering the power balance, the electricity balance and the peak balance. Besides, the constraints of the operating characteristics of different power generation types, the maintenance schedule, the load reservation, the emergency reservation, the water abandonment and the transmitting capacity between different areas are also considered. With the proposed power system operation model, the operation simulations are carried out based on the actual Northwest power grid of China, which resolves the new energy power accommodations considering different system operating conditions. The simulation results well verify the validity of the proposed power system operation model in the accommodation analysis for the power system which is penetrated with large scale NEPG.

  15. Puzzles of large scale structure and gravitation

    International Nuclear Information System (INIS)

    Sidharth, B.G.

    2006-01-01

    We consider the puzzle of cosmic voids bounded by two-dimensional structures of galactic clusters as also a puzzle pointed out by Weinberg: How can the mass of a typical elementary particle depend on a cosmic parameter like the Hubble constant? An answer to the first puzzle is proposed in terms of 'Scaled' Quantum Mechanical like behaviour which appears at large scales. The second puzzle can be answered by showing that the gravitational mass of an elementary particle has a Machian character (see Ahmed N. Cantorian small worked, Mach's principle and the universal mass network. Chaos, Solitons and Fractals 2004;21(4))

  16. Electric vehicles and large-scale integration of wind power

    DEFF Research Database (Denmark)

    Liu, Wen; Hu, Weihao; Lund, Henrik

    2013-01-01

    with this imbalance and to reduce its high dependence on oil production. For this reason, it is interesting to analyse the extent to which transport electrification can further the renewable energy integration. This paper quantifies this issue in Inner Mongolia, where the share of wind power in the electricity supply...... was 6.5% in 2009 and which has the plan to develop large-scale wind power. The results show that electric vehicles (EVs) have the ability to balance the electricity demand and supply and to further the wind power integration. In the best case, the energy system with EV can increase wind power...... integration by 8%. The application of EVs benefits from saving both energy system cost and fuel cost. However, the negative consequences of decreasing energy system efficiency and increasing the CO2 emission should be noted when applying the hydrogen fuel cell vehicle (HFCV). The results also indicate...

  17. Multi-scale modeling for sustainable chemical production.

    Science.gov (United States)

    Zhuang, Kai; Bakshi, Bhavik R; Herrgård, Markus J

    2013-09-01

    With recent advances in metabolic engineering, it is now technically possible to produce a wide portfolio of existing petrochemical products from biomass feedstock. In recent years, a number of modeling approaches have been developed to support the engineering and decision-making processes associated with the development and implementation of a sustainable biochemical industry. The temporal and spatial scales of modeling approaches for sustainable chemical production vary greatly, ranging from metabolic models that aid the design of fermentative microbial strains to material and monetary flow models that explore the ecological impacts of all economic activities. Research efforts that attempt to connect the models at different scales have been limited. Here, we review a number of existing modeling approaches and their applications at the scales of metabolism, bioreactor, overall process, chemical industry, economy, and ecosystem. In addition, we propose a multi-scale approach for integrating the existing models into a cohesive framework. The major benefit of this proposed framework is that the design and decision-making at each scale can be informed, guided, and constrained by simulations and predictions at every other scale. In addition, the development of this multi-scale framework would promote cohesive collaborations across multiple traditionally disconnected modeling disciplines to achieve sustainable chemical production. Copyright © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  19. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  20. Cattle mammary bioreactor generated by a novel procedure of transgenic cloning for large-scale production of functional human lactoferrin.

    Directory of Open Access Journals (Sweden)

    Penghua Yang

    Full Text Available Large-scale production of biopharmaceuticals by current bioreactor techniques is limited by low transgenic efficiency and low expression of foreign proteins. In general, a bacterial artificial chromosome (BAC harboring most regulatory elements is capable of overcoming the limitations, but transferring BAC into donor cells is difficult. We describe here the use of cattle mammary bioreactor to produce functional recombinant human lactoferrin (rhLF by a novel procedure of transgenic cloning, which employs microinjection to generate transgenic somatic cells as donor cells. Bovine fibroblast cells were co-microinjected for the first time with a 150-kb BAC carrying the human lactoferrin gene and a marker gene. The resulting transfection efficiency of up to 15.79 x 10(-2 percent was notably higher than that of electroporation and lipofection. Following somatic cell nuclear transfer, we obtained two transgenic cows that secreted rhLF at high levels, 2.5 g/l and 3.4 g/l, respectively. The rhLF had a similar pattern of glycosylation and proteolytic susceptibility as the natural human counterpart. Biochemical analysis revealed that the iron-binding and releasing properties of rhLF were identical to that of native hLF. Importantly, an antibacterial experiment further demonstrated that rhLF was functional. Our results indicate that co-microinjection with a BAC and a marker gene into donor cells for somatic cell cloning indeed improves transgenic efficiency. Moreover, the cattle mammary bioreactors generated with this novel procedure produce functional rhLF on an industrial scale.

  1. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  2. Investigation of factors influencing biogas production in a large-scale thermophilic municipal biogas plant

    Energy Technology Data Exchange (ETDEWEB)

    Weiss, Agnes; Jerome, Valerie; Freitag, Ruth [Bayreuth Univ. (Germany). Chair for Process Biotechnology; Burghardt, Diana; Likke, Likke; Peiffer, Stefan [Bayreuth Univ. (Germany). Dept. of Hydrology; Hofstetter, Eugen M. [RVT Process Equipment GmbH, Steinwiesen (Germany); Gabler, Ralf [BKW Biokraftwerke Fuerstenwalde GmbH, Fuerstenwalde (Germany)

    2009-10-15

    A continuously operated, thermophilic, municipal biogas plant was observed over 26 months (sampling twice per month) in regard to a number of physicochemical parameters and the biogas production. Biogas yields were put in correlation to parameters such as the volatile fatty acid concentration, the pH and the ammonium concentration. When the residing microbiota was classified via analysis of the 16S rRNA genes, most bacterial sequences matched with unidentified or uncultured bacteria from similar habitats. Of the archaeal sequences, 78.4% were identified as belonging to the genus Methanoculleus, which has not previously been reported for biogas plants, but is known to efficiently use H{sub 2} and CO{sub 2} produced by the degradation of fatty acids by syntrophic microorganisms. In order to further investigate the influence of varied amounts of ammonia (2-8 g/L) and volatile fatty acids on biogas production and composition (methane/CO{sub 2}), laboratory scale satellite experiments were performed in parallel to the technical plant. Finally, ammonia stripping of the process water of the technical plant was accomplished, a measure through which the ammonia entering the biogas reactor via the mash could be nearly halved, which increased the energy output of the biogas plant by almost 20%. (orig.)

  3. On the Renormalization of the Effective Field Theory of Large Scale Structures

    OpenAIRE

    Pajer, Enrico; Zaldarriaga, Matias

    2013-01-01

    Standard perturbation theory (SPT) for large-scale matter inhomogeneities is unsatisfactory for at least three reasons: there is no clear expansion parameter since the density contrast is not small on all scales; it does not fully account for deviations at large scales from a perfect pressureless fluid induced by short-scale non-linearities; for generic initial conditions, loop corrections are UV-divergent, making predictions cutoff dependent and hence unphysical. The Effective Field Theory o...

  4. A decomposition heuristics based on multi-bottleneck machines for large-scale job shop scheduling problems

    Directory of Open Access Journals (Sweden)

    Yingni Zhai

    2014-10-01

    large-scale job shops, and will be helpful for the discrete manufacturing industry for improving the production efficiency and effectiveness.

  5. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    Science.gov (United States)

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  6. Mass dependence of Higgs boson production at large transverse momentum through a bottom-quark loop

    Science.gov (United States)

    Braaten, Eric; Zhang, Hong; Zhang, Jia-Wei

    2018-05-01

    In the production of the Higgs through a bottom-quark loop, the transverse momentum distribution of the Higgs at large PT is complicated by its dependence on two other important scales: the bottom quark mass mb and the Higgs mass mH. A strategy for simplifying the calculation of the cross section at large PT is to calculate only the leading terms in its expansion in mb2/PT2. In this paper, we consider the bottom-quark-loop contribution to the parton process q q ¯→H +g at leading order in αs. We show that the leading power of 1 /PT2 can be expressed in the form of a factorization formula that separates the large scale PT from the scale of the masses. All the dependence on mb and mH can be factorized into a distribution amplitude for b b ¯ in the Higgs, a distribution amplitude for b b ¯ in a real gluon, and an end point contribution. The factorization formula can be used to organize the calculation of the leading terms in the expansion in mb2/PT2 so that every calculation involves at most two scales.

  7. Improving the Communication Pattern in Matrix-Vector Operations for Large Scale-Free Graphs by Disaggregation

    Energy Technology Data Exchange (ETDEWEB)

    Kuhlemann, Verena [Emory Univ., Atlanta, GA (United States); Vassilevski, Panayot S. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2013-10-28

    Matrix-vector multiplication is the key operation in any Krylov-subspace iteration method. We are interested in Krylov methods applied to problems associated with the graph Laplacian arising from large scale-free graphs. Furthermore, computations with graphs of this type on parallel distributed-memory computers are challenging. This is due to the fact that scale-free graphs have a degree distribution that follows a power law, and currently available graph partitioners are not efficient for such an irregular degree distribution. The lack of a good partitioning leads to excessive interprocessor communication requirements during every matrix-vector product. Here, we present an approach to alleviate this problem based on embedding the original irregular graph into a more regular one by disaggregating (splitting up) vertices in the original graph. The matrix-vector operations for the original graph are performed via a factored triple matrix-vector product involving the embedding graph. And even though the latter graph is larger, we are able to decrease the communication requirements considerably and improve the performance of the matrix-vector product.

  8. Methods for Large-Scale Nonlinear Optimization.

    Science.gov (United States)

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  9. Generation of large-scale vorticity in rotating stratified turbulence with inhomogeneous helicity: mean-field theory

    Science.gov (United States)

    Kleeorin, N.

    2018-06-01

    We discuss a mean-field theory of the generation of large-scale vorticity in a rotating density stratified developed turbulence with inhomogeneous kinetic helicity. We show that the large-scale non-uniform flow is produced due to either a combined action of a density stratified rotating turbulence and uniform kinetic helicity or a combined effect of a rotating incompressible turbulence and inhomogeneous kinetic helicity. These effects result in the formation of a large-scale shear, and in turn its interaction with the small-scale turbulence causes an excitation of the large-scale instability (known as a vorticity dynamo) due to a combined effect of the large-scale shear and Reynolds stress-induced generation of the mean vorticity. The latter is due to the effect of large-scale shear on the Reynolds stress. A fast rotation suppresses this large-scale instability.

  10. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    In foundries, violent explosions occur occasionally when molten metal comes into contact with water. If similar explosions can occur with other materials, hazardous situations may arise for example in LNG marine transportation accidents, or in liquid cooled reactor incidents when molten UO 2 contacts water or sodium coolant. Over the last 10 years a large body of experimental data has been obtained on the behaviour of small quantities of hot material in contact with a vaporisable coolant. Such experiments generally give low energy yields, despite producing fine fragmentation of the molten material. These events have been interpreted in terms of a wide range of phenomena such as violent boiling, liquid entrainment, bubble collapse, superheat, surface cracking and many others. Many of these studies have been aimed at understanding the small scale behaviour of the particular materials of interest. However, understanding the nature of the energetic events which were the original cause for concern may also be necessary to give confidence that violent events cannot occur for these materials in large scale situations. More recently, there has been a trend towards larger experiments and some of these have produced explosions of moderately high efficiency. Although occurrence of such large scale explosions can depend rather critically on initial conditions in a way which is not fully understood, there are signs that the interpretation of these events may be more straightforward than that of the single drop experiments. In the last two years several theoretical models for large scale explosions have appeared which attempt a self contained explanation of at least some stages of such high yield events: these have as their common feature a description of how a propagating breakdown of an initially quasi-stable distribution of materials is induced by the pressure and flow field caused by the energy release in adjacent regions. These models have led to the idea that for a full

  11. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  12. Lightweight computational steering of very large scale molecular dynamics simulations

    International Nuclear Information System (INIS)

    Beazley, D.M.

    1996-01-01

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages

  13. Large Scale GW Calculations on the Cori System

    Science.gov (United States)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  14. Cosmic ray acceleration by large scale galactic shocks

    International Nuclear Information System (INIS)

    Cesarsky, C.J.; Lagage, P.O.

    1987-01-01

    The mechanism of diffusive shock acceleration may account for the existence of galactic cosmic rays detailed application to stellar wind shocks and especially to supernova shocks have been developed. Existing models can usually deal with the energetics or the spectral slope, but the observed energy range of cosmic rays is not explained. Therefore it seems worthwhile to examine the effect that large scale, long-lived galactic shocks may have on galactic cosmic rays, in the frame of the diffusive shock acceleration mechanism. Large scale fast shocks can only be expected to exist in the galactic halo. We consider three situations where they may arise: expansion of a supernova shock in the halo, galactic wind, galactic infall; and discuss the possible existence of these shocks and their role in accelerating cosmic rays

  15. Lagrangian space consistency relation for large scale structure

    International Nuclear Information System (INIS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-01-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space

  16. Electron drift in a large scale solid xenon

    International Nuclear Information System (INIS)

    Yoo, J.; Jaskierny, W.F.

    2015-01-01

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Therefore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon

  17. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    International Nuclear Information System (INIS)

    Jin Zhenxing; Wu Yong; Li Baizhan; Gao Yafeng

    2009-01-01

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China.

  18. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Zhenxing; Li, Baizhan; Gao, Yafeng [The Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China); Wu, Yong [The Department of Science and Technology, Ministry of Construction, Beijing 100835 (China)

    2009-06-15

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China. (author)

  19. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    Energy Technology Data Exchange (ETDEWEB)

    Jin Zhenxing [Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China)], E-mail: jinzhenxing33@sina.com; Wu Yong [Department of Science and Technology, Ministry of Construction, Beijing 100835 (China); Li Baizhan; Gao Yafeng [Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China)

    2009-06-15

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China.

  20. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter