WorldWideScience

Sample records for technology large scale

  1. SDI Large-Scale System Technology Study

    National Research Council Canada - National Science Library

    1986-01-01

    .... This coordination is addressed by the Battle Management function. The algorithms and technologies required to support Battle Management are the subject of the SDC Large Scale Systems Technology Study...

  2. Large-scale demonstration of D ampersand D technologies

    International Nuclear Information System (INIS)

    Bhattacharyya, S.K.; Black, D.B.; Rose, R.W.

    1997-01-01

    It is becoming increasingly evident that new technologies will need to be utilized for decontamination and decommissioning (D ampersand D) activities in order to assure safe and cost effective operations. The magnitude of the international D ampersand D problem is sufficiently large in anticipated cost (100's of billions of dollars) and in elapsed time (decades), that the utilization of new technologies should lead to substantial improvements in cost and safety performance. Adoption of new technologies in the generally highly contaminated D ampersand D environments requires assurances that the technology will perform as advertised. Such assurances can be obtained from demonstrations of the technology in environments that are similar to the actual environments without being quite as contaminated and hazardous. The Large Scale Demonstration Project (LSDP) concept was designed to provide such a function. The first LSDP funded by the U.S. Department Of Energy's Environmental Management Office (EM) was on the Chicago Pile 5 (CP-5) Reactor at Argonne National Laboratory. The project, conducted by a Strategic Alliance for Environmental Restoration, has completed demonstrations of 10 D ampersand D technologies and is in the process of comparing the performance to baseline technologies. At the conclusion of the project, a catalog of performance comparisons of these technologies will be developed that will be suitable for use by future D ampersand D planners

  3. Optimization of FTA technology for large scale plant DNA isolation ...

    African Journals Online (AJOL)

    Conventional methods for DNA acquisition and storage require expensive reagents and equipments. Experimental fields located in remote areas and large sample size presents greater challenge to developing country institutions constrained financially. FTATM technology uses a single format utilizing basic tools found in ...

  4. Optimization of FTA technology for large scale plant DNA isolation ...

    African Journals Online (AJOL)

    GRACE

    2006-05-02

    May 2, 2006 ... Key words: FTATM, large-scale, DNA sampling, field set up, marker assisted selection. ... application. FTATM classic card (Whatman Inc., Clifton,. NJ) is a whatman paper that has been impregnated with a patented chemical formulation that lyses cells, .... bands for both normal agarose (data not shown) and.

  5. Survey of large-scale isotope applications: nuclear technology field

    Energy Technology Data Exchange (ETDEWEB)

    Dewitt, R.

    1977-01-21

    A preliminary literature survey of potential large-scale isotope applications was made according to topical fields; i.e., nuclear, biological, medical, environmental, agricultural, geological, and industrial. Other than the possible expansion of established large-scale isotope applications such as uranium, boron, lithium, and hydrogen, no new immediate isotope usage appears to be developing. Over the long term a change in emphasis for isotope applications was identified which appears to be more responsive to societal concerns for health, the environment, and the conservation of materials and energy. For gram-scale applications, a variety of isotopes may be required for use as nonradioactive ''activable'' tracers. A more detailed survey of the nuclear field identified a potential need for large amounts (tons) of special isotopic materials for advanced reactor components and structures. At this need for special materials and the development of efficient separation methods progresses, the utilization of isotopes from nuclear wastes for beneficial uses should also progress.

  6. Technologies and challenges in large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Engholm-Keller, Kasper; Larsen, Martin Røssel

    2013-01-01

    Phosphorylation, the reversible addition of a phosphate group to amino acid side chains of proteins, is a fundamental regulator of protein activity, stability, and molecular interactions. Most cellular processes, such as inter- and intracellular signaling, protein synthesis, degradation......, and apoptosis, rely on phosphorylation. This PTM is thus involved in many diseases, rendering localization and assessment of extent of phosphorylation of major scientific interest. MS-based phosphoproteomics, which aims at describing all phosphorylation sites in a specific type of cell, tissue, or organism, has...... become the main technique for discovery and characterization of phosphoproteins in a nonhypothesis driven fashion. In this review, we describe methods for state-of-the-art MS-based analysis of protein phosphorylation as well as the strategies employed in large-scale phosphoproteomic experiments...

  7. Vision for single flux quantum very large scale integrated technology

    Science.gov (United States)

    Silver, Arnold; Bunyk, Paul; Kleinsasser, Alan; Spargo, John

    2006-05-01

    Single flux quantum (SFQ) electronics is extremely fast and has very low on-chip power dissipation. SFQ VLSI is an excellent candidate for high-performance computing and other applications requiring extremely high-speed signal processing. Despite this, SFQ technology has generally not been accepted for system implementation. We argue that this is due, at least in part, to the use of outdated tools to produce SFQ circuits and chips. Assuming the use of tools equivalent to those employed in the semiconductor industry, we estimate the density of Josephson junctions, circuit speed, and power dissipation that could be achieved with SFQ technology. Today, CMOS lithography is at 90-65 nm with about 20 layers. Assuming equivalent technology, aggressively increasing the current density above 100 kA cm-2 to achieve junction speeds approximately 1000 GHz, and reducing device footprints by converting device profiles from planar to vertical, one could expect to integrate about 250 M Josephson junctions cm-2 into SFQ digital circuits. This should enable circuit operation with clock frequencies above 200 GHz and place approximately 20 K gates within a radius of one clock period. As a result, complete microprocessors, including integrated memory registers, could be fabricated on a single chip. This technology was exported from the United States in accordance with the US Department of Commerce Export Administration Regulations (EAR) for ultimate destination in the United Kingdom. Diversion contrary to US law prohibited.

  8. Vision for single flux quantum very large scale integrated technology

    International Nuclear Information System (INIS)

    Silver, Arnold; Bunyk, Paul; Kleinsasser, Alan; Spargo, John

    2006-01-01

    Single flux quantum (SFQ) electronics is extremely fast and has very low on-chip power dissipation. SFQ VLSI is an excellent candidate for high-performance computing and other applications requiring extremely high-speed signal processing. Despite this, SFQ technology has generally not been accepted for system implementation. We argue that this is due, at least in part, to the use of outdated tools to produce SFQ circuits and chips. Assuming the use of tools equivalent to those employed in the semiconductor industry, we estimate the density of Josephson junctions, circuit speed, and power dissipation that could be achieved with SFQ technology. Today, CMOS lithography is at 90-65 nm with about 20 layers. Assuming equivalent technology, aggressively increasing the current density above 100 kA cm -2 to achieve junction speeds approximately 1000 GHz, and reducing device footprints by converting device profiles from planar to vertical, one could expect to integrate about 250 M Josephson junctions cm -2 into SFQ digital circuits. This should enable circuit operation with clock frequencies above 200 GHz and place approximately 20 K gates within a radius of one clock period. As a result, complete microprocessors, including integrated memory registers, could be fabricated on a single chip

  9. Workshop on the Federal Role in the Commercialization of Large Scale Windmill Technology (summary and papers)

    Science.gov (United States)

    Lerner, J. I.; Miller, G.

    Large-scale wind system and windmill technology and prospects for commercial applications are discussed. Barriers that may affect the commerical viability of large-scale windmill systems are identified, including the relatively poor financial condition of much of the utility industry which effectively prevents many utilities from investing substantially in any new projects. The potential market addressed by the Federal program in large-scale windmill systems is examined. Some of the factors that may limit the degree of market penetration for wind energy systems are: costs of competing fossil and nuclear fuels and technologies; rate of acceptance of new technologies; and competition from other solar technologies, including biomass, solar thermal, and photovoltaic systems. Workshop participants agreed that existing Federal legislation provides significant incentives for the commercialization of large-scale wind machines.

  10. A review of sensing technologies for small and large-scale touch panels

    Science.gov (United States)

    Akhtar, Humza; Kemao, Qian; Kakarala, Ramakrishna

    2017-06-01

    A touch panel is an input device for human computer interaction. It consists of a network of sensors, a sampling circuit and a micro controller for detecting and locating a touch input. Touch input can come from either finger or stylus depending upon the type of touch technology. These touch panels provide an intuitive and collaborative workspace so that people can perform various tasks with the use of their fingers instead of traditional input devices like keyboard and mouse. Touch sensing technology is not new. At the time of this writing, various technologies are available in the market and this paper reviews the most common ones. We review traditional designs and sensing algorithms for touch technology. We also observe that due to its various strengths, capacitive touch will dominate the large-scale touch panel industry in years to come. In the end, we discuss the motivation for doing academic research on large-scale panels.

  11. Developing a Strategy for Using Technology-Enhanced Items in Large-Scale Standardized Tests

    Science.gov (United States)

    Bryant, William

    2017-01-01

    As large-scale standardized tests move from paper-based to computer-based delivery, opportunities arise for test developers to make use of items beyond traditional selected and constructed response types. Technology-enhanced items (TEIs) have the potential to provide advantages over conventional items, including broadening construct measurement,…

  12. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  13. Automatic Measurement in Large-Scale Space with the Laser Theodolite and Vision Guiding Technology

    Directory of Open Access Journals (Sweden)

    Bin Wu

    2013-01-01

    Full Text Available The multitheodolite intersection measurement is a traditional approach to the coordinate measurement in large-scale space. However, the procedure of manual labeling and aiming results in the low automation level and the low measuring efficiency, and the measurement accuracy is affected easily by the manual aiming error. Based on the traditional theodolite measuring methods, this paper introduces the mechanism of vision measurement principle and presents a novel automatic measurement method for large-scale space and large workpieces (equipment combined with the laser theodolite measuring and vision guiding technologies. The measuring mark is established on the surface of the measured workpiece by the collimating laser which is coaxial with the sight-axis of theodolite, so the cooperation targets or manual marks are no longer needed. With the theoretical model data and the multiresolution visual imaging and tracking technology, it can realize the automatic, quick, and accurate measurement of large workpieces in large-scale space. Meanwhile, the impact of artificial error is reduced and the measuring efficiency is improved. Therefore, this method has significant ramification for the measurement of large workpieces, such as the geometry appearance characteristics measuring of ships, large aircraft, and spacecraft, and deformation monitoring for large building, dams.

  14. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  15. Review of DC System Technologies for Large Scale Integration of Wind Energy Systems with Electricity Grids

    Directory of Open Access Journals (Sweden)

    Sheng Jie Shao

    2010-06-01

    Full Text Available The ever increasing development and availability of power electronic systems is the underpinning technology that enables large scale integration of wind generation plants with the electricity grid. As the size and power capacity of the wind turbine continues to increase, so is the need to place these significantly large structures at off-shore locations. DC grids and associated power transmission technologies provide opportunities for cost reduction and electricity grid impact minimization as the bulk power is concentrated at single point of entry. As a result, planning, optimization and impact can be studied and carefully controlled minimizing the risk of the investment as well as power system stability issues. This paper discusses the key technologies associated with DC grids for offshore wind farm applications.

  16. Los Alamos National Laboratory Tritium Technology Deployments Large Scale Demonstration and Deployment Project

    International Nuclear Information System (INIS)

    McFee, J.; Blauvelt, D.; Stallings, E.; Willms, S.

    2002-01-01

    This paper describes the organization, planning and initial implementation of a DOE OST program to deploy proven, cost effective technologies into D and D programs throughout the complex. The primary intent is to accelerate closure of the projects thereby saving considerable funds and at the same time being protective of worker health and the environment. Most of the technologies in the ''toolkit'' for this program have been demonstrated at a DOE site as part of a Large Scale Demonstration and Deployment Project (LSDDP). The Mound Tritium D and D LSDDP served as the base program for the technologies being deployed in this project but other LSDDP demonstrated technologies or ready-for-use commercial technologies will also be considered. The project team will evaluate needs provided by site D and D project managers, match technologies against those needs and rank deployments using a criteria listing. After selecting deployments the project will purchase the equipment and provide a deployment engineer to facilitate the technology implementation. Other cost associated with the use of the technology will be borne by the site including operating staff, safety and health reviews etc. A cost and performance report will be prepared following the deployment to document the results

  17. Large-scale decontamination and decommissioning technology demonstration project at a former uranium metal production facility

    International Nuclear Information System (INIS)

    Martineit, R.A.; Borgman, T.D.; Peters, M.S.; Stebbins, L.L.

    1997-01-01

    The Department of Energy's (DOE) Office of Science and Technology Decontamination and Decommissioning (D ampersand D) Focus Area, led by the Federal Energy Technology Center, has been charged with improving upon baseline D ampersand D technologies with the goal of demonstrating and validating more cost-effective and safer technologies to characterize, deactivate, survey, decontaminate, dismantle, and dispose of surplus structures, buildings, and their contents at DOE sites. The D ampersand D Focus Area's approach to verifying the benefits of the improved D ampersand D technologies is to use them in large-scale technology demonstration (LSTD) projects at several DOE sites. The Fernald Environmental Management Project (FEMP) was selected to host one of the first three LSTD's awarded by the D ampersand D Focus Area. The FEMP is a DOE facility near Cincinnati, Ohio, that was formerly engaged in the production of high quality uranium metal. The FEMP is a Superfund site which has completed its RUFS process and is currently undergoing environmental restoration. With the FEMP's selection to host an LSTD, the FEMP was immediately faced with some challenges. The primary challenge was that this LSTD was to be integrated into the FEMP's Plant 1 D ampersand D Project which was an ongoing D ampersand D Project for which a firm fixed price contract had been issued to the D ampersand D Contractor. Thus, interferences with the baseline D ampersand D project could have significant financial implications. Other challenges include defining and selecting meaningful technology demonstrations, finding/selecting technology providers, and integrating the technology into the baseline D ampersand D project. To date, twelve technologies have been selected, and six have been demonstrated. The technology demonstrations have yielded a high proportion of open-quotes winners.close quotes All demonstrated, technologies will be evaluated for incorporation into the FEMP's baseline D ampersand D

  18. Large-scale decontamination and decommissioning technology demonstration project at a former uranium metal production facility

    Energy Technology Data Exchange (ETDEWEB)

    Martineit, R.A.; Borgman, T.D.; Peters, M.S.; Stebbins, L.L. [and others

    1997-03-05

    The Department of Energy`s (DOE) Office of Science and Technology Decontamination and Decommissioning (D&D) Focus Area, led by the Federal Energy Technology Center, has been charged with improving upon baseline D&D technologies with the goal of demonstrating and validating more cost-effective and safer technologies to characterize, deactivate, survey, decontaminate, dismantle, and dispose of surplus structures, buildings, and their contents at DOE sites. The D&D Focus Area`s approach to verifying the benefits of the improved D&D technologies is to use them in large-scale technology demonstration (LSTD) projects at several DOE sites. The Fernald Environmental Management Project (FEMP) was selected to host one of the first three LSTD`s awarded by the D&D Focus Area. The FEMP is a DOE facility near Cincinnati, Ohio, that was formerly engaged in the production of high quality uranium metal. The FEMP is a Superfund site which has completed its RUFS process and is currently undergoing environmental restoration. With the FEMP`s selection to host an LSTD, the FEMP was immediately faced with some challenges. The primary challenge was that this LSTD was to be integrated into the FEMP`s Plant 1 D&D Project which was an ongoing D&D Project for which a firm fixed price contract had been issued to the D&D Contractor. Thus, interferences with the baseline D&D project could have significant financial implications. Other challenges include defining and selecting meaningful technology demonstrations, finding/selecting technology providers, and integrating the technology into the baseline D&D project. To date, twelve technologies have been selected, and six have been demonstrated. The technology demonstrations have yielded a high proportion of {open_quotes}winners.{close_quotes} All demonstrated, technologies will be evaluated for incorporation into the FEMP`s baseline D&D strategy.

  19. Data warehousing technologies for large-scale and right-time data

    DEFF Research Database (Denmark)

    Xiufeng, Liu

    , but big, tables with few columns, the presented triple-store spreads the data over more tables that may have many columns. The triple-store is optimized by an extensive use of bulk techniques, which makes it very efficient to insert and extract data. The DBMS-based solution makes it very flexible......This thesis is about data warehousing technologies for large-scale and right-time data. Today, due to the exponential growth of data, it has become a common practice for many enterprises to process hundreds of gigabytes of data per day. Traditionally, data warehousing populates data from...... heterogeneous sources into a central data warehouse (DW) by Extract-Transform-Load (ETL) at regular time intervals, e.g., monthly, weekly, or daily. But now, it becomes challenging for large-scale data, and hard to meet the near real-time/right-time business decisions. This thesis considers some...

  20. Integrated Technologies for Large-Scale Trapped-Ion Quantum Information Processing

    Science.gov (United States)

    Sorace-Agaskar, C.; Bramhavar, S.; Kharas, D.; Mehta, K. K.; Loh, W.; Panock, R.; Bruzewicz, C. D.; McConnell, R.; Ram, R. J.; Sage, J. M.; Chiaverini, J.

    2016-05-01

    Atomic ions trapped and controlled using electromagnetic fields hold great promise for practical quantum information processing due to their inherent coherence properties and controllability. However, to realize this promise, the ability to maintain and manipulate large-scale systems is required. We present progress toward the development of, and proof-of-principle demonstrations and characterization of, several technologies that can be integrated with ion-trap arrays on-chip to enable such scaling to practically useful sizes. Of particular use are integrated photonic elements for routing and focusing light throughout a chip without the need for free-space optics. The integration of CMOS electronics and photo-detectors for on-chip control and readout, and methods for monolithic fabrication and wafer-scale integration to incorporate these capabilities into tile-able 2D ion-trap array cells, are also explored.

  1. A Methodology for Integrated, Multiregional Life Cycle Assessment Scenarios under Large-Scale Technological Change.

    Science.gov (United States)

    Gibon, Thomas; Wood, Richard; Arvesen, Anders; Bergesen, Joseph D; Suh, Sangwon; Hertwich, Edgar G

    2015-09-15

    Climate change mitigation demands large-scale technological change on a global level and, if successfully implemented, will significantly affect how products and services are produced and consumed. In order to anticipate the life cycle environmental impacts of products under climate mitigation scenarios, we present the modeling framework of an integrated hybrid life cycle assessment model covering nine world regions. Life cycle assessment databases and multiregional input-output tables are adapted using forecasted changes in technology and resources up to 2050 under a 2 °C scenario. We call the result of this modeling "technology hybridized environmental-economic model with integrated scenarios" (THEMIS). As a case study, we apply THEMIS in an integrated environmental assessment of concentrating solar power. Life-cycle greenhouse gas emissions for this plant range from 33 to 95 g CO2 eq./kWh across different world regions in 2010, falling to 30-87 g CO2 eq./kWh in 2050. Using regional life cycle data yields insightful results. More generally, these results also highlight the need for systematic life cycle frameworks that capture the actual consequences and feedback effects of large-scale policies in the long term.

  2. Polymer optical circuits technology for large-scale integration of passive functions

    Science.gov (United States)

    Maalouf, Azar; Bosc, Dominique; Henrio, Frédéric; Haesaert, Séverine; Grosso, Philippe; Hardy, Isabelle; Gadonna, Michel

    2006-04-01

    Polymers are attractive to realize integrated circuits specially because they are very simple to process and are promising for low cost devices. Moreover, beside low cost technology, the large possible range of refractive index, could lead to large scale of integration, lowering the fabrication costs. In some cases, it could be an alternative solution to semiconductor or inorganic dielectric technologies. With usual UV photolithography technology, this work shows that it is possible to perform small guides in order to provide relatively high circuit densification. The refractive index contrast, between optical core and cladding, can be as high as 0.07 instead of 0.02 for the higher contrast in silica Ge doped waveguides. Recently, this contrast has been increased to 0.11 at the wavelength of 1550nm. These materials make possible the patterning of guides having radius of curvature smaller than 200μm. Such curvatures open the way to functions based on microrings that potentially lead to compact wavelength multiplexers. With the view to control the fabrication of polymer waveguides, some features of the process are reported here. For example, shortcomings such as unsuitable film worm aspects are described and solutions are given with requirements assigned to rough materials. Mechanical and thermal properties of polymers have to be adjusted to withstand integrated circuit processing. This paper also presents results concerning the realization of integrated passive microring resonators with this technology.

  3. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    Science.gov (United States)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  4. New technologies for large-scale micropatterning of functional nanocomposite polymers

    Science.gov (United States)

    Khosla, A.; Gray, B. L.

    2012-04-01

    We present a review of different micropatterning technologies for flexible elastomeric functional nanocomposites with a particular emphasis on mold material and processes for production of large size substrates. The functional polymers include electrically conducting and magnetic materials developed at the Micro-instrumentation Laboratory at Simon Fraser University, Canada. We present a chart that compares many of these different conductive and magnetic functional nanocomposites and their measured characteristics. Furthermore, we have previously reported hybrid processes for nanocomposite polymers micromolded against SU-8 photoepoxy masters. However, SU-8 is typically limited to substrate sizes that are compatible with microelectronics processing as a microelectronics uv-patterning step is typically involved, and de-molding problems are observed. Recently, we have developed new processes that address the problems faced with SU-8 molds. These new technologies for micropatterning nanocomposites involve new substrate materials. A low cost Poly(methyl methacrylate) (PMMA) microfabrication technology has been developed, which involves fabrication of micromold via either CO2 laser ablation or deep UV. We have previously reported this large-scale patterning technique using laser ablation. Finally, we compare the two processes for PMMA producing micromolds for nanocomposites.

  5. Microfluidic very large-scale integration for biochips: Technology, testing and fault-tolerant design

    DEFF Research Database (Denmark)

    Araci, Ismail Emre; Pop, Paul; Chakrabarty, Krishnendu

    2015-01-01

    of this paper is on continuous-flow biochips, where the basic building block is a microvalve. By combining these microvalves, more complex units such as mixers, switches, multiplexers can be built, hence the name of the technology, “microfluidic Very Large-Scale Integration” (mVLSI). A roadblock......Microfluidic biochips are replacing the conventional biochemical analyzers by integrating all the necessary functions for biochemical analysis using microfluidics. Biochips are used in many application areas, such as, in vitro diagnostics, drug discovery, biotech and ecology. The focus...... presents the state-of-the-art in the mVLSI platforms and emerging research challenges in the area of continuous-flow microfluidics, focusing on testing techniques and fault-tolerant design....

  6. Implementing Large-Scale Instructional Technology in Kenya: Changing Instructional Practice and Developing Accountability in a National Education System

    Science.gov (United States)

    Piper, Benjamin; Oyanga, Arbogast; Mejia, Jessica; Pouezevara, Sarah

    2017-01-01

    Previous large-scale education technology interventions have shown only modest impacts on student achievement. Building on results from an earlier randomized controlled trial of three different applications of information and communication technologies (ICTs) on primary education in Kenya, the Tusome Early Grade Reading Activity developed the…

  7. Integrating large-scale data and RNA technology to protect crops from fungal pathogens

    Directory of Open Access Journals (Sweden)

    Ian Joseph Girard

    2016-05-01

    Full Text Available With a rapidly growing human population it is expected that plant science researchers and the agricultural community will need to increase food productivity using less arable land. This challenge is complicated by fungal pathogens and diseases, many of which can severely impact crop yield. Current measures to control fungal pathogens are either ineffective or have adverse effects on the agricultural enterprise. Thus, developing new strategies through research innovation to protect plants from pathogenic fungi is necessary to overcome these hurdles. RNA sequencing technologies are increasing our understanding of the underlying genes and gene regulatory networks mediating disease outcomes. The application of invigorating next generation sequencing strategies to study plant-pathogen interactions has and will provide unprecedented insight into the complex patterns of gene activity responsible for crop protection. However, questions remain about how biological processes in both the pathogen and the host are specified in space directly at the site of infection and over the infection period. The integration of cutting edge molecular and computational tools will provide plant scientists with the arsenal required to identify genes and molecules that play a role in plant protection. Large scale RNA sequence data can then be used to protect plants by targeting genes essential for pathogen viability in the production of stably transformed lines expressing RNA interference molecules, or through foliar applications of double stranded RNA.

  8. Integrating Large-Scale Data and RNA Technology to Protect Crops from Fungal Pathogens.

    Science.gov (United States)

    Girard, Ian J; Mcloughlin, Austein G; de Kievit, Teresa R; Fernando, Dilantha W G; Belmonte, Mark F

    2016-01-01

    With a rapidly growing human population it is expected that plant science researchers and the agricultural community will need to increase food productivity using less arable land. This challenge is complicated by fungal pathogens and diseases, many of which can severely impact crop yield. Current measures to control fungal pathogens are either ineffective or have adverse effects on the agricultural enterprise. Thus, developing new strategies through research innovation to protect plants from pathogenic fungi is necessary to overcome these hurdles. RNA sequencing technologies are increasing our understanding of the underlying genes and gene regulatory networks mediating disease outcomes. The application of invigorating next generation sequencing strategies to study plant-pathogen interactions has and will provide unprecedented insight into the complex patterns of gene activity responsible for crop protection. However, questions remain about how biological processes in both the pathogen and the host are specified in space directly at the site of infection and over the infection period. The integration of cutting edge molecular and computational tools will provide plant scientists with the arsenal required to identify genes and molecules that play a role in plant protection. Large scale RNA sequence data can then be used to protect plants by targeting genes essential for pathogen viability in the production of stably transformed lines expressing RNA interference molecules, or through foliar applications of double stranded RNA.

  9. Ten key considerations for the successful optimization of large-scale health information technology.

    Science.gov (United States)

    Cresswell, Kathrin M; Bates, David W; Sheikh, Aziz

    2017-01-01

    Implementation and adoption of complex health information technology (HIT) is gaining momentum internationally. This is underpinned by the drive to improve the safety, quality, and efficiency of care. Although most of the benefits associated with HIT will only be realized through optimization of these systems, relatively few health care organizations currently have the expertise or experience needed to undertake this. It is extremely important to have systems working before embarking on HIT optimization, which, much like implementation, is an ongoing, difficult, and often expensive process. We discuss some key organization-level activities that are important in optimizing large-scale HIT systems. These include considerations relating to leadership, strategy, vision, and continuous cycles of improvement. Although these alone are not sufficient to fully optimize complex HIT, they provide a starting point for conceptualizing this important area. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  11. Large scale deployment of non power applications (NPAs) and spin-off technologies in rural sector

    International Nuclear Information System (INIS)

    Patankar, A.M.; Mule, S.S.

    2009-01-01

    Over the past 50 years a large indigenous Science and Technology (S and T) know-how has been generated in various national laboratories and in parallel, several technologies have been imported. Urban sector has received the highest attention by way of deployment of large number of these technologies and know-how in urban areas resulting in rapid urban development leading to urban rural divide in terms of prosperity and opportunities. Further, India's young population is expected to be the largest in the world in decades ahead, over 500 millions. Creating gainful and productive work for them is the greatest challenge. Technical know-how generated in national laboratories related to basic needs such as water, food, energy and environment has been underutilized. Deployment and adaptation of this know-how to the rural needs could provide a creative opportunity for expected 500 million youths in rural and urban India to contribute to the national wealth with prosperity for everybody including villages

  12. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    Science.gov (United States)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  13. Developing technology for large-scale production of forest chips. Wood Energy Technology Programme 1999-2003. Interim report

    International Nuclear Information System (INIS)

    Hakkila, P.

    2003-01-01

    Finland is enhancing its use of renewable sources in energy production. From the 1995 level, the use of renewable energy is to be increased by 50 % by 2010, and 100 % by 2025. Wood-based fuels will play a leading role in this development. The main source of wood-based fuels is processing residues from the forest industries. However, as all processing residues are already in use, an increase is possible only as far as the capacity and wood consumption of the forest industries grow. Energy policy affects the production and availability of processing residues only indirectly. Another large source of wood-based energy is forest fuels, consisting of traditional firewood and chips comminuted from low-quality biomass. It is estimated that the reserve of technically harvest-able forest biomass is 10-16 Mm' annually, when no specific cost limit is applied. This corresponds to 2-3 Mtoe or 6-9 % of the present consumption of primary energy in Finland. How much of this re-serve it will actually be possible to harvest and utilize depends on the cost competitiveness of forest chips against alternative sources of energy. A goal of Finnish energy and climate strategies is to use 5 Mm' forest chips annually by 2010. The use of wood fuels is being promoted by means of taxation, investment aid and support for chip production from young forests. Furthermore, research and development is being supported in order to create techno-economic conditions for the competitive production of forest chips. In 1999, the National Technology Agency Tekes established the five-year Wood Energy Technology Programme to stimulate the development of efficient systems for the large-scale production of forest chips. Key tar-gets are competitive costs, reliable supply and good quality chips. The two guiding principles of the programme are: (1) close cooperation between researchers and practitioners and (2) to apply research and development to the practical applications and commercialization. As of November

  14. Development of technology for the large-scale preparation of 60Co polymer film source

    International Nuclear Information System (INIS)

    Udhayakumar, J.; Pardeshi, G.S.; Gandhi, Shymala S.; Chakravarty, Rubel; Kumar, Manoj; Dash, Ashutosh; Venkatesh, Meera

    2008-01-01

    60 Co sources (∼37 kBq) in the form of a thin film are widely used in position identification of perforation in offshore oil-well explorations. This paper describes the large-scale preparation of such sources using a radioactive polymer containing 60 Co. 60 Co was extracted into chloroform containing 8-hydroxyquinoline. The chloroform layer was mixed with polymethyl methacrylate (PMMA) polymer. A large film was prepared using the polymer solution containing the complex. The polymer film was then cut into circular sources, mounted on a source holder and supplied to various users

  15. Economic Impact of Large-Scale Deployment of Offshore Marine and Hydrokinetic Technology in Oregon Coastal Counties

    Energy Technology Data Exchange (ETDEWEB)

    Jimenez, T. [Bureau of Ocean Energy Management (BOEM), Washington, DC (United States); Tegen, S. [Bureau of Ocean Energy Management (BOEM), Washington, DC (United States); Beiter, P. [Bureau of Ocean Energy Management (BOEM), Washington, DC (United States)

    2015-03-01

    To begin understanding the potential economic impacts of large-scale WEC technology, the Bureau of Ocean Energy Management (BOEM) commissioned the National Renewable Energy Laboratory (NREL) to conduct an economic impact analysis of largescale WEC deployment for Oregon coastal counties. This report follows a previously published report by BOEM and NREL on the jobs and economic impacts of WEC technology for the entire state (Jimenez and Tegen 2015). As in Jimenez and Tegen (2015), this analysis examined two deployment scenarios in the 2026-2045 timeframe: the first scenario assumed 13,000 megawatts (MW) of WEC technology deployed during the analysis period, and the second assumed 18,000 MW of WEC technology deployed by 2045. Both scenarios require major technology and cost improvements in the WEC devices. The study is on very large-scale deployment so readers can examine and discuss the potential of a successful and very large WEC industry. The 13,000-MW is used as the basis for the county analysis as it is the smaller of the two scenarios. Sensitivity studies examined the effects of a robust in-state WEC supply chain. The region of analysis is comprised of the seven coastal counties in Oregon—Clatsop, Coos, Curry, Douglas, Lane, Lincoln, and Tillamook—so estimates of jobs and other economic impacts are specific to this coastal county area.

  16. Multicriteria assessment of advanced treatment technologies for micropollutants removal at large-scale applications.

    Science.gov (United States)

    Bui, X T; Vo, T P T; Ngo, H H; Guo, W S; Nguyen, T T

    2016-09-01

    With the introduction and discharge of thousands of new micropollutants (MPs) every year, traditional water and wastewater treatment plants may be incapable of tackling them all. With their low concentrations and diversity in nature, MP removal encounters numerous challenges. Although some MPs are effectively eliminated via conventional treatment methods, most of them can easily escape and are retained in the discharged effluent. Therefore, advanced methods such as (i) adsorption, (ii) oxidation and advanced oxidation processes (O3 and O3-based advanced oxidation processes, UV/H2O2), (iii) membrane processes, and (iv) membrane bioreactors, become an inevitable approach. Despite the unsurprisingly vast number of papers on MP treatment available at present, most of these studies were carried out at a laboratory scale while only a few pilot- and full-scale studies have experimented. Nevertheless, an in-depth assessment of real-world MP treatment methods is extremely crucial for practitioners. To date, no paper has been dedicated to look at this issue. Therefore, this paper aims to review these large-scale treatment methods. First, the paper goes through the regulations and standards which deal with MPs in water courses. It will then assess these methods in various case-studies with reference to different criteria towards serving as a reference for further practical applications. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Exploring Large Scale Data Analysis and Visualization for ARM Data Discovery Using NoSQL Technologies

    Science.gov (United States)

    Krishna, B.; Gustafson, W. I., Jr.; Vogelmann, A. M.; Toto, T.; Devarakonda, R.; Palanisamy, G.

    2016-12-01

    This paper presents a new way of providing ARM data discovery through data analysis and visualization services. ARM stands for Atmospheric Radiation Measurement. This Program was created to study cloud formation processes and their influence on radiative transfer and also include additional measurements of aerosol and precipitation at various highly instrumented ground and mobile stations. The total volume of ARM data is roughly 900TB. The current search for ARM data is performed by using its metadata, such as the site name, instrument name, date, etc. NoSQL technologies were explored to improve the capabilities of data searching, not only by their metadata, but also by using the measurement values. Two technologies that are currently being implemented for testing are Apache Cassandra (noSQL database) and Apache Spark (noSQL based analytics framework). Both of these technologies were developed to work in a distributed environment and hence can handle large data for storing and analytics. D3.js is a JavaScript library that can generate interactive data visualizations in web browsers by making use of commonly used SVG, HTML5, and CSS standards. To test the performance of NoSQL for ARM data, we will be using ARM's popular measurements to locate the data based on its value. Recently noSQL technology has been applied to a pilot project called LASSO, which stands for LES ARM Symbiotic Simulation and Observation Workflow. LASSO will be packaging LES output and observations in "data bundles" and analyses will require the ability for users to analyze both observations and LES model output either individually or together across multiple time periods. The LASSO implementation strategy suggests that enormous data storage is required to store the above mentioned quantities. Thus noSQL was used to provide a powerful means to store portions of the data that provided users with search capabilities on each simulation's traits through a web application. Based on the user selection

  18. Large scale tracking algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Love, Joshua Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Melgaard, David Kennett [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Karelitz, David B. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Pitts, Todd Alan [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Zollweg, Joshua David [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Anderson, Dylan Z. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Nandy, Prabal [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Whitlow, Gary L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Bender, Daniel A. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Byrne, Raymond Harry [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  19. Assessment of Vehicle Sizing, Energy Consumption and Cost Through Large Scale Simulation of Advanced Vehicle Technologies

    Energy Technology Data Exchange (ETDEWEB)

    Moawad, Ayman [Argonne National Lab. (ANL), Argonne, IL (United States); Kim, Namdoo [Argonne National Lab. (ANL), Argonne, IL (United States); Shidore, Neeraj [Argonne National Lab. (ANL), Argonne, IL (United States); Rousseau, Aymeric [Argonne National Lab. (ANL), Argonne, IL (United States)

    2016-01-01

    The U.S. Department of Energy (DOE) Vehicle Technologies Office (VTO) has been developing more energy-efficient and environmentally friendly highway transportation technologies that will enable America to use less petroleum. The long-term aim is to develop "leapfrog" technologies that will provide Americans with greater freedom of mobility and energy security, while lowering costs and reducing impacts on the environment. This report reviews the results of the DOE VTO. It gives an assessment of the fuel and light-duty vehicle technologies that are most likely to be established, developed, and eventually commercialized during the next 30 years (up to 2045). Because of the rapid evolution of component technologies, this study is performed every two years to continuously update the results based on the latest state-of-the-art technologies.

  20. Large-Scale Campus Computer Technology Implementation: Lessons from the First Year.

    Science.gov (United States)

    Nichols, Todd; Frazer, Linda H.

    The purpose of the Elementary Technology Demonstration Schools (ETDS) Project, funded by IBM and Apple, Inc., was to demonstrate the effectiveness of technology in accelerating the learning of low achieving at-risk students and enhancing the education of high achieving students. The paper begins by giving background information on the district,…

  1. Large-scale nanofabrication of periodic nanostructures using nanosphere-related techniques for green technology applications (Conference Presentation)

    Science.gov (United States)

    Yen, Chen-Chung; Wu, Jyun-De; Chien, Yi-Hsin; Wang, Chang-Han; Liu, Chi-Ching; Ku, Chen-Ta; Chen, Yen-Jon; Chou, Meng-Cheng; Chang, Yun-Chorng

    2016-09-01

    Nanotechnology has been developed for decades and many interesting optical properties have been demonstrated. However, the major hurdle for the further development of nanotechnology depends on finding economic ways to fabricate such nanostructures in large-scale. Here, we demonstrate how to achieve low-cost fabrication using nanosphere-related techniques, such as Nanosphere Lithography (NSL) and Nanospherical-Lens Lithography (NLL). NSL is a low-cost nano-fabrication technique that has the ability to fabricate nano-triangle arrays that cover a very large area. NLL is a very similar technique that uses polystyrene nanospheres to focus the incoming ultraviolet light and exposure the underlying photoresist (PR) layer. PR hole arrays form after developing. Metal nanodisk arrays can be fabricated following metal evaporation and lifting-off processes. Nanodisk or nano-ellipse arrays with various sizes and aspect ratios are routinely fabricated in our research group. We also demonstrate we can fabricate more complicated nanostructures, such as nanodisk oligomers, by combining several other key technologies such as angled exposure and deposition, we can modify these methods to obtain various metallic nanostructures. The metallic structures are of high fidelity and in large scale. The metallic nanostructures can be transformed into semiconductor nanostructures and be used in several green technology applications.

  2. Personal attitudes towards large-scale technologies. The perception of risks and benefits of electricity generation with uranium and coal

    Energy Technology Data Exchange (ETDEWEB)

    Midden, C.J.H.; Daamen, D.D.L.; Verplanken, B.

    1984-06-01

    The distribution of attitudes towards the large-scale application of coal and uranium, belief systems underlying these attitudes, the perceived probabilities of a number of consequences from these energy sources and the consequences of these attitudes for behaviour and behavioural intentions are discussed. Attention is paid to other aspects of people's evaluations of these energy technologies: involvement with the problems perceived, personal effectivity to influence collective decisions, information acquisition and level, imaginability of accidents, anxiety, reactions to local plants. The study has been designed following an extended and adapted version of the attitude-behaviour model by Fishbein and Ajzen.

  3. The Feedback of Star Formation Based on Large-scale Spectroscopic Mapping Technology

    Science.gov (United States)

    Li, H. X.

    2017-05-01

    estimated to be ˜ 3.9 × 1045 erg, which is 1% of the cloud turbulent energy. The total kinetic energy of the detected bubbles is estimated to be ˜ 9.2 × 1046 erg, which is 29% of the turbulent energy of Taurus. The energy injection rate from the outflows is ˜ 1.3 × 1033 erg s-1, 0.4-2 times the turbulent dissipation rate of the cloud. The energy injection rate from bubbles is ˜ 6.4 × 1033 erg s-1, 2-10 times the turbulent dissipation rate of the cloud. The gravitational binding energy of the cloud is ˜ 1.5 × 1048 erg, 385 and 16 times the energy of outflows and bubbles, respectively. We conclude that neither outflows nor bubbles can provide sufficient energy to balance the overall gravitational binding energy and the turbulent energy of Taurus. However, in the current epoch, stellar feedback is sufficient to maintain the observed turbulence in Taurus. We studied the methods of spectral data processing for large-scale surveys, which is helpful in developing the data-processing software of FAST (Five-hundred-meter Aperture Spherical radio Telescope).

  4. Developing Server-Side Infrastructure for Large-Scale E-Learning of Web Technology

    Science.gov (United States)

    Simpkins, Neil

    2010-01-01

    The growth of E-business has made experience in server-side technology an increasingly important area for educators. Server-side skills are in increasing demand and recognised to be of relatively greater value than comparable client-side aspects (Ehie, 2002). In response to this, many educational organisations have developed E-business courses,…

  5. Strategic Planning Tools for Large-Scale Technology-Based Assessments

    Science.gov (United States)

    Koomen, Marten; Zoanetti, Nathan

    2018-01-01

    Education systems are increasingly being called upon to implement new technology-based assessment systems that generate efficiencies, better meet changing stakeholder expectations, or fulfil new assessment purposes. These assessment systems require coordinated organisational effort to implement and can be expensive in time, skill and other…

  6. LARGE SCALE GLAZED

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    WORLD FAMOUS ARCHITECTS CHALLENGE TODAY THE EXPOSURE OF CONCRETE IN THEIR ARCHITECTURE. IT IS MY HOPE TO BE ABLE TO COMPLEMENT THESE. I TRY TO DEVELOP NEW AESTHETIC POTENTIALS FOR THE CONCRETE AND CERAMICS, IN LARGE SCALES THAT HAS NOT BEEN SEEN BEFORE IN THE CERAMIC AREA. IT IS EXPECTED TO RESUL...

  7. Sodium nickel chloride battery technology for large-scale stationary storage in the high voltage network

    Science.gov (United States)

    Benato, Roberto; Cosciani, Nicola; Crugnola, Giorgio; Dambone Sessa, Sebastian; Lodi, Giuseppe; Parmeggiani, Carlo; Todeschini, Marco

    2015-10-01

    The extensive application of Sodium-Nickel Chloride (Na-NiCl2) secondary batteries in electric and hybrid vehicles, in which the safety requirements are more restrictive than these of stationary storage applications, depicts the Na-NiCl2 technology as perfectly suitable for the stationary storage applications. The risk of fire is negligible because of the intrinsic safety of the cell chemical reactions, related to the sodium-tetrachloroaluminate (NaAlCl4) content into the cell, which acts as a secondary electrolyte (the primary one being the ceramic β″-alumina as common for Na-Beta batteries). The 3 h rate discharge time makes this technology very attractive for load levelling, voltage regulation, time shifting and the power fluctuation mitigation of the renewable energy sources in both HV and EHV networks.

  8. Characterizing agricultural impacts of recent large-scale US droughts and changing technology and management

    Energy Technology Data Exchange (ETDEWEB)

    Elliott, Joshua [Univ. of Chicago, IL (United States). Computation Inst.; Argonne National Lab. (ANL), Lemont, IL (United States); Glotter, Michael [Univ. of Chicago, IL (United States). Dept. of the Geophysical Sciences; Ruane, Alex C. [NASA Goddard Inst. for Space Studies (GISS), New York, NY (United States); Boote, Kenneth J. [Univ. of Florida, Gainesville, FL (United States). Agricultural and Biological Engineering Dept.; Hatfield, Jerry L. [US Dept. of Agriculture (USDA)., Ames, IA (United States). National Lab. for Agriculture and the Environment; Jones, James W. [Univ. of Florida, Gainesville, FL (United States). Agricultural and Biological Engineering Dept.; Rosenzweig, Cynthia [NASA Goddard Inst. for Space Studies (GISS), New York, NY (United States); Smith, Leonard A. [London School of Economics, London (United Kingdom). Center for Analysis of Time Series; Foster, Ian [Univ. of Chicago, IL (United States). Computation Inst.; Computation Inst.; Argonne National Lab. (ANL), Lemont, IL (United States)

    2018-01-01

    Process-based agricultural models, applied in novel ways, can reproduce historical crop yield anomalies in the US, with median absolute deviation from observations of 6.7% at national-level and 11% at state-level. In seasons for which drought is the overriding factor, performance is further improved. Historical counterfactual scenarios for the 1988 and 2012 droughts show that changes in agricultural technologies and management have reduced system-level drought sensitivity in US maize production by about 25% in the intervening years. Finally, we estimate the economic costs of the two droughts in terms of insured and uninsured crop losses in each US county (for a total, adjusted for inflation, of $9 billion in 1988 and $21.6 billion in 2012). We compare these with cost estimates from the counterfactual scenarios and with crop indemnity data where available. Model-based measures are capable of accurately reproducing the direct agro-economic losses associated with extreme drought and can be used to characterize and compare events that occurred under very different conditions. This work suggests new approaches to modeling, monitoring, forecasting, and evaluating drought impacts on agriculture, as well as evaluating technological changes to inform adaptation strategies for future climate change and extreme events.

  9. Characterizing Agricultural Impacts of Recent Large-Scale US Droughts and Changing Technology and Management

    Science.gov (United States)

    Elliott, Joshua; Glotter, Michael; Ruane, Alex C.; Boote, Kenneth J.; Hatfield, Jerry L.; Jones, James W.; Rosenzweig, Cynthia; Smith, Leonard A.; Foster, Ian

    2017-01-01

    Process-based agricultural models, applied in novel ways, can reproduce historical crop yield anomalies in the US, with median absolute deviation from observations of 6.7% at national-level and 11% at state-level. In seasons for which drought is the overriding factor, performance is further improved. Historical counterfactual scenarios for the 1988 and 2012 droughts show that changes in agricultural technologies and management have reduced system-level drought sensitivity in US maize production by about 25% in the intervening years. Finally, we estimate the economic costs of the two droughts in terms of insured and uninsured crop losses in each US county (for a total, adjusted for inflation, of $9 billion in 1988 and $21.6 billion in 2012). We compare these with cost estimates from the counterfactual scenarios and with crop indemnity data where available. Model based measures are capable of accurately reproducing the direct agro-economic losses associated with extreme drought and can be used to characterize and compare events that occurred under very different conditions. This work suggests new approaches to modeling, monitoring, forecasting, and evaluating drought impacts on agriculture, as well as evaluating technological changes to inform adaptation strategies for future climate change and extreme events.

  10. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  11. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  12. Development of innovative technological base for large-scale nuclear power

    Energy Technology Data Exchange (ETDEWEB)

    Adamov, E.O.; Dedul, A.V.; Orlov, V.V.; Rachkov, V.I.; Slesarev, I.S. [ITC ' ' PRORYV' ' Project, Moscow (Russian Federation)

    2017-04-15

    The problems of the Nuclear Power (NP) further development as well as the ways of their resolution on the basis of innovative fast reactor concepts and the Closed Equilibrium Fuel Cycle (CEFC) are analyzed. The new paradigm of NP and the corresponding NP super task are declared. The corresponding super task could be considered a transition to the vital risk free nuclear power through the guaranteed elimination/suppression of all their vital risks and threats (or their transformation to the category of some ordinary risks and threats) on the base of ''natural safety principle''. The project of Rosatom State Corporation (named ''PRORYV'') is launched within the Federal Target Program ''Nuclear power technologies of new generation for 2010 to 2015 and in perspective till 2020''. It has been planned just for these goals achievement. Super-task solution is quite ''on teeth'' to PRORYV project which is initially focused on the ''natural safety'' realization. This project is aimed, in particular, at construction of the demonstration lead cooled reactor BREST-300-OD and the enterprise for equilibrium fuel cycle closing.

  13. Chemical vapor deposition of amorphous tungsten nitride for applications in ultra-large scale interconnect technologies

    Science.gov (United States)

    Kelsey, Jean E.

    Increasing demands on computer chip technology require exploration of novel materials and deposition techniques. The driving need to reduce device dimensions without increasing device delay time has forced a move towards copper interconnects. Copper interconnects require an encapsulating barrier layer to prevent diffusion into the dielectric layer, as well as a passivation layer to protect against oxidation. One potential material for the barrier layer is tungsten nitride (WNx). Tungsten nitride is expected to perform well as a barrier because of its refractory nature and excellent thermal, chemical, and mechanical properties. In addition, it can be deposited in amorphous form. Amorphous materials have no grain boundaries, thereby making grain boundary diffusion, a fast path diffusion mechanism, impossible. In this work, a chemical vapor deposition (CVD) process was developed for the deposition of tungsten nitride. CVD was selected because it has the potential to deposit highly conformal film. High conformality is critical in a barrier layer in order to ensure viable coverage at the bottom and sides of device structures without sacrificing critical space that would be better used by the copper metal. In this manner, the total resistivity of the interconnect is minimized. The CVD WNx process was systematically optimized for film conformality, resistivity and growth rate. This was achieved by thoroughly examining film nucleation and growth characteristics, and analyzing resulting film properties. Adhesion of copper to the CVD films was qualified using stud pull tests, while X-ray diffraction was implemented to determine crystallization temperature of the amorphous phase. Additionally, diffusion barrier properties of the CVD tungsten nitride were assessed using sputter deposited copper, and compared to those of sputter deposited tungsten nitride. Thermally activated barrier failure was studied as a function of barrier thickness using Rutherford backscattering

  14. Novel electrode structures for large scale dielectrophoretic separations based on textile technology.

    Science.gov (United States)

    Abidin, Zurina Z; Downes, Les; Markx, Gerard H

    2007-06-15

    The use of dielectrophoresis (DEP) to date has mainly been limited to processing small volumes due to difficulties in the fabrication of microelectrodes over large surface areas. To overcome this problem a novel approach to the construction of micro-electrode arrays has been developed based on weaving. A plain weave cloth was made from 100 microm diameter stainless steel wires and 75 decitex polyester yarns. The stainless steel wires formed the weft, and were kept parallel and apart by a warp of flexible polyester yarns, with a gap of around 150 microm between the metal wires. The metal wires were alternately connected to earth and signal of an AC power source, and it was shown that it was possible to collect yeast cells suspended in deionised water at the metal wire surfaces by dielectrophoresis. The polyester yarn was also found to distort the electric field, creating further areas of electric field non-uniformity around the polyester yarns, further enhancing the capability of the system to attract cells. A 14 ml separation chamber was built from the cloth by alternately sandwiching perspex slabs and cloth together. The DEP chamber was able to effectively collect life yeast from a flow of suspended cells through the cloth using an applied field of 1 MHz at flow rates up to 5 ml min-1. However, some loss occurred due to sedimentation. Also, the chamber was able to separate dead and live yeast cells at 30 Vpk-pk, 2 MHz, with some cell loss due to sedimentation.

  15. Carbon dioxide recycling: emerging large-scale technologies with industrial potential.

    Science.gov (United States)

    Quadrelli, Elsje Alessandra; Centi, Gabriele; Duplan, Jean-Luc; Perathoner, Siglinda

    2011-09-19

    This Review introduces this special issue of ChemSusChem dedicated to CO(2) recycling. Its aim is to offer an up-to-date overview of CO(2) chemical utilization (inorganic mineralization, organic carboxylation, reduction reactions, and biochemical conversion), as a continuation and extension of earlier books and reviews on this topic, but with a specific focus on large-volume routes and projects/pilot plants that are currently emerging at (pre-)industrial level. The Review also highlights how some of these routes will offer a valuable opportunity to introduce renewable energy into the existing energy and chemical infrastructure (i.e., "drop-in" renewable energy) by synthesis of chemicals from CO(2) that are easy to transport and store. CO(2) conversion therefore has the potential to become a key pillar of the sustainable and resource-efficient production of chemicals and energy from renewables. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. The MedAustron project: an example of large-scale technology transfer

    CERN Multimedia

    Antonella Del Rosso & Michael Benedikt

    2013-01-01

    In January this year, CERN’s Director-General Rolf Heuer handed over the first ion source to the MedAustron therapy centre in the town of Wiener Neustadt in the presence of the Austrian authorities. This milestone marks the beginning of the transition from the development and design phase to the commissioning of the new facility.   Handover of the ion source to MedAustron on 11 January, 2013. From left to right: Michael Benedikt (Project Leader MedAustron at CERN), Karlheinz Töchterle (Austrian Federal Minister of Science and Research), Erwin Pröll (Governor of Lower Austria), Rolf Heuer (Director-General CERN), Klaus Schneeberger (Lower Austrian State Parliament, Head of EBG MedAustron Council). The goal of the MedAustron project is the construction of an ion-therapy and research centre, based on a synchrotron accelerator complex, in Austria (for more about the technical part of the MedAustron project, click here). “MedAustron will be the first large-sca...

  17. LARGE-SCALE MECURY CONTROL TECHNOLOGY TESTING FOR LIGNITE-FIRED UTILITIES-OXIDATION SYSTEMS FOR WET FGD

    Energy Technology Data Exchange (ETDEWEB)

    Michael J. Holmes; Steven A. Benson; Jeffrey S. Thompson

    2004-03-01

    The Energy & Environmental Research Center (EERC) is conducting a consortium-based effort directed toward resolving the mercury (Hg) control issues facing the lignite industry. Specifically, the EERC team--the EERC, EPRI, URS, ADA-ES, Babcock & Wilcox, the North Dakota Industrial Commission, SaskPower, and the Mercury Task Force, which includes Basin Electric Power Cooperative, Otter Tail Power Company, Great River Energy, Texas Utilities (TXU), Montana-Dakota Utilities Co., Minnkota Power Cooperative, BNI Coal Ltd., Dakota Westmoreland Corporation, and the North American Coal Company--has undertaken a project to significantly and cost-effectively oxidize elemental mercury in lignite combustion gases, followed by capture in a wet scrubber. This approach will be applicable to virtually every lignite utility in the United States and Canada and potentially impact subbituminous utilities. The oxidation process is proven at the pilot-scale and in short-term full-scale tests. Additional optimization is continuing on oxidation technologies, and this project focuses on longer-term full-scale testing. The lignite industry has been proactive in advancing the understanding of and identifying control options for Hg in lignite combustion flue gases. Approximately 1 year ago, the EERC and EPRI began a series of Hg-related discussions with the Mercury Task Force as well as utilities firing Texas and Saskatchewan lignites. This project is one of three being undertaken by the consortium to perform large-scale Hg control technology testing to address the specific needs and challenges to be met in controlling Hg from lignite-fired power plants. This project involves Hg oxidation upstream of a system equipped with an electrostatic precipitator (ESP) followed by wet flue gas desulfurization (FGD). The team involved in conducting the technical aspects of the project includes the EERC, Babcock & Wilcox, URS, and ADA-ES. The host sites include Minnkota Power Cooperative Milton R. Young

  18. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... based on measurements on the Marstal plant, Denmark, and through comparison with published and unpublished data from other plants. Evaluations on the thermal, economical and environmental performance are repored, based on experiences from the last decade. For detailed designing, a computer simulation...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors...

  19. Large-scale laboratory testing of bedload-monitoring technologies: overview of the StreamLab06 Experiments

    Science.gov (United States)

    Marr, Jeffrey D.G.; Gray, John R.; Davis, Broderick E.; Ellis, Chris; Johnson, Sara; Gray, John R.; Laronne, Jonathan B.; Marr, Jeffrey D.G.

    2010-01-01

    A 3-month-long, large-scale flume experiment involving research and testing of selected conventional and surrogate bedload-monitoring technologies was conducted in the Main Channel at the St. Anthony Falls Laboratory under the auspices of the National Center for Earth-surface Dynamics. These experiments, dubbed StreamLab06, involved 25 researchers and volunteers from academia, government, and the private sector. The research channel was equipped with a sediment-recirculation system and a sediment-flux monitoring system that allowed continuous measurement of sediment flux in the flume and provided a data set by which samplers were evaluated. Selected bedload-measurement technologies were tested under a range of flow and sediment-transport conditions. The experiment was conducted in two phases. The bed material in phase I was well-sorted siliceous sand (0.6-1.8 mm median diameter). A gravel mixture (1-32 mm median diameter) composed the bed material in phase II. Four conventional bedload samplers – a standard Helley-Smith, Elwha, BLH-84, and Toutle River II (TR-2) sampler – were manually deployed as part of both experiment phases. Bedload traps were deployed in study Phase II. Two surrogate bedload samplers – stationarymounted down-looking 600 kHz and 1200 kHz acoustic Doppler current profilers – were deployed in experiment phase II. This paper presents an overview of the experiment including the specific data-collection technologies used and the ambient hydraulic, sediment-transport and environmental conditions measured as part of the experiment. All data collected as part of the StreamLab06 experiments are, or will be available to the research community.

  20. A Web-based Multi-user Interactive Visualization System For Large-Scale Computing Using Google Web Toolkit Technology

    Science.gov (United States)

    Weiss, R. M.; McLane, J. C.; Yuen, D. A.; Wang, S.

    2009-12-01

    We have created a web-based, interactive system for multi-user collaborative visualization of large data sets (on the order of terabytes) that allows users in geographically disparate locations to simultaneous and collectively visualize large data sets over the Internet. By leveraging asynchronous java and XML (AJAX) web development paradigms via the Google Web Toolkit (http://code.google.com/webtoolkit/), we are able to provide remote, web-based users a web portal to LCSE's (http://www.lcse.umn.edu) large-scale interactive visualization system already in place at the University of Minnesota that provides high resolution visualizations to the order of 15 million pixels by Megan Damon. In the current version of our software, we have implemented a new, highly extensible back-end framework built around HTTP "server push" technology to provide a rich collaborative environment and a smooth end-user experience. Furthermore, the web application is accessible via a variety of devices including netbooks, iPhones, and other web- and javascript-enabled cell phones. New features in the current version include: the ability for (1) users to launch multiple visualizations, (2) a user to invite one or more other users to view their visualization in real-time (multiple observers), (3) users to delegate control aspects of the visualization to others (multiple controllers) , and (4) engage in collaborative chat and instant messaging with other users within the user interface of the web application. We will explain choices made regarding implementation, overall system architecture and method of operation, and the benefits of an extensible, modular design. We will also discuss future goals, features, and our plans for increasing scalability of the system which includes a discussion of the benefits potentially afforded us by a migration of server-side components to the Google Application Engine (http://code.google.com/appengine/).

  1. Digital footprints: facilitating large-scale environmental psychiatric research in naturalistic settings through data from everyday technologies.

    Science.gov (United States)

    Bidargaddi, N; Musiat, P; Makinen, V-P; Ermes, M; Schrader, G; Licinio, J

    2017-02-01

    Digital footprints, the automatically accumulated by-products of our technology-saturated lives, offer an exciting opportunity for psychiatric research. The commercial sector has already embraced the electronic trails of customers as an enabling tool for guiding consumer behaviour, and analogous efforts are ongoing to monitor and improve the mental health of psychiatric patients. The untargeted collection of digital footprints that may or may not be health orientated comprises a large untapped information resource for epidemiological scale research into psychiatric disorders. Real-time monitoring of mood, sleep and physical and social activity in a substantial portion of the affected population in a naturalistic setting is unprecedented in psychiatry. We propose that digital footprints can provide these measurements from real world setting unobtrusively and in a longitudinal fashion. In this perspective article, we outline the concept of digital footprints and the services and devices that create them, and present examples where digital footprints have been successfully used in research. We then critically discuss the opportunities and fundamental challenges associated digital footprints in psychiatric research, such as collecting data from different sources, analysis, ethical and research design challenges.

  2. Social Network Analysis and Mining to Monitor and Identify Problems with Large-Scale Information and Communication Technology Interventions.

    Directory of Open Access Journals (Sweden)

    Aleksandra do Socorro da Silva

    Full Text Available The published literature reveals several arguments concerning the strategic importance of information and communication technology (ICT interventions for developing countries where the digital divide is a challenge. Large-scale ICT interventions can be an option for countries whose regions, both urban and rural, present a high number of digitally excluded people. Our goal was to monitor and identify problems in interventions aimed at certification for a large number of participants in different geographical regions. Our case study is the training at the Telecentros.BR, a program created in Brazil to install telecenters and certify individuals to use ICT resources. We propose an approach that applies social network analysis and mining techniques to data collected from Telecentros.BR dataset and from the socioeconomics and telecommunications infrastructure indicators of the participants' municipalities. We found that (i the analysis of interactions in different time periods reflects the objectives of each phase of training, highlighting the increased density in the phase in which participants develop and disseminate their projects; (ii analysis according to the roles of participants (i.e., tutors or community members reveals that the interactions were influenced by the center (or region to which the participant belongs (that is, a community contained mainly members of the same region and always with the presence of tutors, contradicting expectations of the training project, which aimed for intense collaboration of the participants, regardless of the geographic region; (iii the social network of participants influences the success of the training: that is, given evidence that the degree of the community member is in the highest range, the probability of this individual concluding the training is 0.689; (iv the North region presented the lowest probability of participant certification, whereas the Northeast, which served municipalities with similar

  3. Social Network Analysis and Mining to Monitor and Identify Problems with Large-Scale Information and Communication Technology Interventions.

    Science.gov (United States)

    da Silva, Aleksandra do Socorro; de Brito, Silvana Rossy; Vijaykumar, Nandamudi Lankalapalli; da Rocha, Cláudio Alex Jorge; Monteiro, Maurílio de Abreu; Costa, João Crisóstomo Weyl Albuquerque; Francês, Carlos Renato Lisboa

    2016-01-01

    The published literature reveals several arguments concerning the strategic importance of information and communication technology (ICT) interventions for developing countries where the digital divide is a challenge. Large-scale ICT interventions can be an option for countries whose regions, both urban and rural, present a high number of digitally excluded people. Our goal was to monitor and identify problems in interventions aimed at certification for a large number of participants in different geographical regions. Our case study is the training at the Telecentros.BR, a program created in Brazil to install telecenters and certify individuals to use ICT resources. We propose an approach that applies social network analysis and mining techniques to data collected from Telecentros.BR dataset and from the socioeconomics and telecommunications infrastructure indicators of the participants' municipalities. We found that (i) the analysis of interactions in different time periods reflects the objectives of each phase of training, highlighting the increased density in the phase in which participants develop and disseminate their projects; (ii) analysis according to the roles of participants (i.e., tutors or community members) reveals that the interactions were influenced by the center (or region) to which the participant belongs (that is, a community contained mainly members of the same region and always with the presence of tutors, contradicting expectations of the training project, which aimed for intense collaboration of the participants, regardless of the geographic region); (iii) the social network of participants influences the success of the training: that is, given evidence that the degree of the community member is in the highest range, the probability of this individual concluding the training is 0.689; (iv) the North region presented the lowest probability of participant certification, whereas the Northeast, which served municipalities with similar

  4. Large scale composting model

    OpenAIRE

    Henon , Florent; Debenest , Gérald; Tremier , Anne; Quintard , Michel; Martel , Jean-Luc; Duchalais , Guy

    2012-01-01

    International audience; One way to treat the organic wastes accordingly to the environmental policies is to develop biological treatment like composting. Nevertheless, this development largely relies on the quality of the final product and as a consequence on the quality of the biological activity during the treatment. Favourable conditions (oxygen concentration, temperature and moisture content) in the waste bed largely contribute to the establishment of a good aerobic biological activity an...

  5. Inkjet printing as a roll-to-roll compatible technology for the production of large area electronic devices on a pre-industrial scale

    NARCIS (Netherlands)

    Teunissen, P.; Rubingh, E.; Lammeren, T. van; Abbel, R.J.; Groen, P.

    2014-01-01

    Inkjet printing is a promising approach towards the solution processing of electronic devices on an industrial scale. Of particular interest is the production of high-end applications such as large area OLEDs on flexible substrates. Roll-to-roll (R2R) processing technologies involving inkjet

  6. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  7. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  8. Interactive Visualization of Large-Scale Hydrological Data using Emerging Technologies in Web Systems and Parallel Programming

    Science.gov (United States)

    Demir, I.; Krajewski, W. F.

    2013-12-01

    As geoscientists are confronted with increasingly massive datasets from environmental observations to simulations, one of the biggest challenges is having the right tools to gain scientific insight from the data and communicate the understanding to stakeholders. Recent developments in web technologies make it easy to manage, visualize and share large data sets with general public. Novel visualization techniques and dynamic user interfaces allow users to interact with data, and modify the parameters to create custom views of the data to gain insight from simulations and environmental observations. This requires developing new data models and intelligent knowledge discovery techniques to explore and extract information from complex computational simulations or large data repositories. Scientific visualization will be an increasingly important component to build comprehensive environmental information platforms. This presentation provides an overview of the trends and challenges in the field of scientific visualization, and demonstrates information visualization and communication tools developed within the light of these challenges.

  9. Interfacing Detectors and Collecting Data for Large-Scale Experiments in High Energy Physics Using COTS Technology

    CERN Document Server

    Schumacher, Jorn; Wandelli, Wainer

    Data-acquisition systems for high-energy physics experiments like the ATLAS experiment at the European particle-physics research institute CERN are used to record experimental physics data and are essential for the effective operation of an experiment. Located in underground facilities with limited space, power, cooling, and exposed to ionizing radiation and strong magnetic fields, data-acquisition systems have unique requirements and are challenging to design and build. Traditionally, these systems have been composed of custom-designed electronic components to be able to cope with the large data volumes that high-energy physics experiments generate and at the same time meet technological and environmental requirements. Custom-designed electronics is costly to develop, effortful to maintain and typically not very flexible. This thesis explores an alternative architecture for data-acquisition systems based on commercial off-the-shelf (COTS) components. A COTS-based data distribution device called FELIX that w...

  10. Interfacing detectors and collecting data for large-scale experiments in high energy physics using COTS technology

    International Nuclear Information System (INIS)

    Schumacher, Joern

    2017-01-01

    Data-acquisition systems for high-energy physics experiments like the ATLAS experiment at the European particle-physics research institute CERN are used to record experimental physics data and are essential for the effective operation of an experiment. Located in underground facilities with limited space, power, cooling, and exposed to ionizing radiation and strong magnetic fields, data-acquisition systems have unique requirements and are challenging to design and build. Traditionally, these systems have been composed of custom-designed electronic components to be able to cope with the large data volumes that high-energy physics experiments generate and at the same time meet technological and environmental requirements. Custom-designed electronics is costly to develop,effortful to maintain and typically not very flexible. This thesis explores an alternative architecture for data-acquisition systems based on commercial off-the-shelf (COTS) components. A COTS-based data distribution device called FELIX that will be integrated in ATLAS is presented. The hardware and software implementation of this device is discussed, with a specific focus on performance, heterogenity of systems and traffic patterns. The COTS-based readout approach is evaluated in the context of the future requirements of the ATLAS experiment. The main contributions of the thesis are an analysis of the ATLAS data-acquisition system with a focus on the readout system, a software architecture for the main application on FELIX hosts, a performance analysis and tuning based on computer science methods for central FELIX software components with respect to the requirements of the ATLAS experiment, a network communication library with a high-level software interface to utilize high-performance computing network technology for the purpose of data-acquisition systems, and an evaluation and discussion of ATLAS data-acquisition using FELIX systems as a case study for COTS-based data-acquisition in high

  11. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  12. Large scale processing of dielectric electroactive polymers

    DEFF Research Database (Denmark)

    Vudayagiri, Sindhu

    Efficient processing techniques are vital to the success of any manufacturing industry. The processing techniques determine the quality of the products and thus to a large extent the performance and reliability of the products that are manufactured. The dielectric electroactive polymer (DEAP......) technology is relatively new and is in the initial stages of development with no established large scale manufacturing techniques. Danfoss Polypower A/S has set up a large scale manufacture process to make thin film DEAP transducers. The DEAP transducers developed by Danfoss Polypower consist...... of microstructured elastomer surfaces on which the compliant metallic electrodes are sputtered thus enabling large strains of non-stretchable metal electrode. Thin microstructured polydimethlysiloxane (PDMS) films are quintessential in DEAP technology due to scaling of their actuation strain with the reciprocal...

  13. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...... main focus. Here the general perception of the nature and role in society of large scale networks as a fundamental infrastructure is analysed. This analysis focuses on the effects of the technical DDN projects and on the perception of network infrastructure as expressed by key decision makers...

  14. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  15. Integration of Technology, Curriculum, and Professional Development for Advancing Middle School Mathematics: Three Large-Scale Studies

    Science.gov (United States)

    Roschelle, Jeremy; Shechtman, Nicole; Tatar, Deborah; Hegedus, Stephen; Hopkins, Bill; Empson, Susan; Knudsen, Jennifer; Gallagher, Lawrence P.

    2010-01-01

    The authors present three studies (two randomized controlled experiments and one embedded quasi-experiment) designed to evaluate the impact of replacement units targeting student learning of advanced middle school mathematics. The studies evaluated the SimCalc approach, which integrates an interactive representational technology, paper curriculum,…

  16. Institutions, technology and water control; water users associations and irrigation management reform in two large-scale systems in India

    NARCIS (Netherlands)

    Narain, V.

    2003-01-01

    Few studies of resource management have paid as much attention or intelligently surveyed the operational aspects of Water User Associations (WUAs) as Institutions, Technology and Water Control. The implementation of WUAs policies, argues this pioneering study, is shaped by the aspirations of its

  17. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    .synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www...... try to develop new aesthetic potentials for the concrete, in large scales that has not been seen before in the ceramic area. It is expected to result in new types of large scale and very thin, glazed concrete façades in building. If such are introduced in an architectural context as exposed surfaces...

  18. Large-Scale Mercury Control Technology Testing for Lignite-Fired Utilities - Oxidation Systems for Wet FGD

    Energy Technology Data Exchange (ETDEWEB)

    Steven A. Benson; Michael J. Holmes; Donald P. McCollor; Jill M. Mackenzie; Charlene R. Crocker; Lingbu Kong; Kevin C. Galbreath

    2007-03-31

    Mercury (Hg) control technologies were evaluated at Minnkota Power Cooperative's Milton R. Young (MRY) Station Unit 2, a 450-MW lignite-fired cyclone unit near Center, North Dakota, and TXU Energy's Monticello Steam Electric Station (MoSES) Unit 3, a 793-MW lignite--Powder River Basin (PRB) subbituminous coal-fired unit near Mt. Pleasant, Texas. A cold-side electrostatic precipitator (ESP) and wet flue gas desulfurization (FGD) scrubber are used at MRY and MoSES for controlling particulate and sulfur dioxide (SO{sub 2}) emissions, respectively. Several approaches for significantly and cost-effectively oxidizing elemental mercury (Hg{sup 0}) in lignite combustion flue gases, followed by capture in an ESP and/or FGD scrubber were evaluated. The project team involved in performing the technical aspects of the project included Babcock & Wilcox, the Energy & Environmental Research Center (EERC), the Electric Power Research Institute, and URS Corporation. Calcium bromide (CaBr{sub 2}), calcium chloride (CaCl{sub 2}), magnesium chloride (MgCl{sub 2}), and a proprietary sorbent enhancement additive (SEA), hereafter referred to as SEA2, were added to the lignite feeds to enhance Hg capture in the ESP and/or wet FGD. In addition, powdered activated carbon (PAC) was injected upstream of the ESP at MRY Unit 2. The work involved establishing Hg concentrations and removal rates across existing ESP and FGD units, determining costs associated with a given Hg removal efficiency, quantifying the balance-of-plant impacts of the control technologies, and facilitating technology commercialization. The primary project goal was to achieve ESP-FGD Hg removal efficiencies of {ge}55% at MRY and MoSES for about a month.

  19. Preinoculation of Soybean Seeds Treated with Agrichemicals up to 30 Days before Sowing: Technological Innovation for Large-Scale Agriculture

    Science.gov (United States)

    da Cruz, Sonia Purin; Martin, Thomas Newton; Nakatani, André Shigueyoshi; Nogueira, Marco Antonio; Hungria, Mariangela

    2017-01-01

    The cultivation of soybean in Brazil experienced an expressive growth in the last decades. Soybean is highly demanding on nitrogen (N) that must come from fertilizers or from biological fixation. The N supply to the soybean crop in Brazil relies on the inoculation with elite strains of Bradyrhizobium japonicum, B. elkanii, and B. diazoefficiens, which are able to fulfill the crop's N requirements and enrich the soil for the following crop. The effectiveness of the association between N2-fixing bacteria and soybean plants depends on the efficacy of the inoculation process. Seed treatment with pesticides, especially fungicides or micronutrients, may rapidly kill the inoculated bacteria, affecting the establishment and outcome of the symbiosis. The development of technologies that allow inoculation to become a successful component of industrial seed treatment represents a valuable tool for the seed industry, as well as for the soybean crop worldwide. In this article, we report the results of new technologies, developed by the company Total Biotecnologia Indústria e Comércio S/A of Brazil, for preinoculation of soybean seeds with bradyrhizobia, in the presence of agrichemicals. Our results demonstrate improved bacterial survival for up to 30 days after inoculation, without compromising nodulation, N2-fixation, and yield in the field. PMID:29129977

  20. Preinoculation of Soybean Seeds Treated with Agrichemicals up to 30 Days before Sowing: Technological Innovation for Large-Scale Agriculture

    Directory of Open Access Journals (Sweden)

    Ricardo Silva Araujo

    2017-01-01

    Full Text Available The cultivation of soybean in Brazil experienced an expressive growth in the last decades. Soybean is highly demanding on nitrogen (N that must come from fertilizers or from biological fixation. The N supply to the soybean crop in Brazil relies on the inoculation with elite strains of Bradyrhizobium japonicum, B. elkanii, and B. diazoefficiens, which are able to fulfill the crop’s N requirements and enrich the soil for the following crop. The effectiveness of the association between N2-fixing bacteria and soybean plants depends on the efficacy of the inoculation process. Seed treatment with pesticides, especially fungicides or micronutrients, may rapidly kill the inoculated bacteria, affecting the establishment and outcome of the symbiosis. The development of technologies that allow inoculation to become a successful component of industrial seed treatment represents a valuable tool for the seed industry, as well as for the soybean crop worldwide. In this article, we report the results of new technologies, developed by the company Total Biotecnologia Indústria e Comércio S/A of Brazil, for preinoculation of soybean seeds with bradyrhizobia, in the presence of agrichemicals. Our results demonstrate improved bacterial survival for up to 30 days after inoculation, without compromising nodulation, N2-fixation, and yield in the field.

  1. Preinoculation of Soybean Seeds Treated with Agrichemicals up to 30 Days before Sowing: Technological Innovation for Large-Scale Agriculture.

    Science.gov (United States)

    Araujo, Ricardo Silva; da Cruz, Sonia Purin; Souchie, Edson Luiz; Martin, Thomas Newton; Nakatani, André Shigueyoshi; Nogueira, Marco Antonio; Hungria, Mariangela

    2017-01-01

    The cultivation of soybean in Brazil experienced an expressive growth in the last decades. Soybean is highly demanding on nitrogen (N) that must come from fertilizers or from biological fixation. The N supply to the soybean crop in Brazil relies on the inoculation with elite strains of Bradyrhizobium japonicum, B. elkanii, and B. diazoefficiens , which are able to fulfill the crop's N requirements and enrich the soil for the following crop. The effectiveness of the association between N 2 -fixing bacteria and soybean plants depends on the efficacy of the inoculation process. Seed treatment with pesticides, especially fungicides or micronutrients, may rapidly kill the inoculated bacteria, affecting the establishment and outcome of the symbiosis. The development of technologies that allow inoculation to become a successful component of industrial seed treatment represents a valuable tool for the seed industry, as well as for the soybean crop worldwide. In this article, we report the results of new technologies, developed by the company Total Biotecnologia Indústria e Comércio S/A of Brazil, for preinoculation of soybean seeds with bradyrhizobia, in the presence of agrichemicals. Our results demonstrate improved bacterial survival for up to 30 days after inoculation, without compromising nodulation, N 2 -fixation, and yield in the field.

  2. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  3. Large Scale Coordination of Small Scale Structures

    Science.gov (United States)

    Kobelski, Adam; Tarr, Lucas A.; Jaeggli, Sarah A.; Savage, Sabrina

    2017-08-01

    Transient brightenings are ubiquitous features of the solar atmosphere across many length and energy scales, the most energetic of which manifest as large-class solar flares. Often, transient brightenings originate in regions of strong magnetic activity and create strong observable enhancements across wavelengths from X-ray to radio, with notable dynamics on timescales of seconds to hours.The coronal aspects of these brightenings have often been studied by way of EUV and X-ray imaging and spectra. These events are likely driven by photospheric activity (such as flux emergence) with the coronal brightenings originating largely from chromospheric ablation (evaporation). Until recently, chromospheric and transition region observations of these events have been limited. However, new observational capabilities have become available which significantly enhance our ability to understand the bi-directional flow of energy through the chromosphere between the photosphere and the corona.We have recently obtained a unique data set with which to study this flow of energy through the chromosphere via the Interface Region Imaging Spectrograph (IRIS), Hinode EUV Imaging Spectrometer (EIS), Hinode X-Ray Telescope (XRT), Hinode Solar Optical Telescope (SOT), Solar Dynamics Observatory (SDO) Atmospheric Imaging Assembly (AIA), SDO Helioseismic and Magnetic Imager (HMI), Nuclear Spectroscopic Telescope Array (NuStar), Atacama Large Millimeter Array (ALMA), and Interferometric BIdimensional Spectropolarimeter (IBIS) at the Dunn Solar Telescope (DST). This data set targets a small active area near disk center which was tracked simultaneously for approximately four hours. Within this region, many transient brightenings detected through multiple layers of the solar atmosphere. In this study, we combine the imaging data and use the spectra from EIS and IRIS to track flows from the photosphere (HMI, SOT) through the chromosphere and transition region (AIA, IBIS, IRIS, ALMA) into the corona

  4. Large-scale nanophotonic phased array.

    Science.gov (United States)

    Sun, Jie; Timurdogan, Erman; Yaacobi, Ami; Hosseini, Ehsan Shah; Watts, Michael R

    2013-01-10

    Electromagnetic phased arrays at radio frequencies are well known and have enabled applications ranging from communications to radar, broadcasting and astronomy. The ability to generate arbitrary radiation patterns with large-scale phased arrays has long been pursued. Although it is extremely expensive and cumbersome to deploy large-scale radiofrequency phased arrays, optical phased arrays have a unique advantage in that the much shorter optical wavelength holds promise for large-scale integration. However, the short optical wavelength also imposes stringent requirements on fabrication. As a consequence, although optical phased arrays have been studied with various platforms and recently with chip-scale nanophotonics, all of the demonstrations so far are restricted to one-dimensional or small-scale two-dimensional arrays. Here we report the demonstration of a large-scale two-dimensional nanophotonic phased array (NPA), in which 64 × 64 (4,096) optical nanoantennas are densely integrated on a silicon chip within a footprint of 576 μm × 576 μm with all of the nanoantennas precisely balanced in power and aligned in phase to generate a designed, sophisticated radiation pattern in the far field. We also show that active phase tunability can be realized in the proposed NPA by demonstrating dynamic beam steering and shaping with an 8 × 8 array. This work demonstrates that a robust design, together with state-of-the-art complementary metal-oxide-semiconductor technology, allows large-scale NPAs to be implemented on compact and inexpensive nanophotonic chips. In turn, this enables arbitrary radiation pattern generation using NPAs and therefore extends the functionalities of phased arrays beyond conventional beam focusing and steering, opening up possibilities for large-scale deployment in applications such as communication, laser detection and ranging, three-dimensional holography and biomedical sciences, to name just a few.

  5. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg

    2009-01-01

    To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro...... peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays...

  6. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  7. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  8. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  9. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  10. Large scale in-situ BOrehole and Geofluid Simulator (i.BOGS) for the development and testing of borehole technologies at reservoir conditions

    Science.gov (United States)

    Duda, Mandy; Bracke, Rolf; Stöckhert, Ferdinand; Wittig, Volker

    2017-04-01

    A fundamental problem of technological applications related to the exploration and provision of geothermal energy is the inaccessibility of subsurface processes. As a result, actual reservoir properties can only be determined using (a) indirect measurement techniques such as seismic surveys, machine feedback and geophysical borehole logging, (b) laboratory experiments capable of simulating in-situ properties, but failing to preserve temporal and spatial scales, or vice versa, and (c) numerical simulations. Moreover, technological applications related to the drilling process, the completion and cementation of a wellbore or the stimulation and exploitation of the reservoir are exposed to high pressure and temperature conditions as well as corrosive environments resulting from both, rock formation and geofluid characteristics. To address fundamental and applied questions in the context of geothermal energy provision and subsurface exploration in general one of Europe's largest geoscientific laboratory infrastructures is introduced. The in-situ Borehole and Geofluid Simulator (i.BOGS) allows to simulate quasi scale-preserving processes at reservoir conditions up to depths of 5000 m and represents a large scale pressure vessel for iso-/hydrostatic and pore pressures up to 125 MPa and temperatures from -10°C to 180°C. The autoclave can either be filled with large rock core samples (25 cm in diameter, up to 3 m length) or with fluids and technical borehole devices (e.g. pumps, sensors). The pressure vessel is equipped with an ultrasound system for active transmission and passive recording of acoustic emissions, and can be complemented by additional sensors. The i.BOGS forms the basic module for the Match.BOGS finally consisting of three modules, i.e. (A) the i.BOGS, (B) the Drill.BOGS, a drilling module to be attached to the i.BOGS capable of applying realistic torques and contact forces to a drilling device that enters the i.BOGS, and (C) the Fluid.BOGS, a geofluid

  11. Large-scale galaxy bias

    Science.gov (United States)

    Jeong, Donghui; Desjacques, Vincent; Schmidt, Fabian

    2018-01-01

    Here, we briefly introduce the key results of the recent review (arXiv:1611.09787), whose abstract is as following. This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy (or halo) statistics. We then review the excursion set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  12. Large-scale galaxy bias

    Science.gov (United States)

    Desjacques, Vincent; Jeong, Donghui; Schmidt, Fabian

    2018-02-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy statistics. We then review the excursion-set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  13. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  14. Large-Scale Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  15. Large scale homing in honeybees.

    Directory of Open Access Journals (Sweden)

    Mario Pahl

    Full Text Available Honeybee foragers frequently fly several kilometres to and from vital resources, and communicate those locations to their nest mates by a symbolic dance language. Research has shown that they achieve this feat by memorizing landmarks and the skyline panorama, using the sun and polarized skylight as compasses and by integrating their outbound flight paths. In order to investigate the capacity of the honeybees' homing abilities, we artificially displaced foragers to novel release spots at various distances up to 13 km in the four cardinal directions. Returning bees were individually registered by a radio frequency identification (RFID system at the hive entrance. We found that homing rate, homing speed and the maximum homing distance depend on the release direction. Bees released in the east were more likely to find their way back home, and returned faster than bees released in any other direction, due to the familiarity of global landmarks seen from the hive. Our findings suggest that such large scale homing is facilitated by global landmarks acting as beacons, and possibly the entire skyline panorama.

  16. Large floating structures technological advances

    CERN Document Server

    Wang, BT

    2015-01-01

    This book surveys key projects that have seen the construction of large floating structures or have attained detailed conceptual designs. This compilation of key floating structures in a single volume captures the innovative features that mark the technological advances made in this field of engineering, and will provide a useful reference for ideas, analysis, design, and construction of these unique and emerging urban projects to offshore and marine engineers, urban planners, architects and students.

  17. Large Pilot Scale Testing of Linde/BASF Post-Combustion CO2 Capture Technology at the Abbott Coal-Fired Power Plant

    Energy Technology Data Exchange (ETDEWEB)

    O' Brien, Kevin C. [University of Illinois, Champaign, IL (United States)

    2017-08-18

    The work summarized in this report is the first step towards a project that will re-train and create jobs for personnel in the coal industry and continue regional economic development to benefit regions impacted by previous downturns. The larger project is aimed at capturing ~300 tons/day (272 metric tonnes/day) CO2 at a 90% capture rate from existing coal- fired boilers at the Abbott Power Plant on the campus of University of Illinois (UI). It will employ the Linde-BASF novel amine-based advanced CO2 capture technology, which has already shown the potential to be cost-effective, energy efficient and compact at the 0.5-1.5 MWe pilot scales. The overall objective of the project is to design and install a scaled-up system of nominal 15 MWe size, integrate it with the Abbott Power Plant flue gas, steam and other utility systems, and demonstrate the viability of continuous operation under realistic conditions with high efficiency and capacity. The project will also begin to build a workforce that understands how to operate and maintain the capture plants by including students from regional community colleges and universities in the operation and evaluation of the capture system. This project will also lay the groundwork for follow-on projects that pilot utilization of the captured CO2 from coal-fired power plants. The net impact will be to demonstrate a replicable means to (1) use a standardized procedure to evaluate power plants for their ability to be retrofitted with a pilot capture unit; (2) design and construct reliable capture systems based on the Linde-BASF technology; (3) operate and maintain these systems; (4) implement training programs with local community colleges and universities to establish a workforce to operate and maintain the systems; and (5) prepare to evaluate at the large pilot scale level various methods to utilize the resulting captured CO2. Towards the larger project goal, the UI-led team, together

  18. Handbook of Large-Scale Random Networks

    CERN Document Server

    Bollobas, Bela; Miklos, Dezso

    2008-01-01

    Covers various aspects of large-scale networks, including mathematical foundations and rigorous results of random graph theory, modeling and computational aspects of large-scale networks, as well as areas in physics, biology, neuroscience, sociology and technical areas

  19. Large-scale field application of RNAi technology reducing Israeli acute paralysis virus disease in honey bees (Apis mellifera, Hymenoptera: Apidae.

    Directory of Open Access Journals (Sweden)

    Wayne Hunter

    Full Text Available The importance of honey bees to the world economy far surpasses their contribution in terms of honey production; they are responsible for up to 30% of the world's food production through pollination of crops. Since fall 2006, honey bees in the U.S. have faced a serious population decline, due in part to a phenomenon called Colony Collapse Disorder (CCD, which is a disease syndrome that is likely caused by several factors. Data from an initial study in which investigators compared pathogens in honey bees affected by CCD suggested a putative role for Israeli Acute Paralysis Virus, IAPV. This is a single stranded RNA virus with no DNA stage placed taxonomically within the family Dicistroviridae. Although subsequent studies have failed to find IAPV in all CCD diagnosed colonies, IAPV has been shown to cause honey bee mortality. RNA interference technology (RNAi has been used successfully to silence endogenous insect (including honey bee genes both by injection and feeding. Moreover, RNAi was shown to prevent bees from succumbing to infection from IAPV under laboratory conditions. In the current study IAPV specific homologous dsRNA was used in the field, under natural beekeeping conditions in order to prevent mortality and improve the overall health of bees infected with IAPV. This controlled study included a total of 160 honey bee hives in two discrete climates, seasons and geographical locations (Florida and Pennsylvania. To our knowledge, this is the first successful large-scale real world use of RNAi for disease control.

  20. Large-scale field application of RNAi technology reducing Israeli acute paralysis virus disease in honey bees (Apis mellifera, Hymenoptera: Apidae).

    Science.gov (United States)

    Hunter, Wayne; Ellis, James; Vanengelsdorp, Dennis; Hayes, Jerry; Westervelt, Dave; Glick, Eitan; Williams, Michael; Sela, Ilan; Maori, Eyal; Pettis, Jeffery; Cox-Foster, Diana; Paldi, Nitzan

    2010-12-23

    The importance of honey bees to the world economy far surpasses their contribution in terms of honey production; they are responsible for up to 30% of the world's food production through pollination of crops. Since fall 2006, honey bees in the U.S. have faced a serious population decline, due in part to a phenomenon called Colony Collapse Disorder (CCD), which is a disease syndrome that is likely caused by several factors. Data from an initial study in which investigators compared pathogens in honey bees affected by CCD suggested a putative role for Israeli Acute Paralysis Virus, IAPV. This is a single stranded RNA virus with no DNA stage placed taxonomically within the family Dicistroviridae. Although subsequent studies have failed to find IAPV in all CCD diagnosed colonies, IAPV has been shown to cause honey bee mortality. RNA interference technology (RNAi) has been used successfully to silence endogenous insect (including honey bee) genes both by injection and feeding. Moreover, RNAi was shown to prevent bees from succumbing to infection from IAPV under laboratory conditions. In the current study IAPV specific homologous dsRNA was used in the field, under natural beekeeping conditions in order to prevent mortality and improve the overall health of bees infected with IAPV. This controlled study included a total of 160 honey bee hives in two discrete climates, seasons and geographical locations (Florida and Pennsylvania). To our knowledge, this is the first successful large-scale real world use of RNAi for disease control.

  1. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  2. Based on Real Time Remote Health Monitoring Systems: A New Approach for Prioritization "Large Scales Data" Patients with Chronic Heart Diseases Using Body Sensors and Communication Technology.

    Science.gov (United States)

    Kalid, Naser; Zaidan, A A; Zaidan, B B; Salman, Omar H; Hashim, M; Albahri, O S; Albahri, A S

    2018-03-02

    This paper presents a new approach to prioritize "Large-scale Data" of patients with chronic heart diseases by using body sensors and communication technology during disasters and peak seasons. An evaluation matrix is used for emergency evaluation and large-scale data scoring of patients with chronic heart diseases in telemedicine environment. However, one major problem in the emergency evaluation of these patients is establishing a reasonable threshold for patients with the most and least critical conditions. This threshold can be used to detect the highest and lowest priority levels when all the scores of patients are identical during disasters and peak seasons. A practical study was performed on 500 patients with chronic heart diseases and different symptoms, and their emergency levels were evaluated based on four main measurements: electrocardiogram, oxygen saturation sensor, blood pressure monitoring, and non-sensory measurement tool, namely, text frame. Data alignment was conducted for the raw data and decision-making matrix by converting each extracted feature into an integer. This integer represents their state in the triage level based on medical guidelines to determine the features from different sources in a platform. The patients were then scored based on a decision matrix by using multi-criteria decision-making techniques, namely, integrated multi-layer for analytic hierarchy process (MLAHP) and technique for order performance by similarity to ideal solution (TOPSIS). For subjective validation, cardiologists were consulted to confirm the ranking results. For objective validation, mean ± standard deviation was computed to check the accuracy of the systematic ranking. This study provides scenarios and checklist benchmarking to evaluate the proposed and existing prioritization methods. Experimental results revealed the following. (1) The integration of TOPSIS and MLAHP effectively and systematically solved the patient settings on triage and

  3. A large-scale industrial CT's data transfer system

    International Nuclear Information System (INIS)

    Chen Xuesong

    2004-01-01

    The large-scale industrial CT generates a large amount of data when it works. To guarantee the reliability of the real-time transfers of those data, the author designs a project by using WLAN technology. And it solves the bottleneck caused by the data rate limitation by using multi-thread technology. (author)

  4. Large-scale turbulence structures in shallow separating flows

    NARCIS (Netherlands)

    Talstra, H.

    2011-01-01

    The Ph.D. thesis “Large-scale turbulence structures in shallow separating flows” by Harmen Talstra is the result of a Ph.D. research project on large-scale shallow-flow turbulence, which has been performed in the Environmental Fluid Mechanics Laboratory at Delft University of Technology. The

  5. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    were the main reasons for the failure of large-scale jatropha plantations in Ethiopia. The findings in Chapter 3 show that when the use of family labour is combined with easy access to credit and technology, outgrowers on average achieve higher productivity than that obtained on large-scale plantations...

  6. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  7. Introducing Large-Scale Innovation in Schools

    Science.gov (United States)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  8. Large-scale field application of RNAi technology reducing Israeli Acute Paralysis Virus Disease in honey bees (Apis mellifera, Hymenoptera; Apidae)

    Science.gov (United States)

    We present the first successful use of RNAi under a large-scale real-world application for disease control. Israeli acute paralysis virus, IAPV, has been linked as a contributing factor in coolly collapse, CCD, of honey bees. IAPV specific homologous dsRNA were designed to reduce impacts from IAPV i...

  9. Large-scale solar heating

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Advanced Energy Systems

    1998-10-01

    Solar heating market is growing in many European countries and annually installed collector area has exceeded one million square meters. There are dozens of collector manufacturers and hundreds of firms making solar heating installations in Europe. One tendency in solar heating is towards larger systems. These can be roof integrated, consisting of some tens or hundreds of square meters of collectors, or they can be larger centralized solar district heating plants consisting of a few thousand square meters of collectors. The increase of size can reduce the specific investments of solar heating systems, because e.g. the costs of some components (controllers, pumps, and pipes), planning and installation can be smaller in larger systems. The solar heat output can also be higher in large systems, because more advanced technique is economically viable

  10. Engineering management of large scale systems

    Science.gov (United States)

    Sanders, Serita; Gill, Tepper L.; Paul, Arthur S.

    1989-01-01

    The organization of high technology and engineering problem solving, has given rise to an emerging concept. Reasoning principles for integrating traditional engineering problem solving with system theory, management sciences, behavioral decision theory, and planning and design approaches can be incorporated into a methodological approach to solving problems with a long range perspective. Long range planning has a great potential to improve productivity by using a systematic and organized approach. Thus, efficiency and cost effectiveness are the driving forces in promoting the organization of engineering problems. Aspects of systems engineering that provide an understanding of management of large scale systems are broadly covered here. Due to the focus and application of research, other significant factors (e.g., human behavior, decision making, etc.) are not emphasized but are considered.

  11. Pressing technology for large bottoms

    International Nuclear Information System (INIS)

    Jilek, L.

    1986-01-01

    The technology has been selected of a circular plate bent into the shape of a trough, for pressing bottoms of pressure vessels from a circular plate of large diameter. The initial sheet is first bent in the middle by heating with the edges remaining straight. These are then welded longitudinally by electroslag welding and the circular shape is flame cut. The result will be a plate with a straight surface in the middle with raised edges which may be pressed into the desired shape. In this manner it is also possible to press pressure vessel bottoms with tube couplings from plates which are thickened in the middle and drilled; additional welding is then eliminated. Deformation from heat treatment may be avoided by the use of a fixture in the shape of a ring with a groove into which is fixed the edge of the bottom. During hardening of the bottom it will be necessary to care for the withdrawal of vapours and gases which would hamper uniform cooling. Bottom hardening with the grill and the cupola downwards has been proven. Deformation which occurs during treatment may to a certain extent be removed by calibration which cannot, however, be made without special fixtures and instruments. (J.B.)

  12. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  13. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  14. Food appropriation through large scale land acquisitions

    International Nuclear Information System (INIS)

    Cristina Rulli, Maria; D’Odorico, Paolo

    2014-01-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300–550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190–370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations. (letter)

  15. Food appropriation through large scale land acquisitions

    Science.gov (United States)

    Rulli, Maria Cristina; D'Odorico, Paolo

    2014-05-01

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300-550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190-370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations.

  16. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...... with high wind power penetration. This paper presents a review of the electricity storage technologies relevant for large power systems. The paper also presents an estimation of the economic feasibility of electricity storage using the west Danish power market area as a case....

  17. EFG Technology and Diagnostic R&D for Large-Scale PV Manufacturing; Final Subcontract Report, 1 March 2002 - 31 March 2005

    Energy Technology Data Exchange (ETDEWEB)

    Kalejs, J.; Aurora, P.; Bathey, B.; Cao, J.; Doedderlein, J.; Gonsiorawski, R.; Heath, B.; Kubasti, J.; Mackintosh, B.; Ouellette, M.; Rosenblum, M.; Southimath, S.; Xavier, G.

    2005-10-01

    The objective of this subcontract was to carry out R&D to advance the technology, processes, and performance of RWE Schott-Solar's wafer, cell, and module manufacturing lines, and help configure these lines for scaling up of edge-defined, film-fed growth (EFG) ribbon technology to the 50-100 MW PV factory level. EFG ribbon manufacturing continued to expand during this subcontract period and now has reached a capacity of 40 MW. EFG wafer products were diversified over this time period. In addition to 10 cm x 10 cm and 10 cm x 15 cm wafer areas, which were the standard products at the beginning of this program, R&D has focused on new EFG technology to extend production to 12.5 cm x 12.5 cm EFG wafers. Cell and module production also has continued to expand in Billerica. A new 12-MW cell line was installed and brought on line in 2003. R&D on this subcontract improved cell yield and throughput, and optimized the cell performance, with special emphasis on work to speed up wafer transfer, hence enhancing throughput. Improvements of wafer transfer processes during this program have raised cell line capacity from 12 MW to over 18 MW. Optimization of module manufacturing processes was carried out on new equipment installed during a manufacturing upgrade in Billerica to a 12-MW capacity to improve yield and reliability of products.

  18. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  19. Achieving Agility and Stability in Large-Scale Software Development

    Science.gov (United States)

    2013-01-16

    question Which software development process are you currently using? 1. Agile software development (e.g., using Scrum , XP practices, test-driven...Architecting in a Complex World Twitter #SEIVirtualEvent © 2013 Carnegie Mellon University Achieving Agility and Stability in Large-Scale...staff in the Research, Technology, and System Solutions Program at the SEI. She is currently engaged in activities focusing on large scale agile

  20. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  1. Inflation, large scale structure and particle physics

    Indian Academy of Sciences (India)

    We review experimental and theoretical developments in inflation and its application to structure formation, including the curvation idea. We then discuss a particle physics model of supersymmetric hybrid inflation at the intermediate scale in which the Higgs scalar field is responsible for large scale structure, show how such ...

  2. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  3. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  4. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  5. Large scale digital atlases in neuroscience

    Science.gov (United States)

    Hawrylycz, M.; Feng, D.; Lau, C.; Kuan, C.; Miller, J.; Dang, C.; Ng, L.

    2014-03-01

    Imaging in neuroscience has revolutionized our current understanding of brain structure, architecture and increasingly its function. Many characteristics of morphology, cell type, and neuronal circuitry have been elucidated through methods of neuroimaging. Combining this data in a meaningful, standardized, and accessible manner is the scope and goal of the digital brain atlas. Digital brain atlases are used today in neuroscience to characterize the spatial organization of neuronal structures, for planning and guidance during neurosurgery, and as a reference for interpreting other data modalities such as gene expression and connectivity data. The field of digital atlases is extensive and in addition to atlases of the human includes high quality brain atlases of the mouse, rat, rhesus macaque, and other model organisms. Using techniques based on histology, structural and functional magnetic resonance imaging as well as gene expression data, modern digital atlases use probabilistic and multimodal techniques, as well as sophisticated visualization software to form an integrated product. Toward this goal, brain atlases form a common coordinate framework for summarizing, accessing, and organizing this knowledge and will undoubtedly remain a key technology in neuroscience in the future. Since the development of its flagship project of a genome wide image-based atlas of the mouse brain, the Allen Institute for Brain Science has used imaging as a primary data modality for many of its large scale atlas projects. We present an overview of Allen Institute digital atlases in neuroscience, with a focus on the challenges and opportunities for image processing and computation.

  6. A new technology of CO2 supplementary for microalgae cultivation on large scale - A spraying absorption tower coupled with an outdoor open runway pond.

    Science.gov (United States)

    Zhang, Chun-Dan; Li, Wei; Shi, Yun-Hai; Li, Yuan-Guang; Huang, Jian-Ke; Li, Hong-Xia

    2016-06-01

    An effective CO2 supply system of a spraying absorption tower combined with an outdoor ORWP (open raceway pond) for microalgae photoautotrophic cultivation is developed in this paper. The microalgae yield, productivity and CO2 fixation efficiency were investigated, and compared with those of bubbling method. The maximum yield and productivity of biomass were achieved 0.927gL(-1) and 0.114gL(-1)day(-1), respectively. The fixation efficiency of CO2 by microalgae with the spraying tower reached 50%, whereas only 11.17% for bubbling method. Pure CO2 can be used in the spraying absorption tower, and the flow rate was only about one third of the bubbling cultivation. It shows that this new method of quantifiable control CO2 supply can meet the requirements of the growth of microalgae cultivation on large-scale. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  8. Large-scale assembly of colloidal particles

    Science.gov (United States)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  9. Large-scale fuel cycle centres

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The US Nuclear Regulatory Commission (NRC) has considered the nuclear energy centre concept for fuel cycle plants in the Nuclear Energy Centre Site Survey 1975 (NECSS-75) Rep. No. NUREG-0001, an important study mandated by the US Congress in the Energy Reorganization Act of 1974 which created the NRC. For this study, the NRC defined fuel cycle centres as consisting of fuel reprocessing and mixed-oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle centre sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000-300,000MW(e). The types of fuel cycle facilities located at the fuel cycle centre permit the assessment of the role of fuel cycle centres in enhancing the safeguard of strategic special nuclear materials - plutonium and mixed oxides. Siting fuel cycle centres presents a smaller problem than siting reactors. A single reprocessing plant of the scale projected for use in the USA (1500-2000t/a) can reprocess fuel from reactors producing 50,000-65,000MW(e). Only two or three fuel cycle centres of the upper limit size considered in the NECSS-75 would be required in the USA by the year 2000. The NECSS-75 fuel cycle centre evaluation showed that large-scale fuel cycle centres present no real technical siting difficulties from a radiological effluent and safety standpoint. Some construction economies may be achievable with fuel cycle centres, which offer opportunities to improve waste-management systems. Combined centres consisting of reactors and fuel reprocessing and mixed-oxide fuel fabrication plants were also studied in the NECSS. Such centres can eliminate shipment not only of Pu but also mixed-oxide fuel. Increased fuel cycle costs result from implementation of combined centres unless the fuel reprocessing plants are commercial-sized. Development of Pu-burning reactors could reduce any economic penalties of combined centres. The need for effective fissile

  10. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  11. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  12. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  13. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  14. 77 FR 58416 - Large Scale Networking (LSN); Middleware and Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2012-09-20

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Middleware and Grid Interagency Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD..., Grid, and cloud projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating Group...

  15. 78 FR 70076 - Large Scale Networking (LSN)-Middleware and Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2013-11-22

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Middleware and Grid Interagency Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD... projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating Group (CG). Public...

  16. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    Science.gov (United States)

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  17. New Visions for Large Scale Networks: Research and Applications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This paper documents the findings of the March 12-14, 2001 Workshop on New Visions for Large-Scale Networks: Research and Applications. The workshops objectives were...

  18. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  19. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  20. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  1. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  2. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  3. Application of GRA method, dynamic analysis and fuzzy set theory in evaluation and selection of emergency treatment technology for large scale phenol spill incidents

    Science.gov (United States)

    Zhao, Jingjing; Yu, Lean; Li, Lian

    2017-05-01

    Select an appropriate technology in an emergency response is a very important issue with various kinds of chemical contingency spills frequently taking place. Due to the complexity, fuzziness and uncertainties of the chemical contingency spills, the theory of GRA method, dynamic analysis combined with fuzzy set theory will be appropriately applied to selection and evaluation of emergency treatment technology. Finally, a emergency phenol spill accidence occurred in highway is provided to illustrate the applicability and feasibility of the proposed methods.

  4. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  5. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  6. Fractals and cosmological large-scale structure

    Science.gov (United States)

    Luo, Xiaochun; Schramm, David N.

    1992-01-01

    Observations of galaxy-galaxy and cluster-cluster correlations as well as other large-scale structure can be fit with a 'limited' fractal with dimension D of about 1.2. This is not a 'pure' fractal out to the horizon: the distribution shifts from power law to random behavior at some large scale. If the observed patterns and structures are formed through an aggregation growth process, the fractal dimension D can serve as an interesting constraint on the properties of the stochastic motion responsible for limiting the fractal structure. In particular, it is found that the observed fractal should have grown from two-dimensional sheetlike objects such as pancakes, domain walls, or string wakes. This result is generic and does not depend on the details of the growth process.

  7. Analysis using large-scale ringing data

    OpenAIRE

    Baillie, S. R.; Doherty, P. F.

    2004-01-01

    Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994). There is increasing interest in understanding patterns of synchrony, or lack of synchro...

  8. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  9. Large-scale computer-aided design

    OpenAIRE

    Adeli, Hojjat

    1997-01-01

    The author and his associates have been 'working on creating novel design theories and computational models with two broad objectives: automation and optimization. This paper is a summary of the author's Keynote Lecture based on the research done by the author and his associates recently. Novel neurocomputing algorithms are presented for large-scale computer-aided design and optimization. This research demonstrates how a new level is achieved in design automation through the ingenious use and...

  10. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  11. Large scale inhomogeneities and the cosmological principle

    International Nuclear Information System (INIS)

    Lukacs, B.; Meszaros, A.

    1984-12-01

    The compatibility of cosmologic principles and possible large scale inhomogeneities of the Universe is discussed. It seems that the strongest symmetry principle which is still compatible with reasonable inhomogeneities, is a full conformal symmetry in the 3-space defined by the cosmological velocity field, but even in such a case, the standard model is isolated from the inhomogeneous ones when the whole evolution is considered. (author)

  12. The consistency problems of large scale structure

    International Nuclear Information System (INIS)

    Schramm, D.N.

    1986-01-01

    Studies of the early universe are reviewed, with emphasis on galaxy formation, dark matter and the generation of large scale structure. The paper was presented at the conference on ''The early universe and its evolution'', Erice, Italy, 1986. Dark matter, Big Bang nucleosynthesis, baryonic halos, flatness arguments, cosmological constant, galaxy formation, neutrinos plus strings or explosions and string models, are all discussed. (U.K.)

  13. Large-Scale Visual Data Analysis

    Science.gov (United States)

    Johnson, Chris

    2014-04-01

    Modern high performance computers have speeds measured in petaflops and handle data set sizes measured in terabytes and petabytes. Although these machines offer enormous potential for solving very large-scale realistic computational problems, their effectiveness will hinge upon the ability of human experts to interact with their simulation results and extract useful information. One of the greatest scientific challenges of the 21st century is to effectively understand and make use of the vast amount of information being produced. Visual data analysis will be among our most most important tools in helping to understand such large-scale information. Our research at the Scientific Computing and Imaging (SCI) Institute at the University of Utah has focused on innovative, scalable techniques for large-scale 3D visual data analysis. In this talk, I will present state- of-the-art visualization techniques, including scalable visualization algorithms and software, cluster-based visualization methods and innovate visualization techniques applied to problems in computational science, engineering, and medicine. I will conclude with an outline for a future high performance visualization research challenges and opportunities.

  14. Boosting the adoption and the reliability of renewable energy sources: Mitigating the large-scale wind power intermittency through vehicle to grid technology

    International Nuclear Information System (INIS)

    Zhao, Yang; Noori, Mehdi; Tatari, Omer

    2017-01-01

    The integration of wind energy in the electricity sector and the adoption of electric vehicles in the transportation sector both have the potential to significantly reduce greenhouse gas emissions individually as well as in tandem with Vehicle-to-Grid technology. This study aims to evaluate the greenhouse gas emission savings of mitigating intermittency resulting from the introduction of wind power through Vehicle-to-Grid technologies, as well as the extent to which the marginal electricity consumption from charging an electric vehicle fleet may weaken this overall environmental benefit. To this end, the comparisons are conducted in seven independent system operator regions. The results indicate that, in most cases, the emission savings of a combination of wind power and Vehicle-to-Grid technology outweighs the additional emissions from marginal electricity generation for electric vehicles. In addition, the fluctuations in newly-integrated wind power could be balanced in the future using EVs and V2G technology, provided that a moderate portion of EV owners is willing to provide V2G services. On the other hand, such a combination is not favorable if the Vehicle-to-Grid service participation rate is less than 5% of all electric vehicle owners within a particular region. - Highlights: • The environmental benefit of vehicle to grid systems as grid stabilizer is analyzed. • Emission savings of vehicle to grid and impacts of electric vehicles are compared. • Seven independent system operator regions are studied. • Uncertainty and sensitivity analysis are performed through a Monte Carlo Simulation.

  15. Large scale structure statistics: Finite volume effects

    Science.gov (United States)

    Colombi, S.; Bouchet, F. R.; Schaeffer, R.

    1994-01-01

    We study finite volume effects on the count probability distribution function PN(l) and the averaged Q-body correlations Xi-barQ (2 less than or = Q less than or equal 5). These statistics are computed for cubic cells, of size l. We use as an example the case of the matter distribution of a cold dark matter (CDM) universe involving approximately 3 x 105 particles. The main effect of the finiteness of the sampled volume is to induce an abrupt cut-off on the function PN(l) at large N. This clear signature makes an analysis of the consequences easy, and one can envisage a correction procedure. As a matter of fact, we demonstrate how an unfair sample can strongly affect the estimates of the functions Xi-barQ for Q greater than or = 3 (and decrease the measured zero of the two-body correlation function). We propose a method to correct for this are fact, or at least to evaluate the corresponding errors. We show that the correlations are systematically underestimated by direct measurements. We find that, once corrected, the statistical properties of the CDM universe appear compatible with the scaling relation SQ identically equals Xi-bar2 exp Q-1 = constant with respect to scale, in the non-linear regime; it was not the case with direct measurments. However, we note a deviation from scaling at scales close to the correlation length. It is probably due to the transition between the highly non-linear regime and the weakly correlated regime, where the functions SQ also seem to present a plateau. We apply the same procedure to simulations with hot dark matter (HDM) and white noise initial conditions, with similar results. Our method thus provides the first accurate measurement of the normalized skewness, S3, and the normalized kurtosis, S4, for three typical models of large scale structure formation in an expanding universe.

  16. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  17. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Harrington, Kathleen; Marriange, Tobias; Aamir, Ali; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  18. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  19. Large-scale impacts of hydroelectric development

    International Nuclear Information System (INIS)

    Rosenberg, D.M.; Bodaly, R.A.; Hecky, R.E.; Rudd, J.W.M.; Berkes, F.; Kelly, C.A.

    1997-01-01

    A study was conducted in which the cumulative environmental effects of mega-hydroelectric development projects such as the James Bay development in Canada, the Sardar Sarovar development in India and the Three Gorges development in China were examined. The extent of flooding as a result of these projects and of many others around the world was presented. The study showed that several factors are responsible for methyl mercury (MeHg) bioaccumulation in reservoirs. The study also revealed that reservoirs can be a significant source of greenhouse gas emissions. Boreal forests in particular, when flooded, become a strong source of greenhouse gases to the atmosphere. This results from the fact that after flooding a boreal forest changes from being a small carbon sink to a large source of carbon to the atmosphere, due to stimulated microbial production of CO 2 and CH 4 by decomposition of plant tissues and peat. This increased decomposition also results in an increase of another microbial activity, namely the methylation of inorganic mercury to the much more toxic MeHg. Selected examples of the downstream effects of altered flows caused by large-scale hydroelectric developments world-wide were summarized. A similar tabulation provided examples of social impacts of relocation of people necessitated by large-scale hydroelectric development. 209 refs., 10 tabs., 3 figs

  20. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  1. Modelling large-scale hydrogen infrastructure development

    International Nuclear Information System (INIS)

    De Groot, A.; Smit, R.; Weeda, M.

    2005-08-01

    In modelling a possible H2 infrastructure development the following questions are answered in this presentation: How could the future demand for H2 develop in the Netherlands?; and In which year and where would it be economically viable to construct a H2 infrastructure in the Netherlands? Conclusions are that: A model for describing a possible future H2 infrastructure is successfully developed; The model is strongly regional and time dependent; Decrease of fuel cell cost appears to be a sensitive parameter for development of H2 demand; Cost-margin between large-scale and small-scale H2 production is a main driver for development of a H2 infrastructure; A H2 infrastructure seems economically viable in the Netherlands starting from the year 2022

  2. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    ) and with majority of images having a proportion larger than one, but less than e.g. the golden ratio. Furthermore, more images have the inversed proportion, meaning that portrait paintings are more common than landscape paintings. The inverse is true for photographs, i.e. more landscape than portrait format......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  3. Large scale phononic metamaterials for seismic isolation

    Energy Technology Data Exchange (ETDEWEB)

    Aravantinos-Zafiris, N. [Department of Sound and Musical Instruments Technology, Ionian Islands Technological Educational Institute, Stylianou Typaldou ave., Lixouri 28200 (Greece); Sigalas, M. M. [Department of Materials Science, University of Patras, Patras 26504 (Greece)

    2015-08-14

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials.

  4. Large scale phononic metamaterials for seismic isolation

    International Nuclear Information System (INIS)

    Aravantinos-Zafiris, N.; Sigalas, M. M.

    2015-01-01

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials

  5. Large-scale innovation and change in UK higher education

    Directory of Open Access Journals (Sweden)

    Stephen Brown

    2013-09-01

    Full Text Available This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ technology to deliver such changes. Key lessons that emerged from these experiences are reviewed covering themes of pervasiveness, unofficial systems, project creep, opposition, pressure to deliver, personnel changes and technology issues. The paper argues that collaborative approaches to project management offer greater prospects of effective large-scale change in universities than either management-driven top-down or more champion-led bottom-up methods. It also argues that while some diminution of control over project outcomes is inherent in this approach, this is outweighed by potential benefits of lasting and widespread adoption of agreed changes.

  6. Large scale study of tooth enamel

    International Nuclear Information System (INIS)

    Bodart, F.; Deconninck, G.; Martin, M.T.

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. On hundred eighty samples of teeth were first analyzed using PIXE, backscattering and nuclear reaction techniques. The results were analyzed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population. (author)

  7. Large-scale ATLAS production on EGEE

    CERN Document Server

    Espinal, X; Walker, R

    2008-01-01

    In preparation for first data at the LHC, a series of Data Challenges, of increasing scale and complexity, have been performed. Large quantities of simulated data have been produced on three different Grids, integrated into the ATLAS production system. During 2006, the emphasis moved towards providing stable continuous production, as is required in the immediate run-up to first data, and thereafter. Here, we discuss the experience of the production done on EGEE resources, using submission based on the gLite WMS, CondorG and a system using Condor Glide-ins. The overall wall time efficiency of around 90% is largely independent of the submission method, and the dominant source of wasted cpu comes from data handling issues. The efficiency of grid job submission is significantly worse than this, and the glide-in method benefits greatly from factorising this out.

  8. Large-Scale Physical Separation of Depleted Uranium from Soil

    Science.gov (United States)

    2012-09-01

    unweathered depleted uranium rods illustrating the formation of uranyl oxides and salts . Unfired penetrator rods can range from 10 to 50 cm in length...specific area ratio (as thin sections, fine particles, or molten states). Uranium in finely divided form is prone to ignition. Uranium also has an...ER D C/ EL T R -1 2 -2 5 Army Range Technology Program Large-Scale Physical Separation of Depleted Uranium from Soil E nv ir on m en ta l

  9. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  10. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Cleary, Joseph

    2018-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is an array of four telescopes designed to measure the polarization of the Cosmic Microwave Background. CLASS aims to detect the B-mode polarization from primordial gravitational waves predicted by cosmic inflation theory, as well as the imprint left by reionization upon the CMB E-mode polarization. This will be achieved through a combination of observing strategy and state-of-the-art instrumentation. CLASS is observing 70% of the sky to characterize the CMB at large angular scales, which will measure the entire CMB power spectrum from the reionization peak to the recombination peak. The four telescopes operate at frequencies of 38, 93, 145, and 217 GHz, in order to estimate Galactic synchrotron and dust foregrounds while avoiding atmospheric absorption. CLASS employs rapid polarization modulation to overcome atmospheric and instrumental noise. Polarization sensitive cryogenic detectors with low noise levels provide CLASS the sensitivity required to constrain the tensor-to-scalar ratio down to levels of r ~ 0.01 while also measuring the optical depth the reionization to sample-variance levels. These improved constraints on the optical depth to reionization are required to pin down the mass of neutrinos from complementary cosmological data. CLASS has completed a year of observations at 38 GHz and is in the process of deploying the rest of the telescope array. This poster provides an overview and update on the CLASS science, hardware and survey operations.

  11. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  12. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  13. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines......Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach......-construction designers simultaneously unfold local design issues, i.e. the design brief and possible design interventions. These non-linear explorative proceedings are similar to what researchers in science and technology studies have described as translation. Translation is a central concept within actor...

  14. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  15. Cardiovascular autonomic neuropathy is associated with macrovascular risk factors in type 2 diabetes: new technology used for routine large-scale screening adds new insight.

    Science.gov (United States)

    Fleischer, Jesper; Yderstraede, Knud; Gulichsen, Elisabeth; Jakobsen, Poul Erik; Lervang, Hans Henrik; Eldrup, Ebbe; Nygaard, Hans; Tarnow, Lise; Ejskjaer, Niels

    2014-07-01

    The objective was to identify the presence of cardiovascular autonomic neuropathy (CAN) in a cohort of individuals with diabetes in outpatient clinics from 4 different parts of Denmark and to explore the difference between type 1 and type 2 diabetes in relation to CAN. The DAN-Study is a Danish multicenter study focusing on diabetic autonomic neuropathy. Over a period of 12 months, 382 type 1 and 271 type 2 individuals with diabetes were tested for CAN. Patients were randomly recruited and tested during normal visits to outpatient clinics at 4 Danish hospitals. The presence of CAN was quantified by performing 3 cardiovascular reflex tests (response to standing, deep breathing, and valsalva). To describe possible associations, multivariate analysis with CAN as the dependent variable was performed. The prevalence of CAN was higher among patients with type 2 diabetes (35%) compared to patients with type 1 diabetes (25%). Multivariate analysis revealed significant associations between CAN and different risk markers in the 2 populations. In type 1 diabetes patients CAN was associated with microalbuminuria (P type 2 diabetes patients CAN was independently associated with high pulse pressure (P type 1, whereas in type 2 CAN was associated with macrovascular risk factors. © 2014 Diabetes Technology Society.

  16. Predictors of Information Technology Integration in Secondary Schools: Evidence from a Large Scale Study of More than 30,000 Students.

    Science.gov (United States)

    Hew, Khe Foon; Tan, Cheng Yong

    2016-01-01

    The present study examined the predictors of information technology (IT) integration in secondary school mathematics lessons. The predictors pertained to IT resource availability in schools, school contextual/institutional variables, accountability pressure faced by schools, subject culture in mathematics, and mathematics teachers' pedagogical beliefs and practices. Data from 32,256 secondary school students from 2,519 schools in 16 developed economies who participated in the Program for International Student Assessment (PISA) 2012 were analyzed using hierarchical linear modeling (HLM). Results showed that after controlling for student-level (gender, prior academic achievement and socioeconomic status) and school-level (class size, number of mathematics teachers) variables, students in schools with more computers per student, with more IT resources, with higher levels of IT curricular expectations, with an explicit policy on the use of IT in mathematics, whose teachers believed in student-centered teaching-learning, and whose teachers provided more problem-solving activities in class reported higher levels of IT integration. On the other hand, students who studied in schools with more positive teacher-related school learning climate, and with more academically demanding parents reported lower levels of IT integration. Student-related school learning climate, principal leadership behaviors, schools' public posting of achievement data, tracking of school's achievement data by administrative authorities, and pedagogical and curricular differentiation in mathematics lessons were not related to levels of IT integration. Put together, the predictors explained a total of 15.90% of the school-level variance in levels of IT integration. In particular, school IT resource availability, and mathematics teachers' pedagogical beliefs and practices stood out as the most important determinants of IT integration in mathematics lessons.

  17. Predictors of Information Technology Integration in Secondary Schools: Evidence from a Large Scale Study of More than 30,000 Students.

    Directory of Open Access Journals (Sweden)

    Khe Foon Hew

    Full Text Available The present study examined the predictors of information technology (IT integration in secondary school mathematics lessons. The predictors pertained to IT resource availability in schools, school contextual/institutional variables, accountability pressure faced by schools, subject culture in mathematics, and mathematics teachers' pedagogical beliefs and practices. Data from 32,256 secondary school students from 2,519 schools in 16 developed economies who participated in the Program for International Student Assessment (PISA 2012 were analyzed using hierarchical linear modeling (HLM. Results showed that after controlling for student-level (gender, prior academic achievement and socioeconomic status and school-level (class size, number of mathematics teachers variables, students in schools with more computers per student, with more IT resources, with higher levels of IT curricular expectations, with an explicit policy on the use of IT in mathematics, whose teachers believed in student-centered teaching-learning, and whose teachers provided more problem-solving activities in class reported higher levels of IT integration. On the other hand, students who studied in schools with more positive teacher-related school learning climate, and with more academically demanding parents reported lower levels of IT integration. Student-related school learning climate, principal leadership behaviors, schools' public posting of achievement data, tracking of school's achievement data by administrative authorities, and pedagogical and curricular differentiation in mathematics lessons were not related to levels of IT integration. Put together, the predictors explained a total of 15.90% of the school-level variance in levels of IT integration. In particular, school IT resource availability, and mathematics teachers' pedagogical beliefs and practices stood out as the most important determinants of IT integration in mathematics lessons.

  18. Implementation factors affecting the large-scale deployment of digital health and well-being technologies: A qualitative study of the initial phases of the 'Living-It-Up' programme.

    Science.gov (United States)

    Agbakoba, Ruth; McGee-Lennon, Marilyn; Bouamrane, Matt-Mouley; Watson, Nicholas; Mair, Frances S

    2016-12-01

    Little is known about the factors which facilitate or impede the large-scale deployment of health and well-being consumer technologies. The Living-It-Up project is a large-scale digital intervention led by NHS 24, aiming to transform health and well-being services delivery throughout Scotland. We conducted a qualitative study of the factors affecting the implementation and deployment of the Living-It-Up services. We collected a range of data during the initial phase of deployment, including semi-structured interviews (N = 6); participant observation sessions (N = 5) and meetings with key stakeholders (N = 3). We used the Normalisation Process Theory as an explanatory framework to interpret the social processes at play during the initial phases of deployment.Initial findings illustrate that it is clear - and perhaps not surprising - that the size and diversity of the Living-It-Up consortium made implementation processes more complex within a 'multi-stakeholder' environment. To overcome these barriers, there is a need to clearly define roles, tasks and responsibilities among the consortium partners. Furthermore, varying levels of expectations and requirements, as well as diverse cultures and ways of working, must be effectively managed. Factors which facilitated implementation included extensive stakeholder engagement, such as co-design activities, which can contribute to an increased 'buy-in' from users in the long term. An important lesson from the Living-It-Up initiative is that attempting to co-design innovative digital services, but at the same time, recruiting large numbers of users is likely to generate conflicting implementation priorities which hinder - or at least substantially slow down - the effective rollout of services at scale.The deployment of Living-It-Up services is ongoing, but our results to date suggest that - in order to be successful - the roll-out of digital health and well-being technologies at scale requires a delicate and pragmatic trade

  19. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    Linton, M.; Khalatbari, A.

    2011-01-01

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  20. [Stress management in large-scale establishments].

    Science.gov (United States)

    Fukasawa, Kenji

    2002-07-01

    Due to a recent dramatic change in industrial structures in Japan, the role of large-scale enterprises is changing. Mass production used to be the major income sources of companies, but nowadays it has changed to high value-added products, including, software development. As a consequence of highly competitive inter-corporate development, there are various sources of job stress which induce health problems in employees, especially those concerned with development or management. To simply to obey the law or offer medical care are not enough to achieve management of these problems. Occupational health staff need to act according to the disease type and provide care with support from the Supervisor and Personnel Division. And for the training, development and consultation system, occupational health staff must work with the Personnel Division and Safety Division, and be approved by management supervisors.

  1. Large - scale Rectangular Ruler Automated Verification Device

    Science.gov (United States)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  2. Large-scale digitizer system, analog converters

    International Nuclear Information System (INIS)

    Althaus, R.F.; Lee, K.L.; Kirsten, F.A.; Wagner, L.J.

    1976-10-01

    Analog to digital converter circuits that are based on the sharing of common resources, including those which are critical to the linearity and stability of the individual channels, are described. Simplicity of circuit composition is valued over other more costly approaches. These are intended to be applied in a large-scale processing and digitizing system for use with high-energy physics detectors such as drift-chambers or phototube-scintillator arrays. Signal distribution techniques are of paramount importance in maintaining adequate signal-to-noise ratio. Noise in both amplitude and time-jitter senses is held sufficiently low so that conversions with 10-bit charge resolution and 12-bit time resolution are achieved

  3. Large-scale sequential quadratic programming algorithms

    Energy Technology Data Exchange (ETDEWEB)

    Eldersveld, S.K.

    1992-09-01

    The problem addressed is the general nonlinear programming problem: finding a local minimizer for a nonlinear function subject to a mixture of nonlinear equality and inequality constraints. The methods studied are in the class of sequential quadratic programming (SQP) algorithms, which have previously proved successful for problems of moderate size. Our goal is to devise an SQP algorithm that is applicable to large-scale optimization problems, using sparse data structures and storing less curvature information but maintaining the property of superlinear convergence. The main features are: 1. The use of a quasi-Newton approximation to the reduced Hessian of the Lagrangian function. Only an estimate of the reduced Hessian matrix is required by our algorithm. The impact of not having available the full Hessian approximation is studied and alternative estimates are constructed. 2. The use of a transformation matrix Q. This allows the QP gradient to be computed easily when only the reduced Hessian approximation is maintained. 3. The use of a reduced-gradient form of the basis for the null space of the working set. This choice of basis is more practical than an orthogonal null-space basis for large-scale problems. The continuity condition for this choice is proven. 4. The use of incomplete solutions of quadratic programming subproblems. Certain iterates generated by an active-set method for the QP subproblem are used in place of the QP minimizer to define the search direction for the nonlinear problem. An implementation of the new algorithm has been obtained by modifying the code MINOS. Results and comparisons with MINOS and NPSOL are given for the new algorithm on a set of 92 test problems.

  4. Large-scale stochasticity in Hamiltonian systems

    International Nuclear Information System (INIS)

    Escande, D.F.

    1982-01-01

    Large scale stochasticity (L.S.S.) in Hamiltonian systems is defined on the paradigm Hamiltonian H(v,x,t) =v 2 /2-M cos x-P cos k(x-t) which describes the motion of one particle in two electrostatic waves. A renormalization transformation Tsub(r) is described which acts as a microscope that focusses on a given KAM (Kolmogorov-Arnold-Moser) torus in phase space. Though approximate, Tsub(r) yields the threshold of L.S.S. in H with an error of 5-10%. The universal behaviour of KAM tori is predicted: for instance the scale invariance of KAM tori and the critical exponent of the Lyapunov exponent of Cantori. The Fourier expansion of KAM tori is computed and several conjectures by L. Kadanoff and S. Shenker are proved. Chirikov's standard mapping for stochastic layers is derived in a simpler way and the width of the layers is computed. A simpler renormalization scheme for these layers is defined. A Mathieu equation for describing the stability of a discrete family of cycles is derived. When combined with Tsub(r), it allows to prove the link between KAM tori and nearby cycles, conjectured by J. Greene and, in particular, to compute the mean residue of a torus. The fractal diagrams defined by G. Schmidt are computed. A sketch of a methodology for computing the L.S.S. threshold in any two-degree-of-freedom Hamiltonian system is given. (Auth.)

  5. 8th International Symposium on Intelligent Distributed Computing & Workshop on Cyber Security and Resilience of Large-Scale Systems & 6th International Workshop on Multi-Agent Systems Technology and Semantics

    CERN Document Server

    Braubach, Lars; Venticinque, Salvatore; Badica, Costin

    2015-01-01

    This book represents the combined peer-reviewed proceedings of the Eight International Symposium on Intelligent Distributed Computing - IDC'2014, of the Workshop on Cyber Security and Resilience of Large-Scale Systems - WSRL-2014, and of the Sixth International Workshop on Multi-Agent Systems Technology and Semantics- MASTS-2014. All the events were held in Madrid, Spain, during September 3-5, 2014. The 47 contributions published in this book address several topics related to theory and applications of the intelligent distributed computing and multi-agent systems, including: agent-based data processing, ambient intelligence, collaborative systems, cryptography and security, distributed algorithms, grid and cloud computing, information extraction, knowledge management, big data and ontologies, social networks, swarm intelligence or videogames amongst others.

  6. CLASS: The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Essinger-Hileman, Thomas; Ali, Aamir; Amiri, Mandana; Appel, John W.; Araujo, Derek; Bennett, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Chuss, David T.; hide

    2014-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is an experiment to measure the signature of a gravitational wave background from inflation in the polarization of the cosmic microwave background (CMB). CLASS is a multi-frequency array of four telescopes operating from a high-altitude site in the Atacama Desert in Chile. CLASS will survey 70% of the sky in four frequency bands centered at 38, 93, 148, and 217 GHz, which are chosen to straddle the Galactic-foreground minimum while avoiding strong atmospheric emission lines. This broad frequency coverage ensures that CLASS can distinguish Galactic emission from the CMB. The sky fraction of the CLASS survey will allow the full shape of the primordial B-mode power spectrum to be characterized, including the signal from reionization at low-length. Its unique combination of large sky coverage, control of systematic errors, and high sensitivity will allow CLASS to measure or place upper limits on the tensor-to-scalar ratio at a level of r = 0:01 and make a cosmic-variance-limited measurement of the optical depth to the surface of last scattering, tau. (c) (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

  7. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  8. Biomass for energy - small scale technologies

    Energy Technology Data Exchange (ETDEWEB)

    Salvesen, F.; Joergensen, P.F. [KanEnergi, Rud (Norway)

    1997-12-31

    The bioenergy markets and potential in EU region, the different types of biofuels, the energy technology, and the relevant applications of these for small-scale energy production are reviewed in this presentation

  9. Low Power Analog Design in Scaled Technologies

    CERN Document Server

    Baschirotto, A; Cocciolo, G; D’Amico, S; De Matteis, M; Delizia, P

    2009-01-01

    In this paper an overview on the main issues in analog IC design in scaled CMOS technology is presented. Decreasing the length of MOS channel and the gate oxide has led to undoubted advantages in terms of chip area, speed and power consumption (mainly exploited in the digital parts). Besides, some drawbacks are introduced in term of power leakage and reliability. Moreover, the scaled technology lower supply voltage requirement has led analog designers to find new circuital solution to guarantee the required performance.

  10. Multidimensional quantum entanglement with large-scale integrated optics.

    Science.gov (United States)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong; Santagati, Raffaele; Skrzypczyk, Paul; Salavrakos, Alexia; Tura, Jordi; Augusiak, Remigiusz; Mančinska, Laura; Bacco, Davide; Bonneau, Damien; Silverstone, Joshua W; Gong, Qihuang; Acín, Antonio; Rottwitt, Karsten; Oxenløwe, Leif K; O'Brien, Jeremy L; Laing, Anthony; Thompson, Mark G

    2018-04-20

    The ability to control multidimensional quantum systems is central to the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control, and analyze high-dimensional entanglement. A programmable bipartite entangled system is realized with dimensions up to 15 × 15 on a large-scale silicon photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality, and controllability of our multidimensional technology, and further exploit these abilities to demonstrate previously unexplored quantum applications, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development of multidimensional quantum technologies. Copyright © 2018 The Authors, some rights reserved; exclusive licensee American Association for the Advancement of Science. No claim to original U.S. Government Works.

  11. Computational solutions to large-scale data management and analysis.

    Science.gov (United States)

    Schadt, Eric E; Linderman, Michael D; Sorenson, Jon; Lee, Lawrence; Nolan, Garry P

    2010-09-01

    Today we can generate hundreds of gigabases of DNA and RNA sequencing data in a week for less than US$5,000. The astonishing rate of data generation by these low-cost, high-throughput technologies in genomics is being matched by that of other technologies, such as real-time imaging and mass spectrometry-based flow cytometry. Success in the life sciences will depend on our ability to properly interpret the large-scale, high-dimensional data sets that are generated by these technologies, which in turn requires us to adopt advances in informatics. Here we discuss how we can master the different types of computational environments that exist - such as cloud and heterogeneous computing - to successfully tackle our big data problems.

  12. Large scale molecular simulations of nanotoxicity.

    Science.gov (United States)

    Jimenez-Cruz, Camilo A; Kang, Seung-gu; Zhou, Ruhong

    2014-01-01

    The widespread use of nanomaterials in biomedical applications has been accompanied by an increasing interest in understanding their interactions with tissues, cells, and biomolecules, and in particular, on how they might affect the integrity of cell membranes and proteins. In this mini-review, we present a summary of some of the recent studies on this important subject, especially from the point of view of large scale molecular simulations. The carbon-based nanomaterials and noble metal nanoparticles are the main focus, with additional discussions on quantum dots and other nanoparticles as well. The driving forces for adsorption of fullerenes, carbon nanotubes, and graphene nanosheets onto proteins or cell membranes are found to be mainly hydrophobic interactions and the so-called π-π stacking (between aromatic rings), while for the noble metal nanoparticles the long-range electrostatic interactions play a bigger role. More interestingly, there are also growing evidences showing that nanotoxicity can have implications in de novo design of nanomedicine. For example, the endohedral metallofullerenol Gd@C₈₂(OH)₂₂ is shown to inhibit tumor growth and metastasis by inhibiting enzyme MMP-9, and graphene is illustrated to disrupt bacteria cell membranes by insertion/cutting as well as destructive extraction of lipid molecules. These recent findings have provided a better understanding of nanotoxicity at the molecular level and also suggested therapeutic potential by using the cytotoxicity of nanoparticles against cancer or bacteria cells. © 2014 Wiley Periodicals, Inc.

  13. Large-scale tides in general relativity

    Energy Technology Data Exchange (ETDEWEB)

    Ip, Hiu Yan; Schmidt, Fabian, E-mail: iphys@mpa-garching.mpg.de, E-mail: fabians@mpa-garching.mpg.de [Max-Planck-Institut für Astrophysik, Karl-Schwarzschild-Str. 1, 85741 Garching (Germany)

    2017-02-01

    Density perturbations in cosmology, i.e. spherically symmetric adiabatic perturbations of a Friedmann-Lemaȋtre-Robertson-Walker (FLRW) spacetime, are locally exactly equivalent to a different FLRW solution, as long as their wavelength is much larger than the sound horizon of all fluid components. This fact is known as the 'separate universe' paradigm. However, no such relation is known for anisotropic adiabatic perturbations, which correspond to an FLRW spacetime with large-scale tidal fields. Here, we provide a closed, fully relativistic set of evolutionary equations for the nonlinear evolution of such modes, based on the conformal Fermi (CFC) frame. We show explicitly that the tidal effects are encoded by the Weyl tensor, and are hence entirely different from an anisotropic Bianchi I spacetime, where the anisotropy is sourced by the Ricci tensor. In order to close the system, certain higher derivative terms have to be dropped. We show that this approximation is equivalent to the local tidal approximation of Hui and Bertschinger [1]. We also show that this very simple set of equations matches the exact evolution of the density field at second order, but fails at third and higher order. This provides a useful, easy-to-use framework for computing the fully relativistic growth of structure at second order.

  14. Wind technology development: Large and small turbines

    Science.gov (United States)

    Thresher, R. W.; Hock, S. M.; Loose, R. R.; Goldman, P.

    1994-12-01

    Wind technology has developed rapidly over the last decade with the design and development of advanced systems with improved performance, higher reliability, and lower costs. During the past several years, substantial gains have been made in wind turbine designs, lowering costs to an average of $0.05/kWh while further technology development is expected to allow the cost to drop below $0.04/kWh by 2000. As a result, wind is expected to be one of the least expensive forms of new electric generation in the next century. This paper will present the technology developments for both utility-scale wind turbines and remote, small-village wind turbines that are currently available or in development. Technology innovations are being adapted for remote and stand-alone power applications with smaller wind turbines. Hybrid power systems using smaller 1 to 50 (kW) wind turbines are being developed for non-grid-connected electrical generation applications. These village power systems typically use wind energy, photovoltaics, battery storage, and conventional diesel generators to power remote communities. Smaller turbines are being explored for application as distributed generation sources on utility grids to supply power during periods of peak demand, avoiding costly upgrades in distribution equipment. New turbine designs now account for turbulence-induced loads, unsteady aerodynamic stall effects, and complex fatigue loads, making use of new technology developments such as advanced airfoils. The new airfoils increase the energy capture, improve the operating efficiency, and reduce the sensitivity of the airfoils to operation roughness. Electronic controls are allowing variable rotor speed operation; while aerodynamic control devices, such as ailerons and flaps, are used to modulate power or stop the rotor in high-speed conditions. These technology trends and future turbine configurations are being sponsored and explored by the U.S. Department of Energy's Wind Energy Program.

  15. The Uneven Diffusion of Collaborative Technology in a Large Organization

    Science.gov (United States)

    Jarulaitis, Gasparas

    This paper investigates the large-scale diffusion of a collaborative technology in a range of different business contexts. The empirical data used in the article were obtained from a longitudinal (2007-2009) case study of a global oil and gas company (OGC). Our study reports on ongoing efforts to deploy an inte grated collaborative system that uses Microsoft SharePoint (MSP) technology. We assess MSP as a configurational technology and analyze the diffusion of a metadata standard developed in-house, which forms an embedded component of MSP. We focus on two different organizational contexts, namely research and development (R&D) and oil and gas production (OGP), and illustrate the key differences between the ways in which configurational technology is managed and used in these contexts, which results in an uneven diffusion. In contrast with previous studies, we unravel the organizational and technological complexity involved, and thus empirically illustrate the flexibility of large-scale technology and show how the trajectories of the various components are influenced by multiple modes of ordering.

  16. Successful application of FTA Classic Card technology and use of bacteriophage phi29 DNA polymerase for large-scale field sampling and cloning of complete maize streak virus genomes.

    Science.gov (United States)

    Owor, Betty E; Shepherd, Dionne N; Taylor, Nigel J; Edema, Richard; Monjane, Adérito L; Thomson, Jennifer A; Martin, Darren P; Varsani, Arvind

    2007-03-01

    Leaf samples from 155 maize streak virus (MSV)-infected maize plants were collected from 155 farmers' fields in 23 districts in Uganda in May/June 2005 by leaf-pressing infected samples onto FTA Classic Cards. Viral DNA was successfully extracted from cards stored at room temperature for 9 months. The diversity of 127 MSV isolates was analysed by PCR-generated RFLPs. Six representative isolates having different RFLP patterns and causing either severe, moderate or mild disease symptoms, were chosen for amplification from FTA cards by bacteriophage phi29 DNA polymerase using the TempliPhi system. Full-length genomes were inserted into a cloning vector using a unique restriction enzyme site, and sequenced. The 1.3-kb PCR product amplified directly from FTA-eluted DNA and used for RFLP analysis was also cloned and sequenced. Comparison of cloned whole genome sequences with those of the original PCR products indicated that the correct virus genome had been cloned and that no errors were introduced by the phi29 polymerase. This is the first successful large-scale application of FTA card technology to the field, and illustrates the ease with which large numbers of infected samples can be collected and stored for downstream molecular applications such as diversity analysis and cloning of potentially new virus genomes.

  17. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  18. Computational Approach to large Scale Process Optimization through Pinch Analysis

    Directory of Open Access Journals (Sweden)

    Nasser Al-Azri

    2015-08-01

    Full Text Available Since its debut in the last quarter of the twentieth century, pinch technology has become an efficient tool for efficient and cost-effective engineering process design. This method allows the integration of mass and heat streams in such a way that minimizes waste and external purchase of mass and utilities. Moreover, integrating process streams internally will minimize fuel consumption and hence carbon emission to the atmosphere. This paper discusses a programmable approach to the design of mass and heat exchange networks that can be used easily for large scale engineering processes.

  19. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  20. Developing Large-Scale Bayesian Networks by Composition

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, we investigate the use of Bayesian networks to construct large-scale diagnostic systems. In particular, we consider the development of large-scale...

  1. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  2. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  3. Large-scale energy consumers pay less

    International Nuclear Information System (INIS)

    Denneman, A.

    2012-01-01

    The price of electricity in the Netherlands rose with 6 percent in the first quarter of 2012, whereas large business consumers are paying less. The natural gas price has risen with about 10 percent in the last year, both for households and for large business consumers. Meanwhile, households are paying twice as much for electricity and gas as large business consumers. [nl

  4. Large scale dynamics of protoplanetary discs

    Science.gov (United States)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  5. Mechanisation of large-scale agricultural fields in developing countries - a review.

    Science.gov (United States)

    Onwude, Daniel I; Abdulstter, Rafia; Gomes, Chandima; Hashim, Norhashila

    2016-09-01

    Mechanisation of large-scale agricultural fields often requires the application of modern technologies such as mechanical power, automation, control and robotics. These technologies are generally associated with relatively well developed economies. The application of these technologies in some developing countries in Africa and Asia is limited by factors such as technology compatibility with the environment, availability of resources to facilitate the technology adoption, cost of technology purchase, government policies, adequacy of technology and appropriateness in addressing the needs of the population. As a result, many of the available resources have been used inadequately by farmers, who continue to rely mostly on conventional means of agricultural production, using traditional tools and equipment in most cases. This has led to low productivity and high cost of production among others. Therefore this paper attempts to evaluate the application of present day technology and its limitations to the advancement of large-scale mechanisation in developing countries of Africa and Asia. Particular emphasis is given to a general understanding of the various levels of mechanisation, present day technology, its management and application to large-scale agricultural fields. This review also focuses on/gives emphasis to future outlook that will enable a gradual, evolutionary and sustainable technological change. The study concludes that large-scale-agricultural farm mechanisation for sustainable food production in Africa and Asia must be anchored on a coherent strategy based on the actual needs and priorities of the large-scale farmers. © 2016 Society of Chemical Industry. © 2016 Society of Chemical Industry.

  6. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  7. Large-scale visualization system for grid environment

    International Nuclear Information System (INIS)

    Suzuki, Yoshio

    2007-01-01

    Center for Computational Science and E-systems of Japan Atomic Energy Agency (CCSE/JAEA) has been conducting R and Ds of distributed computing (grid computing) environments: Seamless Thinking Aid (STA), Information Technology Based Laboratory (ITBL) and Atomic Energy Grid InfraStructure (AEGIS). In these R and Ds, we have developed the visualization technology suitable for the distributed computing environment. As one of the visualization tools, we have developed the Parallel Support Toolkit (PST) which can execute the visualization process parallely on a computer. Now, we improve PST to be executable simultaneously on multiple heterogeneous computers using Seamless Thinking Aid Message Passing Interface (STAMPI). STAMPI, we have developed in these R and Ds, is the MPI library executable on a heterogeneous computing environment. The improvement realizes the visualization of extremely large-scale data and enables more efficient visualization processes in a distributed computing environment. (author)

  8. Large-scale Ising-machines composed of magnetic neurons

    Science.gov (United States)

    Mizushima, Koichi; Goto, Hayato; Sato, Rie

    2017-10-01

    We propose Ising-machines composed of magnetic neurons, that is, magnetic bits in a recording track. In large-scale machines, the sizes of both neurons and synapses need to be reduced, and neat and smart connections among neurons are also required to achieve all-to-all connectivity among them. These requirements can be fulfilled by adopting magnetic recording technologies such as race-track memories and skyrmion tracks because the area of a magnetic bit is almost two orders of magnitude smaller than that of static random access memory, which has normally been used as a semiconductor neuron, and the smart connections among neurons are realized by using the read and write methods of these technologies.

  9. Multidimensional quantum entanglement with large-scale integrated optics

    DEFF Research Database (Denmark)

    Wang, Jianwei; Paesani, Stefano; Ding, Yunhong

    2018-01-01

    -dimensional entanglement. A programmable bipartite entangled system is realized with dimension up to 15 × 15 on a large-scale silicon-photonics quantum circuit. The device integrates more than 550 photonic components on a single chip, including 16 identical photon-pair sources. We verify the high precision, generality......The ability to control multidimensional quantum systems is key for the investigation of fundamental science and for the development of advanced quantum technologies. We demonstrate a multidimensional integrated quantum photonic platform able to generate, control and analyze high...... and controllability of our multidimensional technology, and further exploit these abilities to demonstrate key quantum applications experimentally unexplored before, such as quantum randomness expansion and self-testing on multidimensional states. Our work provides an experimental platform for the development...

  10. HTS cables open the window for large-scale renewables

    International Nuclear Information System (INIS)

    Geschiere, A; Willen, D; Piga, E; Barendregt, P

    2008-01-01

    In a realistic approach to future energy consumption, the effects of sustainable power sources and the effects of growing welfare with increased use of electricity need to be considered. These factors lead to an increased transfer of electric energy over the networks. A dominant part of the energy need will come from expanded large-scale renewable sources. To use them efficiently over Europe, large energy transits between different countries are required. Bottlenecks in the existing infrastructure will be avoided by strengthening the network. For environmental reasons more infrastructure will be built underground. Nuon is studying the HTS technology as a component to solve these challenges. This technology offers a tremendously large power transport capacity as well as the possibility to reduce short circuit currents, making integration of renewables easier. Furthermore, power transport will be possible at lower voltage levels, giving the opportunity to upgrade the existing network while re-using it. This will result in large cost savings while reaching the future energy challenges. In a 6 km backbone structure in Amsterdam Nuon wants to install a 50 kV HTS Triax cable for a significant increase of the transport capacity, while developing its capabilities. Nevertheless several barriers have to be overcome

  11. Large scale road network generalization for vario-scale map

    NARCIS (Netherlands)

    Suba, R.; Meijers, B.M.; Van Oosterom, P.J.M.

    2015-01-01

    The classical approach for road network generalization consists of producing multiple maps, for a different scale or purpose, from a single detailed data source and quite often roads are represented by line objects. Our target is the generalization of a road network for the whole scale range from

  12. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , whereas Chapter 4 indicates that sugarcane outgrowers’ easy access to credit and technology and their high productivity compared to the plantation does not necessarily improve their income and asset stocks particularly when participation in outgrower schemes is mandatory, the buyer has monopsony market...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land....... On other hand, the results in Chapter 4 show that participation in a sugarcane outgrower scheme has a negative impact on households’ income and total asset stock. From the findings in Chapter 3 it can be concluded that outgrower-operated plots have higher productivity than factory-operated plantations...

  13. Stability and Control of Large-Scale Dynamical Systems A Vector Dissipative Systems Approach

    CERN Document Server

    Haddad, Wassim M

    2011-01-01

    Modern complex large-scale dynamical systems exist in virtually every aspect of science and engineering, and are associated with a wide variety of physical, technological, environmental, and social phenomena, including aerospace, power, communications, and network systems, to name just a few. This book develops a general stability analysis and control design framework for nonlinear large-scale interconnected dynamical systems, and presents the most complete treatment on vector Lyapunov function methods, vector dissipativity theory, and decentralized control architectures. Large-scale dynami

  14. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Projects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse problems * partially separable problems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior -point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  15. Parallel Index and Query for Large Scale Data Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chou, Jerry; Wu, Kesheng; Ruebel, Oliver; Howison, Mark; Qiang, Ji; Prabhat,; Austin, Brian; Bethel, E. Wes; Ryne, Rob D.; Shoshani, Arie

    2011-07-18

    Modern scientific datasets present numerous data management and analysis challenges. State-of-the-art index and query technologies are critical for facilitating interactive exploration of large datasets, but numerous challenges remain in terms of designing a system for process- ing general scientific datasets. The system needs to be able to run on distributed multi-core platforms, efficiently utilize underlying I/O infrastructure, and scale to massive datasets. We present FastQuery, a novel software framework that address these challenges. FastQuery utilizes a state-of-the-art index and query technology (FastBit) and is designed to process mas- sive datasets on modern supercomputing platforms. We apply FastQuery to processing of a massive 50TB dataset generated by a large scale accelerator modeling code. We demonstrate the scalability of the tool to 11,520 cores. Motivated by the scientific need to search for inter- esting particles in this dataset, we use our framework to reduce search time from hours to tens of seconds.

  16. High Fidelity Simulations of Large-Scale Wireless Networks

    Energy Technology Data Exchange (ETDEWEB)

    Onunkwo, Uzoma [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Benz, Zachary [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulations (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.

  17. Fast large-scale reionization simulations

    NARCIS (Netherlands)

    Thomas, Rajat M.; Zaroubi, Saleem; Ciardi, Benedetta; Pawlik, Andreas H.; Labropoulos, Panagiotis; Jelic, Vibor; Bernardi, Gianni; Brentjens, Michiel A.; de Bruyn, A. G.; Harker, Geraint J. A.; Koopmans, Leon V. E.; Pandey, V. N.; Schaye, Joop; Yatawatta, Sarod; Mellema, G.

    2009-01-01

    We present an efficient method to generate large simulations of the epoch of reionization without the need for a full three-dimensional radiative transfer code. Large dark-matter-only simulations are post-processed to produce maps of the redshifted 21-cm emission from neutral hydrogen. Dark matter

  18. Challenges and options for large scale integration of wind power

    International Nuclear Information System (INIS)

    Tande, John Olav Giaever

    2006-01-01

    Challenges and options for large scale integration of wind power are examined. Immediate challenges are related to weak grids. Assessment of system stability requires numerical simulation. Models are being developed - validation is essential. Coordination of wind and hydro generation is a key for allowing more wind power capacity in areas with limited transmission corridors. For the case study grid depending on technology and control the allowed wind farm size is increased from 50 to 200 MW. The real life example from 8 January 2005 demonstrates that existing marked based mechanisms can handle large amounts of wind power. In wind integration studies it is essential to take account of the controllability of modern wind farms, the power system flexibility and the smoothing effect of geographically dispersed wind farms. Modern wind farms contribute to system adequacy - combining wind and hydro constitutes a win-win system (ml)

  19. Towards large-scale plasma-assisted synthesis of nanowires

    Science.gov (United States)

    Cvelbar, U.

    2011-05-01

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  20. Planning under uncertainty solving large-scale stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  1. Large scale structure from viscous dark matter

    CERN Document Server

    Blas, Diego; Garny, Mathias; Tetradis, Nikolaos; Wiedemann, Urs Achim

    2015-01-01

    Cosmological perturbations of sufficiently long wavelength admit a fluid dynamic description. We consider modes with wavevectors below a scale $k_m$ for which the dynamics is only mildly non-linear. The leading effect of modes above that scale can be accounted for by effective non-equilibrium viscosity and pressure terms. For mildly non-linear scales, these mainly arise from momentum transport within the ideal and cold but inhomogeneous fluid, while momentum transport due to more microscopic degrees of freedom is suppressed. As a consequence, concrete expressions with no free parameters, except the matching scale $k_m$, can be derived from matching evolution equations to standard cosmological perturbation theory. Two-loop calculations of the matter power spectrum in the viscous theory lead to excellent agreement with $N$-body simulations up to scales $k=0.2 \\, h/$Mpc. The convergence properties in the ultraviolet are better than for standard perturbation theory and the results are robust with respect to varia...

  2. 78 FR 7464 - Large Scale Networking (LSN)-Middleware And Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2013-02-01

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Middleware And Grid Interagency Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD... middleware, Grid, and cloud projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating...

  3. Compressed Air Energy Storage – An Option for Medium to Large Scale Electrical-energy Storage

    OpenAIRE

    Budt, Marcus; Wolf, Daniel; Span, Roland; Yan, Jinyue

    2016-01-01

    This contribution presents the theoretical background of compressed air energy storage, examples for large scale application of this technology, chances and obstacles for its future development, and areas of research aiming at the development of commercially viable plants in the medium to large scale range.

  4. Pseudoscalar-photon mixing and the large scale alignment of QsO ...

    Indian Academy of Sciences (India)

    physics pp. 679-682. Pseudoscalar-photon mixing and the large scale alignment of QsO optical polarizations. PANKAJ JAIN, sUKANTA PANDA and s sARALA. Physics Department, Indian Institute of Technology, Kanpur 208 016, India. Abstract. We review the observation of large scale alignment of QSO optical polariza-.

  5. Superconductivity for Large Scale Wind Turbines

    Energy Technology Data Exchange (ETDEWEB)

    R. Fair; W. Stautner; M. Douglass; R. Rajput-Ghoshal; M. Moscinski; P. Riley; D. Wagner; J. Kim; S. Hou; F. Lopez; K. Haran; J. Bray; T. Laskaris; J. Rochford; R. Duckworth

    2012-10-12

    A conceptual design has been completed for a 10MW superconducting direct drive wind turbine generator employing low temperature superconductors for the field winding. Key technology building blocks from the GE Wind and GE Healthcare businesses have been transferred across to the design of this concept machine. Wherever possible, conventional technology and production techniques have been used in order to support the case for commercialization of such a machine. Appendices A and B provide further details of the layout of the machine and the complete specification table for the concept design. Phase 1 of the program has allowed us to understand the trade-offs between the various sub-systems of such a generator and its integration with a wind turbine. A Failure Modes and Effects Analysis (FMEA) and a Technology Readiness Level (TRL) analysis have been completed resulting in the identification of high risk components within the design. The design has been analyzed from a commercial and economic point of view and Cost of Energy (COE) calculations have been carried out with the potential to reduce COE by up to 18% when compared with a permanent magnet direct drive 5MW baseline machine, resulting in a potential COE of 0.075 $/kWh. Finally, a top-level commercialization plan has been proposed to enable this technology to be transitioned to full volume production. The main body of this report will present the design processes employed and the main findings and conclusions.

  6. High speed and large scale scientific computing

    CERN Document Server

    Gentzsch, W; Joubert, GR

    2010-01-01

    Over the years parallel technologies have completely transformed main stream computing. This book deals with the issues related to the area of cloud computing and discusses developments in grids, applications and information processing, as well as e-science. It is suitable for computer scientists, IT engineers and IT managers.

  7. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  8. 77 FR 58415 - Large Scale Networking (LSN); Joint Engineering Team (JET)

    Science.gov (United States)

    2012-09-20

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN); Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO... agencies and non-Federal participants with interest in high performance research networking and networking...

  9. 78 FR 7464 - Large Scale Networking (LSN) ; Joint Engineering Team (JET)

    Science.gov (United States)

    2013-02-01

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN) ; Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination... interest in high performance research networking and networking to support science applications. The JET...

  10. Large scale solar district heating. Evaluation, modelling and designing

    Energy Technology Data Exchange (ETDEWEB)

    Heller, A.

    2000-07-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the tool for design studies and on a local energy planning case. The evaluation of the central solar heating technology is based on measurements on the case plant in Marstal, Denmark, and on published and unpublished data for other, mainly Danish, CSDHP plants. Evaluations on the thermal, economical and environmental performances are reported, based on the experiences from the last decade. The measurements from the Marstal case are analysed, experiences extracted and minor improvements to the plant design proposed. For the detailed designing and energy planning of CSDHPs, a computer simulation model is developed and validated on the measurements from the Marstal case. The final model is then generalised to a 'generic' model for CSDHPs in general. The meteorological reference data, Danish Reference Year, is applied to find the mean performance for the plant designs. To find the expectable variety of the thermal performance of such plants, a method is proposed where data from a year with poor solar irradiation and a year with strong solar irradiation are applied. Equipped with a simulation tool design studies are carried out spreading from parameter analysis over energy planning for a new settlement to a proposal for the combination of plane solar collectors with high performance solar collectors, exemplified by a trough solar collector. The methodology of utilising computer simulation proved to be a cheap and relevant tool in the design of future solar heating plants. The thesis also exposed the demand for developing computer models for the more advanced solar collector designs and especially for the control operation of CSHPs. In the final chapter the CSHP technology is put into perspective with respect to other possible technologies to find the relevance of the application

  11. Modeling Human Behavior at a Large Scale

    Science.gov (United States)

    2012-01-01

    online messages, along with text analysis of those messages, enables us to predict the progress of a contagion from person to person at a population scale...tation, we represent probabilities and likelihoods with their log-counterparts to avoid arithmetic underflow. At testing time, we are interested in...patterns of people taking taxis, rating movies, choosing a cell phone provider, or sharing music are best explained and predicted by the habits of

  12. Large-Scale Atmosphere-Ocean Coupling.

    Science.gov (United States)

    1984-05-01

    stronger tropical teleconnection linking -120 S-4 i-S----4 -0.5 the reference region and the Carribean Sea and vicin-I" I10 I- --s -- I .-’ " ities is...scale coupling between the tropical atmosphere and ocn in relation to the El Nino/Southern Oscillation (USO) phenomenon is studied using both...connection. between Pacific tropical diabatic heating anomalies and extratropical circulation system over the North Pacific from East Asia to the

  13. Large scale features and assessment of spatial scale ...

    Indian Academy of Sciences (India)

    We have proposed here a new analysis procedure to assess the minimum spatial scale at which the two datasets ... Xie et al. (2007) studied the performance of five ...... The financial support received from ISRO RESPOND Pro- gramme is gratefully acknowledged. TMPA data were obtained from NASA website. References.

  14. Jansky Very Large Array: technology advancing science

    Science.gov (United States)

    Carilli, Christopher

    2015-08-01

    Over the last decade, the NRAO has completed on time, and on budget, a major reconstruction of the Very Large Array. Building on existing infrastructure to maximize efficiency, the entire VLA electronics system, including correlator, receivers, data transmission, and monitor and control, have been replaced with state of the art systems. This complete rebuild establishes the new Jansky VLA, operating between 75MHz and 50GHz, as the most powerful radio telescope in the world for the coming decade.I will review the technical improvements of the array, including:- Correlator: Increased bandwidth from 100MHz to 8GHz, with thousands of spectral channels.- Receivers: replaced the previous narrow bands with receivers covering the full frequency range from 1 GHz to 50GHz. New systems are also being tested to cover from 50MHz to 400MHz.- Data transmission: 8GHz over optical fiber out to 30km.I will then highlight some of the science enabled by these improvements, including:- Large cosmic volume searches for atomic and molecular gas, from the nearby Universe to the most distant galaxies, plus kpc-scale imaging of the cool gas in distant starburst galaxies.- High resolution studies of star and planet formation.- Innovative interferometric searches for transient phenomena.- The first radio continuum deep fields with sensitivities pipelines to provide science ready data products to the community.- Algorithmic development for ultra-deep, wide band, wide field polarimetric imaging.- Exploring the time domain with interferometers.

  15. Recent Progress in Large-Scale Structure

    CERN Multimedia

    CERN. Geneva

    2014-01-01

    I will discuss recent progress in the understanding of how to model galaxy clustering. While recent analyses have focussed on the baryon acoustic oscillations as a probe of cosmology, galaxy redshift surveys contain a lot more information than the acoustic scale. In extracting this additional information three main issues need to be well understood: nonlinear evolution of matter fluctuations, galaxy bias and redshift-space distortions. I will present recent progress in modeling these three effects that pave the way to constraining cosmology and galaxy formation with increased precision.

  16. Large-scale cryopumping for controlled fusion

    International Nuclear Information System (INIS)

    Pittenger, L.C.

    1977-01-01

    Vacuum pumping by freezing out or otherwise immobilizing the pumped gas is an old concept. In several plasma physics experiments for controlled fusion research, cryopumping has been used to provide clean, ultrahigh vacua. Present day fusion research devices, which rely almost universally upon neutral beams for heating, are high gas throughput systems, the pumping of which is best accomplished by cryopumping in the high mass-flow, moderate-to-high vacuum regime. Cryopumping systems have been developed for neutral beam injection systems on several fusion experiments (HVTS, TFTR) and are being developed for the overall pumping of a large, high-throughput mirror containment experiment (MFTF). In operation, these large cryopumps will require periodic defrosting, some schemes for which are discussed, along with other operational considerations. The development of cryopumps for fusion reactors is begun with the TFTR and MFTF systems. Likely paths for necessary further development for power-producing reactors are also discussed

  17. Towards Large-scale Inconsistency Measurement

    OpenAIRE

    Thimm, Matthias

    2015-01-01

    We investigate the problem of inconsistency measurement on large knowledge bases by considering stream-based inconsistency measurement, i.e., we investigate inconsistency measures that cannot consider a knowledge base as a whole but process it within a stream. For that, we present, first, a novel inconsistency measure that is apt to be applied to the streaming case and, second, stream-based approximations for the new and some existing inconsistency measures. We conduct an extensive empirical ...

  18. Large scale fuel oil production experiments

    Energy Technology Data Exchange (ETDEWEB)

    1943-08-04

    The effect of the coal throughput and the composition of the pasting oil, in particular the effect of different middle oil contents in the pasting oil, was previously tested in small scale experiments of hydrogenation of coal. Possibilities of increasing the throughput through the converter when producing heavy oil together with middle oil is shown in this work. The proper industrial detail for the production of heavy oil had to be developed first on a semi-commercial plant. The Upper Silesian coal was used to study the production of gasoline, middle oil, and heavy oil at 700 atm in a 1.6 m/sup 3/ converter and to relate the results with the small scale experiments (10-liter converter). Paste heat exchange was carried out successfully. The following experiments, among others, were carried out: mixed coals were hydrogenated to 100% gasoline plus middle oil, to 65% gasoline and middle oil and 35% heavy oil, as well as 50% gasoline and middle oil plus 50% heavy oil, in part with the usual iron catalyst combination and in part with the sulfurated Bayer mass together with the iron sulfate and sulfigran. The Heinity coal had been hydrogenated with the usual iron catalyst to 65% gasoline and middle oil plus 35% heavy oil. The important results were summarized in a table. Details of the experiments and processes used were given in 3 graphs and 42 tables.

  19. Food security through large scale investments in agriculture

    Science.gov (United States)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  20. Autonomous Sensors for Large Scale Data Collection

    Science.gov (United States)

    Noto, J.; Kerr, R.; Riccobono, J.; Kapali, S.; Migliozzi, M. A.; Goenka, C.

    2017-12-01

    Presented here is a novel implementation of a "Doppler imager" which remotely measures winds and temperatures of the neutral background atmosphere at ionospheric altitudes of 87-300Km and possibly above. Incorporating both recent optical manufacturing developments, modern network awareness and the application of machine learning techniques for intelligent self-monitoring and data classification. This system achieves cost savings in manufacturing, deployment and lifetime operating costs. Deployed in both ground and space-based modalities, this cost-disruptive technology will allow computer models of, ionospheric variability and other space weather models to operate with higher precision. Other sensors can be folded into the data collection and analysis architecture easily creating autonomous virtual observatories. A prototype version of this sensor has recently been deployed in Trivandrum India for the Indian Government. This Doppler imager is capable of operation, even within the restricted CubeSat environment. The CubeSat bus offers a very challenging environment, even for small instruments. The lack of SWaP and the challenging thermal environment demand development of a new generation of instruments; the Doppler imager presented is well suited to this environment. Concurrent with this CubeSat development is the development and construction of ground based arrays of inexpensive sensors using the proposed technology. This instrument could be flown inexpensively on one or more CubeSats to provide valuable data to space weather forecasters and ionospheric scientists. Arrays of magnetometers have been deployed for the last 20 years [Alabi, 2005]. Other examples of ground based arrays include an array of white-light all sky imagers (THEMIS) deployed across Canada [Donovan et al., 2006], oceans sensors on buoys [McPhaden et al., 2010], and arrays of seismic sensors [Schweitzer et al., 2002]. A comparable array of Doppler imagers can be constructed and deployed on the

  1. Networking in a Large-Scale Distributed Agile Project

    OpenAIRE

    Moe, Nils Brede; Šmite, Darja; Šāblis, Aivars; Börjesson, Anne-Lie; Andréasson, Pia

    2014-01-01

    Context: In large-scale distributed software projects the expertise may be scattered across multiple locations. Goal: We describe and discuss a large-scale distributed agile project at Ericsson, a multinational telecommunications company headquartered in Sweden. The project is distributed across four development locations (one in Sweden, one in Korea and two in China) and employs 17 teams. In such a large scale environment the challenge is to have as few dependences between teams as possible,...

  2. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach...... is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines...... within the development of our urban landscapes. At the same time, urban and landscape designers are confronted with new methodological problems. Within a strategic transformation perspective, the formulation of the design problem or brief becomes an integrated part of the design process. This paper...

  3. Large Scale Experiments on Spacecraft Fire Safety

    DEFF Research Database (Denmark)

    Urban, David L.; Ruff, Gary A.; Minster, Olivier

    2012-01-01

    structure. As a result, the prediction of the behaviour of fires in reduced gravity is at present not validated. To address this gap in knowledge, a collaborative international project, Spacecraft Fire Safety, has been established with its cornerstone being the development of an experiment (Fire Safety 1...... to ensure the carrier vehicle does not sustain damage, the absence of a crew removes the need for strict containment of combustion products. This will facilitate the possibility of examin-ing fire behaviour on a scale that is relevant to spacecraft fire safety and will provide unique data for fire model...... validation. This unprecedented opportunity will expand the understanding of the fundamentals of fire behaviour in spacecraft. The experiment is being developed by an international topical team that is collaboratively defining the experiment requirements and performing supporting analysis, experimentation...

  4. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P. (PA Energy, Malling (Denmark)); Vedde, J. (SiCon. Silicon and PV consulting, Birkeroed (Denmark))

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  5. Responses in large-scale structure

    Science.gov (United States)

    Barreira, Alexandre; Schmidt, Fabian

    2017-06-01

    We introduce a rigorous definition of general power-spectrum responses as resummed vertices with two hard and n soft momenta in cosmological perturbation theory. These responses measure the impact of long-wavelength perturbations on the local small-scale power spectrum. The kinematic structure of the responses (i.e., their angular dependence) can be decomposed unambiguously through a ``bias'' expansion of the local power spectrum, with a fixed number of physical response coefficients, which are only a function of the hard wavenumber k. Further, the responses up to n-th order completely describe the (n+2)-point function in the squeezed limit, i.e. with two hard and n soft modes, which one can use to derive the response coefficients. This generalizes previous results, which relate the angle-averaged squeezed limit to isotropic response coefficients. We derive the complete expression of first- and second-order responses at leading order in perturbation theory, and present extrapolations to nonlinear scales based on simulation measurements of the isotropic response coefficients. As an application, we use these results to predict the non-Gaussian part of the angle-averaged matter power spectrum covariance CovNGl=0(k1,k2), in the limit where one of the modes, say k2, is much smaller than the other. Without any free parameters, our model results are in very good agreement with simulations for k2 lesssim 0.06 h Mpc-1, and for any k1 gtrsim 2k2. The well-defined kinematic structure of the power spectrum response also permits a quick evaluation of the angular dependence of the covariance matrix. While we focus on the matter density field, the formalism presented here can be generalized to generic tracers such as galaxies.

  6. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  7. Large Scale Demand Response of Thermostatic Loads

    DEFF Research Database (Denmark)

    Totu, Luminita Cristiana

    This study is concerned with large populations of residential thermostatic loads (e.g. refrigerators, air conditioning or heat pumps). The purpose is to gain control over the aggregate power consumption in order to provide balancing services for the electrical grid. Without affecting the temperat......This study is concerned with large populations of residential thermostatic loads (e.g. refrigerators, air conditioning or heat pumps). The purpose is to gain control over the aggregate power consumption in order to provide balancing services for the electrical grid. Without affecting...... the temperature limits and other operational constraints, and by using only limited communication, it is possible to make use of the individual thermostat deadband flexibility to step-up or step-down the power consumption of the population as if it were a power plant. The individual thermostatic loads experience...... no loss of service or quality, and the electrical grid gains a fast power resource of hundreds of MW or more. This study proposes and analysis a mechanism that introduces random on/off and off/on switches in the normal thermostat operation of the units. This mechanism is called Switching Actuation...

  8. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    Science.gov (United States)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  9. State-of-the-art of large scale biogas plants

    International Nuclear Information System (INIS)

    Prisum, J.M.; Noergaard, P.

    1992-01-01

    A survey of the technological state of large scale biogas plants in Europe treating manure is given. 83 plants are in operation at present. Of these, 16 are centralised digestion plants. Transport costs at centralised digestion plants amounts to between 25 and 40 percent of the total operational costs. Various transport equipment is used. Most large scale digesters are CSTRs, but serial, contact, 2-step, and plug-flow digesters are also found. Construction materials are mostly steel and concrete. Mesophilic digestion is most common (56%), thermophilic digestion is used in 17% of the plants, combined mesophilic and thermophilic digestion is used in 28% of the centralised plants. Mixing of digester content is performed with gas injection, propellers, and gas-liquid displacement. Heating is carried out using external or internal heat exchangers. Heat recovery is only used in Denmark. Gas purification equipment is commonplace, but not often needed. Several plants use separation of the digested manure, often as part of a post-treatment/-purification process or for the production of 'compost'. Screens, sieve belt separaters, centrifuges and filter presses are employed. The use of biogas varies considerably. In some cases, combined heat and power stations are supplying the grid and district heating systems. Other plants use only either the electricity or heat. (au)

  10. Large-Scale Cultivation of Euglena.

    Science.gov (United States)

    Suzuki, Kengo

    2017-01-01

    From the middle of the twentieth century, microalgae have been exploited as a candidate biomass source of food and other products. One such candidate source is the fast-proliferating microalga Euglena gracilis. The commercial cultivation of E. gracilis began in 2007, after the success of its outdoor mass cultivation and improvement of the harvesting and drying methods suitable for Euglena cells. The commercialization of Euglena production is based on the strategy of "5Fs of Biomass," which refers to the development and production of commercial products including food, fiber, feed, fertilizer, and fuel from biomass." Although room for improvement remains in the productivity of Euglena biomass, the product with the highest value-food-is already profitable. By enhancing the productivity of its biomass, other Euglena products, including fiber, feed, fertilizer, and fuel, can be commercialized. Breeding and recombinant DNA technology studies are being conducted to accomplish more extensive application of Euglena. In addition, the search for a better place for outdoor mass cultivation of Euglena is ongoing.

  11. Optimization of Large-Scale Structural Systems

    DEFF Research Database (Denmark)

    Jensen, F. M.

    solutions to small problems with one or two variables to the optimization of large structures such as bridges, ships and offshore structures. The methods used for salving these problems have evolved from being classical differential calculus and calculus of variation to very advanced numerical techniques....... In the same period of time the problems have grown in size and the ongoing research in the various engineering fields has introduced new areas to complicate the optimization task further. These are e.g. structural reliability theory (including new, more complex constraints), discrete optimization (introducing...... new narrow bounds on the optimization variables), s tachastic FEM, vibration theory or multiobjective optimization. At the same time researchers always try to salve problems ahead of to-day's capabilities, thereby utilising current mathematical programming (MP) methods to the limit. However, when...

  12. Large-Scale Structures of Planetary Systems

    Science.gov (United States)

    Murray-Clay, Ruth; Rogers, Leslie A.

    2015-12-01

    A class of solar system analogs has yet to be identified among the large crop of planetary systems now observed. However, since most observed worlds are more easily detectable than direct analogs of the Sun's planets, the frequency of systems with structures similar to our own remains unknown. Identifying the range of possible planetary system architectures is complicated by the large number of physical processes that affect the formation and dynamical evolution of planets. I will present two ways of organizing planetary system structures. First, I will suggest that relatively few physical parameters are likely to differentiate the qualitative architectures of different systems. Solid mass in a protoplanetary disk is perhaps the most obvious possible controlling parameter, and I will give predictions for correlations between planetary system properties that we would expect to be present if this is the case. In particular, I will suggest that the solar system's structure is representative of low-metallicity systems that nevertheless host giant planets. Second, the disk structures produced as young stars are fed by their host clouds may play a crucial role. Using the observed distribution of RV giant planets as a function of stellar mass, I will demonstrate that invoking ice lines to determine where gas giants can form requires fine tuning. I will suggest that instead, disk structures built during early accretion have lasting impacts on giant planet distributions, and disk clean-up differentially affects the orbital distributions of giant and lower-mass planets. These two organizational hypotheses have different implications for the solar system's context, and I will suggest observational tests that may allow them to be validated or falsified.

  13. Tidal power plant may develop into large-scale industry

    International Nuclear Information System (INIS)

    2001-01-01

    Hammerfest was the first city in Norway with hydroelectric power production and the first city in Northern Europe to have electric street lights. Recently, technologists within the city's electricity supply industry have suggested that Hammerfest should pioneer the field of tidal energy. The idea is to create a new Norwegian large-scale industry. The technology is being developed by the company Hammerfest Stroem. A complete plant is planned to be installed in Kvalsundet. It will include turbine, generator, converters, transmission to land and delivery to the network. Once fully developed, in 2004, the plant will be sold. The company expects to install similar plants elsewhere in Norway and abroad. It is calculated that for a tidewater current of 2.5 m/s, the worldwide potential is about 450 TWh

  14. Goethite Bench-scale and Large-scale Preparation Tests

    Energy Technology Data Exchange (ETDEWEB)

    Josephson, Gary B.; Westsik, Joseph H.

    2011-10-23

    The Hanford Waste Treatment and Immobilization Plant (WTP) is the keystone for cleanup of high-level radioactive waste from our nation's nuclear defense program. The WTP will process high-level waste from the Hanford tanks and produce immobilized high-level waste glass for disposal at a national repository, low activity waste (LAW) glass, and liquid effluent from the vitrification off-gas scrubbers. The liquid effluent will be stabilized into a secondary waste form (e.g. grout-like material) and disposed on the Hanford site in the Integrated Disposal Facility (IDF) along with the low-activity waste glass. The major long-term environmental impact at Hanford results from technetium that volatilizes from the WTP melters and finally resides in the secondary waste. Laboratory studies have indicated that pertechnetate ({sup 99}TcO{sub 4}{sup -}) can be reduced and captured into a solid solution of {alpha}-FeOOH, goethite (Um 2010). Goethite is a stable mineral and can significantly retard the release of technetium to the environment from the IDF. The laboratory studies were conducted using reaction times of many days, which is typical of environmental subsurface reactions that were the genesis of this new process. This study was the first step in considering adaptation of the slow laboratory steps to a larger-scale and faster process that could be conducted either within the WTP or within the effluent treatment facility (ETF). Two levels of scale-up tests were conducted (25x and 400x). The largest scale-up produced slurries of Fe-rich precipitates that contained rhenium as a nonradioactive surrogate for {sup 99}Tc. The slurries were used in melter tests at Vitreous State Laboratory (VSL) to determine whether captured rhenium was less volatile in the vitrification process than rhenium in an unmodified feed. A critical step in the technetium immobilization process is to chemically reduce Tc(VII) in the pertechnetate (TcO{sub 4}{sup -}) to Tc(Iv)by reaction with the

  15. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  16. Superconducting materials for large scale applications

    Energy Technology Data Exchange (ETDEWEB)

    Scanlan, Ronald M.; Malozemoff, Alexis P.; Larbalestier, David C.

    2004-05-06

    Significant improvements in the properties ofsuperconducting materials have occurred recently. These improvements arebeing incorporated into the latest generation of wires, cables, and tapesthat are being used in a broad range of prototype devices. These devicesinclude new, high field accelerator and NMR magnets, magnets for fusionpower experiments, motors, generators, and power transmission lines.These prototype magnets are joining a wide array of existing applicationsthat utilize the unique capabilities of superconducting magnets:accelerators such as the Large Hadron Collider, fusion experiments suchas ITER, 930 MHz NMR, and 4 Tesla MRI. In addition, promising newmaterials such as MgB2 have been discovered and are being studied inorder to assess their potential for new applications. In this paper, wewill review the key developments that are leading to these newapplications for superconducting materials. In some cases, the key factoris improved understanding or development of materials with significantlyimproved properties. An example of the former is the development of Nb3Snfor use in high field magnets for accelerators. In other cases, thedevelopment is being driven by the application. The aggressive effort todevelop HTS tapes is being driven primarily by the need for materialsthat can operate at temperatures of 50 K and higher. The implications ofthese two drivers for further developments will be discussed. Finally, wewill discuss the areas where further improvements are needed in order fornew applications to be realized.

  17. Toward Increasing Fairness in Score Scale Calibrations Employed in International Large-Scale Assessments

    Science.gov (United States)

    Oliveri, Maria Elena; von Davier, Matthias

    2014-01-01

    In this article, we investigate the creation of comparable score scales across countries in international assessments. We examine potential improvements to current score scale calibration procedures used in international large-scale assessments. Our approach seeks to improve fairness in scoring international large-scale assessments, which often…

  18. Advances in Modelling of Large Scale Coastal Evolution

    NARCIS (Netherlands)

    Stive, M.J.F.; De Vriend, H.J.

    1995-01-01

    The attention for climate change impact on the world's coastlines has established large scale coastal evolution as a topic of wide interest. Some more recent advances in this field, focusing on the potential of mathematical models for the prediction of large scale coastal evolution, are discussed.

  19. The Large Scale Magnetic Field and Sunspot Cycles

    Indian Academy of Sciences (India)

    tribpo

    We report on the correlation between the large scale magnetic field and sunspot cycles during the last 80 years that was found by Makarov et al. (1999) and Makarov. & Tlatov (2000) in H α spherical harmonics of the large scale magnetic field for. 1915 1999. The sum of intensities of the low modes l = 1 and 3, A(t), was used ...

  20. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  1. ACTIVE DIMENSIONAL CONTROL OF LARGE-SCALED STEEL STRUCTURES

    Directory of Open Access Journals (Sweden)

    Radosław Rutkowski

    2013-09-01

    Full Text Available The article discusses the issues of dimensional control in the construction process of large-scaled steel structures. The main focus is on the analysis of manufacturing tolerances. The article presents the procedure of tolerance analysis usage in process of design and manufacturing of large-scaled steel structures. The proposed solution could significantly improve the manufacturing process.

  2. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  3. Transmission of large amounts of scientific data using laser technology

    Science.gov (United States)

    Isaev, E. A.; Tarasov, P. A.

    2016-08-01

    Currently, the volume of figures generated by different research scientific projects (the Large Hadron Collider (Large Hadron Collider, LHC), The Square Kilometre Array (SKA)), can reach tens of petabytes per day. The only technical solution that allows you to transfer such large amounts of scientific data to the places of their processing is the transfer of information by means of laser technology, using different propagation environment. This article discusses the possibility of data transmission via fiber-optic networks, data transmission using the modulation binary stream of light source by a special LED light source, the neccessity to apply laser technologies for deep space communications, the principle for an unlimited expansion of the capacity of laser data link. Also in this study is shown the need for a substantial increase in data transfer speed via a pre-existing communication networks and via the construction of new channels of communication that will cope with the transfer of very large scale data volumes, taking into account the projected rate of growth.

  4. Technology Management on Large Construction Projects

    DEFF Research Database (Denmark)

    Bonke, Sten

    The aim of this text is to discuss and to develop the concept of technology management in relation to the empirical field of construction projects. In the first of the two main sections central theories and their derived assertions concerning technology management criteria are summed up...... Fixed Link construction project. Finally on this basis the concluding remarks are pointing to the main theoretical problems and their practical implementations for the introduction of a technology management discipline in construction....... in a schematic theoretical framework. Hereafter the general characteristics of construction are examined from the point of view of serving as an empirical field for technology management analysis. In the second section the technology management theme is associated with the empirical properties of the Great Belt...

  5. Large-scale seismic waveform quality metric calculation using Hadoop

    Science.gov (United States)

    Magana-Zook, S.; Gaylord, J. M.; Knapp, D. R.; Dodge, D. A.; Ruppert, S. D.

    2016-09-01

    In this work we investigated the suitability of Hadoop MapReduce and Apache Spark for large-scale computation of seismic waveform quality metrics by comparing their performance with that of a traditional distributed implementation. The Incorporated Research Institutions for Seismology (IRIS) Data Management Center (DMC) provided 43 terabytes of broadband waveform data of which 5.1 TB of data were processed with the traditional architecture, and the full 43 TB were processed using MapReduce and Spark. Maximum performance of 0.56 terabytes per hour was achieved using all 5 nodes of the traditional implementation. We noted that I/O dominated processing, and that I/O performance was deteriorating with the addition of the 5th node. Data collected from this experiment provided the baseline against which the Hadoop results were compared. Next, we processed the full 43 TB dataset using both MapReduce and Apache Spark on our 18-node Hadoop cluster. These experiments were conducted multiple times with various subsets of the data so that we could build models to predict performance as a function of dataset size. We found that both MapReduce and Spark significantly outperformed the traditional reference implementation. At a dataset size of 5.1 terabytes, both Spark and MapReduce were about 15 times faster than the reference implementation. Furthermore, our performance models predict that for a dataset of 350 terabytes, Spark running on a 100-node cluster would be about 265 times faster than the reference implementation. We do not expect that the reference implementation deployed on a 100-node cluster would perform significantly better than on the 5-node cluster because the I/O performance cannot be made to scale. Finally, we note that although Big Data technologies clearly provide a way to process seismic waveform datasets in a high-performance and scalable manner, the technology is still rapidly changing, requires a high degree of investment in personnel, and will likely

  6. Development of a Large Scale, High Speed Wheel Test Facility

    Science.gov (United States)

    Kondoleon, Anthony; Seltzer, Donald; Thornton, Richard; Thompson, Marc

    1996-01-01

    Draper Laboratory, with its internal research and development budget, has for the past two years been funding a joint effort with the Massachusetts Institute of Technology (MIT) for the development of a large scale, high speed wheel test facility. This facility was developed to perform experiments and carry out evaluations on levitation and propulsion designs for MagLev systems currently under consideration. The facility was developed to rotate a large (2 meter) wheel which could operate with peripheral speeds of greater than 100 meters/second. The rim of the wheel was constructed of a non-magnetic, non-conductive composite material to avoid the generation of errors from spurious forces. A sensor package containing a multi-axis force and torque sensor mounted to the base of the station, provides a signal of the lift and drag forces on the package being tested. Position tables mounted on the station allow for the introduction of errors in real time. A computer controlled data acquisition system was developed around a Macintosh IIfx to record the test data and control the speed of the wheel. This paper describes the development of this test facility. A detailed description of the major components is presented. Recently completed tests carried out on a novel Electrodynamic (EDS) suspension system, developed by MIT as part of this joint effort are described and presented. Adaptation of this facility for linear motor and other propulsion and levitation testing is described.

  7. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  8. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  9. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  10. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  11. Robust scene stitching in large scale mobile mapping

    OpenAIRE

    Schouwenaars, Filip; Timofte, Radu; Van Gool, Luc

    2013-01-01

    Schouwenaars F., Timofte R., Van Gool L., ''Robust scene stitching in large scale mobile mapping'', 24th British machine vision conference - BMVC 2013, 11 pp., September 9-13, 2013, Bristol, United Kingdom.

  12. Large-scale linear programs in planning and prediction.

    Science.gov (United States)

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  13. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  14. USAGE OF DISSIMILARITY MEASURES AND MULTIDIMENSIONAL SCALING FOR LARGE SCALE SOLAR DATA ANALYSIS

    Data.gov (United States)

    National Aeronautics and Space Administration — USAGE OF DISSIMILARITY MEASURES AND MULTIDIMENSIONAL SCALING FOR LARGE SCALE SOLAR DATA ANALYSIS Juan M Banda, Rafal Anrgyk ABSTRACT: This work describes the...

  15. Efficient algorithms for collaborative decision making for large scale settings

    DEFF Research Database (Denmark)

    Assent, Ira

    2011-01-01

    Collaborative decision making is a successful approach in settings where data analysis and querying can be done interactively. In large scale systems with huge data volumes or many users, collaboration is often hindered by impractical runtimes. Existing work on improving collaboration focuses on ...... to bring about more effective and more efficient retrieval systems that support the users' decision making process. We sketch promising research directions for more efficient algorithms for collaborative decision making, especially for large scale systems....

  16. Temporal Variation of Large Scale Flows in the Solar Interior ...

    Indian Academy of Sciences (India)

    tribpo

    Abstract. We attempt to detect short term temporal variations in the rotation rate and other large scale velocity fields in the outer part of the solar convection zone using the ring diagram technique applied to. Michelson Doppler Imager (MDI) data. The measured velocity field shows variations by about 10 m/s on the scale of ...

  17. The Large Scale Structure: Polarization Aspects R. F. Pizzo

    Indian Academy of Sciences (India)

    c Indian Academy of Sciences. The Large Scale Structure: Polarization Aspects. R. F. Pizzo. ASTRON, Postbus 2, 7990 AA Dwingeloo, The Netherlands e-mail: pizzo@astron.nl. Abstract. Polarized radio emission is detected at various scales in the. Universe. In this document, I will briefly review our knowledge on polar-.

  18. Large scale particle image velocimetry with helium filled soap bubbles

    Science.gov (United States)

    Bosbach, Johannes; Kühn, Matthias; Wagner, Claus

    2009-03-01

    The application of Particle Image Velocimetry (PIV) to measurement of flows on large scales is a challenging necessity especially for the investigation of convective air flows. Combining helium filled soap bubbles as tracer particles with high power quality switched solid state lasers as light sources allows conducting PIV on scales of the order of several square meters. The technique was applied to mixed convection in a full scale double aisle aircraft cabin mock-up for validation of Computational Fluid Dynamics simulations.

  19. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  20. Challenges with Scaling Scrum to Large-Scale Software Development: A Case Study

    OpenAIRE

    Jensen, Simen

    2017-01-01

    Agile software development methods have become popular since the introduction of the Agile Manifesto in 2001. Agile methods, such as Scrum, are originally created for small co-located teams but have been adopted to large-scale development organizations. The accompanying challenges of using Scrum in large-scale development are not fully explored and understood. This thesis aim to explore and identify challenges regarding large-scale agile development in a global software development organizati...

  1. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...... wind power. This was done by bringing together the key industry stakeholders and competent research organisations in the project....

  2. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    Science.gov (United States)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  3. Cost Overruns in Large-scale Transportation Infrastructure Projects

    DEFF Research Database (Denmark)

    Cantarelli, Chantal C; Flyvbjerg, Bent; Molin, Eric J. E

    2010-01-01

    Managing large-scale transportation infrastructure projects is difficult due to frequent misinformation about the costs which results in large cost overruns that often threaten the overall project viability. This paper investigates the explanations for cost overruns that are given in the literature...

  4. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Recent increases in commodity prices have led some governments and private investors to purchase or lease large tracts of land in foreign countries for producing their own food and biofuel. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  5. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  6. Planck intermediate results XLII. Large-scale Galactic magnetic fields

    DEFF Research Database (Denmark)

    Adam, R.; Ade, P. A. R.; Alves, M. I. R.

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured...

  7. Large scale photovoltaic field trials. Second technical report: monitoring phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This report provides an update on the Large-Scale Building Integrated Photovoltaic Field Trials (LS-BIPV FT) programme commissioned by the Department of Trade and Industry (Department for Business, Enterprise and Industry; BERR). It provides detailed profiles of the 12 projects making up this programme, which is part of the UK programme on photovoltaics and has run in parallel with the Domestic Field Trial. These field trials aim to record the experience and use the lessons learnt to raise awareness of, and confidence in, the technology and increase UK capabilities. The projects involved: the visitor centre at the Gaia Energy Centre in Cornwall; a community church hall in London; council offices in West Oxfordshire; a sports science centre at Gloucester University; the visitor centre at Cotswold Water Park; the headquarters of the Insolvency Service; a Welsh Development Agency building; an athletics centre in Birmingham; a research facility at the University of East Anglia; a primary school in Belfast; and Barnstable civic centre in Devon. The report describes the aims of the field trials, monitoring issues, performance, observations and trends, lessons learnt and the results of occupancy surveys.

  8. Countercurrent tangential chromatography for large-scale protein purification.

    Science.gov (United States)

    Shinkazh, Oleg; Kanani, Dharmesh; Barth, Morgan; Long, Matthew; Hussain, Daniar; Zydney, Andrew L

    2011-03-01

    Recent advances in cell culture technology have created significant pressure on the downstream purification process, leading to a "downstream bottleneck" in the production of recombinant therapeutic proteins for the treatment of cancer, genetic disorders, and cardiovascular disease. Countercurrent tangential chromatography overcomes many of the limitations of conventional column chromatography by having the resin (in the form of a slurry) flow through a series of static mixers and hollow fiber membrane modules. The buffers used in the binding, washing, and elution steps flow countercurrent to the resin, enabling high-resolution separations while reducing the amount of buffer needed for protein purification. The results obtained in this study provide the first experimental demonstration of the feasibility of using countercurrent tangential chromatography for the separation of a model protein mixture containing bovine serum albumin and myoglobin using a commercially available anion exchange resin. Batch uptake/desorption experiments were used in combination with critical flux data for the hollow fiber filters to design the countercurrent tangential chromatography system. A two-stage batch separation yielded the purified target protein at >99% purity with 94% recovery. The results clearly demonstrate the potential of using countercurrent tangential chromatography for the large-scale purification of therapeutic proteins. Copyright © 2010 Wiley Periodicals, Inc.

  9. Siemens: Smart Technologies for Large Control Systems

    CERN Multimedia

    CERN. Geneva; BAKANY, Elisabeth

    2015-01-01

    The CERN Large Hadron Collider (LHC) is known to be one of the most complex scientific machines ever built by mankind. Its correct functioning relies on the integration of a multitude of interdependent industrial control systems, which provide different and essential services to run and protect the accelerators and experiments. These systems have to deal with several millions of data points (e.g. sensors, actuators, configuration parameters, etc…) which need to be acquired, processed, archived and analysed. Since more than 20 years, CERN and Siemens have developed a strong collaboration to deal with the challenges for these large systems. The presentation will cover the current work on the SCADA (Supervisory Control and Data Acquisition) systems and Data Analytics Frameworks.

  10. Technology requirements and readiness for very large aircraft

    Science.gov (United States)

    Conner, D. W.; Vaughan, J. C., III

    1980-01-01

    The technology requirements for designing, manufacturing and operating any vehicle depend largely on the configuration of that vehicle. Under the general heading of Very Large Aircraft (VLA), configurations are many and varied, so, therefore, are the technology requirements. The present work is limited to technology requirements of particular interest to VLA. While many are of common interest, a few technology requirements critical to specific VLA types are also covered. Addressed in turn are common VLA concerns and how they influence configurations and technology; the methodology followed in selecting requirements and assessing readiness; the resultant technology requirements and readiness; and some overall observations regarding technology areas judged to be particularly critical. Over 50 technology requirements are identified as unique or particularly critical to VLA. However, none of the requirements is considered to have an excellent state of technology readiness.

  11. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  12. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  13. Enhancing microelectronics education with large-scale student projects

    OpenAIRE

    Rumpf, Clemens; Lidtke, Aleksander; Weddell, Alex; Maunder, Rob

    2016-01-01

    This paper discusses the benefits of using large-scale projects, involving many groups of students with different backgrounds, in the education of undergraduate microelectronics engineering students. The benefits of involving students in large, industry-like projects are first briefly reviewed. The organisation of undergraduate programmes is presented, and it is described how students can be involved in such large projects, while maintaining compatibility with undergraduate programmes. The ge...

  14. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  15. Acoustic Studies of the Large Scale Ocean Circulation

    Science.gov (United States)

    Menemenlis, Dimitris

    1999-01-01

    Detailed knowledge of ocean circulation and its transport properties is prerequisite to an understanding of the earth's climate and of important biological and chemical cycles. Results from two recent experiments, THETIS-2 in the Western Mediterranean and ATOC in the North Pacific, illustrate the use of ocean acoustic tomography for studies of the large scale circulation. The attraction of acoustic tomography is its ability to sample and average the large-scale oceanic thermal structure, synoptically, along several sections, and at regular intervals. In both studies, the acoustic data are compared to, and then combined with, general circulation models, meteorological analyses, satellite altimetry, and direct measurements from ships. Both studies provide complete regional descriptions of the time-evolving, three-dimensional, large scale circulation, albeit with large uncertainties. The studies raise serious issues about existing ocean observing capability and provide guidelines for future efforts.

  16. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  17. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  18. FEASIBILITY OF LARGE-SCALE OCEAN CO2 SEQUESTRATION

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Peter Brewer; Dr. James Barry

    2002-09-30

    We have continued to carry out creative small-scale experiments in the deep ocean to investigate the science underlying questions of possible future large-scale deep-ocean CO{sub 2} sequestration as a means of ameliorating greenhouse gas growth rates in the atmosphere. This project is closely linked to additional research funded by the DoE Office of Science, and to support from the Monterey Bay Aquarium Research Institute. The listing of project achievements here over the past year reflects these combined resources. Within the last project year we have: (1) Published a significant workshop report (58 pages) entitled ''Direct Ocean Sequestration Expert's Workshop'', based upon a meeting held at MBARI in 2001. The report is available both in hard copy, and on the NETL web site. (2) Carried out three major, deep ocean, (3600m) cruises to examine the physical chemistry, and biological consequences, of several liter quantities released on the ocean floor. (3) Carried out two successful short cruises in collaboration with Dr. Izuo Aya and colleagues (NMRI, Osaka, Japan) to examine the fate of cold (-55 C) CO{sub 2} released at relatively shallow ocean depth. (4) Carried out two short cruises in collaboration with Dr. Costas Tsouris, ORNL, to field test an injection nozzle designed to transform liquid CO{sub 2} into a hydrate slurry at {approx}1000m depth. (5) In collaboration with Prof. Jill Pasteris (Washington University) we have successfully accomplished the first field test of a deep ocean laser Raman spectrometer for probing in situ the physical chemistry of the CO{sub 2} system. (6) Submitted the first major paper on biological impacts as determined from our field studies. (7) Submitted a paper on our measurements of the fate of a rising stream of liquid CO{sub 2} droplets to Environmental Science & Technology. (8) Have had accepted for publication in Eos the first brief account of the laser Raman spectrometer success. (9) Have had two

  19. Advanced I/O for large-scale scientific applications.

    Energy Technology Data Exchange (ETDEWEB)

    Klasky, Scott (Oak Ridge National Laboratory, Oak Ridge, TN); Schwan, Karsten (Georgia Institute of Technology, Atlanta, GA); Oldfield, Ron A.; Lofstead, Gerald F., II (Georgia Institute of Technology, Atlanta, GA)

    2010-01-01

    As scientific simulations scale to use petascale machines and beyond, the data volumes generated pose a dual problem. First, with increasing machine sizes, the careful tuning of IO routines becomes more and more important to keep the time spent in IO acceptable. It is not uncommon, for instance, to have 20% of an application's runtime spent performing IO in a 'tuned' system. Careful management of the IO routines can move that to 5% or even less in some cases. Second, the data volumes are so large, on the order of 10s to 100s of TB, that trying to discover the scientifically valid contributions requires assistance at runtime to both organize and annotate the data. Waiting for offline processing is not feasible due both to the impact on the IO system and the time required. To reduce this load and improve the ability of scientists to use the large amounts of data being produced, new techniques for data management are required. First, there is a need for techniques for efficient movement of data from the compute space to storage. These techniques should understand the underlying system infrastructure and adapt to changing system conditions. Technologies include aggregation networks, data staging nodes for a closer parity to the IO subsystem, and autonomic IO routines that can detect system bottlenecks and choose different approaches, such as splitting the output into multiple targets, staggering output processes. Such methods must be end-to-end, meaning that even with properly managed asynchronous techniques, it is still essential to properly manage the later synchronous interaction with the storage system to maintain acceptable performance. Second, for the data being generated, annotations and other metadata must be incorporated to help the scientist understand output data for the simulation run as a whole, to select data and data features without concern for what files or other storage technologies were employed. All of these features should be

  20. Advanced I/O for large-scale scientific applications

    International Nuclear Information System (INIS)

    Klasky, Scott; Schwan, Karsten; Oldfield, Ron A.; Lofstead, Gerald F. II

    2010-01-01

    As scientific simulations scale to use petascale machines and beyond, the data volumes generated pose a dual problem. First, with increasing machine sizes, the careful tuning of IO routines becomes more and more important to keep the time spent in IO acceptable. It is not uncommon, for instance, to have 20% of an application's runtime spent performing IO in a 'tuned' system. Careful management of the IO routines can move that to 5% or even less in some cases. Second, the data volumes are so large, on the order of 10s to 100s of TB, that trying to discover the scientifically valid contributions requires assistance at runtime to both organize and annotate the data. Waiting for offline processing is not feasible due both to the impact on the IO system and the time required. To reduce this load and improve the ability of scientists to use the large amounts of data being produced, new techniques for data management are required. First, there is a need for techniques for efficient movement of data from the compute space to storage. These techniques should understand the underlying system infrastructure and adapt to changing system conditions. Technologies include aggregation networks, data staging nodes for a closer parity to the IO subsystem, and autonomic IO routines that can detect system bottlenecks and choose different approaches, such as splitting the output into multiple targets, staggering output processes. Such methods must be end-to-end, meaning that even with properly managed asynchronous techniques, it is still essential to properly manage the later synchronous interaction with the storage system to maintain acceptable performance. Second, for the data being generated, annotations and other metadata must be incorporated to help the scientist understand output data for the simulation run as a whole, to select data and data features without concern for what files or other storage technologies were employed. All of these features should be attained while

  1. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  2. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  3. Large aperture nanocomposite deformable mirror technology

    Science.gov (United States)

    Chen, Peter C.; Hale, Richard D.

    2007-12-01

    We report progress in the development of deformable mirrors (DM) using nanocomposite materials. For the extremely large telescopes (ELTs) currently being planned, a new generation of DMs with unprecedented performance is a critical path item. The DMs need to have large apertures (meters), continuous surfaces, and low microroughness. Most importantly, they must have excellent static optical figures and yet be sufficiently thin (1-2 mm) and flexible to function with small, low powered actuators. Carbon fiber reinforced plastics (CFRP) have the potential to fulfill these requirements. However, CFRP mirrors made using direct optical replication have encountered a number of problems. Firstly, it is difficult if not impossible for a CFRP mirror to maintain a good static optical figure if a small number of plies are used, but adding more plies to the laminate tends to make the substrate too thick and stiff. Secondly, direct optical replication requires precision mandrels, the costs of which become prohibitive at multi-meter apertures. We report development of a new approach. By using a combination of a novel support structure, selected fibers, and binding resins infused with nanoparticles, it is possible to make millimeter thick optical mirrors that can both maintain good static optical figures and yet still have the required flexibility for actuation. Development and refinement of a non-contact, deterministic process of fine figuring permits generation of accurate optical surfaces without the need for precision optical mandrels. We present data from tests that have been carried out to demonstrate these new processes. A number of flat DMs have been fabricated, as well as concave and convex DMs in spherical, parabolic, and other forms.

  4. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  5. Modeling large scale cohesive sediment transport affected by small scale biological activity

    NARCIS (Netherlands)

    Borsje, Bastiaan Wijnand; de Vries, Mindert; Hulscher, Suzanne J.M.H.; de Boer, Gerben J.

    2008-01-01

    Biological activity on the bottom of the seabed is known to have significant influence on the dynamics of cohesive sediment on a small spatial and temporal scale. In this study, we aim to understand the large-scale effects of small-scale biological activity. Hereto, effects of biology are

  6. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  7. Dynamic scaling and large scale effects in turbulence in compressible stratified fluid

    International Nuclear Information System (INIS)

    Pharasi, Hirdesh K.; Bhattacharjee, Jayanta K.

    2016-01-01

    We consider the propagation of sound in a turbulent fluid which is confined between two horizontal parallel plates, maintained at different temperatures. In the homogeneous fluid, Staroselsky et al. had predicted a divergent sound speed at large length scales. Here we find a divergent sound speed and a vanishing expansion coefficient at large length scales. Dispersion relation and the question of scale invariance at large distance scales lead to these results. - Highlights: • Turbulence in a stratified fluid has been studied in the Boussinesq approximation. • We extend this study to include density fluctuations due to pressure fluctuations. • For a homogeneous weakly compressible fluid the sound speed is known to become scale dependent. • For the stratified fluid we show that the expansion coefficient is also scale dependent. • Our results are based on general dynamic scaling arguments rather than detailed calculation.

  8. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension of the para......Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...... of the parameter vector. A new design matrix free algorithm is proposed for computing the penalized maximum likelihood estimate for GLAMs, which, in particular, handles nondifferentiable penalty functions. The proposed algorithm is implemented and available via the R package glamlasso. It combines several ideas...

  9. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  10. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  11. Privacy Preserving Large-Scale Rating Data Publishing

    Directory of Open Access Journals (Sweden)

    Xiaoxun Sun

    2013-02-01

    Full Text Available Large scale rating data usually contains both ratings of sensitive and non-sensitive issues, and the ratings of sensitive issues belong to personal privacy. Even when survey participants do not reveal any of their ratings, their survey records are potentially identifiable by using information from other public sources. In order to protect the privacy in the large-scale rating data, it is important to propose new privacy principles which consider the properties of the rating data. Moreover, given the privacy principle, how to efficiently determine whether the rating data satisfied the required privacy principle is crucial as well. Furthermore, if the privacy principle is not satisfied, an efficient method is needed to securely publish the large-scale rating data. In this paper, all these problem will be addressed.

  12. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  13. Partitioning Large Scale Deep Belief Networks Using Dropout

    OpenAIRE

    Huang, Yanping; Zhang, Sai

    2015-01-01

    Deep learning methods have shown great promise in many practical applications, ranging from speech recognition, visual object recognition, to text processing. However, most of the current deep learning methods suffer from scalability problems for large-scale applications, forcing researchers or users to focus on small-scale problems with fewer parameters. In this paper, we consider a well-known machine learning model, deep belief networks (DBNs) that have yielded impressive classification per...

  14. Thermal System Analysis and Optimization of Large-Scale Compressed Air Energy Storage (CAES

    Directory of Open Access Journals (Sweden)

    Zhongguang Fu

    2015-08-01

    Full Text Available As an important solution to issues regarding peak load and renewable energy resources on grids, large-scale compressed air energy storage (CAES power generation technology has recently become a popular research topic in the area of large-scale industrial energy storage. At present, the combination of high-expansion ratio turbines with advanced gas turbine technology is an important breakthrough in energy storage technology. In this study, a new gas turbine power generation system is coupled with current CAES technology. Moreover, a thermodynamic cycle system is optimized by calculating for the parameters of a thermodynamic system. Results show that the thermal efficiency of the new system increases by at least 5% over that of the existing system.

  15. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  16. Large-scale liquid scintillation detectors for solar neutrinos

    Energy Technology Data Exchange (ETDEWEB)

    Benziger, Jay B.; Calaprice, Frank P. [Princeton University Princeton, Princeton, NJ (United States)

    2016-04-15

    Large-scale liquid scintillation detectors are capable of providing spectral yields of the low energy solar neutrinos. These detectors require > 100 tons of liquid scintillator with high optical and radiopurity. In this paper requirements for low-energy neutrino detection by liquid scintillation are specified and the procedures to achieve low backgrounds in large-scale liquid scintillation detectors for solar neutrinos are reviewed. The designs, operations and achievements of Borexino, KamLAND and SNO+ in measuring the low-energy solar neutrino fluxes are reviewed. (orig.)

  17. Large Scale Anomalies of the Cosmic Microwave Background with Planck

    DEFF Research Database (Denmark)

    Frejsel, Anne Mette

    This thesis focuses on the large scale anomalies of the Cosmic Microwave Background (CMB) and their possible origins. The investigations consist of two main parts. The first part is on statistical tests of the CMB, and the consistency of both maps and power spectrum. We find that the Planck data....... Here we find evidence that the Planck CMB maps contain residual radiation in the loop areas, which can be linked to some of the large scale CMB anomalies: the point-parity asymmetry, the alignment of quadrupole and octupole and the dipolemodulation....

  18. Large-scale structure in the universe: Theory vs observations

    International Nuclear Information System (INIS)

    Kashlinsky, A.; Jones, B.J.T.

    1990-01-01

    A variety of observations constrain models of the origin of large scale cosmic structures. We review here the elements of current theories and comment in detail on which of the current observational data provide the principal constraints. We point out that enough observational data have accumulated to constrain (and perhaps determine) the power spectrum of primordial density fluctuations over a very large range of scales. We discuss the theories in the light of observational data and focus on the potential of future observations in providing even (and ever) tighter constraints. (orig.)

  19. The survey of large-scale query classification

    Science.gov (United States)

    Zhou, Sanduo; Cheng, Kefei; Men, Lijun

    2017-04-01

    In recent years, a lot of researches have been done on query classification. The paper introduces the recent researches on query classification in detail, mainly including the source of query log, the category systems, the feature extraction methods, classification methods and the evaluation methodology. Then it discusses the issues of large-scale query classification and the solved methods combined with big data analysis systems. The research result shows there still are several problems and challenges, such as lack of authoritative classification system and evaluation methodology, efficiency of the feature extraction method, uncertainty of the performance on large-scale query log and the further query classification on the big data platform, etc.

  20. The CLASSgal code for Relativistic Cosmological Large Scale Structure

    CERN Document Server

    Di Dio, Enea; Lesgourgues, Julien; Durrer, Ruth

    2013-01-01

    We present some accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum Cl(z1,z2) and the corresponding correlation function xi(theta,z1,z2) in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  1. [Issues of large scale tissue culture of medicinal plant].

    Science.gov (United States)

    Lv, Dong-Mei; Yuan, Yuan; Zhan, Zhi-Lai

    2014-09-01

    In order to increase the yield and quality of the medicinal plant and enhance the competitive power of industry of medicinal plant in our country, this paper analyzed the status, problem and countermeasure of the tissue culture of medicinal plant on large scale. Although the biotechnology is one of the most efficient and promising means in production of medicinal plant, it still has problems such as stability of the material, safety of the transgenic medicinal plant and optimization of cultured condition. Establishing perfect evaluation system according to the characteristic of the medicinal plant is the key measures to assure the sustainable development of the tissue culture of medicinal plant on large scale.

  2. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...... into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends initial...... and discuss three challenges to address when dealing with large-scale systems development....

  3. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  4. Testing, development and demonstration of large scale solar district heating systems

    DEFF Research Database (Denmark)

    Furbo, Simon; Fan, Jianhua; Perers, Bengt

    2015-01-01

    know how on solar heating plants and solar heating test technology have been transferred from Denmark to China, large solar heating systems have been promoted in China, test capabilities on solar collectors and large scale solar heating systems have been improved in China and Danish-Chinese cooperation......In 2013-2014 the project “Testing, development and demonstration of large scale solar district heating systems” was carried out within the Sino-Danish Renewable Energy Development Programme, the so called RED programme jointly developed by the Chinese and Danish governments. In the project Danish...

  5. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  6. Visualizing large-scale uncertainty in astrophysical data.

    Science.gov (United States)

    Li, Hongwei; Fu, Chi-Wing; Li, Yinggang; Hanson, Andrew

    2007-01-01

    Visualization of uncertainty or error in astrophysical data is seldom available in simulations of astronomical phenomena, and yet almost all rendered attributes possess some degree of uncertainty due to observational error. Uncertainties associated with spatial location typically vary signicantly with scale and thus introduce further complexity in the interpretation of a given visualization. This paper introduces effective techniques for visualizing uncertainty in large-scale virtual astrophysical environments. Building upon our previous transparently scalable visualization architecture, we develop tools that enhance the perception and comprehension of uncertainty across wide scale ranges. Our methods include a unified color-coding scheme for representing log-scale distances and percentage errors, an ellipsoid model to represent positional uncertainty, an ellipsoid envelope model to expose trajectory uncertainty, and a magic-glass design supporting the selection of ranges of log-scale distance and uncertainty parameters, as well as an overview mode and a scalable WIM tool for exposing the magnitudes of spatial context and uncertainty.

  7. Technology requirements and readiness for very large vehicles

    Science.gov (United States)

    Conner, D. W.

    1979-01-01

    Common concerns of very large vehicles in the areas of economics, transportation system interfaces and operational problems were reviewed regarding their influence on vehicle configurations and technology. Fifty-four technology requirements were identified which are judged to be unique, or particularly critical, to very large vehicles. The requirements were about equally divided among the four general areas of aero/hydrodynamics, propulsion and acoustics, structures, and vehicle systems and operations. The state of technology readiness was judged to be poor to fair for slightly more than one half of the requirements. In the classic disciplinary areas, the state of technology readiness appears to be more advanced than for vehicle systems and operations.

  8. Design of large-scale enterprise interoperable value webs

    NARCIS (Netherlands)

    Hofman, W.J.

    2011-01-01

    Still a lot of enterprises are faced with the issue of interoperability. Whereas large enterprises are able to implement the required technology, SMEs (Small and Medium sized Enterprises) face challenges as they lack knowledge and budget. Enterprises have defined their specific semantics and

  9. 78 FR 70076 - Large Scale Networking (LSN)-Joint Engineering Team (JET)

    Science.gov (United States)

    2013-11-22

    ... NATIONAL SCIENCE FOUNDATION Large Scale Networking (LSN)--Joint Engineering Team (JET) AGENCY: The Networking and Information Technology Research and Development (NITRD) National Coordination Office (NCO... Federal agencies and non-Federal participants with interest in high performance research networking and...

  10. Toward the Development and Deployment of Large-Scale Carbon Dioxide Capture and Conversion Processes

    DEFF Research Database (Denmark)

    Yuan, Zhihong; Eden, Mario R.; Gani, Rafiqul

    2016-01-01

    conversion processes are among the grand challenges faced by today's scientists and engineers. Very few of the reported CO2 capture and conversion technologies have been employed for industrial installations on a large scale, where high-efficiency, cost/energy-effectiveness, and environmental friendliness...

  11. Large-scale agent-based social simulation : A study on epidemic prediction and control

    NARCIS (Netherlands)

    Zhang, M.

    2016-01-01

    Large-scale agent-based social simulation is gradually proving to be a versatile methodological approach for studying human societies, which could make contributions from policy making in social science, to distributed artificial intelligence and agent technology in computer science, and to theory

  12. Psychology in an Interdisciplinary Setting: A Large-Scale Project to Improve University Teaching

    Science.gov (United States)

    Koch, Franziska D.; Vogt, Joachim

    2015-01-01

    At a German university of technology, a large-scale project was funded as a part of the "Quality Pact for Teaching", a programme launched by the German Federal Ministry of Education and Research to improve the quality of university teaching and study conditions. The project aims at intensifying interdisciplinary networking in teaching,…

  13. Incremental development of large-scale human-robot teamwork in disaster response environments

    NARCIS (Netherlands)

    Greeff, J. de; Smets, N.J.J.M.; Hindriks, K.; Neerincx, M.A.; Kruijff-Korbayová, I.

    2017-01-01

    We report on the latest large-scale disaster-response exercise conducted by our project, which involves a robotic system with both ground robots (UGVs) and aerial robots (UAVs). In particularly, we focus on aspects related to Human-Robot teaming, and the uptake of new technology by end-users. © 2017

  14. The Ecological Impacts of Large-Scale Agrofuel Monoculture Production Systems in the Americas

    Science.gov (United States)

    Altieri, Miguel A.

    2009-01-01

    This article examines the expansion of agrofuels in the Americas and the ecological impacts associated with the technologies used in the production of large-scale monocultures of corn and soybeans. In addition to deforestation and displacement of lands devoted to food crops due to expansion of agrofuels, the massive use of transgenic crops and…

  15. Power ramp limitation and frequency support in large scale PVPPs without storage

    DEFF Research Database (Denmark)

    Craciun, Bogdan-Ionut; Spataru, Sergiu; Kerekes, Tamas

    2013-01-01

    Photovoltaic (PV) power generation started to become a mature technology and large scale PV Power Plants (PVPPs) operating in Maximum Power Point Tracking (MPPT) are not a solution anymore. During changes in the meteorological conditions, PVPPs output is directly influenced creating high power...

  16. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  17. Success in large high-technology projects: What really works?

    Science.gov (United States)

    Crosby, P.

    2014-08-01

    Despite a plethora of tools, technologies and management systems, successful execution of big science and engineering projects remains problematic. The sheer scale of globally funded projects such as the Large Hadron Collider and the Square Kilometre Array telescope means that lack of project success can impact both on national budgets, and collaborative reputations. In this paper, I explore data from contemporary literature alongside field research from several current high-technology projects in Europe and Australia, and reveal common `pressure points' that are shown to be key influencers of project control and success. I discuss the how mega-science projects sit between being merely complicated, and chaotic, and explain the importance of understanding multiple dimensions of project complexity. Project manager/leader traits are briefly discussed, including capability to govern and control such enterprises. Project structures are examined, including the challenge of collaborations. I show that early attention to building project resilience, curbing optimism, and risk alertness can help prepare large high-tech projects against threats, and why project managers need to understand aspects of `the silent power of time'. Mission assurance is advanced as a critical success function, alongside the deployment of task forces and new combinations of contingency plans. I argue for increased project control through industrial-style project reviews, and show how post-project reviews are an under-used, yet invaluable avenue of personal and organisational improvement. Lastly, I discuss the avoidance of project amnesia through effective capture of project knowledge, and transfer of lessons-learned to subsequent programs and projects.

  18. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  19. Link between local scale BC emissions and large scale atmospheric solar absorption

    Science.gov (United States)

    Praveen, P. S.; Ahmed, T.; Kar, A.; Rehman, I. H.; Ramanathan, V.

    2011-07-01

    Project Surya has documented indoor and outdoor concentrations of black carbon (BC) from traditional biomass burning cook stoves in a rural village located in the Indo-Gangetic Plains (IGP) region of N. India from November 2009- September 2010. In this paper, we systematically document the link between local scale aerosol properties and column averaged regional aerosol optical properties and atmospheric radiative forcing. We report observations from the first phase of Project Surya to estimate the source dependent (biomass and fossil fuels) aerosol optical properties from local to regional scale. Data were collected using surface based observations of BC, organic carbon (OC), aerosol light absorption, scattering coefficient at the Surya village (SVI_1) located in IGP region, and satellite and AERONET observations at the regional scale (IGP). The daily mean BC concentrations at SVI1 showed the large increase of BC during the dry season (December to February) with values reaching 35 μg m-3. Space based LIDAR data reveal how the biomass smoke is trapped within the first kilometre during the dry season and its extension to above 5 km during the pre-monsoon season. As a result during the dry season, the variance in the daily mean SSA and column aerosol optical properties at the local IGP site correlated (with slopes in the range of 0.85 to 1.06 and R2>0.4) well with the "IGP_AERONET" (mean of six AERONET sites), thus suggesting in-situ observations at few locations can be used to infer spatial mean forcing. The atmospheric forcing due to BC and OC exceeded 20 W m-2 during all months from November to May, leading to the deduction that elimination of cook stove smoke emissions through clean cooking technologies will likely have a major positive impact on health and the regional climate.

  20. Vibration amplitude rule study for rotor under large time scale

    International Nuclear Information System (INIS)

    Yang Xuan; Zuo Jianli; Duan Changcheng

    2014-01-01

    The rotor is an important part of the rotating machinery; its vibration performance is one of the important factors affecting the service life. This paper presents both theoretical analyses and experimental demonstrations of the vibration rule of the rotor under large time scales. The rule can be used for the service life estimation of the rotor. (authors)

  1. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  2. Fractals and the Large-Scale Structure in the Universe

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 2. Fractals and the Large-Scale Structure in the Universe - Introduction and Basic Concepts. A K Mittal T R Seshadri. General Article Volume 7 Issue 2 February 2002 pp 6-19 ...

  3. Chain Analysis for large-scale Communication systems

    NARCIS (Netherlands)

    Grijpink, Jan|info:eu-repo/dai/nl/095130861

    2010-01-01

    The chain concept is introduced to explain how large-scale information infrastructures so often fail and sometimes even backfire. Next, the assessment framework of the doctrine of Chain-computerisation and its chain analysis procedure are outlined. In this procedure chain description precedes

  4. Water Implications of Large-Scale Land Acquisitions in Ghana

    Directory of Open Access Journals (Sweden)

    Timothy Olalekan Williams

    2012-06-01

    The paper offers recommendations which can help the government to achieve its stated objective of developing a "policy framework and guidelines for large-scale land acquisitions by both local and foreign investors for biofuels that will protect the interests of investors and the welfare of Ghanaian farmers and landowners".

  5. Origin of large-scale cell structure in the universe

    International Nuclear Information System (INIS)

    Zel'dovich, Y.B.

    1982-01-01

    A qualitative explanation is offered for the characteristic global structure of the universe, wherein ''black'' regions devoid of galaxies are surrounded on all sides by closed, comparatively thin, ''bright'' layers populated by galaxies. The interpretation rests on some very general arguments regarding the growth of large-scale perturbations in a cold gas

  6. Newton Methods for Large Scale Problems in Machine Learning

    Science.gov (United States)

    Hansen, Samantha Leigh

    2014-01-01

    The focus of this thesis is on practical ways of designing optimization algorithms for minimizing large-scale nonlinear functions with applications in machine learning. Chapter 1 introduces the overarching ideas in the thesis. Chapters 2 and 3 are geared towards supervised machine learning applications that involve minimizing a sum of loss…

  7. Large-Scale Machine Learning for Classification and Search

    Science.gov (United States)

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  8. Fractals and the Large-Scale Structure in the Universe

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 7; Issue 4. Fractals and the Large-Scale Structure in the Universe - Is the Cosmological Principle Valid? A K Mittal T R Seshadri. General Article Volume 7 Issue 4 April 2002 pp 39-47 ...

  9. Evaluating Large-scale National Public Management Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Morten Balle

    This article explores differences and similarities between two evaluations of large-scale administrative reforms which were carried out in the 2000s: The evaluation of the Norwegian NAV reform (EVANAV) and the evaluation of the Danish Local Government Reform (LGR). We provide a comparative analysis...

  10. Invertebrates or iron: does large-scale opencast mining impact ...

    African Journals Online (AJOL)

    The results were, however, confounded by the fact that the resting eggs of pan inhabitants could remain dormant in the sediment for decades; suggesting that ... Similarly, the preservation of conservation areas and a landscape wide management system were proposed to ensure that large-scale ecological process are not ...

  11. Reconsidering Replication: New Perspectives on Large-Scale School Improvement

    Science.gov (United States)

    Peurach, Donald J.; Glazer, Joshua L.

    2012-01-01

    The purpose of this analysis is to reconsider organizational replication as a strategy for large-scale school improvement: a strategy that features a "hub" organization collaborating with "outlet" schools to enact school-wide designs for improvement. To do so, we synthesize a leading line of research on commercial replication to construct a…

  12. The large scale microwave background anisotropy in decaying particle cosmology

    International Nuclear Information System (INIS)

    Panek, M.

    1987-06-01

    We investigate the large-scale anisotropy of the microwave background radiation in cosmological models with decaying particles. The observed value of the quadrupole moment combined with other constraints gives an upper limit on the redshift of the decay z/sub d/ < 3-5. 12 refs., 2 figs

  13. Resolute large scale mining company contribution to health services of

    African Journals Online (AJOL)

    Introduction: In 1995 Tanzanian Government reformed the mining industry and the new policy allowed an involvement of multinational companies but the communities living near new large scale gold mines were expected to benefit from the industry in terms of socio economic, health, education, employment, safe drinking ...

  14. Large Scale Magnetic Fields: Density Power Spectrum in Redshift ...

    Indian Academy of Sciences (India)

    R. Narasimhan (Krishtel eMaging) 1461 1996 Oct 15 13:05:22

    netic fields to have significant impact on the large scale structure at present. Magnetic fields of a more recent ... are produced at the time of inflation in the very early universe. Larger surveys like the on-going ... fields and their impact on redshift space power spectrum and give our main results. In section 4 we summarize our ...

  15. Dual Decomposition for Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Vandenberghe, Lieven

    2013-01-01

    Dual decomposition is applied to power balancing of exible thermal storage units. The centralized large-scale problem is decomposed into smaller subproblems and solved locallyby each unit in the Smart Grid. Convergence is achieved by coordinating the units consumption through a negotiation...... procedure with the dual variables....

  16. Temporal Variation of Large Scale Flows in the Solar Interior ...

    Indian Academy of Sciences (India)

    tribpo

    Temporal Variation of Large Scale Flows in the Solar Interior. 355. Figure 2. Zonal and meridional components of the time-dependent residual velocity at a few selected depths as marked above each panel, are plotted as contours of constant velocity in the longitude-latitude plane. The left panels show the zonal component, ...

  17. The Cosmology Large Angular Scale Surveyor (CLASS) Telescope Architecture

    Science.gov (United States)

    Chuss, David T.; Ali, Aamir; Amiri, Mandana; Appel, John W.; Araujo, Derek; Bennett, Charles L.; Boone, Fletcher; Chan, Manwei; Cho, Hsiao-Mei; Colazo, Felipe; hide

    2014-01-01

    We describe the instrument architecture of the Johns Hopkins University-led CLASS instrument, a groundbased cosmic microwave background (CMB) polarimeter that will measure the large-scale polarization of the CMB in several frequency bands to search for evidence of inflation.

  18. Description of a Large-Scale Micro-Teaching Program.

    Science.gov (United States)

    Webb, Clark; And Others

    This report describes the implementation of a large-scale program at Brigham Young University to provide for at least one microteaching experience for each of 730 students enrolled in a beginning education course. A definition of microteaching (the creation of a miniature teaching situation under controlled conditions) and the elements which make…

  19. Small and large scale genomic DNA isolation protocol for chickpea ...

    African Journals Online (AJOL)

    Small and large scale genomic DNA isolation protocol for chickpea ( Cicer arietinum L.), suitable for molecular marker and transgenic analyses. ... Chickpea is an important food legume crop with high nutritional value. Lack of appropriate DNA isolation protocol is a limiting factor for any molecular studies of this crop.

  20. The Large-Scale Structure of Scientific Method

    Science.gov (United States)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  1. The interaction of large scale and mesoscale environment leading to ...

    Indian Academy of Sciences (India)

    Home; Journals; Journal of Earth System Science; Volume 118; Issue 5. The interaction of large scale and mesoscale environment leading to formation of intense thunderstorms over Kolkata. Part I: Doppler radar and satellite observations. P Mukhopadhyay M Mahakur H A K Singh. Volume 118 Issue 5 October 2009 pp ...

  2. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    The objective of this project is to test whether the Food and Agriculture Organization's Voluntary Guidelines on the Responsible Governance of Tenure of Land, Fisheries and Forests in the Context of National Food Security can help increase accountability for large-scale land acquisitions in Mali, Nigeria, Uganda, and South ...

  3. Participatory Design and the Challenges of Large-Scale Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    With its 10th biannual anniversary conference, Participatory Design (PD) is leaving its teens and must now be considered ready to join the adult world. In this article we encourage the PD community to think big: PD should engage in large-scale information-systems development and opt for a PD...

  4. A Large-Scale Earth and Ocean Phenomenon

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 10; Issue 2. Tsunamis - A Large-Scale Earth and Ocean Phenomenon. Satish R Shetye. General Article Volume 10 Issue 2 February 2005 pp 8-19. Fulltext. Click here to view fulltext PDF. Permanent link:

  5. Large-Scale Innovation and Change in UK Higher Education

    Science.gov (United States)

    Brown, Stephen

    2013-01-01

    This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ…

  6. Factors Influencing Uptake of a Large Scale Curriculum Innovation.

    Science.gov (United States)

    Adey, Philip S.

    Educational research has all too often failed to be implemented on a large-scale basis. This paper describes the multiplier effect of a professional development program for teachers and for trainers in the United Kingdom, and how that program was developed, monitored, and evaluated. Cognitive Acceleration through Science Education (CASE) is a…

  7. Symmetry in stochasticity: Random walk models of large-scale ...

    Indian Academy of Sciences (India)

    This paper describes the insights gained from the excursion set approach, in which various questions about the phenomenology of large-scale structure formation can be mapped to problems associated with the first crossing distribution of appropriately defined barriers by random walks. Much of this is summarized in R K ...

  8. Solving large scale crew scheduling problems by using iterative partitioning

    NARCIS (Netherlands)

    E.J.W. Abbink (Erwin)

    2008-01-01

    textabstractThis paper deals with large-scale crew scheduling problems arising at the Dutch railway operator, Netherlands Railways (NS). NS operates about 30,000 trains a week. All these trains need a driver and a certain number of conductors. No available crew scheduling algorithm can solve such

  9. Solving Large Scale Crew Scheduling Problems in Practice

    NARCIS (Netherlands)

    E.J.W. Abbink (Erwin); L. Albino; T.A.B. Dollevoet (Twan); D. Huisman (Dennis); J. Roussado; R.L. Saldanha

    2010-01-01

    textabstractThis paper deals with large-scale crew scheduling problems arising at the Dutch railway operator, Netherlands Railways (NS). NS operates about 30,000 trains a week. All these trains need a driver and a certain number of guards. Some labor rules restrict the duties of a certain crew base

  10. The Large Scale Magnetic Field and Sunspot Cycles

    Indian Academy of Sciences (India)

    tribpo

    J. Astrophys. Astr. (2000) 21, 161 162. The Large Scale Magnetic Field and Sunspot Cycles. V. I. Makarov* & A. G. Tlatov, Kislovodsk Solar Station of the Pulkovo Observatory,. Kislovodsk 357700, P.O. Box 145, Russia. *e mail: makarov@gao.spb.ru. Key words. Sun: magnetic field—sunspots—solar cycle. Extended abstract.

  11. Large-Scale Networked Virtual Environments: Architecture and Applications

    Science.gov (United States)

    Lamotte, Wim; Quax, Peter; Flerackers, Eddy

    2008-01-01

    Purpose: Scalability is an important research topic in the context of networked virtual environments (NVEs). This paper aims to describe the ALVIC (Architecture for Large-scale Virtual Interactive Communities) approach to NVE scalability. Design/methodology/approach: The setup and results from two case studies are shown: a 3-D learning environment…

  12. Information Tailoring Enhancements for Large-Scale Social Data

    Science.gov (United States)

    2016-09-26

    improved usability and navigation, (iii) improved the computational framework of Scraawl, (iv) enhanced Named Entity Recognition (NER), and (v...tailoring, large-scale analysis, OSINT 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18. NUMBER a. REPORT b. ABSTRACT c. THIS PAGE ABSTRACT OF PAGES...Improvements .................................................... 7 2.3 Upgrade Scraawl Computational Framework to Increase Robustness ....... 8 2.4

  13. Large-Scale Assessments and Educational Policies in Italy

    Science.gov (United States)

    Damiani, Valeria

    2016-01-01

    Despite Italy's extensive participation in most large-scale assessments, their actual influence on Italian educational policies is less easy to identify. The present contribution aims at highlighting and explaining reasons for the weak and often inconsistent relationship between international surveys and policy-making processes in Italy.…

  14. Large Scale Magnetic Fields: Density Power Spectrum in Redshift ...

    Indian Academy of Sciences (India)

    2016-01-27

    Jan 27, 2016 ... Our analysis shows that if these magnetic fields originated in the early universe then it is possible to construct models for which the shape of the power spectrum agrees with the large scale slope of the observed power spectrum. However requiring compatibility with observed CMBR anisotropies, the ...

  15. Fractals and the Large-Scale Structure in the Universe

    Indian Academy of Sciences (India)

    lien from Harish-Chandra. Research Institute,. Allahabad. Areas of his interest include cosmic microwave background radiation, large scale structures in the Universe and application of fractals in these. A K Mittal and T R Seshadri. During the last decade it has been argued by some investigators that the distribution of galax-.

  16. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena

  17. Proceedings of the meeting on large scale computer simulation research

    International Nuclear Information System (INIS)

    2004-04-01

    The meeting to summarize the collaboration activities for FY2003 on the Large Scale Computer Simulation Research was held January 15-16, 2004 at Theory and Computer Simulation Research Center, National Institute for Fusion Science. Recent simulation results, methodologies and other related topics were presented. (author)

  18. Large scale synthesis and characterization of Ni nanoparticles by ...

    Indian Academy of Sciences (India)

    ... Refresher Courses · Symposia · Live Streaming. Home; Journals; Bulletin of Materials Science; Volume 31; Issue 1. Large scale synthesis and characterization of Ni nanoparticles by solution reduction method. Huazhi Wang Xinli Kou Jie Zhang Jiangong Li. Nanomaterials Volume 31 Issue 1 February 2008 pp 97-100 ...

  19. Large-Scale Systems Control Design via LMI Optimization

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2015-01-01

    Roč. 44, č. 3 (2015), s. 247-253 ISSN 1392-124X Institutional support: RVO:67985556 Keywords : Combinatorial linear matrix inequalities * large-scale system * decentralized control Subject RIV: BC - Control Systems Theory Impact factor: 0.633, year: 2015

  20. Firebrands and spotting ignition in large-scale fires

    Science.gov (United States)

    Eunmo Koo; Patrick J. Pagni; David R. Weise; John P. Woycheese

    2010-01-01

    Spotting ignition by lofted firebrands is a significant mechanism of fire spread, as observed in many largescale fires. The role of firebrands in fire propagation and the important parameters involved in spot fire development are studied. Historical large-scale fires, including wind-driven urban and wildland conflagrations and post-earthquake fires are given as...

  1. Scale to Measure Attitudes toward Information Technology

    Science.gov (United States)

    Gokhale, Anu A.; Paul E. Brauchle; Kenton F. Machina

    2013-01-01

    The current post-secondary graduation rates in computing disciplines suggest American universities are only training enough students to fill one third of the projected 1.4 million technology and computing jobs available (National Center for Women and Information Technology, 2011). Pursuit of information technology (IT) majors depends, to a great…

  2. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  3. The Large-scale Effect of Environment on Galactic Conformity

    Science.gov (United States)

    Sun, Shuangpeng; Guo, Qi; Wang, Lan; Wang, Jie; Gao, Liang; Lacey, Cedric G.; Pan, Jun

    2018-04-01

    We use a volume-limited galaxy sample from the SDSS Data Release 7 to explore the dependence of galactic conformity on the large-scale environment, measured on ˜ 4 Mpc scales. We find that the star formation activity of neighbour galaxies depends more strongly on the environment than on the activity of their primary galaxies. In under-dense regions most neighbour galaxies tend to be active, while in over-dense regions neighbour galaxies are mostly passive, regardless of the activity of their primary galaxies. At a given stellar mass, passive primary galaxies reside in higher density regions than active primary galaxies, leading to the apparently strong conformity signal. The dependence of the activity of neighbour galaxies on environment can be explained by the corresponding dependence of the fraction of satellite galaxies. Similar results are found for galaxies in a semi-analytical model, suggesting that no new physics is required to explain the observed large-scale conformity.

  4. Reliability Evaluation considering Structures of a Large Scale Wind Farm

    DEFF Research Database (Denmark)

    Shin, Je-Seok; Cha, Seung-Tae; Wu, Qiuwei

    2012-01-01

    evaluation on wind farm is necessarily required. Also, because large scale offshore wind farm has a long repair time and a high repair cost as well as a high investment cost, it is essential to take into account the economic aspect. One of methods to efficiently build and to operate wind farm is to construct......Wind energy is one of the most widely used renewable energy resources. Wind power has been connected to the grid as large scale wind farm which is made up of dozens of wind turbines, and the scale of wind farm is more increased recently. Due to intermittent and variable wind source, reliability...... wind farm which is able to enhance a capability of delivering a power instead of controlling an uncontrollable output of wind power. Therefore, this paper introduces a method to evaluate the reliability depending upon structures of wind farm and to reflect the result to the planning stage of wind farm....

  5. Variability in large-scale wind power generation: Variability in large-scale wind power generation

    Energy Technology Data Exchange (ETDEWEB)

    Kiviluoma, Juha [VTT Technical Research Centre of Finland, Espoo Finland; Holttinen, Hannele [VTT Technical Research Centre of Finland, Espoo Finland; Weir, David [Energy Department, Norwegian Water Resources and Energy Directorate, Oslo Norway; Scharff, Richard [KTH Royal Institute of Technology, Electric Power Systems, Stockholm Sweden; Söder, Lennart [Royal Institute of Technology, Electric Power Systems, Stockholm Sweden; Menemenlis, Nickie [Institut de recherche Hydro-Québec, Montreal Canada; Cutululis, Nicolaos A. [DTU, Wind Energy, Roskilde Denmark; Danti Lopez, Irene [Electricity Research Centre, University College Dublin, Dublin Ireland; Lannoye, Eamonn [Electric Power Research Institute, Palo Alto California USA; Estanqueiro, Ana [LNEG, Laboratorio Nacional de Energia e Geologia, UESEO, Lisbon Spain; Gomez-Lazaro, Emilio [Renewable Energy Research Institute and DIEEAC/EDII-AB, Castilla-La Mancha University, Albacete Spain; Zhang, Qin [State Grid Corporation of China, Beijing China; Bai, Jianhua [State Grid Energy Research Institute Beijing, Beijing China; Wan, Yih-Huei [National Renewable Energy Laboratory, Transmission and Grid Integration Group, Golden Colorado USA; Milligan, Michael [National Renewable Energy Laboratory, Transmission and Grid Integration Group, Golden Colorado USA

    2015-10-25

    The paper demonstrates the characteristics of wind power variability and net load variability in multiple power systems based on real data from multiple years. Demonstrated characteristics include probability distribution for different ramp durations, seasonal and diurnal variability and low net load events. The comparison shows regions with low variability (Sweden, Spain and Germany), medium variability (Portugal, Ireland, Finland and Denmark) and regions with higher variability (Quebec, Bonneville Power Administration and Electric Reliability Council of Texas in North America; Gansu, Jilin and Liaoning in China; and Norway and offshore wind power in Denmark). For regions with low variability, the maximum 1 h wind ramps are below 10% of nominal capacity, and for regions with high variability, they may be close to 30%. Wind power variability is mainly explained by the extent of geographical spread, but also higher capacity factor causes higher variability. It was also shown how wind power ramps are autocorrelated and dependent on the operating output level. When wind power was concentrated in smaller area, there were outliers with high changes in wind output, which were not present in large areas with well-dispersed wind power.

  6. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  7. Random access in large-scale DNA data storage.

    Science.gov (United States)

    Organick, Lee; Ang, Siena Dumas; Chen, Yuan-Jyue; Lopez, Randolph; Yekhanin, Sergey; Makarychev, Konstantin; Racz, Miklos Z; Kamath, Govinda; Gopalan, Parikshit; Nguyen, Bichlien; Takahashi, Christopher N; Newman, Sharon; Parker, Hsing-Yeh; Rashtchian, Cyrus; Stewart, Kendall; Gupta, Gagan; Carlson, Robert; Mulligan, John; Carmean, Douglas; Seelig, Georg; Ceze, Luis; Strauss, Karin

    2018-03-01

    Synthetic DNA is durable and can encode digital data with high density, making it an attractive medium for data storage. However, recovering stored data on a large-scale currently requires all the DNA in a pool to be sequenced, even if only a subset of the information needs to be extracted. Here, we encode and store 35 distinct files (over 200 MB of data), in more than 13 million DNA oligonucleotides, and show that we can recover each file individually and with no errors, using a random access approach. We design and validate a large library of primers that enable individual recovery of all files stored within the DNA. We also develop an algorithm that greatly reduces the sequencing read coverage required for error-free decoding by maximizing information from all sequence reads. These advances demonstrate a viable, large-scale system for DNA data storage and retrieval.

  8. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    Science.gov (United States)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4Destiny will be used in its third year as a high resolution, wide-field imager to conduct a weak lensing survey covering >lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  9. Solar Trees: First Large-Scale Demonstration of Fully Solution Coated, Semitransparent, Flexible Organic Photovoltaic Modules.

    Science.gov (United States)

    Berny, Stephane; Blouin, Nicolas; Distler, Andreas; Egelhaaf, Hans-Joachim; Krompiec, Michal; Lohr, Andreas; Lozman, Owen R; Morse, Graham E; Nanson, Lana; Pron, Agnieszka; Sauermann, Tobias; Seidler, Nico; Tierney, Steve; Tiwana, Priti; Wagner, Michael; Wilson, Henry

    2016-05-01

    The technology behind a large area array of flexible solar cells with a unique design and semitransparent blue appearance is presented. These modules are implemented in a solar tree installation at the German pavilion in the EXPO2015 in Milan/IT. The modules show power conversion efficiencies of 4.5% and are produced exclusively using standard printing techniques for large-scale production.

  10. Large-scale ocean connectivity and planktonic body size

    KAUST Repository

    Villarino, Ernesto

    2018-01-04

    Global patterns of planktonic diversity are mainly determined by the dispersal of propagules with ocean currents. However, the role that abundance and body size play in determining spatial patterns of diversity remains unclear. Here we analyse spatial community structure - β-diversity - for several planktonic and nektonic organisms from prokaryotes to small mesopelagic fishes collected during the Malaspina 2010 Expedition. β-diversity was compared to surface ocean transit times derived from a global circulation model, revealing a significant negative relationship that is stronger than environmental differences. Estimated dispersal scales for different groups show a negative correlation with body size, where less abundant large-bodied communities have significantly shorter dispersal scales and larger species spatial turnover rates than more abundant small-bodied plankton. Our results confirm that the dispersal scale of planktonic and micro-nektonic organisms is determined by local abundance, which scales with body size, ultimately setting global spatial patterns of diversity.

  11. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  12. Entirely renewable energy-based electricity supply system (small scale and large scale)

    Energy Technology Data Exchange (ETDEWEB)

    Zahedi, A. [Monash University, Caulfield (Australia). Division of Electrical and Computer Systems Engineering

    1996-09-01

    Our future energy needs will be supplied by a combination of many different sources ranging from small wind turbine to provide power for a single house to central power stations that provide power in very large scale fed into the national grid. Computer control systems will integrate the performance of all these systems to make sure that as much power as possible comes from environmentally friendlier sources. As alternative sources becomes more widely available, small scale systems meeting local needs may start to replace current large scale central power stations. The author is investigating the feasibility of an entirely renewable energy-based electricity supply system. The developed system find so many applications as it can be used as small scale power system for Remote Area Power Supply, wind energy/battery or solar energy/battery, as well as large scale for interconnection with national grid. (Author)

  13. Large-Scale Power Production Potential on U.S. Department of Energy Lands

    Energy Technology Data Exchange (ETDEWEB)

    Kandt, Alicen J. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Elgqvist, Emma M. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Gagne, Douglas A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Hillesheim, Michael B. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); Walker, H. A. [National Renewable Energy Laboratory (NREL), Golden, CO (United States); King, Jeff [Colorado School of Mines, Golden, CO (United States); Boak, Jeremy [Colorado School of Mines, Golden, CO (United States); Washington, Jeremy [Colorado School of Mines, Golden, CO (United States); Sharp, Cory [Colorado School of Mines, Golden, CO (United States)

    2017-11-03

    This report summarizes the potential for independent power producers to generate large-scale power on U.S. Department of Energy (DOE) lands and export that power into a larger power market, rather than serving on-site DOE loads. The report focuses primarily on the analysis of renewable energy (RE) technologies that are commercially viable at utility scale, including photovoltaics (PV), concentrating solar power (CSP), wind, biomass, landfill gas (LFG), waste to energy (WTE), and geothermal technologies. The report also summarizes the availability of fossil fuel, uranium, or thorium resources at 55 DOE sites.

  14. Selection of the surface water treatment technology - a full-scale technological investigation.

    Science.gov (United States)

    Pruss, Alina

    2015-01-01

    A technological investigation was carried out over a period of 2 years to evaluate surface water treatment technology. The study was performed in Poland, in three stages. From November 2011 to July 2012, for the first stage, flow tests with a capacity of 0.1-1.5 m³/h were performed simultaneously in three types of technical installations differing by coagulation modules. The outcome of the first stage was the choice of the technology for further investigation. The second stage was performed between September 2012 and March 2013 on a full-scale water treatment plant. Three large technical installations, operated in parallel, were analysed: coagulation with sludge flotation, micro-sand ballasted coagulation with sedimentation, coagulation with sedimentation and sludge recirculation. The capacity of the installations ranged from 10 to 40 m³/h. The third stage was also performed in a full-scale water treatment plant and was aimed at optimising the selected technology. This article presents the results of the second stage of the full-scale investigation. The critical treatment process, for the analysed water, was the coagulation in an acidic environment (6.5 < pH < 7.0) carried out in a system with rapid mixing, a flocculation chamber, preliminary separation of coagulation products, and removal of residual suspended solids through filtration.

  15. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  16. Primordial quantum nonequilibrium and large-scale cosmic anomalies

    Science.gov (United States)

    Colin, Samuel; Valentini, Antony

    2015-08-01

    We study incomplete relaxation to quantum equilibrium at long wavelengths, during a preinflationary phase, as a possible explanation for the reported large-scale anomalies in the cosmic microwave background. Our scenario makes use of the de Broglie-Bohm pilot-wave formulation of quantum theory, in which the Born probability rule has a dynamical origin. The large-scale power deficit could arise from incomplete relaxation for the amplitudes of the primordial perturbations. We show, by numerical simulations for a spectator scalar field, that if the preinflationary era is radiation dominated then the deficit in the emerging power spectrum will have a characteristic shape (an inverse-tangent dependence on wave number k , with oscillations). It is found that our scenario is able to produce a power deficit in the observed region and of the observed (approximate) magnitude for an appropriate choice of cosmological parameters. We also discuss the large-scale anisotropy, which might arise from incomplete relaxation for the phases of the primordial perturbations. We present numerical simulations for phase relaxation, and we show how to define characteristic scales for amplitude and phase nonequilibrium. The extent to which the data might support our scenario is left as a question for future work. Our results suggest that we have a potentially viable model that might explain two apparently independent cosmic anomalies by means of a single mechanism.

  17. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  18. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  19. Solving large scale structure in ten easy steps with COLA

    International Nuclear Information System (INIS)

    Tassev, Svetlin; Zaldarriaga, Matias; Eisenstein, Daniel J.

    2013-01-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10 9 M s un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10 11 M s un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed

  20. Robust regression for large-scale neuroimaging studies.

    Science.gov (United States)

    Fritsch, Virgile; Da Mota, Benoit; Loth, Eva; Varoquaux, Gaël; Banaschewski, Tobias; Barker, Gareth J; Bokde, Arun L W; Brühl, Rüdiger; Butzek, Brigitte; Conrod, Patricia; Flor, Herta; Garavan, Hugh; Lemaitre, Hervé; Mann, Karl; Nees, Frauke; Paus, Tomas; Schad, Daniel J; Schümann, Gunter; Frouin, Vincent; Poline, Jean-Baptiste; Thirion, Bertrand

    2015-05-01

    Multi-subject datasets used in neuroimaging group studies have a complex structure, as they exhibit non-stationary statistical properties across regions and display various artifacts. While studies with small sample sizes can rarely be shown to deviate from standard hypotheses (such as the normality of the residuals) due to the poor sensitivity of normality tests with low degrees of freedom, large-scale studies (e.g. >100 subjects) exhibit more obvious deviations from these hypotheses and call for more refined models for statistical inference. Here, we demonstrate the benefits of robust regression as a tool for analyzing large neuroimaging cohorts. First, we use an analytic test based on robust parameter estimates; based on simulations, this procedure is shown to provide an accurate statistical control without resorting to permutations. Second, we show that robust regression yields more detections than standard algorithms using as an example an imaging genetics study with 392 subjects. Third, we show that robust regression can avoid false positives in a large-scale analysis of brain-behavior relationships with over 1500 subjects. Finally we embed robust regression in the Randomized Parcellation Based Inference (RPBI) method and demonstrate that this combination further improves the sensitivity of tests carried out across the whole brain. Altogether, our results show that robust procedures provide important advantages in large-scale neuroimaging group studies. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Less is more: regularization perspectives on large scale machine learning

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Deep learning based techniques provide a possible solution at the expanse of theoretical guidance and, especially, of computational requirements. It is then a key challenge for large scale machine learning to devise approaches guaranteed to be accurate and yet computationally efficient. In this talk, we will consider a regularization perspectives on machine learning appealing to classical ideas in linear algebra and inverse problems to scale-up dramatically nonparametric methods such as kernel methods, often dismissed because of prohibitive costs. Our analysis derives optimal theoretical guarantees while providing experimental results at par or out-performing state of the art approaches.

  2. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  3. Large-scale building integrated photovoltaics field trial. First technical report - installation phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of the first eighteen months of the Large-Scale Building Integrated Photovoltaic Field Trial focussing on technical aspects. The project aims included increasing awareness and application of the technology, raising the UK capabilities in application of the technology, and assessing the potential for building integrated photovoltaics (BIPV). Details are given of technology choices; project organisation, cost, and status; and the evaluation criteria. Installations of BIPV described include University buildings, commercial centres, and a sports stadium, wildlife park, church hall, and district council building. Lessons learnt are discussed, and a further report covering monitoring aspects is planned.

  4. Dimensionality reduction for large-scale neural recordings

    OpenAIRE

    Cunningham, John P; Yu, Byron M

    2014-01-01

    Most sensory, cognitive and motor functions depend on the interactions of many neurons. In recent years, there has been rapid development and increasing use of technologies for recording from large numbers of neurons, either sequentially or simultaneously. A key question is what scientific insight can be gained by studying a population of recorded neurons beyond studying each neuron individually. Here, we examine three important motivations for population studies: single-trial hypotheses requ...

  5. Design techniques for large scale linear measurement systems

    International Nuclear Information System (INIS)

    Candy, J.V.

    1979-03-01

    Techniques to design measurement schemes for systems modeled by large scale linear time invariant systems, i.e., physical systems modeled by a large number (> 5) of ordinary differential equations, are described. The techniques are based on transforming the physical system model to a coordinate system facilitating the design and then transforming back to the original coordinates. An example of a three-stage, four-species, extraction column used in the reprocessing of spent nuclear fuel elements is presented. The basic ideas are briefly discussed in the case of noisy measurements. An example using a plutonium nitrate storage vessel (reprocessing) with measurement uncertainty is also presented

  6. Segmentation by Large Scale Hypothesis Testing - Segmentation as Outlier Detection

    DEFF Research Database (Denmark)

    Darkner, Sune; Dahl, Anders Lindbjerg; Larsen, Rasmus

    2010-01-01

    We propose a novel and efficient way of performing local image segmentation. For many applications a threshold of pixel intensities is sufficient but determine the appropriate threshold value can be difficult. In cases with large global intensity variation the threshold value has to be adapted...... locally. We propose a method based on large scale hypothesis testing with a consistent method for selecting an appropriate threshold for the given data. By estimating the background distribution we characterize the segment of interest as a set of outliers with a certain probability based on the estimated...

  7. Geometric algorithms for electromagnetic modeling of large scale structures

    Science.gov (United States)

    Pingenot, James

    With the rapid increase in the speed and complexity of integrated circuit designs, 3D full wave and time domain simulation of chip, package, and board systems becomes more and more important for the engineering of modern designs. Much effort has been applied to the problem of electromagnetic (EM) simulation of such systems in recent years. Major advances in boundary element EM simulations have led to O(n log n) simulations using iterative methods and advanced Fast. Fourier Transform (FFT), Multi-Level Fast Multi-pole Methods (MLFMM), and low-rank matrix compression techniques. These advances have been augmented with an explosion of multi-core and distributed computing technologies, however, realization of the full scale of these capabilities has been hindered by cumbersome and inefficient geometric processing. Anecdotal evidence from industry suggests that users may spend around 80% of turn-around time manipulating the geometric model and mesh. This dissertation addresses this problem by developing fast and efficient data structures and algorithms for 3D modeling of chips, packages, and boards. The methods proposed here harness the regular, layered 2D nature of the models (often referred to as "2.5D") to optimize these systems for large geometries. First, an architecture is developed for efficient storage and manipulation of 2.5D models. The architecture gives special attention to native representation of structures across various input models and special issues particular to 3D modeling. The 2.5D structure is then used to optimize the mesh systems First, circuit/EM co-simulation techniques are extended to provide electrical connectivity between objects. This concept is used to connect independently meshed layers, allowing simple and efficient 2D mesh algorithms to be used in creating a 3D mesh. Here, adaptive meshing is used to ensure that the mesh accurately models the physical unknowns (current and charge). Utilizing the regularized nature of 2.5D objects and

  8. Large scale EMF in current sheets induced by tearing modes

    Science.gov (United States)

    Mizerski, Krzysztof A.

    2018-02-01

    An extension of the analysis of resistive instabilities of a sheet pinch from a famous work by Furth et al (1963 Phys. Fluids 6 459) is presented here, to study the mean electromotive force (EMF) generated by the developing instability. In a Cartesian configuration and in the presence of a current sheet first the boundary layer technique is used to obtain global, matched asymptotic solutions for the velocity and magnetic field and then the solutions are used to calculate the large-scale EMF in the system. It is reported, that in the bulk the curl of the mean EMF is linear in {{j}}0\\cdot {{B}}0, a simple pseudo-scalar quantity constructed from the large-scale quantities.

  9. Mathematical programming methods for large-scale topology optimization problems

    DEFF Research Database (Denmark)

    Rojas Labanda, Susana

    for the classical minimum compliance problem. Two of the state-of-the-art optimization algorithms are investigated and implemented for this structural topology optimization problem. A Sequential Quadratic Programming (TopSQP) and an interior point method (TopIP) are developed exploiting the specific mathematical...... structure of the problem. In both solvers, information of the exact Hessian is considered. A robust iterative method is implemented to efficiently solve large-scale linear systems. Both TopSQP and TopIP have successful results in terms of convergence, number of iterations, and objective function values....... Thanks to the use of the iterative method implemented, TopIP is able to solve large-scale problems with more than three millions degrees of freedom....

  10. Volume measurement study for large scale input accountancy tank

    International Nuclear Information System (INIS)

    Uchikoshi, Seiji; Watanabe, Yuichi; Tsujino, Takeshi

    1999-01-01

    Large Scale Tank Calibration (LASTAC) facility, including an experimental tank which has the same volume and structure as the input accountancy tank of Rokkasho Reprocessing Plant (RRP) was constructed in Nuclear Material Control Center of Japan. Demonstration experiments have been carried out to evaluate a precision of solution volume measurement and to establish the procedure of highly accurate pressure measurement for a large scale tank with dip-tube bubbler probe system to be applied to the input accountancy tank of RRP. Solution volume in a tank is determined from substitution the solution level for the calibration function obtained in advance, which express a relation between the solution level and its volume in the tank. Therefore, precise solution volume measurement needs a precise calibration function that is determined carefully. The LASTAC calibration experiments using pure water showed good result in reproducibility. (J.P.N.)

  11. Model for large scale circulation of nuclides in nature, 1

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, Teruaki

    1988-12-01

    A model for large scale circulation of nuclides was developed, and a computer code named COCAIN was made which simulates this circulation system-dynamically. The natural environment considered in the present paper consists of 2 atmospheres, 8 geospheres and 2 lithospheres. The biosphere is composed of 4 types of edible plants, 5 cattles and their products, 4 water biota and 16 human organs. The biosphere is assumed to be given nuclides from the natural environment mentioned above. With the use of COCAIN, two numerical case studies were carried out; the one is the study on nuclear pollution in nature by the radioactive nuclides originating from the past nuclear bomb tests, and the other is the study on the response of environment and biota to the pulse injection of nuclides into one compartment. From the former case study it was verified that this model can well explain the observation and properly simulate the large scale circulation of nuclides in nature.

  12. Performance of Grey Wolf Optimizer on large scale problems

    Science.gov (United States)

    Gupta, Shubham; Deep, Kusum

    2017-01-01

    For solving nonlinear continuous problems of optimization numerous nature inspired optimization techniques are being proposed in literature which can be implemented to solve real life problems wherein the conventional techniques cannot be applied. Grey Wolf Optimizer is one of such technique which is gaining popularity since the last two years. The objective of this paper is to investigate the performance of Grey Wolf Optimization Algorithm on large scale optimization problems. The Algorithm is implemented on 5 common scalable problems appearing in literature namely Sphere, Rosenbrock, Rastrigin, Ackley and Griewank Functions. The dimensions of these problems are varied from 50 to 1000. The results indicate that Grey Wolf Optimizer is a powerful nature inspired Optimization Algorithm for large scale problems, except Rosenbrock which is a unimodal function.

  13. Large-scale building energy efficiency retrofit: Concept, model and control

    International Nuclear Information System (INIS)

    Wu, Zhou; Wang, Bo; Xia, Xiaohua

    2016-01-01

    BEER (Building energy efficiency retrofit) projects are initiated in many nations and regions over the world. Existing studies of BEER focus on modeling and planning based on one building and one year period of retrofitting, which cannot be applied to certain large BEER projects with multiple buildings and multi-year retrofit. In this paper, the large-scale BEER problem is defined in a general TBT (time-building-technology) framework, which fits essential requirements of real-world projects. The large-scale BEER is newly studied in the control approach rather than the optimization approach commonly used before. Optimal control is proposed to design optimal retrofitting strategy in terms of maximal energy savings and maximal NPV (net present value). The designed strategy is dynamically changing on dimensions of time, building and technology. The TBT framework and the optimal control approach are verified in a large BEER project, and results indicate that promising performance of energy and cost savings can be achieved in the general TBT framework. - Highlights: • Energy efficiency retrofit of many buildings is studied. • A TBT (time-building-technology) framework is proposed. • The control system of the large-scale BEER is modeled. • The optimal retrofitting strategy is obtained.

  14. Investigation of the large scale regional hydrogeological situation at Ceberg

    International Nuclear Information System (INIS)

    Boghammar, A.; Grundfelt, B.; Hartley, L.

    1997-11-01

    The present study forms part of the large-scale groundwater flow studies within the SR 97 project. The site of interest is Ceberg. Within the present study two different regional scale groundwater models have been constructed, one large regional model with an areal extent of about 300 km 2 and one semi-regional model with an areal extent of about 50 km 2 . Different types of boundary conditions have been applied to the models. Topography driven pressures, constant infiltration rates, non-linear infiltration combined specified pressure boundary conditions, and transfer of groundwater pressures from the larger model to the semi-regional model. The present model has shown that: -Groundwater flow paths are mainly local. Large-scale groundwater flow paths are only seen below the depth of the hypothetical repository (below 500 meters) and are very slow. -Locations of recharge and discharge, to and from the site area are in the close vicinity of the site. -The low contrast between major structures and the rock mass means that the factor having the major effect on the flowpaths is the topography. -A sufficiently large model, to incorporate the recharge and discharge areas to the local site is in the order of kilometres. -A uniform infiltration rate boundary condition does not give a good representation of the groundwater movements in the model. -A local site model may be located to cover the site area and a few kilometers of the surrounding region. In order to incorporate all recharge and discharge areas within the site model, the model will be somewhat larger than site scale models at other sites. This is caused by the fact that the discharge areas are divided into three distinct areas to the east, south and west of the site. -Boundary conditions may be supplied to the site model by means of transferring groundwater pressures obtained with the semi-regional model

  15. System Recovery in Large-Scale Distributed Storage Systems

    OpenAIRE

    Aga, Svein

    2008-01-01

    This report aims to describe and improve a system recovery process in large-scale storage systems. Inevitable, a recovery process results in the system being loaded with internal replication of data, and will extensively utilize several storage nodes. Such internal load can be categorized and generalized into a maintenance workload class. Obviously, a storage system will have external clients which also introduce load into the system. This can be users altering their data, uploading new cont...

  16. Foundations of Large-Scale Multimedia Information Management and Retrieval

    CERN Document Server

    Chang, Edward Y

    2011-01-01

    "Foundations of Large-Scale Multimedia Information Management and Retrieval - Mathematics of Perception" covers knowledge representation and semantic analysis of multimedia data and scalability in signal extraction, data mining, and indexing. The book is divided into two parts: Part I - Knowledge Representation and Semantic Analysis focuses on the key components of mathematics of perception as it applies to data management and retrieval. These include feature selection/reduction, knowledge representation, semantic analysis, distance function formulation for measuring similarity, and

  17. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    Selected Large-Scale Projects Common Name: Orion Project Update The President proposed cancellation of the Constellation Program, including the Orion ...fiscal year 2010. NASA remains poised to leverage Constellation assets to contribute to future exploration beyond low-Earth orbit. Orion Crew...Observatory 2 (OCO-2) 65 Orion Crew Exploration Vehicle 67 Radiation Belt Storm Probes (RBSP) 69 Soil Moisture Active and Passive (SMAP) 71

  18. Large scale 2D spectral compressed sensing in continuous domain

    KAUST Repository

    Cai, Jian-Feng

    2017-06-20

    We consider the problem of spectral compressed sensing in continuous domain, which aims to recover a 2-dimensional spectrally sparse signal from partially observed time samples. The signal is assumed to be a superposition of s complex sinusoids. We propose a semidefinite program for the 2D signal recovery problem. Our model is able to handle large scale 2D signals of size 500 × 500, whereas traditional approaches only handle signals of size around 20 × 20.

  19. How Large-Scale Research Facilities Connect to Global Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn

    2013-01-01

    Policies for large-scale research facilities (LSRFs) often highlight their spillovers to industrial innovation and their contribution to the external connectivity of the regional innovation system hosting them. Arguably, the particular institutional features of LSRFs are conducive for collaborati...... with domestic universities or government laboratories. Policies conceiving LSRFs as “knowledge attractors” therefore should consider the complementarities between research at a LSRF and in its academic context at a regional or national level....

  20. Large-scale prediction of drug-target relationships

    DEFF Research Database (Denmark)

    Kuhn, Michael; Campillos, Mónica; González, Paula

    2008-01-01

    , but also provides a more global view on drug-target relations. Here we review recent attempts to apply large-scale computational analyses to predict novel interactions of drugs and targets from molecular and cellular features. In this context, we quantify the family-dependent probability of two proteins...... to bind the same ligand as function of their sequence similarity. We finally discuss how phenotypic data could help to expand our understanding of the complex mechanisms of drug action....

  1. Experimental simulation of microinteractions in large scale explosions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, X.; Luo, R.; Yuen, W.W.; Theofanous, T.G. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    This paper presents data and analysis of recent experiments conducted in the SIGMA-2000 facility to simulate microinteractions in large scale explosions. Specifically, the fragmentation behavior of a high temperature molten steel drop under high pressure (beyond critical) conditions are investigated. The current data demonstrate, for the first time, the effect of high pressure in suppressing the thermal effect of fragmentation under supercritical conditions. The results support the microinteractions idea, and the ESPROSE.m prediction of fragmentation rate. (author)

  2. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  3. Primordial large-scale electromagnetic fields from gravitoelectromagnetic inflation

    International Nuclear Information System (INIS)

    Membiela, Federico Agustin; Bellini, Mauricio

    2009-01-01

    We investigate the origin and evolution of primordial electric and magnetic fields in the early universe, when the expansion is governed by a cosmological constant Λ 0 . Using the gravitoelectromagnetic inflationary formalism with A 0 =0, we obtain the power of spectrums for large-scale magnetic fields and the inflaton field fluctuations during inflation. A very important fact is that our formalism is naturally non-conformally invariant.

  4. Exploring the technical challenges of large-scale lifelogging

    OpenAIRE

    Gurrin, Cathal; Smeaton, Alan F.; Qiu, Zhengwei; Doherty, Aiden R.

    2013-01-01

    Ambiently and automatically maintaining a lifelog is an activity that may help individuals track their lifestyle, learning, health and productivity. In this paper we motivate and discuss the technical challenges of developing real-world lifelogging solutions, based on seven years of experience. The gathering, organisation, retrieval and presentation challenges of large-scale lifelogging are dis- cussed and we show how this can be achieved and the benefits that may accrue.

  5. Accuracy control in ultra-large-scale electronic structure calculation

    OpenAIRE

    Hoshi, Takeo

    2007-01-01

    Numerical aspects are investigated in ultra-large-scale electronic structure calculation. Accuracy control methods in process (molecular-dynamics) calculation are focused. Flexible control methods are proposed so as to control variational freedoms, automatically at each time step, within the framework of generalized Wannier state theory. The method is demonstrated in silicon cleavage simulation with 10^2-10^5 atoms. The idea is of general importance among process calculations and is also used...

  6. Large Scale Density Estimation of Blue and Fin Whales (LSD)

    Science.gov (United States)

    2015-09-30

    1 DISTRIBUTION STATEMENT A. Approved for public release; distribution is unlimited. Large Scale Density Estimation of Blue and Fin Whales ...sensors, or both. The goal of this research is to develop and implement a new method for estimating blue and fin whale density that is effective over...develop and implement a density estimation methodology for quantifying blue and fin whale abundance from passive acoustic data recorded on sparse

  7. Domain nesting for multi-scale large eddy simulation

    Science.gov (United States)

    Fuka, Vladimir; Xie, Zheng-Tong

    2016-04-01

    The need to simulate city scale areas (O(10 km)) with high resolution within street canyons in certain areas of interests necessitates different grid resolutions in different part of the simulated area. General purpose computational fluid dynamics codes typically employ unstructured refined grids while mesoscale meteorological models more often employ nesting of computational domains. ELMM is a large eddy simulation model for the atmospheric boundary layer. It employs orthogonal uniform grids and for this reason domain nesting was chosen as the approach for simulations in multiple scales. Domains are implemented as sets of MPI processes which communicate with each other as in a normal non-nested run, but also with processes from another (outer/inner) domain. It should stressed that the duration of solution of time-steps in the outer and in the inner domain must be synchronized, so that the processes do not have to wait for the completion of their boundary conditions. This can achieved by assigning an appropriate number of CPUs to each domain, and to gain high efficiency. When nesting is applied for large eddy simulation, the inner domain receives inflow boundary conditions which lack turbulent motions not represented by the outer grid. ELMM remedies this by optional adding of turbulent fluctuations to the inflow using the efficient method of Xie and Castro (2008). The spatial scale of these fluctuations is in the subgrid-scale of the outer grid and their intensity will be estimated from the subgrid turbulent kinetic energy in the outer grid.

  8. Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development

    Science.gov (United States)

    Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dünner, R.; Essinger-Hileman, T.; Eimer, J.; Fluxa, P.; Gothe, D.; Halpern, M.; Harrington, K.; Hilton, G.; Hinshaw, G.; Hubmayr, J.; Iuliano, J.; Marriage, T. A.; Miller, N.; Moseley, S. H.; Mumby, G.; Petroff, M.; Reintsema, C.; Rostem, K.; U-Yen, K.; Watts, D.; Wagner, E.; Wollack, E. J.; Xu, Z.; Zeng, L.

    2016-08-01

    The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe ˜ 70 % of the sky. A variable-delay polarization modulator provides modulation of the polarization at ˜ 10 Hz to suppress the 1/ f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.

  9. Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development

    Science.gov (United States)

    Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dunner, R.; Essinger-Hileman, T.; Eimer, J.; hide

    2015-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe approx.70% of the sky. A variable-delay polarization modulator provides modulation of the polarization at approx.10Hz to suppress the 1/f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.

  10. DEMNUni: massive neutrinos and the bispectrum of large scale structures

    Science.gov (United States)

    Ruggeri, Rossana; Castorina, Emanuele; Carbone, Carmelita; Sefusatti, Emiliano

    2018-03-01

    The main effect of massive neutrinos on the large-scale structure consists in a few percent suppression of matter perturbations on all scales below their free-streaming scale. Such effect is of particular importance as it allows to constraint the value of the sum of neutrino masses from measurements of the galaxy power spectrum. In this work, we present the first measurements of the next higher-order correlation function, the bispectrum, from N-body simulations that include massive neutrinos as particles. This is the simplest statistics characterising the non-Gaussian properties of the matter and dark matter halos distributions. We investigate, in the first place, the suppression due to massive neutrinos on the matter bispectrum, comparing our measurements with the simplest perturbation theory predictions, finding the approximation of neutrinos contributing at quadratic order in perturbation theory to provide a good fit to the measurements in the simulations. On the other hand, as expected, a linear approximation for neutrino perturbations would lead to Script O(fν) errors on the total matter bispectrum at large scales. We then attempt an extension of previous results on the universality of linear halo bias in neutrino cosmologies, to non-linear and non-local corrections finding consistent results with the power spectrum analysis.

  11. Ecohydrological modeling for large-scale environmental impact assessment.

    Science.gov (United States)

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  13. Learning Short Binary Codes for Large-scale Image Retrieval.

    Science.gov (United States)

    Liu, Li; Yu, Mengyang; Shao, Ling

    2017-03-01

    Large-scale visual information retrieval has become an active research area in this big data era. Recently, hashing/binary coding algorithms prove to be effective for scalable retrieval applications. Most existing hashing methods require relatively long binary codes (i.e., over hundreds of bits, sometimes even thousands of bits) to achieve reasonable retrieval accuracies. However, for some realistic and unique applications, such as on wearable or mobile devices, only short binary codes can be used for efficient image retrieval due to the limitation of computational resources or bandwidth on these devices. In this paper, we propose a novel unsupervised hashing approach called min-cost ranking (MCR) specifically for learning powerful short binary codes (i.e., usually the code length shorter than 100 b) for scalable image retrieval tasks. By exploring the discriminative ability of each dimension of data, MCR can generate one bit binary code for each dimension and simultaneously rank the discriminative separability of each bit according to the proposed cost function. Only top-ranked bits with minimum cost-values are then selected and grouped together to compose the final salient binary codes. Extensive experimental results on large-scale retrieval demonstrate that MCR can achieve comparative performance as the state-of-the-art hashing algorithms but with significantly shorter codes, leading to much faster large-scale retrieval.

  14. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  15. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  16. Origin of the large scale structures of the universe

    International Nuclear Information System (INIS)

    Oaknin, David H.

    2004-01-01

    We revise the statistical properties of the primordial cosmological density anisotropies that, at the time of matter-radiation equality, seeded the gravitational development of large scale structures in the otherwise homogeneous and isotropic Friedmann-Robertson-Walker flat universe. Our analysis shows that random fluctuations of the density field at the same instant of equality and with comoving wavelength shorter than the causal horizon at that time can naturally account, when globally constrained to conserve the total mass (energy) of the system, for the observed scale invariance of the anisotropies over cosmologically large comoving volumes. Statistical systems with similar features are generically known as glasslike or latticelike. Obviously, these conclusions conflict with the widely accepted understanding of the primordial structures reported in the literature, which requires an epoch of inflationary cosmology to precede the standard expansion of the universe. The origin of the conflict must be found in the widespread, but unjustified, claim that scale invariant mass (energy) anisotropies at the instant of equality over comoving volumes of cosmological size, larger than the causal horizon at the time, must be generated by fluctuations in the density field with comparably large comoving wavelength

  17. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  18. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  19. Large scale scenario analysis of future low carbon energy options

    International Nuclear Information System (INIS)

    Olaleye, Olaitan; Baker, Erin

    2015-01-01

    In this study, we use a multi-model framework to examine a set of possible future energy scenarios resulting from R&D investments in Solar, Nuclear, Carbon Capture and Storage (CCS), Bio-fuels, Bio-electricity, and Batteries for Electric Transportation. Based on a global scenario analysis, we examine the impact on the economy of advancement in energy technologies, considering both individual technologies and the interactions between pairs of technologies, with a focus on the role of uncertainty. Nuclear and CCS have the most impact on abatement costs, with CCS mostly important at high levels of abatement. We show that CCS and Bio-electricity are complements, while most of the other energy technology pairs are substitutes. We also examine for stochastic dominance between R&D portfolios: given the uncertainty in R&D outcomes, we examine which portfolios would be preferred by all decision-makers, regardless of their attitude toward risk. We observe that portfolios with CCS tend to stochastically dominate those without CCS; and portfolios lacking CCS and Nuclear tend to be stochastically dominated by others. We find that the dominance of CCS becomes even stronger as uncertainty in climate damages increases. Finally, we show that there is significant value in carefully choosing a portfolio, as relatively small portfolios can dominate large portfolios. - Highlights: • We examine future energy scenarios in the face of R&D and climate uncertainty. • We examine the impact of advancement in energy technologies and pairs of technologies. • CCS complements Bio-electricity while most technology pairs are substitutes. • R&D portfolios without CCS are stochastically dominated by portfolios with CCS. • Higher damage uncertainty favors R&D development of CCS and Bio-electricity

  20. MicroEcos: Micro-Scale Explorations of Large-Scale Late Pleistocene Ecosystems

    Science.gov (United States)

    Gellis, B. S.

    2017-12-01

    Pollen data can inform the reconstruction of early-floral environments by providing data for artistic representations of what early-terrestrial ecosystems looked like, and how existing terrestrial landscapes have evolved. For example, what did the Bighorn Basin look like when large ice sheets covered modern Canada, the Yellowstone Plateau had an ice cap, and the Bighorn Mountains were mantled with alpine glaciers? MicroEcos is an immersive, multimedia project that aims to strengthen human-nature connections through the understanding and appreciation of biological ecosystems. Collected pollen data elucidates flora that are visible in the fossil record - associated with the Late-Pleistocene - and have been illustrated and described in botanical literature. It aims to make scientific data accessible and interesting to all audiences through a series of interactive-digital sculptures, large-scale photography and field-based videography. While this project is driven by scientific data, it is rooted in deeply artistic and outreach-based practices, which include broad artistic practices, e.g.: digital design, illustration, photography, video and sound design. Using 3D modeling and printing technology MicroEcos centers around a series of 3D-printed models of the Last Canyon rock shelter on the Wyoming and Montana border, Little Windy Hill pond site in Wyoming's Medicine Bow National Forest, and Natural Trap Cave site in Wyoming's Big Horn Basin. These digital, interactive-3D sculpture provide audiences with glimpses of three-dimensional Late-Pleistocene environments, and helps create dialogue of how grass, sagebrush, and spruce based ecosystems form. To help audiences better contextualize how MicroEcos bridges notions of time, space, and place, modern photography and videography of the Last Canyon, Little Windy Hill and Natural Trap Cave sites surround these 3D-digital reconstructions.

  1. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  2. Empirical research methods for technology validation: Scaling up to practice

    NARCIS (Netherlands)

    Wieringa, Roelf J.

    tBefore technology is transferred to the market, it must be validated empirically by simulating future prac-tical use of the technology. Technology prototypes are first investigated in simplified contexts, and thesesimulations are scaled up to conditions of practice step by step as more becomes

  3. Large-scale stabilization control of input-constrained quadrotor

    Directory of Open Access Journals (Sweden)

    Jun Jiang

    2016-10-01

    Full Text Available The quadrotor has been the most popular aircraft in the last decade due to its excellent dynamics and continues to attract ever-increasing research interest. Delivering a quadrotor from a large fixed-wing aircraft is a promising application of quadrotors. In such an application, the quadrotor needs to switch from a highly unstable status, featured as large initial states, to a safe and stable flight status. This is the so-called large-scale stability control problem. In such an extreme scenario, the quadrotor is at risk of actuator saturation. This can cause the controller to update incorrectly and lead the quadrotor to spiral and crash. In this article, to safely control the quadrotor in such scenarios, the control input constraint is analyzed. The key states of a quadrotor dynamic model are selected, and a two-dimensional dynamic model is extracted based on a symmetrical body configuration. A generalized point-wise min-norm nonlinear control method is proposed based on the Lyapunov function, and large-scale stability control is hence achieved. An enhanced point-wise, min-norm control is further provided to improve the attitude control performance, with altitude performance degenerating slightly. Simulation results showed that the proposed control methods can stabilize the input-constrained quadrotor and the enhanced method can improve the performance of the quadrotor in critical states.

  4. Remote Sensing Image Classification With Large-Scale Gaussian Processes

    Science.gov (United States)

    Morales-Alvarez, Pablo; Perez-Suay, Adrian; Molina, Rafael; Camps-Valls, Gustau

    2018-02-01

    Current remote sensing image classification problems have to deal with an unprecedented amount of heterogeneous and complex data sources. Upcoming missions will soon provide large data streams that will make land cover/use classification difficult. Machine learning classifiers can help at this, and many methods are currently available. A popular kernel classifier is the Gaussian process classifier (GPC), since it approaches the classification problem with a solid probabilistic treatment, thus yielding confidence intervals for the predictions as well as very competitive results to state-of-the-art neural networks and support vector machines. However, its computational cost is prohibitive for large scale applications, and constitutes the main obstacle precluding wide adoption. This paper tackles this problem by introducing two novel efficient methodologies for Gaussian Process (GP) classification. We first include the standard random Fourier features approximation into GPC, which largely decreases its computational cost and permits large scale remote sensing image classification. In addition, we propose a model which avoids randomly sampling a number of Fourier frequencies, and alternatively learns the optimal ones within a variational Bayes approach. The performance of the proposed methods is illustrated in complex problems of cloud detection from multispectral imagery and infrared sounding data. Excellent empirical results support the proposal in both computational cost and accuracy.

  5. Some ecological guidelines for large-scale biomass plantations

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, W.; Cook, J.H.; Beyea, J. [National Audubon Society, Tavernier, FL (United States)

    1993-12-31

    The National Audubon Society sees biomass as an appropriate and necessary source of energy to help replace fossil fuels in the near future, but is concerned that large-scale biomass plantations could displace significant natural vegetation and wildlife habitat, and reduce national and global biodiversity. We support the development of an industry large enough to provide significant portions of our energy budget, but we see a critical need to ensure that plantations are designed and sited in ways that minimize ecological disruption, or even provide environmental benefits. We have been studying the habitat value of intensively managed short-rotation tree plantations. Our results show that these plantations support large populations of some birds, but not all of the species using the surrounding landscape, and indicate that their value as habitat can be increased greatly by including small areas of mature trees within them. We believe short-rotation plantations can benefit regional biodiversity if they can be deployed as buffers for natural forests, or as corridors connecting forest tracts. To realize these benefits, and to avoid habitat degradation, regional biomass plantation complexes (e.g., the plantations supplying all the fuel for a powerplant) need to be planned, sited, and developed as large-scale units in the context of the regional landscape mosaic.

  6. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  7. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  8. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    Science.gov (United States)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  9. Advances in Large-Scale Solar Heating and Long Term Storage in Denmark

    DEFF Research Database (Denmark)

    Heller, Alfred

    2000-01-01

    of the total 51 plants are equipped with long-term storage. In Denmark, 7 plants are installed, comprising of approx. 18,000-m2 collector area with new plants planned. The development of these plants and the involved technologies will be presented in this paper, with a focus on the improvements for Danish......According to (the) information from the European Large-Scale Solar Heating Network, (See http://www.hvac.chalmers.se/cshp/), the area of installed solar collectors for large-scale application is in Europe, approximately 8 mill m2, corresponding to about 4000 MW thermal power. The 11 plants...... Central Solar Heating Plants, servicing District Heating and related developments in large-scale thermal storage. Central solar heating today is a mature and economic realistic solution for district heating based on a renewable source. The cost for solar collectors has decreased by nearly ¼ during...

  10. Soil organic carbon - a large scale paired catchment assessment

    Science.gov (United States)

    Kunkel, V.; Hancock, G. R.; Wells, T.

    2016-12-01

    Soil organic carbon (SOC) concentration can vary both spatially and temporally driven by differences in soil properties, topography and climate. However most studies have focused on point scale data sets with a paucity of studies examining larger scale catchments. Here we examine the spatial and temporal distribution of SOC for two large catchments. The Krui (575 km2) and Merriwa River (675km2) catchments (New South Wales, Australia). Both have similar shape, soils, topography and orientation. We show that SOC distribution is very similar for both catchments and that elevation (and associated increase in soil moisture) is a major influence on SOC. We also show that there is little change in SOC from the initial assessment in 2006 to 2015 despite a major drought from 2003 to 2010 and extreme rainfall events in 2007 and 2010 -therefore SOC concentration appears robust. However, we found significant relationships between erosion and deposition patterns (as quantified using 137Cs) and SOC for both catchments again demonstrating a strong geomorphic relationship. Vegetation across the catchments was assessed using remote sensing (Landsat and MODIS). Vegetation patterns were temporally consistent with above ground biomass increasing with elevation. SOC could be predicted using both these low and high resolution remote sensing platforms. Results indicate that, although moderate resolution (250 m) allows for reasonable prediction of the spatial distribution of SOC, the higher resolution (30 m) improved the strength of the SOC-NDVI relationship. The relationship between SOC and 137Cs, as a surrogate for the erosion and deposition of SOC, suggested that sediment transport and deposition influences the distribution of SOC within the catchment. The findings demonstrate that over the large catchment scale and at the decadal time scale that SOC is relatively constant and can largely be predicted by topography.

  11. Technology scale and supply chains in a secure, affordable and low carbon energy transition

    International Nuclear Information System (INIS)

    Hoggett, Richard

    2014-01-01

    Highlights: • Energy systems need to decarbonise, provide security and remain affordable. • There is uncertainty over which technologies will best enable this to happen. • A strategy to deal with uncertainty is to assess a technologies ability to show resilience, flexibility and adaptability. • Scale is important and smaller scale technologies are like to display the above characteristics. • Smaller scale technologies are therefore more likely to enable a sustainable, secure, and affordable energy transition. - Abstract: This research explores the relationship between technology scale, energy security and decarbonisation within the UK energy system. There is considerable uncertainty about how best to deliver on these goals for energy policy, but a focus on supply chains and their resilience can provide useful insights into the problems uncertainty causes. Technology scale is central to this, and through an analysis of the supply chains of nuclear power and solar photovoltaics, it is suggested that smaller scale technologies are more likely to support and enable a secure, low carbon energy transition. This is because their supply chains are less complex, show more flexibility and adaptability, and can quickly respond to changes within an energy system, and as such they are more resilient than large scale technologies. These characteristics are likely to become increasingly important in a rapidly changing energy system, and prioritising those technologies that demonstrate resilience, flexibility and adaptability will better enable a transition that is rapid, sustainable, secure and affordable

  12. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  13. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  14. UAV Data Processing for Large Scale Topographical Mapping

    Directory of Open Access Journals (Sweden)

    W. Tampubolon

    2014-06-01

    Full Text Available Large scale topographical mapping in the third world countries is really a prominent challenge in geospatial industries nowadays. On one side the demand is significantly increasing while on the other hand it is constrained by limited budgets available for mapping projects. Since the advent of Act Nr.4/yr.2011 about Geospatial Information in Indonesia, large scale topographical mapping has been on high priority for supporting the nationwide development e.g. detail spatial planning. Usually large scale topographical mapping relies on conventional aerial survey campaigns in order to provide high resolution 3D geospatial data sources. Widely growing on a leisure hobby, aero models in form of the so-called Unmanned Aerial Vehicle (UAV bring up alternative semi photogrammetric aerial data acquisition possibilities suitable for relatively small Area of Interest (AOI i.e. <5,000 hectares. For detail spatial planning purposes in Indonesia this area size can be used as a mapping unit since it usually concentrates on the basis of sub district area (kecamatan level. In this paper different camera and processing software systems will be further analyzed for identifying the best optimum UAV data acquisition campaign components in combination with the data processing scheme. The selected AOI is covering the cultural heritage of Borobudur Temple as one of the Seven Wonders of the World. A detailed accuracy assessment will be concentrated within the object feature of the temple at the first place. Feature compilation involving planimetric objects (2D and digital terrain models (3D will be integrated in order to provide Digital Elevation Models (DEM as the main interest of the topographic mapping activity. By doing this research, incorporating the optimum amount of GCPs in the UAV photo data processing will increase the accuracy along with its high resolution in 5 cm Ground Sampling Distance (GSD. Finally this result will be used as the benchmark for alternative

  15. The predictability of large-scale wind-driven flows

    Directory of Open Access Journals (Sweden)

    A. Mahadevan

    2001-01-01

    Full Text Available The singular values associated with optimally growing perturbations to stationary and time-dependent solutions for the general circulation in an ocean basin provide a measure of the rate at which solutions with nearby initial conditions begin to diverge, and hence, a measure of the predictability of the flow. In this paper, the singular vectors and singular values of stationary and evolving examples of wind-driven, double-gyre circulations in different flow regimes are explored. By changing the Reynolds number in simple quasi-geostrophic models of the wind-driven circulation, steady, weakly aperiodic and chaotic states may be examined. The singular vectors of the steady state reveal some of the physical mechanisms responsible for optimally growing perturbations. In time-dependent cases, the dominant singular values show significant variability in time, indicating strong variations in the predictability of the flow. When the underlying flow is weakly aperiodic, the dominant singular values co-vary with integral measures of the large-scale flow, such as the basin-integrated upper ocean kinetic energy and the transport in the western boundary current extension. Furthermore, in a reduced gravity quasi-geostrophic model of a weakly aperiodic, double-gyre flow, the behaviour of the dominant singular values may be used to predict a change in the large-scale flow, a feature not shared by an analogous two-layer model. When the circulation is in a strongly aperiodic state, the dominant singular values no longer vary coherently with integral measures of the flow. Instead, they fluctuate in a very aperiodic fashion on mesoscale time scales. The dominant singular vectors then depend strongly on the arrangement of mesoscale features in the flow and the evolved forms of the associated singular vectors have relatively short spatial scales. These results have several implications. In weakly aperiodic, periodic, and stationary regimes, the mesoscale energy

  16. Systematic renormalization of the effective theory of Large Scale Structure

    International Nuclear Information System (INIS)

    Abolhasani, Ali Akbar; Mirbabayi, Mehrdad; Pajer, Enrico

    2016-01-01

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we show that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k 2 and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.

  17. Quantitative approach to the topology of large-scale structure

    International Nuclear Information System (INIS)

    Gott, J.R. III; Weinberg, D.H.; Melott, A.L.; Kansas Univ., Lawrence)

    1987-01-01

    A quantitative measure of the topology of large-scale structure: the genus of density contours in a smoothed density distribution, is described and applied. For random phase (Gaussian) density fields, the mean genus per unit volume exhibits a universal dependence on threshold density, with a normalizing factor that can be calculated from the power spectrum. If large-scale structure formed from the gravitational instability of small-amplitude density fluctuations, the topology observed today on suitable scales should follow the topology in the initial conditions. The technique is illustrated by applying it to simulations of galaxy clustering in a flat universe dominated by cold dark matter. The technique is also applied to a volume-limited sample of the CfA redshift survey and to a model in which galaxies reside on the surfaces of polyhedral bubbles. The topology of the evolved mass distribution and biased galaxy distribution in the cold dark matter models closely matches the topology of the density fluctuations in the initial conditions. The topology of the observational sample is consistent with the random phase, cold dark matter model. 22 references

  18. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  19. Large-Scale Industrial Production of Superconducting Cavities

    CERN Document Server

    Chiaveri, Enrico

    1996-01-01

    Many laboratories around the world, notably CEBAF, CERN, DESY and KEK, after a period of research and development, are presently or have recently been involved in the industrial production of a large number of RF superconducting cavities. CERN, instead of using the standard bulk niobium technique, has developed a new Nb/Cu technology (niobium film deposited by magnetron sputtering on copper). The aim of this paper is to present the transfer of this technology to three European firms [Ansaldo, CERCA and Siemens (now ACCEL)]. Emphasis will be placed on the major challenges to industry of mastering the very complex procedure (which requires high quality control at every stage of the production) needed to achieve a very demanding final RF performance [Q(6 MV/m) = 3.4¥109 at 4.5 K].

  20. The Effect of Large Scale Salinity Gradient on Langmuir Turbulence

    Science.gov (United States)

    Fan, Y.; Jarosz, E.; Yu, Z.; Jensen, T.; Sullivan, P. P.; Liang, J.

    2017-12-01

    Langmuir circulation (LC) is believed to be one of the leading order causes of turbulent mixing in the upper ocean. It is important for momentum and heat exchange across the mixed layer (ML) and directly impact the dynamics and thermodynamics in the upper ocean and lower atmosphere including the vertical distributions of chemical, biological, optical, and acoustic properties. Based on Craik and Leibovich (1976) theory, large eddy simulation (LES) models have been developed to simulate LC in the upper ocean, yielding new insights that could not be obtained from field observations and turbulent closure models. Due its high computational cost, LES models are usually limited to small domain sizes and cannot resolve large-scale flows. Furthermore, most LES models used in the LC simulations use periodic boundary conditions in the horizontal direction, which assumes the physical properties (i.e. temperature and salinity) and expected flow patterns in the area of interest are of a periodically repeating nature so that the limited small LES domain is representative for the larger area. Using periodic boundary condition can significantly reduce computational effort in problems, and it is a good assumption for isotropic shear turbulence. However, LC is anisotropic (McWilliams et al 1997) and was observed to be modulated by crosswind tidal currents (Kukulka et al 2011). Using symmetrical domains, idealized LES studies also indicate LC could interact with oceanic fronts (Hamlington et al 2014) and standing internal waves (Chini and Leibovich, 2005). The present study expands our previous LES modeling investigations of Langmuir turbulence to the real ocean conditions with large scale environmental motion that features fresh water inflow into the study region. Large scale gradient forcing is introduced to the NCAR LES model through scale separation analysis. The model is applied to a field observation in the Gulf of Mexico in July, 2016 when the measurement site was impacted by

  1. FIELD DEMONSTRATION OF EMERGING PIPE WALL INTEGRITY ASSESSMENT TECHNOLOGIES FOR LARGE CAST IRON WATER MAINS

    Science.gov (United States)

    The U.S. Environmental Protection Agency sponsored a large-scale field demonstration of innovative leak detection/location and condition assessment technologies on a 76-year old, 2,500-ft long, cement-lined, 24-in. cast iron water main in Louisville, KY from July through Septembe...

  2. Field Demonstration of Emerging Pipe Wall Integrity Assessment Technologies for Large Cast Iron Water Mains - Paper

    Science.gov (United States)

    The U.S. Environmental Protection Agency (EPA) sponsored a large-scale field demonstration of innovative leak detection/location and condition assessment technologies on a 76-year old, 2,000-ft long, cement-lined, 24-in. cast-iron water main in Louisville, KY from July through Se...

  3. Periodic cells for large-scale problem initialization

    Directory of Open Access Journals (Sweden)

    Ciantia Matteo O.

    2017-01-01

    Full Text Available In geotechnical applications the success of the discrete element method (DEM in simulating fundamental aspects of soil behaviour has increased the interest in applications for direct simulation of engineering scale boundary value problems (BVP’s. The main problem is that the method remains relatively expensive in terms of computational cost. A non-negligible part of that cost is related to specimen creation and initialization. As the response of soil is strongly dependant on its initial state (stress and porosity, attaining a specified initial state is a crucial part of a DEM model. Different procedures for controlled sample generation are available. However, applying the existing REV-oriented initialization procedures to such models is inefficient in terms of computational cost and challenging in terms of sample homogeneity. In this work a simple but efficient procedure to initialize large-scale DEM models is presented. Periodic cells are first generated with a sufficient number of particles matching a desired particle size distribution (PSD. The cells are then equilibrated at low-level isotropic stress at target porosity. Once the cell is in equilibrium, it is replicated in space in order to fill the model domain. After the domain is thus filled a few mechanical cycles are needed to re-equilibrate the large domain. The result is a large, homogeneous sample, equilibrated under prescribed stress at the desired porosity. The method is applicable to both isotropic and anisotropic initial stress states, with stress magnitude varying in space.

  4. Detecting differential protein expression in large-scale population proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Soyoung; Qian, Weijun; Camp, David G.; Smith, Richard D.; Tompkins, Ronald G.; Davis, Ronald W.; Xiao, Wenzhong

    2014-06-17

    Mass spectrometry-based high-throughput quantitative proteomics shows great potential in clinical biomarker studies, identifying and quantifying thousands of proteins in biological samples. However, methods are needed to appropriately handle issues/challenges unique to mass spectrometry data in order to detect as many biomarker proteins as possible. One issue is that different mass spectrometry experiments generate quite different total numbers of quantified peptides, which can result in more missing peptide abundances in an experiment with a smaller total number of quantified peptides. Another issue is that the quantification of peptides is sometimes absent, especially for less abundant peptides and such missing values contain the information about the peptide abundance. Here, we propose a Significance Analysis for Large-scale Proteomics Studies (SALPS) that handles missing peptide intensity values caused by the two mechanisms mentioned above. Our model has a robust performance in both simulated data and proteomics data from a large clinical study. Because varying patients’ sample qualities and deviating instrument performances are not avoidable for clinical studies performed over the course of several years, we believe that our approach will be useful to analyze large-scale clinical proteomics data.

  5. Large Scale Community Detection Using a Small World Model

    Directory of Open Access Journals (Sweden)

    Ranjan Kumar Behera

    2017-11-01

    Full Text Available In a social network, small or large communities within the network play a major role in deciding the functionalities of the network. Despite of diverse definitions, communities in the network may be defined as the group of nodes that are more densely connected as compared to nodes outside the group. Revealing such hidden communities is one of the challenging research problems. A real world social network follows small world phenomena, which indicates that any two social entities can be reachable in a small number of steps. In this paper, nodes are mapped into communities based on the random walk in the network. However, uncovering communities in large-scale networks is a challenging task due to its unprecedented growth in the size of social networks. A good number of community detection algorithms based on random walk exist in literature. In addition, when large-scale social networks are being considered, these algorithms are observed to take considerably longer time. In this work, with an objective to improve the efficiency of algorithms, parallel programming framework like Map-Reduce has been considered for uncovering the hidden communities in social network. The proposed approach has been compared with some standard existing community detection algorithms for both synthetic and real-world datasets in order to examine its performance, and it is observed that the proposed algorithm is more efficient than the existing ones.

  6. Punishment sustains large-scale cooperation in prestate warfare

    Science.gov (United States)

    Mathew, Sarah; Boyd, Robert

    2011-01-01

    Understanding cooperation and punishment in small-scale societies is crucial for explaining the origins of human cooperation. We studied warfare among the Turkana, a politically uncentralized, egalitarian, nomadic pastoral society in East Africa. Based on a representative sample of 88 recent raids, we show that the Turkana sustain costly cooperation in combat at a remarkably large scale, at least in part, through punishment of free-riders. Raiding parties comprised several hundred warriors and participants are not kin or day-to-day interactants. Warriors incur substantial risk of death and produce collective benefits. Cowardice and desertions occur, and are punished by community-imposed sanctions, including collective corporal punishment and fines. Furthermore, Turkana norms governing warfare benefit the ethnolinguistic group, a population of a half-million people, at the expense of smaller social groupings. These results challenge current views that punishment is unimportant in small-scale societies and that human cooperation evolved in small groups of kin and familiar individuals. Instead, these results suggest that cooperation at the larger scale of ethnolinguistic units enforced by third-party sanctions could have a deep evolutionary history in the human species. PMID:21670285

  7. Deep Feature Learning and Cascaded Classifier for Large Scale Data

    DEFF Research Database (Denmark)

    Prasoon, Adhish

    from data rather than having a predefined feature set. We explore deep learning approach of convolutional neural network (CNN) for segmenting three dimensional medical images. We propose a novel system integrating three 2D CNNs, which have a one-to-one association with the xy, yz and zx planes of 3D......This thesis focuses on voxel/pixel classification based approaches for image segmentation. The main application is segmentation of articular cartilage in knee MRIs. The first major contribution of the thesis deals with large scale machine learning problems. Many medical imaging problems need huge...... amount of training data to cover sufficient biological variability. Learning methods scaling badly with number of training data points cannot be used in such scenarios. This may restrict the usage of many powerful classifiers having excellent generalization ability. We propose a cascaded classifier which...

  8. BigSUR: large-scale structured urban reconstruction

    KAUST Repository

    Kelly, Tom

    2017-11-22

    The creation of high-quality semantically parsed 3D models for dense metropolitan areas is a fundamental urban modeling problem. Although recent advances in acquisition techniques and processing algorithms have resulted in large-scale imagery or 3D polygonal reconstructions, such data-sources are typically noisy, and incomplete, with no semantic structure. In this paper, we present an automatic data fusion technique that produces high-quality structured models of city blocks. From coarse polygonal meshes, street-level imagery, and GIS footprints, we formulate a binary integer program that globally balances sources of error to produce semantically parsed mass models with associated facade elements. We demonstrate our system on four city regions of varying complexity; our examples typically contain densely built urban blocks spanning hundreds of buildings. In our largest example, we produce a structured model of 37 city blocks spanning a total of 1,011 buildings at a scale and quality previously impossible to achieve automatically.

  9. Neural Correlates of Unconsciousness in Large-Scale Brain Networks.

    Science.gov (United States)

    Mashour, George A; Hudetz, Anthony G

    2018-03-01

    The biological basis of consciousness is one of the most challenging and fundamental questions in 21st century science. A related pursuit aims to identify the neural correlates and causes of unconsciousness. We review current trends in the investigation of physiological, pharmacological, and pathological states of unconsciousness at the level of large-scale functional brain networks. We focus on the roles of brain connectivity, repertoire, graph-theoretical techniques, and neural dynamics in understanding the functional brain disconnections and reduced complexity that appear to characterize these states. Persistent questions in the field, such as distinguishing true correlates, linking neural scales, and understanding differential recovery patterns, are also addressed. Copyright © 2018 Elsevier Ltd. All rights reserved.

  10. On the Phenomenology of an Accelerated Large-Scale Universe

    Directory of Open Access Journals (Sweden)

    Martiros Khurshudyan

    2016-10-01

    Full Text Available In this review paper, several new results towards the explanation of the accelerated expansion of the large-scale universe is discussed. On the other hand, inflation is the early-time accelerated era and the universe is symmetric in the sense of accelerated expansion. The accelerated expansion of is one of the long standing problems in modern cosmology, and physics in general. There are several well defined approaches to solve this problem. One of them is an assumption concerning the existence of dark energy in recent universe. It is believed that dark energy is responsible for antigravity, while dark matter has gravitational nature and is responsible, in general, for structure formation. A different approach is an appropriate modification of general relativity including, for instance, f ( R and f ( T theories of gravity. On the other hand, attempts to build theories of quantum gravity and assumptions about existence of extra dimensions, possible variability of the gravitational constant and the speed of the light (among others, provide interesting modifications of general relativity applicable to problems of modern cosmology, too. In particular, here two groups of cosmological models are discussed. In the first group the problem of the accelerated expansion of large-scale universe is discussed involving a new idea, named the varying ghost dark energy. On the other hand, the second group contains cosmological models addressed to the same problem involving either new parameterizations of the equation of state parameter of dark energy (like varying polytropic gas, or nonlinear interactions between dark energy and dark matter. Moreover, for cosmological models involving varying ghost dark energy, massless particle creation in appropriate radiation dominated universe (when the background dynamics is due to general relativity is demonstrated as well. Exploring the nature of the accelerated expansion of the large-scale universe involving generalized

  11. Probing Inflation Using Galaxy Clustering On Ultra-Large Scales

    Science.gov (United States)

    Dalal, Roohi; de Putter, Roland; Dore, Olivier

    2018-01-01

    A detailed understanding of curvature perturbations in the universe is necessary to constrain theories of inflation. In particular, measurements of the local non-gaussianity parameter, flocNL, enable us to distinguish between two broad classes of inflationary theories, single-field and multi-field inflation. While most single-field theories predict flocNL ≈ ‑5/12 (ns -1), in multi-field theories, flocNL is not constrained to this value and is allowed to be observably large. Achieving σ(flocNL) = 1 would give us discovery potential for detecting multi-field inflation, while finding flocNL=0 would rule out a good fraction of interesting multi-field models. We study the use of galaxy clustering on ultra-large scales to achieve this level of constraint on flocNL. Upcoming surveys such as Euclid and LSST will give us galaxy catalogs from which we can construct the galaxy power spectrum and hence infer a value of flocNL. We consider two possible methods of determining the galaxy power spectrum from a catalog of galaxy positions: the traditional Feldman Kaiser Peacock (FKP) Power Spectrum Estimator, and an Optimal Quadratic Estimator (OQE). We implemented and tested each method using mock galaxy catalogs, and compared the resulting constraints on flocNL. We find that the FKP estimator can measure flocNL in an unbiased way, but there remains room for improvement in its precision. We also find that the OQE is not computationally fast, but remains a promising option due to its ability to isolate the power spectrum at large scales. We plan to extend this research to study alternative methods, such as pixel-based likelihood functions. We also plan to study the impact of general relativistic effects at these scales on our ability to measure flocNL.

  12. Large-eddy simulation of very-large-scale motions in atmospheric boundary-layer flows

    Science.gov (United States)

    Fang, Jiannong; Porté-Agel, Fernando

    2015-04-01

    In the last few decades, laboratory experiments and direct numerical simulations of turbulent boundary layers, performed at low to moderate Reynolds numbers, have found very-large-scale motions (VLSMs) in the logarithmic and outer regions. The size of VLSMs was found to be 10-20 times as large as the boundary-layer thickness. Recently, few studies based on field experiments examined the presence of VLSMs in neutral atmospheric boundary-layer flows, which are invariably at very high Reynolds numbers. Very large scale structures similar to those observed in laboratory-scale experiments have been found and characterized. However, it is known that field measurements are more challenging than laboratory-based measurements, and can lack resolution and statistical convergence. Such challenges have implications on the robustness of the analysis, which may be further adversely affected by the use of Taylor's hypothesis to convert time series to spatial data. We use large-eddy simulation (LES) to investigate VLSMs in atmospheric boundary-layer flows. In order to make sure that the largest flow structures are properly resolved, the horizontal domain size is chosen to be much larger than the standard domain size. It is shown that the contributions to the resolved turbulent kinetic energy and shear stress from VLSMs are significant. Therefore, the large computational domain adopted here is essential for the purpose of investigating VLSMs. The spatially coherent structures associated with VLSMs are characterized through flow visualization and statistical analysis. The instantaneous velocity fields in horizontal planes give evidence of streamwise-elongated flow structures of low-speed fluid with negative fluctuation of the streamwise velocity component, and which are flanked on either side by similarly elongated high-speed structures. The pre-multiplied power spectra and two-point correlations indicate that the scales of these streak-like structures are very large. These features

  13. Onsite Greywater Treatment using Pilot Scale Technology

    Directory of Open Access Journals (Sweden)

    Manzoor-ul-Haq Rajput

    2015-04-01

    Full Text Available The GROW Technology for greywater treatment was installed at the MUET (Mehran University of Engineering & Technology, hostel and run under continuous flow conditions with hydraulic loading rate of 0.15m.d-1. The monitoring and analysis of influent and effluent water were carried out during January-December, 2010. Local plants species such as water hyacinth, Pennywort (duck weed, Mint and Cattail were used in the GROW rig as a mixed mode. Coarse Gravels were filled in the troughs as a medium. The collected samples were analyzed for BOD5 (Biochemical Oxygen Demand, COD (Chemical Oxygen Demand, TSS (Total Suspended Solids, pH, and DO (Dissolved Oxygen. Removal efficiencies of BOD5, COD and TSS were calculated as 83.0,69.0 and 84.0% respectively. DO was found increased from 0.6-3.5 mg.dm-3 while pH was observed between 6.5-7.8

  14. FY results for the Los Alamos large scale demonstration and deployment project

    International Nuclear Information System (INIS)

    Stallings, E.; McFee, J.

    2000-01-01

    The Los Alamos Large Scale Demonstration and Deployment Project (LSDDP) in support of the US Department of Energy (DOE) Deactivation and Decommissioning Focus Area (DDFA) is identifying and demonstrating technologies to reduce the cost and risk of management of transuranic element contaminated large metal objects, i.e. gloveboxes. DOE must dispose of hundreds of gloveboxes from Rocky Flats, Los Alamos and other DOE sites. Current practices for removal, decontamination and size reduction of large metal objects translates to a DOE system-wide cost in excess of $800 million, without disposal costs. In FY99 and FY00 the Los Alamos LSDDP performed several demonstrations on cost/risk savings technologies. Commercial air pallets were demonstrated for movement and positioning of the oversized crates in neutron counting equipment. The air pallets are able to cost effectively address the complete waste management inventory, whereas the baseline wheeled carts could address only 25% of the inventory with higher manpower costs. A gamma interrogation radiography technology was demonstrated to support characterization of the crates. The technology was developed for radiography of trucks for identification of contraband. The radiographs were extremely useful in guiding the selection and method for opening very large crated metal objects. The cost of the radiography was small and the operating benefit is high. Another demonstration compared a Blade Cutting Plunger and reciprocating saw for removal of glovebox legs and appurtenances. The cost comparison showed that the Blade Cutting Plunger costs were comparable, and a significant safety advantage was reported. A second radiography demonstration was conducted evaluation of a technology based on WIPP-type x-ray characterization of large boxes. This technology provides considerable detail of the contents of the crates. The technology identified details as small as the fasteners in the crates, an unpunctured aerosol can, and a vessel

  15. Large scale obscuration and related climate effects open literature bibliography

    International Nuclear Information System (INIS)

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ''Nuclear Winter Controversy'' in the early 1980's. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest

  16. Large-scale biophysical evaluation of protein PEGylation effects

    DEFF Research Database (Denmark)

    Vernet, Erik; Popa, Gina; Pozdnyakova, Irina

    2016-01-01

    PEGylation is the most widely used method to chemically modify protein biopharmaceuticals, but surprisingly limited public data is available on the biophysical effects of protein PEGylation. Here we report the first large-scale study, with site-specific mono-PEGylation of 15 different proteins...... and characterization of 61 entities in total using a common set of analytical methods. Predictions of molecular size were typically accurate in comparison with actual size determined by size-exclusion chromatography (SEC) or dynamic light scattering (DLS). In contrast, there was no universal trend regarding the effect...

  17. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob

    2013-01-01

    necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...... optimization decision. A multi-objective PSO (MOPSO) algorithm was introduced to solve this optimization problem, which can accelerate the convergence and guarantee the diversity of Pareto-optimal front set as well. The feasibility and effectiveness of the proposed bi-level planning approach and the MOPSO...

  18. Large scale obscuration and related climate effects open literature bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ``Nuclear Winter Controversy`` in the early 1980`s. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest.

  19. Current status of large-scale cryogenic gravitational wave telescope

    International Nuclear Information System (INIS)

    Kuroda, K; Ohashi, M; Miyoki, S; Uchiyama, T; Ishitsuka, H; Yamamoto, K; Kasahara, K; Fujimoto, M-K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Nagano, S; Tsunesada, Y; Zhu, Zong-Hong; Shintomi, T; Yamamoto, A; Suzuki, T; Saito, Y; Haruyama, T; Sato, N; Higashi, Y; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Aso, Y; Ueda, K-I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Tagoshi, H; Nakamura, T; Sasaki, M; Tanaka, T; Oohara, K; Takahashi, H; Miyakawa, O; Tobar, M E

    2003-01-01

    The large-scale cryogenic gravitational wave telescope (LCGT) project is the proposed advancement of TAMA, which will be able to detect the coalescences of binary neutron stars occurring in our galaxy. LCGT intends to detect the coalescence events within about 240 Mpc, the rate of which is expected to be from 0.1 to several events in a year. LCGT has Fabry-Perot cavities of 3 km baseline and the mirrors are cooled down to a cryogenic temperature of 20 K. It is planned to be built in the underground of Kamioka mine. This paper overviews the revision of the design and the current status of the R and D

  20. Optimization of large scale food production using Lean Manufacturing principles

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Breum, Gitte

    2009-01-01

    This paper discusses how the production principles of Lean Manufacturing (Lean) can be applied in a large-scale meal production. Lean principles are briefly presented, followed by a field study of how a kitchen at a Danish hospital has implemented Lean in the daily production. In the kitchen...... not be negatively affected by the rationalisation of production procedures. The field study shows that Lean principles can be applied in meal production and can result in increased production efficiency and systematic improvement of product quality without negative effects on the working environment. The results...

  1. Inflation in de Sitter spacetime and CMB large scale anomaly

    Science.gov (United States)

    Zhao, Dong; Li, Ming-Hua; Wang, Ping; Chang, Zhe

    2015-09-01

    The influence of cosmological constant-type dark energy in the early universe is investigated. This is accommodated by a new dispersion relation in de Sitter spacetime. We perform a global fit to explore the cosmological parameter space by using the CosmoMC package with the recently released Planck TT and WMAP polarization datasets. Using the results from the global fit, we compute a new CMB temperature-temperature (TT) spectrum. The obtained TT spectrum has lower power compared with that based on the ACDM model at large scales. Supported by National Natural Science Foundation of China (11375203)

  2. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate...... the analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice...

  3. Learning a Large Scale of Ontology from Japanese Wikipedia

    Science.gov (United States)

    Tamagawa, Susumu; Sakurai, Shinya; Tejima, Takuya; Morita, Takeshi; Izumi, Noriaki; Yamaguchi, Takahira

    Here is discussed how to learn a large scale of ontology from Japanese Wikipedia. The learned ontology includes the following properties: rdfs:subClassOf (IS-A relationship), rdf:type (class-instance relationship), owl:Object/DatatypeProperty (Infobox triple), rdfs:domain (property domain), and skos:altLabel (synonym). Experimental case studies show us that the learned Japanese Wikipedia Ontology goes better than already existing general linguistic ontologies, such as EDR and Japanese WordNet, from the points of building costs and structure information richness.

  4. Efficient Selection of Multiple Objects on a Large Scale

    DEFF Research Database (Denmark)

    Stenholt, Rasmus

    2012-01-01

    The task of multiple object selection (MOS) in immersive virtual environments is important and still largely unexplored. The diffi- culty of efficient MOS increases with the number of objects to be selected. E.g. in small-scale MOS, only a few objects need to be simultaneously selected. This may...... consuming. Instead, we have implemented and tested two of the existing approaches to 3-D MOS, a brush and a lasso, as well as a new technique, a magic wand, which automati- cally selects objects based on local proximity to other objects. In a formal user evaluation, we have studied how the performance...

  5. Infrastructure and interfaces for large-scale numerical software.

    Energy Technology Data Exchange (ETDEWEB)

    Freitag, L.; Gropp, W. D.; Hovland, P. D.; McInnes, L. C.; Smith, B. F.

    1999-06-10

    The complexity of large-scale scientific simulations often necessitates the combined use of multiple software packages developed by different groups in areas such as adaptive mesh manipulations, scalable algebraic solvers, and optimization. Historically, these packages have been combined by using custom code. This practice inhibits experimentation with and comparison of multiple tools that provide similar functionality through different implementations. The ALICE project, a collaborative effort among researchers at Argonne National Laboratory, is exploring the use of component-based software engineering to provide better interoperability among numerical toolkits. They discuss some initial experiences in developing an infrastructure and interfaces for high-performance numerical computing.

  6. A Modeling & Simulation Implementation Framework for Large-Scale Simulation

    Directory of Open Access Journals (Sweden)

    Song Xiao

    2012-10-01

    Full Text Available Classical High Level Architecture (HLA systems are facing development problems for lack of supporting fine-grained component integration and interoperation in large-scale complex simulation applications. To provide efficient methods of this issue, an extensible, reusable and composable simulation framework is proposed. To promote the reusability from coarse-grained federate to fine-grained components, this paper proposes a modelling & simulation framework which consists of component-based architecture, modelling methods, and simulation services to support and simplify the process of complex simulation application construction. Moreover, a standard process and simulation tools are developed to ensure the rapid and effective development of simulation application.

  7. Large-Scale Experiments in a Sandy Aquifer in Denmark

    DEFF Research Database (Denmark)

    Jensen, Karsten Høgh; Bitsch, Karen Bue; Bjerg, Poul Løgstrup

    1993-01-01

    A large-scale natural gradient dispersion experiment was carried out in a sandy aquifer in the western part of Denmark using tritium and chloride as tracers. For both plumes a marked spreading was observed in the longitudinal direction while the spreading in the transverse horizontal and transverse...... vertical directions was very small. The horizontal transport parameters of the advection-dispersion equation were investigated by applying an optimization model to observed breakthrough curves of tritium representing depth averaged concentrations. No clear trend in dispersion parameters with travel...

  8. Large-scale sodium spray fire code validation (SOFICOV) test

    International Nuclear Information System (INIS)

    Jeppson, D.W.; Muhlestein, L.D.

    1985-01-01

    A large-scale, sodium, spray fire code validation test was performed in the HEDL 850-m 3 Containment System Test Facility (CSTF) as part of the Sodium Spray Fire Code Validation (SOFICOV) program. Six hundred fifty eight kilograms of sodium spray was sprayed in an air atmosphere for a period of 2400 s. The sodium spray droplet sizes and spray pattern distribution were estimated. The containment atmosphere temperature and pressure response, containment wall temperature response and sodium reaction rate with oxygen were measured. These results are compared to post-test predictions using SPRAY and NACOM computer codes

  9. Reliability Calculation of Large-scale Complex Initiation Network

    Science.gov (United States)

    Li, Xinjian; Yang, Jun; Yan, Bingqiang; Zheng, Xiao

    2018-02-01

    A method was proposed to calculate the reliability of bundle-series compound initiation network which was the widely used for large-scale demolition blasting in China. The network was defined reliable only when all the 2nd level Nonel detonator joints outside the blasting holes were initiated. Based on the definition a series of equations were inferred to calculate the reliability of the complex initiation network. A program is written by Matlab to solve the equations. The method showed good performance with much less computations compared to the traditional ones.

  10. Large Scale Simulations of the Euler Equations on GPU Clusters

    KAUST Repository

    Liebmann, Manfred

    2010-08-01

    The paper investigates the scalability of a parallel Euler solver, using the Vijayasundaram method, on a GPU cluster with 32 Nvidia Geforce GTX 295 boards. The aim of this research is to enable large scale fluid dynamics simulations with up to one billion elements. We investigate communication protocols for the GPU cluster to compensate for the slow Gigabit Ethernet network between the GPU compute nodes and to maintain overall efficiency. A diesel engine intake-port and a nozzle, meshed in different resolutions, give good real world examples for the scalability tests on the GPU cluster. © 2010 IEEE.

  11. Rucio - The next generation large scale distributed system for ATLAS Data Management

    CERN Document Server

    Beermann, T; The ATLAS collaboration; Lassnig, M; Barisits, M; Vigne, R; Serfon, C; Stewart, G A; Goossens, L; Nairz, A; Molfetas, A

    2014-01-01

    Rucio is the next-generation Distributed Data Management (DDM) system benefiting from recent advances in cloud and "Big Data" computing to address the ATLAS experiment scaling requirements. Rucio is an evolution of the ATLAS DDM system Don Quijote 2 (DQ2), which has demonstrated very large scale data management capabilities with more than 150 petabytes spread worldwide across 130 sites, and accesses from 1,000 active users. However, DQ2 is reaching its limits in terms of scalability, requiring a large number of support staff to operate and being hard to extend with new technologies. Rucio will deal with these issues by relying on new technologies to ensure system scalability, address new user requirements and employ a new automation framework to reduce operational overheads.

  12. Recent developments in large-scale ozone generation with dielectric barrier discharges

    Science.gov (United States)

    Lopez, Jose L.

    2014-10-01

    Large-scale ozone generation for industrial applications has been entirely based on the creation of microplasmas or microdischarges created using dielectric barrier discharge (DBD) reactors. Although versions of DBD generated ozone have been in continuous use for over a hundred years especially in water treatment, recent changes in environmental awareness and sustainability have lead to a surge of ozone generating facilities throughout the world. As a result of this enhanced global usage of this environmental cleaning application various new discoveries have emerged in the science and technology of ozone generation. This presentation will describe some of the most recent breakthrough developments in large-scale ozone generation while further addressing some of the current scientific and engineering challenges of this technology.

  13. Large-Scale Sequencing: The Future of Genomic Sciences Colloquium

    Energy Technology Data Exchange (ETDEWEB)

    Margaret Riley; Merry Buckley

    2009-01-01

    Genetic sequencing and the various molecular techniques it has enabled have revolutionized the field of microbiology. Examining and comparing the genetic sequences borne by microbes - including bacteria, archaea, viruses, and microbial eukaryotes - provides researchers insights into the processes microbes carry out, their pathogenic traits, and new ways to use microorganisms in medicine and manufacturing. Until recently, sequencing entire microbial genomes has been laborious and expensive, and the decision to sequence the genome of an organism was made on a case-by-case basis by individual researchers and funding agencies. Now, thanks to new technologies, the cost and effort of sequencing is within reach for even the smallest facilities, and the ability to sequence the genomes of a significant fraction of microbial life may be possible. The availability of numerous microbial genomes will enable unprecedented insights into microbial evolution, function, and physiology. However, the current ad hoc approach to gathering sequence data has resulted in an unbalanced and highly biased sampling of microbial diversity. A well-coordinated, large-scale effort to target the breadth and depth of microbial diversity would result in the greatest impact. The American Academy of Microbiology convened a colloquium to discuss the scientific benefits of engaging in a large-scale, taxonomically-based sequencing project. A group of individuals with expertise in microbiology, genomics, informatics, ecology, and evolution deliberated on the issues inherent in such an effort and generated a set of specific recommendations for how best to proceed. The vast majority of microbes are presently uncultured and, thus, pose significant challenges to such a taxonomically-based approach to sampling genome diversity. However, we have yet to even scratch the surface of the genomic diversity among cultured microbes. A coordinated sequencing effort of cultured organisms is an appropriate place to begin

  14. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  15. Magnetic Properties of Large-Scale Nanostructured Graphene Systems

    DEFF Research Database (Denmark)

    Gregersen, Søren Schou

    The on-going progress in two-dimensional (2D) materials and nanostructure fabrication motivates the study of altered and combined materials. Graphene—the most studied material of the 2D family—displays unique electronic and spintronic properties. Exceptionally high electron mobilities, that surpass...... those in conventional materials such as silicon, make graphene a very interesting material for high-speed electronics. Simultaneously, long spin-diffusion lengths and spin-life times makes graphene an eligible spin-transport channel. In this thesis, we explore fundamental features of nanostructured...... graphene systems using large-scale modeling techniques. Graphene perforations, or antidots, have received substantial interest in the prospect of opening large band gaps in the otherwise gapless graphene. Motivated by recent improvements of fabrication processes, such as forming graphene antidots and layer...

  16. A large-scale evaluation of computational protein function prediction.

    Science.gov (United States)

    Radivojac, Predrag; Clark, Wyatt T; Oron, Tal Ronnen; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kaßner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Boehm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas A; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo

    2013-03-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based critical assessment of protein function annotation (CAFA) experiment. Fifty-four methods representing the state of the art for protein function prediction were evaluated on a target set of 866 proteins from 11 organisms. Two findings stand out: (i) today's best protein function prediction algorithms substantially outperform widely used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is considerable need for improvement of currently available tools.

  17. Safeguarding of large scale reprocessing and MOX plants

    International Nuclear Information System (INIS)

    Howsley, R.; Burrows, B.; Longevialle, H. de; Kuroi, H.; Izumi, A.

    1997-01-01

    In May 97, the IAEA Board of Governors approved the final measures of the ''93+2'' safeguards strengthening programme, thus improving the international non-proliferation regime by enhancing the effectiveness and efficiency of safeguards verification. These enhancements are not however, a revolution in current practices, but rather an important step in the continuous evolution of the safeguards system. The principles embodied in 93+2, for broader access to information and increased physical access already apply, in a pragmatic way, to large scale reprocessing and MOX fabrication plants. In these plants, qualitative measures and process monitoring play an important role in addition to accountancy and material balance evaluations in attaining the safeguard's goals. This paper will reflect on the safeguards approaches adopted for these large bulk handling facilities and draw analogies, conclusions and lessons for the forthcoming implementation of the 93+2 Programme. (author)

  18. Large-scale demonstration of waste solidification in saltstone

    International Nuclear Information System (INIS)

    McIntyre, P.F.; Oblath, S.B.; Wilhite, E.L.

    1988-05-01

    The saltstone lysimeters are a large scale demonstration of a disposal concept for decontaminated salt solution resulting from in-tank processing of defense waste. The lysimeter experiment has provided data on the leaching behavior of large saltstone monoliths under realistic field conditions. The results also will be used to compare the effect of capping the wasteform on contaminant release. Biweekly monitoring of sump leachate from three lysimeters has continued on a routine basis for approximately 3 years. An uncapped lysimeter has shown the highest levels of nitrate and 99 Tc release. Gravel and clay capped lysimeters have shown levels equivalent to or slightly higher than background rainwater levels. Mathematical model predictions have been compared to lysimeter results. The models will be applied to predict the impact of saltstone disposal on groundwater quality. 9 refs., 5 figs., 3 tabs

  19. Large-scale transport across narrow gaps in rod bundles

    Energy Technology Data Exchange (ETDEWEB)

    Guellouz, M.S.; Tavoularis, S. [Univ. of Ottawa (Canada)

    1995-09-01

    Flow visualization and how-wire anemometry were used to investigate the velocity field in a rectangular channel containing a single cylindrical rod, which could be traversed on the centreplane to form gaps of different widths with the plane wall. The presence of large-scale, quasi-periodic structures in the vicinity of the gap has been demonstrated through flow visualization, spectral analysis and space-time correlation measurements. These structures are seen to exist even for relatively large gaps, at least up to W/D=1.350 (W is the sum of the rod diameter, D, and the gap width). The above measurements appear to compatible with the field of a street of three-dimensional, counter-rotating vortices, whose detailed structure, however, remains to be determined. The convection speed and the streamwise spacing of these vortices have been determined as functions of the gap size.

  20. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Science.gov (United States)

    Xiao, Xiangyun; Zhang, Wei; Zou, Xiufen

    2015-01-01

    The reconstruction of gene regulatory networks (GRNs) from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE)-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM), experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.