WorldWideScience

Sample records for large scale project

  1. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  2. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  3. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  4. Internationalization Measures in Large Scale Research Projects

    Science.gov (United States)

    Soeding, Emanuel; Smith, Nancy

    2017-04-01

    Internationalization measures in Large Scale Research Projects Large scale research projects (LSRP) often serve as flagships used by universities or research institutions to demonstrate their performance and capability to stakeholders and other interested parties. As the global competition among universities for the recruitment of the brightest brains has increased, effective internationalization measures have become hot topics for universities and LSRP alike. Nevertheless, most projects and universities are challenged with little experience on how to conduct these measures and make internationalization an cost efficient and useful activity. Furthermore, those undertakings permanently have to be justified with the Project PIs as important, valuable tools to improve the capacity of the project and the research location. There are a variety of measures, suited to support universities in international recruitment. These include e.g. institutional partnerships, research marketing, a welcome culture, support for science mobility and an effective alumni strategy. These activities, although often conducted by different university entities, are interlocked and can be very powerful measures if interfaced in an effective way. On this poster we display a number of internationalization measures for various target groups, identify interfaces between project management, university administration, researchers and international partners to work together, exchange information and improve processes in order to be able to recruit, support and keep the brightest heads to your project.

  5. Evaluation model of project complexity for large-scale construction projects in Iran - A Fuzzy ANP approach

    Directory of Open Access Journals (Sweden)

    Aliyeh Kazemi

    2016-09-01

    Full Text Available Construction projects have always been complex. By growing trend of this complexity, implementations of large-scale constructions become harder. Hence, evaluating and understanding these complexities are critical. Correct evaluation of a project complication can provide executives and managers with good source to use. Fuzzy analytic network process (ANP is a logical and systematic approach toward defining, evaluation, and grading. This method allows for analyzing complex systems, and determining complexity of them. In this study, by taking advantage of fuzzy ANP, effective indexes for development of complications in large-scale construction projects in Iran have been determined and prioritized. The results show socio-political, project system interdependencies, and technological complexity indexes ranked top to three. Furthermore, in comparison of three main huge projects: commercial-administrative, hospital, and skyscrapers, the hospital project had been evaluated as the most complicated. This model is beneficial for professionals in managing large-scale projects.

  6. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  7. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  8. A Large Scale Problem Based Learning inter-European Student Satellite Construction Project

    DEFF Research Database (Denmark)

    Nielsen, Jens Frederik Dalsgaard; Alminde, Lars; Bisgaard, Morten

    2006-01-01

    that electronic communication technology was vital within the project. Additionally the SSETI EXPRESS project implied the following problems it didn’t fit to a standard semester - 18 months for the satellite project compared to 5/6 months for a “normal” semester project. difficulties in integrating the tasks......A LARGE SCALE PROBLEM BASED LEARNING INTER-EUROPEAN STUDENT SATELLITE CONSTRUCTION PROJECT This paper describes the pedagogical outcome of a large scale PBL experiment. ESA (European Space Agency) Education Office launched January 2004 an ambitious project: Let students from all over Europe build....... The satellite was successfully launched on October 27th 2005 (http://www.express.space.aau.dk). The project was a student driven project with student project responsibility adding at lot of international experiences and project management skills to the outcome of more traditional one semester, single group...

  9. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  10. Managing the risks of a large-scale infrastructure project : The case of Spoorzone Delft

    NARCIS (Netherlands)

    Priemus, H.

    2012-01-01

    Risk management in large-scale infrastructure projects is attracting the attention of academics and practitioners alike. After a brief summary of the theoretical background, this paper describes how the risk analysis and risk management shaped up in a current large-scale infrastructure project in

  11. What Will the Neighbors Think? Building Large-Scale Science Projects Around the World

    International Nuclear Information System (INIS)

    Jones, Craig; Mrotzek, Christian; Toge, Nobu; Sarno, Doug

    2007-01-01

    Public participation is an essential ingredient for turning the International Linear Collider into a reality. Wherever the proposed particle accelerator is sited in the world, its neighbors -- in any country -- will have something to say about hosting a 35-kilometer-long collider in their backyards. When it comes to building large-scale physics projects, almost every laboratory has a story to tell. Three case studies from Japan, Germany and the US will be presented to examine how community relations are handled in different parts of the world. How do particle physics laboratories interact with their local communities? How do neighbors react to building large-scale projects in each region? How can the lessons learned from past experiences help in building the next big project? These and other questions will be discussed to engage the audience in an active dialogue about how a large-scale project like the ILC can be a good neighbor.

  12. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  13. On a Game of Large-Scale Projects Competition

    Science.gov (United States)

    Nikonov, Oleg I.; Medvedeva, Marina A.

    2009-09-01

    The paper is devoted to game-theoretical control problems motivated by economic decision making situations arising in realization of large-scale projects, such as designing and putting into operations the new gas or oil pipelines. A non-cooperative two player game is considered with payoff functions of special type for which standard existence theorems and algorithms for searching Nash equilibrium solutions are not applicable. The paper is based on and develops the results obtained in [1]-[5].

  14. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project

    Science.gov (United States)

    Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.

    2011-01-01

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969

  15. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.

    Science.gov (United States)

    Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C

    2011-11-27

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.

  16. Sustainability Risk Evaluation for Large-Scale Hydropower Projects with Hybrid Uncertainty

    Directory of Open Access Journals (Sweden)

    Weiyao Tang

    2018-01-01

    Full Text Available As large-scale hydropower projects are influenced by many factors, risk evaluations are complex. This paper considers a hydropower project as a complex system from the perspective of sustainability risk, and divides it into three subsystems: the natural environment subsystem, the eco-environment subsystem and the socioeconomic subsystem. Risk-related factors and quantitative dimensions of each subsystem are comprehensively analyzed considering uncertainty of some quantitative dimensions solved by hybrid uncertainty methods, including fuzzy (e.g., the national health degree, the national happiness degree, the protection of cultural heritage, random (e.g., underground water levels, river width, and fuzzy random uncertainty (e.g., runoff volumes, precipitation. By calculating the sustainability risk-related degree in each of the risk-related factors, a sustainable risk-evaluation model is built. Based on the calculation results, the critical sustainability risk-related factors are identified and targeted to reduce the losses caused by sustainability risk factors of the hydropower project. A case study at the under-construction Baihetan hydropower station is presented to demonstrate the viability of the risk-evaluation model and to provide a reference for the sustainable risk evaluation of other large-scale hydropower projects.

  17. The Proposal of Scaling the Roles in Scrum of Scrums for Distributed Large Projects

    OpenAIRE

    Abeer M. AlMutairi; M. Rizwan Jameel Qureshi

    2015-01-01

    Scrum of scrums is an approach used to scale the traditional Scrum methodology to fit for the development of complex and large projects. However, scaling the roles of scrum members brought new challenges especially in distributed and large software projects. This paper describes in details the roles of each scrum member in scrum of scrum to propose a solution to use a dedicated product owner for a team and inclusion of sub-backlog. The main goal of the proposed solution i...

  18. Talking About The Smokes: a large-scale, community-based participatory research project.

    Science.gov (United States)

    Couzos, Sophia; Nicholson, Anna K; Hunt, Jennifer M; Davey, Maureen E; May, Josephine K; Bennet, Pele T; Westphal, Darren W; Thomas, David P

    2015-06-01

    To describe the Talking About The Smokes (TATS) project according to the World Health Organization guiding principles for conducting community-based participatory research (PR) involving indigenous peoples, to assist others planning large-scale PR projects. The TATS project was initiated in Australia in 2010 as part of the International Tobacco Control Policy Evaluation Project, and surveyed a representative sample of 2522 Aboriginal and Torres Strait Islander adults to assess the impact of tobacco control policies. The PR process of the TATS project, which aimed to build partnerships to create equitable conditions for knowledge production, was mapped and summarised onto a framework adapted from the WHO principles. Processes describing consultation and approval, partnerships and research agreements, communication, funding, ethics and consent, data and benefits of the research. The TATS project involved baseline and follow-up surveys conducted in 34 Aboriginal community-controlled health services and one Torres Strait community. Consistent with the WHO PR principles, the TATS project built on community priorities and strengths through strategic partnerships from project inception, and demonstrated the value of research agreements and trusting relationships to foster shared decision making, capacity building and a commitment to Indigenous data ownership. Community-based PR methodology, by definition, needs adaptation to local settings and priorities. The TATS project demonstrates that large-scale research can be participatory, with strong Indigenous community engagement and benefits.

  19. Data management strategies for multinational large-scale systems biology projects.

    Science.gov (United States)

    Wruck, Wasco; Peuker, Martin; Regenbrecht, Christian R A

    2014-01-01

    Good accessibility of publicly funded research data is essential to secure an open scientific system and eventually becomes mandatory [Wellcome Trust will Penalise Scientists Who Don't Embrace Open Access. The Guardian 2012]. By the use of high-throughput methods in many research areas from physics to systems biology, large data collections are increasingly important as raw material for research. Here, we present strategies worked out by international and national institutions targeting open access to publicly funded research data via incentives or obligations to share data. Funding organizations such as the British Wellcome Trust therefore have developed data sharing policies and request commitment to data management and sharing in grant applications. Increased citation rates are a profound argument for sharing publication data. Pre-publication sharing might be rewarded by a data citation credit system via digital object identifiers (DOIs) which have initially been in use for data objects. Besides policies and incentives, good practice in data management is indispensable. However, appropriate systems for data management of large-scale projects for example in systems biology are hard to find. Here, we give an overview of a selection of open-source data management systems proved to be employed successfully in large-scale projects.

  20. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  1. Large scale waste combustion projects. A study of financial structures and sensitivities

    International Nuclear Information System (INIS)

    Brandler, A.

    1993-01-01

    The principal objective of the study was to determine the key contractual and financial aspects of large scale energy-from-waste projects, and to provide the necessary background information on financing to appreciate the approach lenders take when they consider financing waste combustion projects. An integral part of the study has been the preparation of a detailed financial model, incorporating all major financing parameters, to assess the economic and financial viability of typical waste combustion projects. (author)

  2. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P [PA Energy, Malling (Denmark); Vedde, J [SiCon. Silicon and PV consulting, Birkeroed (Denmark)

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  3. Really Large Scale Computer Graphic Projection Using Lasers and Laser Substitutes

    Science.gov (United States)

    Rother, Paul

    1989-07-01

    This paper reflects on past laser projects to display vector scanned computer graphic images onto very large and irregular surfaces. Since the availability of microprocessors and high powered visible lasers, very large scale computer graphics projection have become a reality. Due to the independence from a focusing lens, lasers easily project onto distant and irregular surfaces and have been used for amusement parks, theatrical performances, concert performances, industrial trade shows and dance clubs. Lasers have been used to project onto mountains, buildings, 360° globes, clouds of smoke and water. These methods have proven successful in installations at: Epcot Theme Park in Florida; Stone Mountain Park in Georgia; 1984 Olympics in Los Angeles; hundreds of Corporate trade shows and thousands of musical performances. Using new ColorRayTM technology, the use of costly and fragile lasers is no longer necessary. Utilizing fiber optic technology, the functionality of lasers can be duplicated for new and exciting projection possibilities. The use of ColorRayTM technology has enjoyed worldwide recognition in conjunction with Pink Floyd and George Michaels' world wide tours.

  4. Methods Dealing with Complexity in Selecting Joint Venture Contractors for Large-Scale Infrastructure Projects

    Directory of Open Access Journals (Sweden)

    Ru Liang

    2018-01-01

    Full Text Available The magnitude of business dynamics has increased rapidly due to increased complexity, uncertainty, and risk of large-scale infrastructure projects. This fact made it increasingly tough to “go alone” into a contractor. As a consequence, joint venture contractors with diverse strengths and weaknesses cooperatively bid for bidding. Understanding project complexity and making decision on the optimal joint venture contractor is challenging. This paper is to study how to select joint venture contractors for undertaking large-scale infrastructure projects based on a multiattribute mathematical model. Two different methods are developed to solve the problem. One is based on ideal points and the other one is based on balanced ideal advantages. Both of the two methods consider individual difference in expert judgment and contractor attributes. A case study of Hong Kong-Zhuhai-Macao-Bridge (HZMB project in China is used to demonstrate how to apply these two methods and their advantages.

  5. CSNI Project for Fracture Analyses of Large-Scale International Reference Experiments (Project FALSIRE)

    International Nuclear Information System (INIS)

    Bass, B.R.; Pugh, C.E.; Keeney-Walker, J.; Schulz, H.; Sievers, J.

    1993-06-01

    This report summarizes the recently completed Phase I of the Project for Fracture Analysis of Large-Scale International Reference Experiments (Project FALSIRE). Project FALSIRE was created by the Fracture Assessment Group (FAG) of Principal Working Group No. 3 (PWG/3) of the Organization for Economic Cooperation and Development (OECD)/Nuclear Energy Agency's (NEA's) Committee on the Safety of Nuclear Installations (CSNI). Motivation for the project was derived from recognition by the CSNI-PWG/3 that inconsistencies were being revealed in predictive capabilities of a variety of fracture assessment methods, especially in ductile fracture applications. As a consequence, the CSNI/FAG was formed to evaluate fracture prediction capabilities currently used in safety assessments of nuclear components. Members are from laboratories and research organizations in Western Europe, Japan, and the United States of America (USA). On behalf of the CSNI/FAG, the US Nuclear Regulatory Commission's (NRC's) Heavy-Section Steel Technology (HSST) Program at the Oak Ridge National Laboratory (ORNL) and the Gesellschaft fuer Anlagen--und Reaktorsicherheit (GRS), Koeln, Federal Republic of Germany (FRG) had responsibility for organization arrangements related to Project FALSIRE. The group is chaired by H. Schulz from GRS, Koeln, FRG

  6. CSNI Project for Fracture Analyses of Large-Scale International Reference Experiments (Project FALSIRE)

    Energy Technology Data Exchange (ETDEWEB)

    Bass, B.R.; Pugh, C.E.; Keeney-Walker, J. [Oak Ridge National Lab., TN (United States); Schulz, H.; Sievers, J. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Koeln (Gemany)

    1993-06-01

    This report summarizes the recently completed Phase I of the Project for Fracture Analysis of Large-Scale International Reference Experiments (Project FALSIRE). Project FALSIRE was created by the Fracture Assessment Group (FAG) of Principal Working Group No. 3 (PWG/3) of the Organization for Economic Cooperation and Development (OECD)/Nuclear Energy Agency`s (NEA`s) Committee on the Safety of Nuclear Installations (CSNI). Motivation for the project was derived from recognition by the CSNI-PWG/3 that inconsistencies were being revealed in predictive capabilities of a variety of fracture assessment methods, especially in ductile fracture applications. As a consequence, the CSNI/FAG was formed to evaluate fracture prediction capabilities currently used in safety assessments of nuclear components. Members are from laboratories and research organizations in Western Europe, Japan, and the United States of America (USA). On behalf of the CSNI/FAG, the US Nuclear Regulatory Commission`s (NRC`s) Heavy-Section Steel Technology (HSST) Program at the Oak Ridge National Laboratory (ORNL) and the Gesellschaft fuer Anlagen--und Reaktorsicherheit (GRS), Koeln, Federal Republic of Germany (FRG) had responsibility for organization arrangements related to Project FALSIRE. The group is chaired by H. Schulz from GRS, Koeln, FRG.

  7. Legal aspects of public participation in the planning/licensing of environmentally related large-scale projects

    International Nuclear Information System (INIS)

    Kurz, A.

    1991-01-01

    A variety of legal problems arise in the planning/licensing of environmentally related large-scale projects associated with the control and evaluation of technical conditions and the ramifications in social and legal policy of the acceptance of, and resistance to, such projects. On the basis of a number of partial studies e.g. of the licensing procedure of a nuclear power plant (Neckar-2 reactor) the author examines the legal aspects of public participation in the administrative procedures of licensing/plans approval. The dichotomy of law and technology is covered, and public participation in administrative procedures is derived legally from the basic constitutional rights and the principle of fair hearing. After an outline of specific administrative procedures, public participation as part of administrative procedures is included in the broad legal framework of licensing/plans approval of environmentally related large-scale projects. The author concludes that public participation, within the framework of the basic decisions established by legislature, is not a tool to be used in deciding basic political conflicts. Instead, public participations in the application of law serves to protect the rights of the individual by ensuring fair proceedings paying attention to the subjective rights of the individual. As it is unable to decide political conflicts, it is also an unsuitable means of establishing of basic societal consensus, or of seeking acceptance of large-scale projects. (orig./HP) [de

  8. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  9. Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS

    Science.gov (United States)

    Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.

    2015-12-01

    Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.

  10. RISK MANAGEMENT IN A LARGE-SCALE NEW RAILWAY TRANSPORT SYSTEM PROJECT

    Directory of Open Access Journals (Sweden)

    Sunduck D. SUH, Ph.D., P.E.

    2000-01-01

    Full Text Available Risk management experiences of the Korean Seoul-Pusan high-speed railway (KTX project since the planning stage are evaluated. One can clearly see the interplay of engineering and construction risks, financial risks and political risks in the development of the KTX project, which is the peculiarity of large-scale new railway system projects. A brief description on evaluation methodology and overview of the project is followed by detailed evaluations on key differences in risks between conventional railway system and high-speed railway system, social and political risks, engineering and construction risks, and financial risks. Risks involved in system procurement process, such as proposal solicitation, evaluation, selection, and scope of solicitation are separated out and evaluated in depth. Detailed events resulting from these issues are discussed along with their possible impact on system risk. Lessons learned and further possible refinements are also discussed.

  11. CSCW Challenges in Large-Scale Technical Projects - a case study

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1992-01-01

    This paper investigates CSCW aspects of large-scale technical projects based on a case study of a specific Danish engineering company and uncovers s challenges to CSCW applications in this setting. The company is responsible for management and supervision of one of the worlds largest tunnel....... The initial qualitative analysis identified a number of bottlenecks in daily work, where support for cooperation is needed. Examples of bottlenecks are: sharing materials, issuing tasks, and keeping track of task status. Grounded in the analysis, cooperative design workshops based on scenarios of future work...

  12. PubChemQC Project: A Large-Scale First-Principles Electronic Structure Database for Data-Driven Chemistry.

    Science.gov (United States)

    Nakata, Maho; Shimazaki, Tomomi

    2017-06-26

    Large-scale molecular databases play an essential role in the investigation of various subjects such as the development of organic materials, in silico drug design, and data-driven studies with machine learning. We have developed a large-scale quantum chemistry database based on first-principles methods. Our database currently contains the ground-state electronic structures of 3 million molecules based on density functional theory (DFT) at the B3LYP/6-31G* level, and we successively calculated 10 low-lying excited states of over 2 million molecules via time-dependent DFT with the B3LYP functional and the 6-31+G* basis set. To select the molecules calculated in our project, we referred to the PubChem Project, which was used as the source of the molecular structures in short strings using the InChI and SMILES representations. Accordingly, we have named our quantum chemistry database project "PubChemQC" ( http://pubchemqc.riken.jp/ ) and placed it in the public domain. In this paper, we show the fundamental features of the PubChemQC database and discuss the techniques used to construct the data set for large-scale quantum chemistry calculations. We also present a machine learning approach to predict the electronic structure of molecules as an example to demonstrate the suitability of the large-scale quantum chemistry database.

  13. Multimode Resource-Constrained Multiple Project Scheduling Problem under Fuzzy Random Environment and Its Application to a Large Scale Hydropower Construction Project

    Science.gov (United States)

    Xu, Jiuping

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method. PMID:24550708

  14. Multimode resource-constrained multiple project scheduling problem under fuzzy random environment and its application to a large scale hydropower construction project.

    Science.gov (United States)

    Xu, Jiuping; Feng, Cuiying

    2014-01-01

    This paper presents an extension of the multimode resource-constrained project scheduling problem for a large scale construction project where multiple parallel projects and a fuzzy random environment are considered. By taking into account the most typical goals in project management, a cost/weighted makespan/quality trade-off optimization model is constructed. To deal with the uncertainties, a hybrid crisp approach is used to transform the fuzzy random parameters into fuzzy variables that are subsequently defuzzified using an expected value operator with an optimistic-pessimistic index. Then a combinatorial-priority-based hybrid particle swarm optimization algorithm is developed to solve the proposed model, where the combinatorial particle swarm optimization and priority-based particle swarm optimization are designed to assign modes to activities and to schedule activities, respectively. Finally, the results and analysis of a practical example at a large scale hydropower construction project are presented to demonstrate the practicality and efficiency of the proposed model and optimization method.

  15. Legal aspects of public participation in the planning/licensing of environmentally related large-scale projects

    International Nuclear Information System (INIS)

    Kurz, A.

    1992-02-01

    A variety of legal problems arise in the planning/licensing of environmentally related large-scale projects associated with the control and evaluation of technical conditions and the ramifications in social and legal policy of the acceptance of, and resistance to, such projects. On the basis of a number of partial studies e.g. of the licensing procedure of a nuclear power plant (Neckar-2 reactor), the author examines the legal aspects of public participation in the administrative procedure of licensing/plans approval. The dichotomy of law and technology is covered, and public participation in administrative procedures is derived legally from the basic constitutional rights and the principle of fair hearing. After an outline of specific administrative procedures, public participation as part of administrative procedures is included in the broad legal framework of licensing/plans approval of environmentally related large-scale projects. The author concludes that public participation, within the framework of the basic decisions established by legislature, is not a tool to be used in deciding basic political conflicts. Instead, public participations in the application of law serves to protect the rights of the individual by ensuring fair proceedings paying attention to the subjective rights of the individual. As it is unable to decide political conflicts, it is also an unsuitable means of establishing of basic societal consensus, or of seeking acceptance of large-scale projects. This is reflected also in studies of the legal functions of public participation, according to which the lawfulness of procedures is observed without, however, the legitimacy of the project being achieved. (orig./HP) [de

  16. Projection Effects of Large-scale Structures on Weak-lensing Peak Abundances

    Science.gov (United States)

    Yuan, Shuo; Liu, Xiangkun; Pan, Chuzhong; Wang, Qiao; Fan, Zuhui

    2018-04-01

    High peaks in weak lensing (WL) maps originate dominantly from the lensing effects of single massive halos. Their abundance is therefore closely related to the halo mass function and thus a powerful cosmological probe. However, besides individual massive halos, large-scale structures (LSS) along lines of sight also contribute to the peak signals. In this paper, with ray-tracing simulations, we investigate the LSS projection effects. We show that for current surveys with a large shape noise, the stochastic LSS effects are subdominant. For future WL surveys with source galaxies having a median redshift z med ∼ 1 or higher, however, they are significant. For the cosmological constraints derived from observed WL high-peak counts, severe biases can occur if the LSS effects are not taken into account properly. We extend the model of Fan et al. by incorporating the LSS projection effects into the theoretical considerations. By comparing with simulation results, we demonstrate the good performance of the improved model and its applicability in cosmological studies.

  17. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  18. PLANNING QUALITY ASSURANCE PROCESSES IN A LARGE SCALE GEOGRAPHICALLY SPREAD HYBRID SOFTWARE DEVELOPMENT PROJECT

    Directory of Open Access Journals (Sweden)

    Святослав Аркадійович МУРАВЕЦЬКИЙ

    2016-02-01

    Full Text Available There have been discussed key points of operational activates in a large scale geographically spread software development projects. A look taken at required QA processes structure in such project. There have been given up to date methods of integration quality assurance processes into software development processes. There have been reviewed existing groups of software development methodologies. Such as sequential, agile and based on RPINCE2. There have been given a condensed overview of quality assurance processes in each group. There have been given a review of common challenges that sequential and agile models are having in case of large geographically spread hybrid software development project. Recommendations were given in order to tackle those challenges.  The conclusions about the best methodology choice and appliance to the particular project have been made.

  19. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    Science.gov (United States)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning

  20. The MIRAGE project: large scale radionuclide transport investigations and integral migration experiments

    International Nuclear Information System (INIS)

    Come, B.; Bidoglio, G.; Chapman, N.

    1986-01-01

    Predictions of radionuclide migration through the geosphere must be supported by large-scale, long-term investigations. Several research areas of the MIRAGE Project are devoted to acquiring reliable data for developing and validating models. Apart from man-made migration experiments in boreholes and/or underground galleries, attention is paid to natural geological migration systems which have been active for very long time spans. The potential role of microbial activity, either resident or introduced into the host media, is also considered. In order to clarify basic mechanisms, smaller scale ''integral'' migration experiments under fully controlled laboratory conditions are also carried out using real waste forms and representative geological media. (author)

  1. Survey and analysis of selected jointly owned large-scale electric utility storage projects

    Energy Technology Data Exchange (ETDEWEB)

    1982-05-01

    The objective of this study was to examine and document the issues surrounding the curtailment in commercialization of large-scale electric storage projects. It was sensed that if these issues could be uncovered, then efforts might be directed toward clearing away these barriers and allowing these technologies to penetrate the market to their maximum potential. Joint-ownership of these projects was seen as a possible solution to overcoming the major barriers, particularly economic barriers, of commercializaton. Therefore, discussions with partners involved in four pumped storage projects took place to identify the difficulties and advantages of joint-ownership agreements. The four plants surveyed included Yards Creek (Public Service Electric and Gas and Jersey Central Power and Light); Seneca (Pennsylvania Electric and Cleveland Electric Illuminating Company); Ludington (Consumers Power and Detroit Edison, and Bath County (Virginia Electric Power Company and Allegheny Power System, Inc.). Also investigated were several pumped storage projects which were never completed. These included Blue Ridge (American Electric Power); Cornwall (Consolidated Edison); Davis (Allegheny Power System, Inc.) and Kttatiny Mountain (General Public Utilities). Institutional, regulatory, technical, environmental, economic, and special issues at each project were investgated, and the conclusions relative to each issue are presented. The major barriers preventing the growth of energy storage are the high cost of these systems in times of extremely high cost of capital, diminishing load growth and regulatory influences which will not allow the building of large-scale storage systems due to environmental objections or other reasons. However, the future for energy storage looks viable despite difficult economic times for the utility industry. Joint-ownership can ease some of the economic hardships for utilites which demonstrate a need for energy storage.

  2. Investigating and stimulating primary teachers’ attitudes towards science: Summary of a large-scale research project

    NARCIS (Netherlands)

    Walma van der Molen, Julie Henriëtte; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical

  3. HyLights: Preparation of the Large-Scale Demonstration Projects on Hydrogen for Transport in Europe

    International Nuclear Information System (INIS)

    Ulrich Bunger; Volker Blandow; Volker Jaensch; Harm Jeeninga; Cristina Morte Gomez

    2006-01-01

    The strategically important project HyLights has been launched by the European Commission in preparation of the large scale demonstration projects in transition to hydrogen as a fuel and long-term renewable energy carrier. HyLights, monitors concluded/ongoing demonstration projects and assists the planning of the next demonstration project phase, putting a clear focus on hydrogen in transport. HyLights is a coordination action that comprises 5 tasks to: 1) develop an assessment framework for concluded/ongoing demonstration projects, 2) analyse individual projects and establish a project database, 3) carry out a gaps analysis and prepare a requirement profile for the next stage projects, 4) assess and identify necessary financial and legal steps in preparation of the new projects, and 5) develop a European Initiative for the Growth of Hydrogen for Transport (EIGHT). (authors)

  4. Structural problems of public participation in large-scale projects with environmental impact

    International Nuclear Information System (INIS)

    Bechmann, G.

    1989-01-01

    Four items are discussed showing that the problems involved through participation of the public in large-scale projects with environmental impact cannot be solved satisfactorily without suitable modification of the existing legal framework. The problematic items are: the status of the electric utilities as a quasi public enterprise; informal preliminary negotiations; the penetration of scientific argumentation into administrative decisions; the procedural concept. The paper discusses the fundamental issue of the problem-adequate design of the procedure and develops suggestions for a cooperative participation design. (orig./HSCH) [de

  5. Report of preliminary investigations on the next-generation large-scale synchrotron radiation facility projects

    International Nuclear Information System (INIS)

    1990-01-01

    The Special Committee for Future Project of the Japanese Society for Synchrotron Radiation Research investigated the construction-projects of the large-scaled synchrotron radiation facilities which are presently in progress in Japan. As a result, the following both projects are considered the very valuable research-project which will carry the development of Japan's next-generation synchrotron radiation science: 1. the 8 GeV synchrotron radiation facilities (SPring-8) projected to be constructed by Japan Atomic Energy Research Institute and the Institute of Physical and Chemical Research under the sponsorship of Science Technology Agency at Harima Science Park City, Hyogo Pref., Japan. 2. The project to utilize the Tristan Main Ring (MR) of the National Laboratory for High Energy Physics as the radiation source. Both projects are unique in research theme and technological approach, and complemental each other. Therefore it has been concluded that both projects should be aided and ratified by the Society. (M.T.)

  6. The use of public participation and economic appraisal for public involvement in large-scale hydropower projects: Case study of the Nam Theun 2 Hydropower Project

    International Nuclear Information System (INIS)

    Mirumachi, Naho; Torriti, Jacopo

    2012-01-01

    Gaining public acceptance is one of the main issues with large-scale low-carbon projects such as hydropower development. It has been recommended by the World Commission on Dams that to gain public acceptance, public involvement is necessary in the decision-making process (). As financially-significant actors in the planning and implementation of large-scale hydropower projects in developing country contexts, the paper examines the ways in which public involvement may be influenced by international financial institutions. Using the case study of the Nam Theun 2 Hydropower Project in Laos, the paper analyses how public involvement facilitated by the Asian Development Bank had a bearing on procedural and distributional justice. The paper analyses the extent of public participation and the assessment of full social and environmental costs of the project in the Cost-Benefit Analysis conducted during the project appraisal stage. It is argued that while efforts were made to involve the public, there were several factors that influenced procedural and distributional justice: the late contribution of the Asian Development Bank in the project appraisal stage; and the issue of non-market values and discount rate to calculate the full social and environmental costs. - Highlights: ► Public acceptance in large-scale hydropower projects is examined. ► Both procedural and distributional justice are important for public acceptance. ► International Financial Institutions can influence the level of public involvement. ► Public involvement benefits consideration of non-market values and discount rates.

  7. Negotiating development narratives within large-scale oil palm projects on village lands in Sarawak, Malaysia

    DEFF Research Database (Denmark)

    Andersen, Astrid Oberborbeck; Bruun, Thilde Bech; Egay, Kelvin

    2016-01-01

    the narratives that suggest that large-scale land development projects ‘bring development to the people’, utilising ‘idle lands’ and ‘creating employment’ to lift them out of poverty, we argue that political and economic processes related to cultivation of oil palm intersect with local community differences...

  8. Investigating and Stimulating Primary Teachers' Attitudes Towards Science: Summary of a Large-Scale Research Project

    Science.gov (United States)

    Walma van der Molen, Juliette; van Aalderen-Smeets, Sandra

    2013-01-01

    Attention to the attitudes of primary teachers towards science is of fundamental importance to research on primary science education. The current article describes a large-scale research project that aims to overcome three main shortcomings in attitude research, i.e. lack of a strong theoretical concept of attitude, methodological flaws in…

  9. Transitioning a home telehealth project into a sustainable, large-scale service: a qualitative study.

    Science.gov (United States)

    Wade, Victoria A; Taylor, Alan D; Kidd, Michael R; Carati, Colin

    2016-05-16

    This study was a component of the Flinders Telehealth in the Home project, which tested adding home telehealth to existing rehabilitation, palliative care and geriatric outreach services. Due to the known difficulty of transitioning telehealth projects services, a qualitative study was conducted to produce a preferred implementation approach for sustainable and large-scale operations, and a process model that offers practical advice for achieving this goal. Initially, semi-structured interviews were conducted with senior clinicians, health service managers and policy makers, and a thematic analysis of the interview transcripts was undertaken to identify the range of options for ongoing operations, plus the factors affecting sustainability. Subsequently, the interviewees and other decision makers attended a deliberative forum in which participants were asked to select a preferred model for future implementation. Finally, all data from the study was synthesised by the researchers to produce a process model. 19 interviews with senior clinicians, managers, and service development staff were conducted, finding strong support for home telehealth but a wide diversity of views on governance, models of clinical care, technical infrastructure operations, and data management. The deliberative forum worked through these options and recommended a collaborative consortium approach for large-scale implementation. The process model proposes that the key factor for large-scale implementation is leadership support, which is enabled by 1) showing solutions to the problems of service demand, budgetary pressure and the relationship between hospital and primary care, 2) demonstrating how home telehealth aligns with health service policies, and 3) achieving clinician acceptance through providing evidence of benefit and developing new models of clinical care. Two key actions to enable change were marketing telehealth to patients, clinicians and policy-makers, and building a community of

  10. Construction Claim Types and Causes for a Large-Scale Hydropower Project in Bhutan

    Directory of Open Access Journals (Sweden)

    Bonaventura H.W. Hadikusumo

    2015-01-01

    Full Text Available Hydropower construction projects are complex and uncertain, have long gestational periods and involve several parties. Furthermore, they require the integration of different components (Civil, Mechanical and Electrical to work together as a single unit. These projects require highly specialised designs, detailed plans and specifications, high-risk construction methods, effective management, skilful supervision and close coordination. Thus, claims are common in such projects. These claims are undesirable because they require significant time and resources to resolve and cause adversarial relationships among the parties involved. Therefore, it is in the common interest of all involved parties to prevent, minimise, or resolve claims as amicably as possible. Identifying common claim types and their causes is essential in devising techniques to minimise and avoid them in future projects. This report details a case study performed on a large-scale hydropower project in Bhutan. The findings of this case study indicate that differing site conditions are the major contributor of impact and change claims and 95% of total claims can be settled by negotiation, whereas 5% of claims can be settled by arbitration.

  11. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  12. Large Scale Analyses and Visualization of Adaptive Amino Acid Changes Projects.

    Science.gov (United States)

    Vázquez, Noé; Vieira, Cristina P; Amorim, Bárbara S R; Torres, André; López-Fernández, Hugo; Fdez-Riverola, Florentino; Sousa, José L R; Reboiro-Jato, Miguel; Vieira, Jorge

    2018-03-01

    When changes at few amino acid sites are the target of selection, adaptive amino acid changes in protein sequences can be identified using maximum-likelihood methods based on models of codon substitution (such as codeml). Although such methods have been employed numerous times using a variety of different organisms, the time needed to collect the data and prepare the input files means that tens or hundreds of coding regions are usually analyzed. Nevertheless, the recent availability of flexible and easy to use computer applications that collect relevant data (such as BDBM) and infer positively selected amino acid sites (such as ADOPS), means that the entire process is easier and quicker than before. However, the lack of a batch option in ADOPS, here reported, still precludes the analysis of hundreds or thousands of sequence files. Given the interest and possibility of running such large-scale projects, we have also developed a database where ADOPS projects can be stored. Therefore, this study also presents the B+ database, which is both a data repository and a convenient interface that looks at the information contained in ADOPS projects without the need to download and unzip the corresponding ADOPS project file. The ADOPS projects available at B+ can also be downloaded, unzipped, and opened using the ADOPS graphical interface. The availability of such a database ensures results repeatability, promotes data reuse with significant savings on the time needed for preparing datasets, and effortlessly allows further exploration of the data contained in ADOPS projects.

  13. Evaluating high risks in large-scale projects using an extended VIKOR method under a fuzzy environment

    Directory of Open Access Journals (Sweden)

    S. Ebrahimnejad

    2012-04-01

    Full Text Available The complexity of large-scale projects has led to numerous risks in their life cycle. This paper presents a new risk evaluation approach in order to rank the high risks in large-scale projects and improve the performance of these projects. It is based on the fuzzy set theory that is an effective tool to handle uncertainty. It is also based on an extended VIKOR method that is one of the well-known multiple criteria decision-making (MCDM methods. The proposed decision-making approach integrates knowledge and experience acquired from professional experts, since they perform the risk identification and also the subjective judgments of the performance rating for high risks in terms of conflicting criteria, including probability, impact, quickness of reaction toward risk, event measure quantity and event capability criteria. The most notable difference of the proposed VIKOR method with its traditional version is just the use of fuzzy decision-matrix data to calculate the ranking index without the need to ask the experts. Finally, the proposed approach is illustrated with a real-case study in an Iranian power plant project, and the associated results are compared with two well-known decision-making methods under a fuzzy environment.

  14. Pilot project wind power - Large scale wind power in northern Sweden; Pilotprojekt vindkraft - Storskalig vindkraft i norra Sverige

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2011-07-01

    The Swedish Energy Agency granted 2009-04-20 Svevind AB financial aid to implement {sup P}ilot project wind power- Large scale wind power in northern Sweden{sup .} The purpose of the aid is to implement pilot sub-projects in wind power, to to increase knowledge for the larger establishments. The Energy Agency said in its decision that the projects Dragaliden and Gabriel Mountain is of 'great importance for future large-scale development of wind power in Sweden'. The special conditions prevailing in the project, forest environment and cold climate, gives the possibility of studies of wind turbines on birds, reindeer herding and hunting and the more technical aspects, such as de-icing and obstacle lighting. The objectives of the project, in addition to the construction and operation of 32 wind turbines, has been to include evaluating the permit process, studying the social effects around the wind power, to study the impact on small game hunting, perform tests of the de-icing system, investigate impacts on reindeer herding and explain the outcome of the project-generated rural funds. Some of the above sub-projects have been completed, which are reported in this report. For the sub-projects still in progress, the report presents the results to date, until the completion.

  15. Managing Large Scale Project Analysis Teams through a Web Accessible Database

    Science.gov (United States)

    O'Neil, Daniel A.

    2008-01-01

    Large scale space programs analyze thousands of requirements while mitigating safety, performance, schedule, and cost risks. These efforts involve a variety of roles with interdependent use cases and goals. For example, study managers and facilitators identify ground-rules and assumptions for a collection of studies required for a program or project milestone. Task leaders derive product requirements from the ground rules and assumptions and describe activities to produce needed analytical products. Disciplined specialists produce the specified products and load results into a file management system. Organizational and project managers provide the personnel and funds to conduct the tasks. Each role has responsibilities to establish information linkages and provide status reports to management. Projects conduct design and analysis cycles to refine designs to meet the requirements and implement risk mitigation plans. At the program level, integrated design and analysis cycles studies are conducted to eliminate every 'to-be-determined' and develop plans to mitigate every risk. At the agency level, strategic studies analyze different approaches to exploration architectures and campaigns. This paper describes a web-accessible database developed by NASA to coordinate and manage tasks at three organizational levels. Other topics in this paper cover integration technologies and techniques for process modeling and enterprise architectures.

  16. Falsire: CSNI project for fracture analyses of large-scale international reference experiments (Phase 1). Comparison report

    International Nuclear Information System (INIS)

    1994-01-01

    A summary of the recently completed Phase I of the Project for Fracture Analysis of Large-Scale International Reference Experiments (Project FALSIRE) is presented. Project FALSIRE was created by the Fracture Assessment Group (FAG) of Principal Working Group No. 3 (PWG/3) of the OECD/NEA Committee on the Safety of Nuclear Installations (CSNI), formed to evaluate fracture prediction capabilities currently used in safety assessments of nuclear vessel components. The aim of the Project FALSIRE was to assess various fracture methodologies through interpretive analyses of selected large-scale fracture experiments. The six experiments used in Project FALSIRE (performed in the Federal Republic of Germany, Japan, the United Kingdom, and the U.S.A.) were designed to examine various aspects of crack growth in reactor pressure vessel (RPV) steels under pressurized-thermal-shock (PTS) loading conditions. The analysis techniques employed by the participants included engineering and finite-element methods, which were combined with Jr fracture methodology and the French local approach. For each experiment, analysis results provided estimates of variables such as crack growth, crack-mouth-opening displacement, temperature, stress, strain, and applied J and K values. A comparative assessment and discussion of the analysis results are presented; also, the current status of the entire results data base is summarized. Some conclusions concerning predictive capabilities of selected ductile fracture methodologies, as applied to RPVs subjected to PTS loading, are given, and recommendations for future development of fracture methodologies are made

  17. Position Paper on Jatropha curcas. State of the Art Small and Large Scale Project Development

    Energy Technology Data Exchange (ETDEWEB)

    Daey Ouwens, K.; Franken, Y.J.; Rijssenbeek, W. [Fuels from Agriculture in Communal Technology FACT, Eindhoven (Netherlands); Francis, G. [University of Hohenheim, Hohenheim (Germany); Riedacker, A. [French National Institute for Agricultural Research INRA, Paris (France); Foidl, N.; Jongschaap, R.; Bindraban, P. [Plant Research International PRI, Wageningen (Netherlands)

    2007-06-15

    Much information has been collected during the Seminar on Jatropha held in Wageningen, Netherlands, March 2007, summarized in this paper. Much research is still necessary to improve yield, to allow use of biological products such as oil cake as animal fodder, etc. Good documented yield data are still scarce. Cooperation with research institutions is therefore recommended. At this stage it is still particularly important to distinguish between reality, promises and dangerous extrapolations. To avoid, spectacular and regretful failures and waste of money for investors as well as great disappointments of local populations, promoters of large scale plantation are invited to adopt stepwise approaches: large scale plantations should only be considered after some 4 to 5 years obtaining experimental data (annual seed yield and oil yield, economical viability etc.) from a sufficient number of small scale experimental plots (about 1 ha) corresponding to the whole range of soil and climatic conditions of such projects.

  18. Risk management in a large-scale CO2 geosequestration pilot project, Illinois, USA

    Science.gov (United States)

    Hnottavange-Telleen, K.; Chabora, E.; Finley, R.J.; Greenberg, S.E.; Marsteller, S.

    2011-01-01

    Like most large-scale infrastructure projects, carbon dioxide (CO 2) geological sequestration (GS) projects have multiple success criteria and multiple stakeholders. In this context "risk evaluation" encompasses multiple scales. Yet a risk management program aims to maximize the chance of project success by assessing, monitoring, minimizing all risks in a consistent framework. The 150,000-km2 Illinois Basin underlies much of the state of Illinois, USA, and parts of adjacent Kentucky and Indiana. Its potential for CO2 storage is first-rate among basins in North America, an impression that has been strengthened by early testing of the injection well of the Midwest Geological Sequestration Consortium's (MGSC's) Phase III large scale demonstration project, the Illinois Basin - Decatur Project (IBDP). The IBDP, funded by the U.S. Department of Energy's National Energy Technology Laboratory (NETL), represents a key trial of GS technologies and project-management techniques. Though risks are specific to each site and project, IBDP risk management methodologies provide valuable experience for future GS projects. IBDP views risk as the potential for negative impact to any of these five values: health and safety, environment, financial, advancing the viability and public acceptability of a GS industry, and research. Research goals include monitoring one million metric tonnes of injected CO2 in the subsurface. Risk management responds to the ways in which any values are at risk: for example, monitoring is designed to reduce uncertainties in parameter values that are important for research and system control, and is also designed to provide public assurance. Identified risks are the primary basis for risk-reduction measures: risks linked to uncertainty in geologic parameters guide further characterization work and guide simulations applied to performance evaluation. Formally, industry defines risk (more precisely risk criticality) as the product L*S, the Likelihood multiplied

  19. Public participation in the licensing of large-scale projects with environmental impact

    International Nuclear Information System (INIS)

    1989-01-01

    As a rule, public participation in licensing and planning procedures for large-scale projects with environmental impact is as controversial as the projects themselves. Against this background, an interdisciplinary examination of several 'cases' of public participation in administrative procedures with varying legal forms is being conducted in a joint research project run by the Department for Applied Systems Analysis (Karlsruher Nuclear Centre) and the Research Institute for Public Administration at the College for Administrative Sciences in Speyer. The actual course of events and the practical experience of the groups involved (authorities, applicants, objectors, experts) are related to the applicable legal arrangements for public participation. The question to be answered is: which expected and unexpected functions do the procedures actually fulfill and which not. Proposals for the amendment of legal policy could be developed upon the foundation of such a diagnosis. The report contains the results of the 'pilot study' on public participation in the licensing of the nuclear power plant GKN-2 in Neckarwestheim and further contributions on the issue of 'public participation', presented within the framework of a research colloquium at the School for Nuclear Technology/Karlsruhe Nuclear Research Centre. (orig.) [de

  20. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  1. Adoption of innovative energy systems in social housing: Lessons from eight large-scale renovation projects in The Netherlands

    International Nuclear Information System (INIS)

    Hoppe, Thomas

    2012-01-01

    Thanks to new insights on the impacts that dwellings have throughout their life cycles, there has been increased attention to retrofitting innovative energy systems (IES) in existing housing. This paper uses an explorative case study design to gain more knowledge about the governance aspects of this under-researched topic. The central research question is: Which factors influence the adoption of innovative energy systems in social housing sites during renovation projects? To answer this question, eight large-scale renovation projects in The Netherlands were investigated. These case studies allowed the identification of barriers, enabling factors and perspectives from three main actors—housing associations, tenants and local authorities. It turns out that adopting IES encounters many barriers: lack of trust between project partners, delay in project progress, financial feasibility considerations, lack of support from tenants, lengthy legal permit procedures, over-ambitious project goals, poor experiences in previous projects, and IES ambitions that are not taken serious by key decision-makers. Furthermore, IES were only successfully fitted in three of the eight projects. Moreover, ambitions were lowered as the projects progressed in all the cases investigated. The study calls for further systematic, in-depth comparison of fitting IES in large-scale renovation projects in social housing. - Highlights: ► Attention to adoption of innovative energy systems in social housing. ► Several non-technical factors influence adoption. ► In-depth analysis of eight local-level renovation projects. ► Ambitions are lowered as projects progress. ► Barriers: financial feasibility, over-ambitious goals, delay, lack of trust.

  2. Being affected by large-scale projects - considerations on the protection of rights in Atomic Energy Law

    International Nuclear Information System (INIS)

    Baumann, W.

    1982-01-01

    This is the final part of an article published in Bay. (VBL 1982, 257). The author comes to the following conclusions: Proposals for reducing the protection of rights frames against a certain background of interests and partly taken over in practice by administrative courts meet with a variety of dogmatic and political considerations relating to these rights. This does not only hold for the attempt to restrict the rights of all those under public and substantive law who are affected by construction and operation, but also for making it harder for the complainants to pursue their rights, and especially for efforts intended to further limit the fact-finding of courts in case of large-scale projects. The judicature is called to review the individual licences issued for large-scale projects, especially because of the inherent symptomatic collisions of basic rights. The judge has a confidence-building advantage; i.e. his independence, which is gaining in importance in the struggle between hardened fronts and economic interests. (orig./HSCH) [de

  3. Understanding large social-ecological systems: introducing the SESMAD project

    Directory of Open Access Journals (Sweden)

    Michael Cox

    2014-08-01

    Full Text Available This article introduces the Social-ecological systems meta-analysis database (SESMAD project, which is the project behind the case studies and synthetic articles contained in this special issue of the International Journal of the Commons. SESMAD is an internationally collaborative meta-analysis project that builds on previous seminally synthetic work on small-scale common-pool resource systems conducted at the Workshop in Political Theory and Policy Analysis at Indiana University. This project is guided by the following research question: can the variables found to be important in explaining outcomes on small-scale systems be scaled up to explain outcomes in large-scale environmental governance? In this special issue we report on our findings thus far through a set of case studies of large-scale environmental governance, a paper that describes our conceptual advances, and a paper that compares these five case studies to further examine our central research question.

  4. Organizational and technological genesis as a tool for strategic planning of large-scale real estate development projects

    Directory of Open Access Journals (Sweden)

    Gusakova Elena

    2018-01-01

    Full Text Available Conceptual planning and implementation of large-scale real estate development projects is one of the most difficult tasks in the organization of construction. In the Russian practice, a large experience of development, complex reorganization and redevelopment of large development areas is accumulated. The methodological basis for solving similar problems is the organizational and technological genesis, which considers the development of the project during the full life cycle. An analysis of this experience allows us to talk about the formation of new and effective approaches and methods within the organizational and technological genesis. Among them, the most significant and universal approaches should be highlighted: The concept of real estate development, which explains the reasons and objective needs for project transformations during its life cycle, as well as to increase the adaptive capabilities of design decisions and the project's suitability for the most likely future changes; Development project of joint action, which is based on the balance of interests of project participants; Master planning of the life cycle stages of the project and subprojects, based on the rethinking of the theory and methods of the construction organization, and allowing rationally localized construction sites and related subprojects, while retaining the remaining development and development area beyond of the negative effect of construction for comfortable living and work.

  5. Large-scale heat pumps in sustainable energy systems: System and project perspectives

    Directory of Open Access Journals (Sweden)

    Blarke Morten B.

    2007-01-01

    Full Text Available This paper shows that in support of its ability to improve the overall economic cost-effectiveness and flexibility of the Danish energy system, the financially feasible integration of large-scale heat pumps (HP with existing combined heat and power (CHP plants, is critically sensitive to the operational mode of the HP vis-à-vis the operational coefficient of performance, mainly given by the temperature level of the heat source. When using ground source for low-temperature heat source, heat production costs increases by about 10%, while partial use of condensed flue gasses for low-temperature heat source results in an 8% cost reduction. Furthermore, the analysis shows that when a large-scale HP is integrated with an existing CHP plant, the projected spot market situation in The Nordic Power Exchange (Nord Pool towards 2025, which reflects a growing share of wind power and heat-supply constrained power generation electricity, further reduces the operational hours of the CHP unit over time, while increasing the operational hours of the HP unit. In result, an HP unit at half the heat production capacity as the CHP unit in combination with a heat-only boiler represents as a possibly financially feasible alternative to CHP operation, rather than a supplement to CHP unit operation. While such revised operational strategy would have impacts on policies to promote co-generation, these results indicate that the integration of large-scale HP may jeopardize efforts to promote co-generation. Policy instruments should be designed to promote the integration of HP with lower than half of the heating capacity of the CHP unit. Also it is found, that CHP-HP plant designs should allow for the utilization of heat recovered from the CHP unit’s flue gasses for both concurrent (CHP unit and HP unit and independent operation (HP unit only. For independent operation, the recovered heat is required to be stored. .

  6. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  7. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  8. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  9. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  10. Projected large flood event sensitivity to projection selection and temporal downscaling methodology

    Energy Technology Data Exchange (ETDEWEB)

    Raff, D. [U.S. Dept. of the Interior, Bureau of Reclamation, Denver, Colorado (United States)

    2008-07-01

    Large flood events, that influence regulatory guidelines as well as safety of dams decisions, are likely to be affected by climate change. This talk will evaluate the use of climate projections downscaled and run through a rainfall - runoff model and its influence on large flood events. The climate spatial downscaling is performed statistically and a re-sampling and scaling methodology is used to temporally downscale from monthly to daily signals. The signals are run through a National Weather Service operational rainfall-runoff model to produce 6-hour flows. The flows will be evaluated for changes in large events at look-ahead horizons from 2011 - 2040, 2041 - 2070, and 2071 - 2099. The sensitivity of results will be evaluated with respect to projection selection criteria and re-sampling and scaling criteria for the Boise River in Idaho near Lucky Peak Dam. (author)

  11. Projected large flood event sensitivity to projection selection and temporal downscaling methodology

    International Nuclear Information System (INIS)

    Raff, D.

    2008-01-01

    Large flood events, that influence regulatory guidelines as well as safety of dams decisions, are likely to be affected by climate change. This talk will evaluate the use of climate projections downscaled and run through a rainfall - runoff model and its influence on large flood events. The climate spatial downscaling is performed statistically and a re-sampling and scaling methodology is used to temporally downscale from monthly to daily signals. The signals are run through a National Weather Service operational rainfall-runoff model to produce 6-hour flows. The flows will be evaluated for changes in large events at look-ahead horizons from 2011 - 2040, 2041 - 2070, and 2071 - 2099. The sensitivity of results will be evaluated with respect to projection selection criteria and re-sampling and scaling criteria for the Boise River in Idaho near Lucky Peak Dam. (author)

  12. Large scale afforestation projects mitigate degradation and increase the stability of the karst ecosystems in southwest China

    Science.gov (United States)

    Yue, Y.; Tong, X.; Wang, K.; Fensholt, R.; Brandt, M.

    2017-12-01

    With the aim to combat desertification and improve the ecological environment, mega-engineering afforestation projects have been launched in the karst regions of southwest China around the turn of the new millennium. A positive impact of these projects on vegetation cover has been shown, however, it remains unclear if conservation efforts have been able to effectively restore ecosystem properties and reduce the sensitivity of the karst ecosystem to climate variations at large scales. Here we use passive microwave and optical satellite time series data combined with the ecosystem model LPJ-GUESS and show widespread increase in vegetation cover with a clear demarcation at the Chinese national border contrasting the conditions of neighboring countries. We apply a breakpoint detection to identify permanent changes in vegetation time series and assess the vegetation's sensitivity against climate before and after the breakpoints. A majority (74%) of the breakpoints were detected between 2001 and 2004 and are remarkably in line with the implementation and spatial extent of the Grain to Green project. We stratify the counties of the study area into four groups according to the extent of Grain to Green conservation areas and find distinct differences between the groups. Vegetation trends are similar prior to afforestation activities (1982-2000), but clearly diverge at a later stage, following the spatial extent of conservation areas. Moreover, vegetation cover dynamics were increasingly decoupled from climatic influence in areas of high conservation efforts. Whereas both vegetation resilience and resistance were considerably improved in areas with large conservation efforts thereby showing an increase in ecosystem stability, ongoing degradation and an amplified sensitivity to climate variability was found in areas with limited project implementation. Our study concludes that large scale conservation projects can regionally contribute to a greening Earth and are able to

  13. Large-scale building energy efficiency retrofit: Concept, model and control

    International Nuclear Information System (INIS)

    Wu, Zhou; Wang, Bo; Xia, Xiaohua

    2016-01-01

    BEER (Building energy efficiency retrofit) projects are initiated in many nations and regions over the world. Existing studies of BEER focus on modeling and planning based on one building and one year period of retrofitting, which cannot be applied to certain large BEER projects with multiple buildings and multi-year retrofit. In this paper, the large-scale BEER problem is defined in a general TBT (time-building-technology) framework, which fits essential requirements of real-world projects. The large-scale BEER is newly studied in the control approach rather than the optimization approach commonly used before. Optimal control is proposed to design optimal retrofitting strategy in terms of maximal energy savings and maximal NPV (net present value). The designed strategy is dynamically changing on dimensions of time, building and technology. The TBT framework and the optimal control approach are verified in a large BEER project, and results indicate that promising performance of energy and cost savings can be achieved in the general TBT framework. - Highlights: • Energy efficiency retrofit of many buildings is studied. • A TBT (time-building-technology) framework is proposed. • The control system of the large-scale BEER is modeled. • The optimal retrofitting strategy is obtained.

  14. Community response to large-scale federal projects: the case of the MX

    International Nuclear Information System (INIS)

    Albrecht, S.L.

    1983-01-01

    An analysis of community response to large-scale defense projects, such as the proposals to site MX missiles in Utah and Nevada, is one way to identify those factors likely to be important in determining community response to nuclear waste repository siting. This chapter gives a brief overview of the MX system's characteristics and the potential impacts it would have had on the rural areas, describes the patterns of community mobilization that occurred in Utah and Nevada, and suggests where this response may parallel community concerns about a repository siting. Three lessons from the MX experience are that local residents, asked to assume a disproportionate share of the negative impacts, should be involved in the siting process, that local residents should be treated as equal, and that compensation should be offered when local residents suffer from political expediency

  15. Architectures of adaptive integration in large collaborative projects

    Directory of Open Access Journals (Sweden)

    Lois Wright Morton

    2015-12-01

    Full Text Available Collaborations to address complex societal problems associated with managing human-natural systems often require large teams comprised of scientists from multiple disciplines. For many such problems, large-scale, transdisciplinary projects whose members include scientists, stakeholders, and other professionals are necessary. The success of very large, transdisciplinary projects can be facilitated by attending to the diversity of types of collaboration that inevitably occur within them. As projects progress and evolve, the resulting dynamic collaborative heterogeneity within them constitutes architectures of adaptive integration (AAI. Management that acknowledges this dynamic and fosters and promotes awareness of it within a project can better facilitate the creativity and innovation required to address problems from a systems perspective. In successful large projects, AAI (1 functionally meets objectives and goals, (2 uses disciplinary expertise and concurrently bridges many disciplines, (3 has mechanisms to enable connection, (4 delineates boundaries to keep focus but retain flexibility, (5 continuously monitors and adapts, and (6 encourages project-wide awareness. These principles are illustrated using as case studies three large climate change and agriculture projects funded by the U.S. Department of Agriculture-National Institute of Food and Agriculture.

  16. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  17. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach...... for setting the design brief in a large scale urban landscape in Norway, the Jaeren region around the city of Stavanger. In this paper, we first outline the methodological challenges and then present and discuss the proposed method based on our teaching experiences. On this basis, we discuss aspects...... is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines...

  18. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  19. Efficient management vital to large, long-term engineering projects

    International Nuclear Information System (INIS)

    Wolfe, P.L.

    1989-01-01

    This article describes the ways in which firms manage large hazardous waste mitigation projects efficiently. Staffing concerns, control systems and report mechanisms critical to effective and timely management of these large-scale programs are explored

  20. Testing, development and demonstration of large scale solar district heating systems

    DEFF Research Database (Denmark)

    Furbo, Simon; Fan, Jianhua; Perers, Bengt

    2015-01-01

    In 2013-2014 the project “Testing, development and demonstration of large scale solar district heating systems” was carried out within the Sino-Danish Renewable Energy Development Programme, the so called RED programme jointly developed by the Chinese and Danish governments. In the project Danish...... know how on solar heating plants and solar heating test technology have been transferred from Denmark to China, large solar heating systems have been promoted in China, test capabilities on solar collectors and large scale solar heating systems have been improved in China and Danish-Chinese cooperation...

  1. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  2. Overview of large scale experiments performed within the LBB project in the Czech Republic

    Energy Technology Data Exchange (ETDEWEB)

    Kadecka, P.; Lauerova, D. [Nuclear Research Institute, Rez (Czechoslovakia)

    1997-04-01

    During several recent years NRI Rez has been performing the LBB analyses of safety significant primary circuit pipings of NPPs in Czech and Slovak Republics. The analyses covered the NPPs with reactors WWER 440 Type 230 and 213 and WWER 1000 Type 320. Within the relevant LBB projects undertaken with the aim to prove the fulfilling of the requirements of LBB, a series of large scale experiments were performed. The goal of these experiments was to verify the properties of the components selected, and to prove the quality and/or conservatism of assessments used in the LBB-analyses. In this poster, a brief overview of experiments performed in Czech Republic under guidance of NRI Rez is presented.

  3. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  4. Demonstrating a new framework for the comparison of environmental impacts from small- and large-scale hydropower and wind power projects.

    Science.gov (United States)

    Bakken, Tor Haakon; Aase, Anne Guri; Hagen, Dagmar; Sundt, Håkon; Barton, David N; Lujala, Päivi

    2014-07-01

    Climate change and the needed reductions in the use of fossil fuels call for the development of renewable energy sources. However, renewable energy production, such as hydropower (both small- and large-scale) and wind power have adverse impacts on the local environment by causing reductions in biodiversity and loss of habitats and species. This paper compares the environmental impacts of many small-scale hydropower plants with a few large-scale hydropower projects and one wind power farm, based on the same set of environmental parameters; land occupation, reduction in wilderness areas (INON), visibility and impacts on red-listed species. Our basis for comparison was similar energy volumes produced, without considering the quality of the energy services provided. The results show that small-scale hydropower performs less favourably in all parameters except land occupation. The land occupation of large hydropower and wind power is in the range of 45-50 m(2)/MWh, which is more than two times larger than the small-scale hydropower, where the large land occupation for large hydropower is explained by the extent of the reservoirs. On all the three other parameters small-scale hydropower performs more than two times worse than both large hydropower and wind power. Wind power compares similarly to large-scale hydropower regarding land occupation, much better on the reduction in INON areas, and in the same range regarding red-listed species. Our results demonstrate that the selected four parameters provide a basis for further development of a fair and consistent comparison of impacts between the analysed renewable technologies. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Studies on combined model based on functional objectives of large scale complex engineering

    Science.gov (United States)

    Yuting, Wang; Jingchun, Feng; Jiabao, Sun

    2018-03-01

    As various functions were included in large scale complex engineering, and each function would be conducted with completion of one or more projects, combined projects affecting their functions should be located. Based on the types of project portfolio, the relationship of projects and their functional objectives were analyzed. On that premise, portfolio projects-technics based on their functional objectives were introduced, then we studied and raised the principles of portfolio projects-technics based on the functional objectives of projects. In addition, The processes of combined projects were also constructed. With the help of portfolio projects-technics based on the functional objectives of projects, our research findings laid a good foundation for management of large scale complex engineering portfolio management.

  6. Large-scale demonstration and deployment project at Los Alamos National Laboratory

    International Nuclear Information System (INIS)

    Brown, S.; McFee, J.; Broom, C.; Dugger, H.; Stallings, E.

    1999-01-01

    Established by the US Department of Energy (DOE) Environmental Management program through its Office of Science and Technology, the Deactivation and Decommissioning Focus Area is developing answers to the technological problems that hinder Environmental Management's extensive cleanup efforts. The optimized application of technologies to ongoing nuclear facility decontamination and dismantlement is critical in meeting the challenge of decommissioning approximately 9,000 buildings and structures within the DOE complex. The significant technical and economic concerns in this area underscore a national imperative for the qualification and timely delivery of cost-reduction technologies and management approaches to meet federal and private needs. At Los Alamos National Laboratory (LANL), a Large-Scale Demonstration and Deployment Project (LSDDP) has been established to facilitate demonstration and deployment of technologies for the characterization, decontamination, and volume reduction of oversized metallic waste, mostly in the form of gloveboxes contaminated with transuranic radionuclides. The LANL LSDDP is being managed by an integrated contractor team (ICT) consisting of IT Corporation, ICF Incorporated, and Florida International University and includes representation from LANL's Environmental Management Program Office. The ICT published in the Commerce Business Daily a solicitation for interest for innovative technologies capable of improving cost and performance of the baseline process. Each expression of interest response was evaluated and demonstration contract negotiations are under way for those technologies expected to be capable of meeting the project objectives. This paper discusses management organization and approach, the results of the technology search, the technology selection methodology, the results of the selection process, and future plans for the program

  7. Large-scale projects between regional planning and environmental protection

    International Nuclear Information System (INIS)

    Schmidt, G.

    1984-01-01

    The first part of the work discusses the current law of land-use planning, municipal and technical construction planning, and licensing under the atomic energy law and the federal law on immission protection. In the second part some theses suggesting modifications are submitted. In the sector of land-use planning substantial contributions to the protection of the environment can only be expected from programs and plans (aims). For the environmental conflicts likely to arise from large-scale projects (nuclear power plant, fossil-fuel power plant) this holds good for the most part of site selection plans. They have bearings on environmental protection in that they presuppose thorough examination of facts, help to recognize possible conflicts at an early date and provide a frame for solving those problems. Municipal construction planning is guided by the following principles: Environmental protection is an equivalent planning target. Environmental data and facts and their methodical processing play a fundamental part as they constitute the basis of evaluation. Under the rules and regulations of the federal law on immission protection, section 5, number 2 - prevention of nuisances - operators are obliged to take preventive care of risks. That section is not concerned with planning or distribution. Neither does the licensing of nuclear plants have planning character. So far as the legal preconditions of licensing are fulfilled, the scope for rejection of an application under section 7, subsection 2 of the atomic energy law in view of site selection and requirement of a plant hardly carries any practical weight. (orig./HP) [de

  8. Third generation participatory design in health informatics--making user participation applicable to large-scale information system projects.

    Science.gov (United States)

    Pilemalm, Sofie; Timpka, Toomas

    2008-04-01

    Participatory Design (PD) methods in the field of health informatics have mainly been applied to the development of small-scale systems with homogeneous user groups in local settings. Meanwhile, health service organizations are becoming increasingly large and complex in character, making it necessary to extend the scope of the systems that are used for managing data, information and knowledge. This study reports participatory action research on the development of a PD framework for large-scale system design. The research was conducted in a public health informatics project aimed at developing a system for 175,000 users. A renewed PD framework was developed in response to six major limitations experienced to be associated with the existing methods. The resulting framework preserves the theoretical grounding, but extends the toolbox to suit applications in networked health service organizations. Future research should involve evaluations of the framework in other health service settings where comprehensive HISs are developed.

  9. A review of large-scale solar heating systems in Europe

    International Nuclear Information System (INIS)

    Fisch, M.N.; Guigas, M.; Dalenback, J.O.

    1998-01-01

    Large-scale solar applications benefit from the effect of scale. Compared to small solar domestic hot water (DHW) systems for single-family houses, the solar heat cost can be cut at least in third. The most interesting projects for replacing fossil fuels and the reduction of CO 2 -emissions are solar systems with seasonal storage in combination with gas or biomass boilers. In the framework of the EU-APAS project Large-scale Solar Heating Systems, thirteen existing plants in six European countries have been evaluated. lie yearly solar gains of the systems are between 300 and 550 kWh per m 2 collector area. The investment cost of solar plants with short-term storage varies from 300 up to 600 ECU per m 2 . Systems with seasonal storage show investment costs twice as high. Results of studies concerning the market potential for solar heating plants, taking new collector concepts and industrial production into account, are presented. Site specific studies and predesign of large-scale solar heating plants in six European countries for housing developments show a 50% cost reduction compared to existing projects. The cost-benefit-ratio for the planned systems with long-term storage is between 0.7 and 1.5 ECU per kWh per year. (author)

  10. Legal control of technical large-scale projects

    International Nuclear Information System (INIS)

    Kuhnt, D.

    1981-01-01

    The principle derived from experience that large projects require approval by the courts may not longer be valid. On the contrary, the courts are only entitled to real legal control according to the principle of the division of powers. If not accurately defined legal terms cannot be waived, the administration has to set the frame for review by courts by technical standards to be given in statutory ordinances, administrative provisions and administrative instructions. The average term of administrative proceedings has to be shortened considerably. The plaintiff as well as the beneficiaries of the act of licensing have a right to a prompt decision. The immediate execution of a decision can, on principle, also not be waived in future. More than up to now, the careful consideration of the interests and not an anticipated judgement on the main issue has to be the subject of legal examination according to section 80, subsection 5 of the German code of administrative procedure (Verwaltungsgerichtsordnung). (orig./HP) [de

  11. Analysis of environmental impact assessment for large-scale X-ray medical equipments

    International Nuclear Information System (INIS)

    Fu Jin; Pei Chengkai

    2011-01-01

    Based on an Environmental Impact Assessment (EIA) project, this paper elaborates the basic analysis essentials of EIA for the sales project of large-scale X-ray medical equipment, and provides the analysis procedure of environmental impact and dose estimation method under normal and accident conditions. The key points of EIA for the sales project of large-scale X-ray medical equipment include the determination of pollution factor and management limit value according to the project's actual situation, the utilization of various methods of assessment and prediction such as analogy, actual measurement and calculation to analyze, monitor, calculate and predict the pollution during normal and accident condition. (authors)

  12. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  13. Output Control Technologies for a Large-scale PV System Considering Impacts on a Power Grid

    Science.gov (United States)

    Kuwayama, Akira

    The mega-solar demonstration project named “Verification of Grid Stabilization with Large-scale PV Power Generation systems” had been completed in March 2011 at Wakkanai, the northernmost city of Japan. The major objectives of this project were to evaluate adverse impacts of large-scale PV power generation systems connected to the power grid and develop output control technologies with integrated battery storage system. This paper describes the outline and results of this project. These results show the effectiveness of battery storage system and also proposed output control methods for a large-scale PV system to ensure stable operation of power grids. NEDO, New Energy and Industrial Technology Development Organization of Japan conducted this project and HEPCO, Hokkaido Electric Power Co., Inc managed the overall project.

  14. The SCALE-UP Project

    Science.gov (United States)

    Beichner, Robert

    2015-03-01

    The Student Centered Active Learning Environment with Upside-down Pedagogies (SCALE-UP) project was developed nearly 20 years ago as an economical way to provide collaborative, interactive instruction even for large enrollment classes. Nearly all research-based pedagogies have been designed with fairly high faculty-student ratios. The economics of introductory courses at large universities often precludes that situation, so SCALE-UP was created as a way to facilitate highly collaborative active learning with large numbers of students served by only a few faculty and assistants. It enables those students to learn and succeed not only in acquiring content, but also to practice important 21st century skills like problem solving, communication, and teamsmanship. The approach was initially targeted at undergraduate science and engineering students taking introductory physics courses in large enrollment sections. It has since expanded to multiple content areas, including chemistry, math, engineering, biology, business, nursing, and even the humanities. Class sizes range from 24 to over 600. Data collected from multiple sites around the world indicates highly successful implementation at more than 250 institutions. NSF support was critical for initial development and dissemination efforts. Generously supported by NSF (9752313, 9981107) and FIPSE (P116B971905, P116B000659).

  15. Large scale nuclear structure studies

    International Nuclear Information System (INIS)

    Faessler, A.

    1985-01-01

    Results of large scale nuclear structure studies are reported. The starting point is the Hartree-Fock-Bogoliubov solution with angular momentum and proton and neutron number projection after variation. This model for number and spin projected two-quasiparticle excitations with realistic forces yields in sd-shell nuclei similar good results as the 'exact' shell-model calculations. Here the authors present results for a pf-shell nucleus 46 Ti and results for the A=130 mass region where they studied 58 different nuclei with the same single-particle energies and the same effective force derived from a meson exchange potential. They carried out a Hartree-Fock-Bogoliubov variation after mean field projection in realistic model spaces. In this way, they determine for each yrast state the optimal mean Hartree-Fock-Bogoliubov field. They apply this method to 130 Ce and 128 Ba using the same effective nucleon-nucleon interaction. (Auth.)

  16. Large-scale innovation and change in UK higher education

    Directory of Open Access Journals (Sweden)

    Stephen Brown

    2013-09-01

    Full Text Available This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ technology to deliver such changes. Key lessons that emerged from these experiences are reviewed covering themes of pervasiveness, unofficial systems, project creep, opposition, pressure to deliver, personnel changes and technology issues. The paper argues that collaborative approaches to project management offer greater prospects of effective large-scale change in universities than either management-driven top-down or more champion-led bottom-up methods. It also argues that while some diminution of control over project outcomes is inherent in this approach, this is outweighed by potential benefits of lasting and widespread adoption of agreed changes.

  17. Participation of the public in the licensing of large-scale projects with environmental impact - theoretical framework and project conceptual outline

    International Nuclear Information System (INIS)

    Bechmann, G.; Gloede, F.

    1989-01-01

    The paper discusses the dimensions and environmental impacts of large-scale projects, and public participation as an element of institutionalized problem management. The legal protection functions, the administrative and socio-political functions are subsumed under a concept of function which defines a function as a synthesis of a variety of possibilities, so that different 'achievements' are expected from public participation, depending on the reference system chosen for comparison. The paper explains the purpose of the examination of public participation and its functions, the design of analytical procedure which covers the three main stages of 'input, legal procedure, output', and instruments and aspects such as interdisciplinarity, representativity, comparison, relation between empirical research and the principles of the law. (HSCH) [de

  18. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  19. Academic Training Lecture Regular Programme: How Large-Scale Civil Engineering Projects Realise the Potential of a City

    CERN Multimedia

    2012-01-01

    How Large-Scale Civil Engineering Projects Realise the Potential of a City (1/3), by Bill Hanway (Excecutive Director of Operations, AECOM Europe).   Wednesday, June 6, 2012 from 11:00 to 12:00 (Europe/Zurich) at CERN ( 80-1-001 - Globe 1st Floor ) In this series of three special lectures, leading experts from AECOM would explore the impact of a trio of major projects on a single city. In common with every metropolis, London has run-down districts and infrastructure in need of upgrading. The lectures propose to cover three of the biggest challenges: regenerating run-down areas; reducing congestion and transporting people more efficiently; and improving water and wastewater systems. Each project contributes to a collective public aim - to realise the potential of a growing city, and ensure its healthy, sustainable and competitive future. Lecture 1: Into the lecture series and The London 2012 Olympic Games Most cities share a group of common complex challenges – growing populations, agei...

  20. The (in)effectiveness of Global Land Policies on Large-Scale Land Acquisition

    NARCIS (Netherlands)

    Verhoog, S.M.

    2014-01-01

    Due to current crises, large-scale land acquisition (LSLA) is becoming a topic of growing concern. Public data from the ‘Land Matrix Global Observatory’ project (Land Matrix 2014a) demonstrates that since 2000, 1,664 large-scale land transactions in low- and middle-income countries were reported,

  1. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  2. Large-scale seismic test for soil-structure interaction research in Hualien, Taiwan

    International Nuclear Information System (INIS)

    Ueshima, T.; Kokusho, T.; Okamoto, T.

    1995-01-01

    It is important to evaluate dynamic soil-structure interaction more accurately in the aseismic design of important facilities such as nuclear power plants. A large-scale model structure with about 1/4th of commercial nuclear power plants was constructed on the gravelly layers in seismically active Hualien, Taiwan. This international joint project is called 'the Hualien LSST Project', where 'LSST' is short for Large-Scale Seismic Test. In this paper, research tasks and responsibilities, the process of the construction work and research tasks along the time-line, main results obtained up to now, and so on in this Project are described. (J.P.N.)

  3. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  4. Quantitative analysis on the environmental impact of large-scale water transfer project on water resource area in a changing environment

    Directory of Open Access Journals (Sweden)

    D. H. Yan

    2012-08-01

    Full Text Available The interbasin long-distance water transfer project is key support for the reasonable allocation of water resources in a large-scale area, which can optimize the spatio-temporal change of water resources to secure the amount of water available. Large-scale water transfer projects have a deep influence on ecosystems; besides, global climate change causes uncertainty and additive effect of the environmental impact of water transfer projects. Therefore, how to assess the ecological and environmental impact of megaprojects in both construction and operation phases has triggered a lot of attention. The water-output area of the western route of China's South-North Water Transfer Project was taken as the study area of the present article. According to relevant evaluation principles and on the basis of background analysis, we identified the influencing factors and established the diagnostic index system. The climate-hydrology-ecology coupled simulation model was used to simulate and predict ecological and environmental responses of the water resource area in a changing environment. The emphasis of impact evaluation was placed on the reservoir construction and operation scheduling, representative river corridors and wetlands, natural reserves and the water environment below the dam sites. In the end, an overall evaluation of the comprehensive influence of the project was conducted. The research results were as follows: the environmental impacts of the western route project in the water resource area were concentrated on two aspects: the permanent destruction of vegetation during the phase of dam construction and river impoundment, and the significant influence on the hydrological situation of natural river corridor after the implementation of water extraction. The impact on local climate, vegetation ecology, typical wetlands, natural reserves and the water environment of river basins below the dam sites was small.

  5. Quantitative analysis on the environmental impact of large-scale water transfer project on water resource area in a changing environment

    Science.gov (United States)

    Yan, D. H.; Wang, H.; Li, H. H.; Wang, G.; Qin, T. L.; Wang, D. Y.; Wang, L. H.

    2012-08-01

    The interbasin long-distance water transfer project is key support for the reasonable allocation of water resources in a large-scale area, which can optimize the spatio-temporal change of water resources to secure the amount of water available. Large-scale water transfer projects have a deep influence on ecosystems; besides, global climate change causes uncertainty and additive effect of the environmental impact of water transfer projects. Therefore, how to assess the ecological and environmental impact of megaprojects in both construction and operation phases has triggered a lot of attention. The water-output area of the western route of China's South-North Water Transfer Project was taken as the study area of the present article. According to relevant evaluation principles and on the basis of background analysis, we identified the influencing factors and established the diagnostic index system. The climate-hydrology-ecology coupled simulation model was used to simulate and predict ecological and environmental responses of the water resource area in a changing environment. The emphasis of impact evaluation was placed on the reservoir construction and operation scheduling, representative river corridors and wetlands, natural reserves and the water environment below the dam sites. In the end, an overall evaluation of the comprehensive influence of the project was conducted. The research results were as follows: the environmental impacts of the western route project in the water resource area were concentrated on two aspects: the permanent destruction of vegetation during the phase of dam construction and river impoundment, and the significant influence on the hydrological situation of natural river corridor after the implementation of water extraction. The impact on local climate, vegetation ecology, typical wetlands, natural reserves and the water environment of river basins below the dam sites was small.

  6. Large-Scale Urban Projects, Production of Space and Neo-liberal Hegemony: A Comparative Study of Izmir

    Directory of Open Access Journals (Sweden)

    Mehmet PENPECİOĞLU

    2013-04-01

    Full Text Available With the rise of neo-liberalism, large-scale urban projects (LDPs have become a powerful mechanism of urban policy. Creating spaces of neo-liberal urbanization such as central business districts, tourism centers, gated residences and shopping malls, LDPs play a role not only in the reproduction of capital accumulation relations but also in the shift of urban political priorities towards the construction of neo-liberal hegemony. The construction of neo-liberal hegemony and the role played by LDPs in this process could not only be investigated by the analysis of capital accumulation. For such an investigation; the role of state and civil society actors in LDPs, their collaborative and conflictual relationships should be researched and their functions in hegemony should be revealed. In the case of Izmir’s two LDPs, namely the New City Center (NCC and Inciraltı Tourism Center (ITC projects, this study analyzes the relationship between the production of space and neo-liberal hegemony. In the NCC project, local governments, investors, local capital organizations and professional chambers collaborated and disseminated hegemonic discourse, which provided social support for the project. Through these relationships and discourses, the NCC project has become a hegemonic project for producing space and constructed neo-liberal hegemony over urban political priorities. In contrast to the NCC project, the ITC project saw no collaboration between state and organized civil society actors. The social opposition against the ITC project, initiated by professional chambers, has brought legal action against the ITC development plans in order to prevent their implementation. As a result, the ITC project did not acquire the consent of organized social groups and failed to become a hegemonic project for producing space.

  7. Choice of large projects

    Energy Technology Data Exchange (ETDEWEB)

    Harris, R

    1978-08-01

    Conventional cost/benefit or project analysis has generally not taken into account circumstances in which the project under consideration is large enough that its introduction to the economy would have significant general equilibrium effects. In this paper, rules are examined that would indicate whether such large projects should be accepted or rejected. The rules utilize information yielded by before-project and after-project equilibrium prices and production data. Rules are developed for the undistorted ''first-best'' case, the case in which the fixed costs of the project are covered by distortionary taxation, and for the case of projects producing public goods. 34 references.

  8. Large Scale eHealth Deployment in Europe: Insights from Concurrent Use of Standards.

    Science.gov (United States)

    Eichelberg, Marco; Chronaki, Catherine

    2016-01-01

    Large-scale eHealth deployment projects face a major challenge when called to select the right set of standards and tools to achieve sustainable interoperability in an ecosystem including both legacy systems and new systems reflecting technological trends and progress. There is not a single standard that would cover all needs of an eHealth project, and there is a multitude of overlapping and perhaps competing standards that can be employed to define document formats, terminology, communication protocols mirroring alternative technical approaches and schools of thought. eHealth projects need to respond to the important question of how alternative or inconsistently implemented standards and specifications can be used to ensure practical interoperability and long-term sustainability in large scale eHealth deployment. In the eStandards project, 19 European case studies reporting from R&D and large-scale eHealth deployment and policy projects were analyzed. Although this study is not exhaustive, reflecting on the concepts, standards, and tools for concurrent use and the successes, failures, and lessons learned, this paper offers practical insights on how eHealth deployment projects can make the most of the available eHealth standards and tools and how standards and profile developing organizations can serve the users embracing sustainability and technical innovation.

  9. Implementing Projects in Calculus on a Large Scale at the University of South Florida

    Science.gov (United States)

    Fox, Gordon A.; Campbell, Scott; Grinshpan, Arcadii; Xu, Xiaoying; Holcomb, John; Bénéteau, Catherine; Lewis, Jennifer E.; Ramachandran, Kandethody

    2017-01-01

    This paper describes the development of a program of project-based learning in Calculus courses at a large urban research university. In this program, students developed research projects in consultation with a faculty advisor in their major, and supervised by their calculus instructors. Students wrote up their projects in a prescribed format…

  10. Large-scale decontamination and decommissioning technology demonstration project at a former uranium metal production facility

    International Nuclear Information System (INIS)

    Martineit, R.A.; Borgman, T.D.; Peters, M.S.; Stebbins, L.L.

    1997-01-01

    The Department of Energy's (DOE) Office of Science and Technology Decontamination and Decommissioning (D ampersand D) Focus Area, led by the Federal Energy Technology Center, has been charged with improving upon baseline D ampersand D technologies with the goal of demonstrating and validating more cost-effective and safer technologies to characterize, deactivate, survey, decontaminate, dismantle, and dispose of surplus structures, buildings, and their contents at DOE sites. The D ampersand D Focus Area's approach to verifying the benefits of the improved D ampersand D technologies is to use them in large-scale technology demonstration (LSTD) projects at several DOE sites. The Fernald Environmental Management Project (FEMP) was selected to host one of the first three LSTD's awarded by the D ampersand D Focus Area. The FEMP is a DOE facility near Cincinnati, Ohio, that was formerly engaged in the production of high quality uranium metal. The FEMP is a Superfund site which has completed its RUFS process and is currently undergoing environmental restoration. With the FEMP's selection to host an LSTD, the FEMP was immediately faced with some challenges. The primary challenge was that this LSTD was to be integrated into the FEMP's Plant 1 D ampersand D Project which was an ongoing D ampersand D Project for which a firm fixed price contract had been issued to the D ampersand D Contractor. Thus, interferences with the baseline D ampersand D project could have significant financial implications. Other challenges include defining and selecting meaningful technology demonstrations, finding/selecting technology providers, and integrating the technology into the baseline D ampersand D project. To date, twelve technologies have been selected, and six have been demonstrated. The technology demonstrations have yielded a high proportion of open-quotes winners.close quotes All demonstrated, technologies will be evaluated for incorporation into the FEMP's baseline D ampersand D

  11. The MedAustron project: an example of large-scale technology transfer

    CERN Multimedia

    Antonella Del Rosso & Michael Benedikt

    2013-01-01

    In January this year, CERN’s Director-General Rolf Heuer handed over the first ion source to the MedAustron therapy centre in the town of Wiener Neustadt in the presence of the Austrian authorities. This milestone marks the beginning of the transition from the development and design phase to the commissioning of the new facility.   Handover of the ion source to MedAustron on 11 January, 2013. From left to right: Michael Benedikt (Project Leader MedAustron at CERN), Karlheinz Töchterle (Austrian Federal Minister of Science and Research), Erwin Pröll (Governor of Lower Austria), Rolf Heuer (Director-General CERN), Klaus Schneeberger (Lower Austrian State Parliament, Head of EBG MedAustron Council). The goal of the MedAustron project is the construction of an ion-therapy and research centre, based on a synchrotron accelerator complex, in Austria (for more about the technical part of the MedAustron project, click here). “MedAustron will be the first large-sca...

  12. Evaluation of scheduling problems for the project planning of large-scale projects using the example of nuclear facility dismantling; Evaluation von Schedulingproblemen fuer die Projektplanung von Grossprojekten am Beispiel des kerntechnischen Rueckbaus

    Energy Technology Data Exchange (ETDEWEB)

    Huebner, Felix; Schellenbaum, Uli; Stuerck, Christian; Gerhards, Patrick; Schultmann, Frank

    2017-05-15

    The magnitude of widespread nuclear decommissioning and dismantling, regarding deconstruction costs and project duration, exceeds even most of the prominent large-scale projects. The deconstruction costs of one reactor are estimated at several hundred million Euros and the dismantling period for more than a decade. The nuclear power plants built in the 1970s are coming closer to the end of their planned operating lifespan. Therefore, the decommissioning and dismantling of nuclear facilities, which is posing a multitude of challenges to planning and implementation, is becoming more and more relevant. This study describes planning methods for large-scale projects. The goal of this paper is to formulate a project planning problem that appropriately copes with the specific challenges of nuclear deconstruction projects. For this purpose, the requirements for appropriate scheduling methods are presented. Furthermore, a variety of possible scheduling problems are introduced and compared by their specifications and their behaviour. A set of particular scheduling problems including possible extensions and generalisations is assessed in detail. Based on the introduced problems and extensions, a Multi-mode Resource Investment Problem with Tardiness Penalty is chosen to fit the requirements of nuclear facility dismantling. This scheduling problem is then customised and adjusted according to the specific challenges of nuclear deconstruction projects. It can be called a Multi-mode Resource Investment Problem under the consideration of generalized precedence constraints and post-operational costs.

  13. Research Proposal: Methodology for Assessment Frameworks in Large-scale Infrastructural Water Projects

    NARCIS (Netherlands)

    Hommes, Saskia

    2005-01-01

    Water management is a central and ongoing issue in the Netherlands. Large infrastructural projects are being carried out and planned in a number of water systems. These initiatives operate within a complex web of interactions, between short- and long-term, economic costs and benefits, technical

  14. Selecting Large Portfolios of Social Projects in Public Organizations

    Directory of Open Access Journals (Sweden)

    Igor Litvinchev

    2014-01-01

    Full Text Available We address the portfolio selection of social projects in public organizations considering interdependencies (synergies affecting project funds requirements and tasks. A mixed integer linear programming model is proposed incorporating the most relevant aspects of the problem found in the literature. The model supports both complete (all or nothing and partial (a certain amount from a given interval of funding resource allocation policies. Numerical results for large-scale problem instances are presented.

  15. Final Report: Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Ghattas, Omar [The University of Texas at Austin

    2013-10-15

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimiza- tion) Project focuses on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimiza- tion and inversion methods. Our research is directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. Our efforts are integrated in the context of a challenging testbed problem that considers subsurface reacting flow and transport. The MIT component of the SAGUARO Project addresses the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas-Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to- observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as "reduce then sample" and "sample then reduce." In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to achieve their speedups.

  16. A Nonlinear Multiobjective Bilevel Model for Minimum Cost Network Flow Problem in a Large-Scale Construction Project

    Directory of Open Access Journals (Sweden)

    Jiuping Xu

    2012-01-01

    Full Text Available The aim of this study is to deal with a minimum cost network flow problem (MCNFP in a large-scale construction project using a nonlinear multiobjective bilevel model with birandom variables. The main target of the upper level is to minimize both direct and transportation time costs. The target of the lower level is to minimize transportation costs. After an analysis of the birandom variables, an expectation multiobjective bilevel programming model with chance constraints is formulated to incorporate decision makers’ preferences. To solve the identified special conditions, an equivalent crisp model is proposed with an additional multiobjective bilevel particle swarm optimization (MOBLPSO developed to solve the model. The Shuibuya Hydropower Project is used as a real-world example to verify the proposed approach. Results and analysis are presented to highlight the performances of the MOBLPSO, which is very effective and efficient compared to a genetic algorithm and a simulated annealing algorithm.

  17. Open TG-GATEs: a large-scale toxicogenomics database

    Science.gov (United States)

    Igarashi, Yoshinobu; Nakatsu, Noriyuki; Yamashita, Tomoya; Ono, Atsushi; Ohno, Yasuo; Urushidani, Tetsuro; Yamada, Hiroshi

    2015-01-01

    Toxicogenomics focuses on assessing the safety of compounds using gene expression profiles. Gene expression signatures from large toxicogenomics databases are expected to perform better than small databases in identifying biomarkers for the prediction and evaluation of drug safety based on a compound's toxicological mechanisms in animal target organs. Over the past 10 years, the Japanese Toxicogenomics Project consortium (TGP) has been developing a large-scale toxicogenomics database consisting of data from 170 compounds (mostly drugs) with the aim of improving and enhancing drug safety assessment. Most of the data generated by the project (e.g. gene expression, pathology, lot number) are freely available to the public via Open TG-GATEs (Toxicogenomics Project-Genomics Assisted Toxicity Evaluation System). Here, we provide a comprehensive overview of the database, including both gene expression data and metadata, with a description of experimental conditions and procedures used to generate the database. Open TG-GATEs is available from http://toxico.nibio.go.jp/english/index.html. PMID:25313160

  18. Modern Sorters for Soil Segregation on Large Scale Remediation Projects

    International Nuclear Information System (INIS)

    Shonka, J.J.; Kelley, J.E.; O'Brien, J.M.

    2008-01-01

    volume is 50 kilograms or less. Smaller sorting volumes can be obtained with lower throughput or by re-sorting the diverted material. This equipment can also handle large objects. The use of spectroscopy systems allows several regions of- interest to be set. Super-Sorters can bring waste processing charges down to less than $30/ metric ton on smaller jobs and can save hundreds of dollars per metric ton in disposal charges. The largest effect on the overall project cost occurs during planning and implementation. The overall goal is reduction of the length of the project, which dictates the most efficient soil processing. With all sorting systems the parameters that need to be accounted for are matrix type, soil feed rate, soil pre-processing, site conditions, and regulatory issues. The soil matrix and its ability to flow are extremely crucial to operations. It is also important to consider that as conditions change (i.e., moisture), the flowability of the soil matrix will change. Many soil parameters have to be considered: cohesive strength, internal and wall friction, permeability, and bulk density as a function of consolidating pressure. Clay bearing soils have very low permeability and high cohesive strength which makes them difficult to process, especially when wet. Soil feed speed is dependent on the equipment present and the ability to move the soil in the Super-Sorter processing area. When a Super-Sorter is running at 400 metric tons per hour it is difficult to feed the system. As an example, front-end loaders with large buckets would move approximately 5-10 metric tons of material, and 400 metric tons per hour would require 50-100 bucket-loads per hour to attain. Because the flowability of the soil matrix is important, poor material is often pre-processed before it is added to the feed hopper of the 'survey' conveyor. This pre-processing can consist of a 'grizzly' to remove large objects from the soil matrix, followed screening plant to prepare the soil so that it

  19. Large Scale Glazed Concrete Panels

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    Today, there is a lot of focus on concrete surface’s aesthitic potential, both globally and locally. World famous architects such as Herzog De Meuron, Zaha Hadid, Richard Meyer and David Chippenfield challenge the exposure of concrete in their architecture. At home, this trend can be seen...... in the crinkly façade of DR-Byen (the domicile of the Danish Broadcasting Company) by architect Jean Nouvel and Zaha Hadid’s Ordrupgård’s black curved smooth concrete surfaces. Furthermore, one can point to initiatives such as “Synlig beton” (visible concrete) that can be seen on the website www.......synligbeton.dk and spæncom’s aesthetic relief effects by the designer Line Kramhøft (www.spaencom.com). It is my hope that the research-development project “Lasting large scale glazed concrete formwork,” I am working on at DTU, department of Architectural Engineering will be able to complement these. It is a project where I...

  20. Direct Satellite Data Acquisition and its Application for Large -scale Monitoring Projects in Russia

    Science.gov (United States)

    Gershenzon, O.

    2011-12-01

    ScanEx RDC created an infrastructure (ground stations network) to acquire and process remote sensing data from different satellites: Terra, Aqua, Landsat, IRS-P5/P6, SPOT 4/5, FORMOSAT-2, EROS A/B, RADARSAT-1/2, ENVISAT-1. It owns image archives from these satellites as well as from SPOT-2 and CARTOSAT-2. ScanEx RDC builds and delivers remote sensing ground stations (working with up to 15 satellites); and owns the ground stations network to acquire data for Russia and surrounding territory. ScanEx stations are the basic component in departmental networks of remote sensing data acquisition for different state authorities (Roshydromet, Ministry of Natural Recourses, Emercom) and University- based remote sensing data acquisition and processing centers in Russia and abroad. ScanEx performs large-scale projects in collaboration with government agencies to monitor forests, floods, fires, sea surface pollution, and ice situation in Northern Russia. During 2010-2011 ScanEx conducted daily monitoring of wild fires in Russia detecting and registering thermal anomalies using data from Terra, Aqua, Landsat and SPOT satellites. Detailed SPOT 4/5 data is used to analyze burnt areas and to assess damage caused by fire. Satellite data along with other information about fire situation in Russia was daily updated and published via free-access Internet geoportal. A few projects ScanEx conducted together with environmental NGO. Project "Satellite monitoring of Especially Protected Natural Areas of Russia and its results visualization on geoportal was conducted in cooperation with NGO "Transparent World". The project's goal was to observe natural phenomena and economical activity, including illegal, by means of Earth remote sensing data. Monitoring is based on multi-temporal optical space imagery of different spatial resolution. Project results include detection of anthropogenic objects that appeared in the vicinity or even within the border of natural territories, that have never been

  1. Success Factors of Large Scale ERP Implementation in Thailand

    OpenAIRE

    Rotchanakitumnuai; Siriluck

    2010-01-01

    The objectives of the study are to examine the determinants of ERP implementation success factors of ERP implementation. The result indicates that large scale ERP implementation success consist of eight factors: project management competence, knowledge sharing, ERP system quality , understanding, user involvement, business process re-engineering, top management support, organization readiness.

  2. Large-scale visualization projects for teaching software engineering.

    Science.gov (United States)

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments.

  3. Reliability, availability and maintenance aspects of large-scale offshore wind farms, a concepts study

    NARCIS (Netherlands)

    Van Bussel, G.J.W.; Zaayer, M.B.

    2001-01-01

    The DOWEC projects aims at implementation of large wind turbines in large scale wind farms. part of the DOWEC project a concepts study was performed regarding the achievable reliability and availability levels. A reduction with a factor of 2 with regard to the present state of the art seems fairly

  4. Large scale phenotyping and data analysis of pepper genotypes in the EU-SPICY project

    NARCIS (Netherlands)

    Dieleman, J.A.; Magán, J.J.; Wubs, A.M.; Palloix, A.; Lenk, S.; Glasbey, C.; Eeuwijk, van F.A.

    2012-01-01

    In breeding the best genotypes for diverse conditions, ideally the breeder should test all his crossings under all these conditions. Especially with complex physiological traits like yield, which exhibit large variation, this would require many expensive and large field trials. The EU project “Smart

  5. Small-scale bioenergy projects in rural China: Lessons to be learnt

    NARCIS (Netherlands)

    Han, Jingyi; Mol, A.P.J.; Lu, Y.; Zhang, L.

    2008-01-01

    Large amounts of small-scale bioenergy projects were carried out in China's rural areas in light of its national renewable energy policies. These projects applied pyrolysis gasification as the main technology, which turns biomass waste at low costs into biogas. This paper selects seven bioenergy

  6. Automated Bug Assignment: Ensemble-based Machine Learning in Large Scale Industrial Contexts

    OpenAIRE

    Jonsson, Leif; Borg, Markus; Broman, David; Sandahl, Kristian; Eldh, Sigrid; Runeson, Per

    2016-01-01

    Bug report assignment is an important part of software maintenance. In particular, incorrect assignments of bug reports to development teams can be very expensive in large software development projects. Several studies propose automating bug assignment techniques using machine learning in open source software contexts, but no study exists for large-scale proprietary projects in industry. The goal of this study is to evaluate automated bug assignment techniques that are based on machine learni...

  7. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  8. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems......, especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  9. Energy performance strategies for the large scale introduction of geothermal energy in residential and industrial buildings: The GEO.POWER project

    International Nuclear Information System (INIS)

    Giambastiani, B.M.S.; Tinti, F.; Mendrinos, D.; Mastrocicco, M.

    2014-01-01

    Use of shallow geothermal energy, in terms of ground coupled heat pumps (GCHP) for heating and cooling purposes, is an environmentally-friendly and cost-effective alternative with potential to replace fossil fuels and help mitigate global warming. Focusing on the recent results of the GEO.POWER project, this paper aims at examining the energy performance strategies and the future regional and national financial instruments for large scale introduction of geothermal energy and GCHP systems in both residential and industrial buildings. After a transferability assessment to evaluate the reproducibility of some outstanding examples of systems currently existing in Europe for the utilisation of shallow geothermal energy, a set of regulatory, economic and technical actions is proposed to encourage the GCHP market development and support geothermal energy investments in the frame of the existing European normative platforms. This analysis shows that many European markets are changing from a new GCHP market to growth market. However some interventions are still required, such as incentives, regulatory framework, certification schemes and training activities in order to accelerate the market uptake and achieve the main European energy and climate targets. - Highlights: • Potentiality of geothermal applications for heating and cooling in buildings. • Description of the GEO.POWER project and its results. • Local strategies for the large scale introduction of GCHPs

  10. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  11. Numerical Analysis of Soil Settlement Prediction and Its Application In Large-Scale Marine Reclamation Artificial Island Project

    Directory of Open Access Journals (Sweden)

    Zhao Jie

    2017-11-01

    Full Text Available In an artificial island construction project based on the large-scale marine reclamation land, the soil settlement is a key to affect the late safe operation of the whole field. To analyze the factors of the soil settlement in a marine reclamation project, the SEM method in the soil micro-structural analysis method is used to test and study six soil samples such as the representative silt, mucky silty clay, silty clay and clay in the area. The structural characteristics that affect the soil settlement are obtained by observing the SEM charts at different depths. By combining numerical calculation method of Terzaghi’s one-dimensional and Biot’s two-dimensional consolidation theory, the one-dimensional and two-dimensional creep models are established and the numerical calculation results of two consolidation theories are compared in order to predict the maximum settlement of the soils 100 years after completion. The analysis results indicate that the micro-structural characteristics are the essential factor to affect the settlement in this area. Based on numerical analysis of one-dimensional and two-dimensional settlement, the settlement law and trend obtained by two numerical analysis method is similar. The analysis of this paper can provide reference and guidance to the project related to the marine reclamation land.

  12. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    Science.gov (United States)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  13. Concurrent Programming Using Actors: Exploiting Large-Scale Parallelism,

    Science.gov (United States)

    1985-10-07

    ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK* Artificial Inteligence Laboratory AREA Is WORK UNIT NUMBERS 545 Technology Square...D-R162 422 CONCURRENT PROGRMMIZNG USING f"OS XL?ITP TEH l’ LARGE-SCALE PARALLELISH(U) NASI AC E Al CAMBRIDGE ARTIFICIAL INTELLIGENCE L. G AGHA ET AL...RESOLUTION TEST CHART N~ATIONAL BUREAU OF STANDA.RDS - -96 A -E. __ _ __ __’ .,*- - -- •. - MASSACHUSETTS INSTITUTE OF TECHNOLOGY ARTIFICIAL

  14. Estimation of Truck Trips on Large-Scale Irrigation Project: A Combinatory Input-Output Commodity-Based Approach

    Directory of Open Access Journals (Sweden)

    Ackchai Sirikijpanichkul

    2015-01-01

    Full Text Available For the agricultural-based countries, the requirement on transportation infrastructure should not only be limited to accommodate general traffic but also the transportation of crop and agricultural products during the harvest seasons. Most of the past researches focus on the development of truck trip estimation techniques for urban, statewide, or nationwide freight movement but neglect the importance of rural freight movement which contributes to pavement deterioration on rural roads especially during harvest seasons. Recently, the Thai Government initiated a plan to construct a network of reservoirs within the northeastern region, aiming at improving existing irrigation system particularly in the areas where a more effective irrigation system is needed. It is expected to bring in new opportunities on expanding the cultivation areas, increasing the economy of scale and enlarging the extent market of area. As a consequence, its effects on truck trip generation needed to be investigated to assure the service quality of related transportation infrastructure. This paper proposes a combinatory input-output commodity-based approach to estimate truck trips on rural highway infrastructure network. The large-scale irrigation project for the northeastern of Thailand is demonstrated as a case study.

  15. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  16. Properties of large-scale methane/hydrogen jet fires

    Energy Technology Data Exchange (ETDEWEB)

    Studer, E. [CEA Saclay, DEN, LTMF Heat Transfer and Fluid Mech Lab, 91 - Gif-sur-Yvette (France); Jamois, D.; Leroy, G.; Hebrard, J. [INERIS, F-60150 Verneuil En Halatte (France); Jallais, S. [Air Liquide, F-78350 Jouy En Josas (France); Blanchetiere, V. [GDF SUEZ, 93 - La Plaine St Denis (France)

    2009-12-15

    A future economy based on reduction of carbon-based fuels for power generation and transportation may consider hydrogen as possible energy carrier Extensive and widespread use of hydrogen might require a pipeline network. The alternatives might be the use of the existing natural gas network or to design a dedicated network. Whatever the solution, mixing hydrogen with natural gas will modify the consequences of accidents, substantially The French National Research Agency (ANR) funded project called HYDROMEL focuses on these critical questions Within this project large-scale jet fires have been studied experimentally and numerically The main characteristics of these flames including visible length, radiation fluxes and blowout have been assessed. (authors)

  17. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  18. Scale-Up: Improving Large Enrollment Physics Courses

    Science.gov (United States)

    Beichner, Robert

    1999-11-01

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project is working to establish a learning environment that will promote increased conceptual understanding, improved problem-solving performance, and greater student satisfaction, while still maintaining class sizes of approximately 100. We are also addressing the new ABET engineering accreditation requirements for inquiry-based learning along with communication and team-oriented skills development. Results of studies of our latest classroom design, plans for future classroom space, and the current iteration of instructional materials will be discussed.

  19. CSNI Project for Fracture Analyses of Large-Scale International Reference Experiments (FALSIRE II)

    Energy Technology Data Exchange (ETDEWEB)

    Bass, B.R.; Pugh, C.E.; Keeney, J. [Oak Ridge National Lab., TN (United States); Schulz, H.; Sievers, J. [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) mbH, Koeln (Gemany)

    1996-11-01

    A summary of Phase II of the Project for FALSIRE is presented. FALSIRE was created by the Fracture Assessment Group (FAG) of the OECD/NEA`s Committee on the Safety of Nuclear Installations (CNSI) Principal Working Group No. 3. FALSIRE I in 1988 assessed fracture methods through interpretive analyses of 6 large-scale fracture experiments in reactor pressure vessel (RPV) steels under pressurized- thermal-shock (PTS) loading. In FALSIRE II, experiments examined cleavage fracture in RPV steels for a wide range of materials, crack geometries, and constraint and loading conditions. The cracks were relatively shallow, in the transition temperature region. Included were cracks showing either unstable extension or two stages of extensions under transient thermal and mechanical loads. Crack initiation was also investigated in connection with clad surfaces and with biaxial load. Within FALSIRE II, comparative assessments were performed for 7 reference fracture experiments based on 45 analyses received from 22 organizations representing 12 countries. Temperature distributions in thermal shock loaded samples were approximated with high accuracy and small scatter bands. Structural response was predicted reasonably well; discrepancies could usually be traced to the assumed material models and approximated material properties. Almost all participants elected to use the finite element method.

  20. CSNI Project for Fracture Analyses of Large-Scale International Reference Experiments (FALSIRE II)

    International Nuclear Information System (INIS)

    Bass, B.R.; Pugh, C.E.; Keeney, J.; Schulz, H.; Sievers, J.

    1996-11-01

    A summary of Phase II of the Project for FALSIRE is presented. FALSIRE was created by the Fracture Assessment Group (FAG) of the OECD/NEA's Committee on the Safety of Nuclear Installations (CNSI) Principal Working Group No. 3. FALSIRE I in 1988 assessed fracture methods through interpretive analyses of 6 large-scale fracture experiments in reactor pressure vessel (RPV) steels under pressurized- thermal-shock (PTS) loading. In FALSIRE II, experiments examined cleavage fracture in RPV steels for a wide range of materials, crack geometries, and constraint and loading conditions. The cracks were relatively shallow, in the transition temperature region. Included were cracks showing either unstable extension or two stages of extensions under transient thermal and mechanical loads. Crack initiation was also investigated in connection with clad surfaces and with biaxial load. Within FALSIRE II, comparative assessments were performed for 7 reference fracture experiments based on 45 analyses received from 22 organizations representing 12 countries. Temperature distributions in thermal shock loaded samples were approximated with high accuracy and small scatter bands. Structural response was predicted reasonably well; discrepancies could usually be traced to the assumed material models and approximated material properties. Almost all participants elected to use the finite element method

  1. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    Science.gov (United States)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4Destiny will be used in its third year as a high resolution, wide-field imager to conduct a weak lensing survey covering >lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  2. Infrastructure for large-scale quality-improvement projects: early lessons from North Carolina Improving Performance in Practice.

    Science.gov (United States)

    Newton, Warren P; Lefebvre, Ann; Donahue, Katrina E; Bacon, Thomas; Dobson, Allen

    2010-01-01

    Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3) quality-improvement consultants (QICs), or practice facilitators; (4) learning networks; and (5) alignment of incentives. We emphasized a community-based strategy and developing a statewide infrastructure. Results are reported from the first 2 years of the North Carolina Improving Performance in Practice (IPIP) project. A coalition was formed to include professional societies, North Carolina AHEC, Community Care of North Carolina, insurers, and other organizations. Wave One started with 18 practices in 2 of 9 regions of the state. Quality-improvement consultants recruited practices. Over 80 percent of practices attended all quarterly regional meetings. In 9 months, almost all diabetes measures improved, and a bundled asthma measure improved from 33 to 58 percent. Overall, the magnitude of improvement was clinically and statistically significant (P = .001). Quality improvements were maintained on review 1 year later. Wave Two has spread to 103 practices in all 9 regions of the state, with 42 additional practices beginning the enrollment process. Large-scale health care quality improvement is feasible, when broadly supported by statewide leadership and community infrastructure. Practice-collected data and lack of a control group are limitations of the study design. Future priorities include maintaining improved sustainability for practices and communities. Our long-term goal is to transform all 2000 primary-care practices in our state.

  3. The large hadron collider project

    International Nuclear Information System (INIS)

    Maiani, L.

    1999-01-01

    Knowledge of the fundamental constituents of matter has greatly advanced, over the last decades. The standard theory of fundamental interactions presents us with a theoretically sound picture, which describes with great accuracy known physical phenomena on most diverse energy and distance scales. These range from 10 -16 cm, inside the nucleons, up to large-scale astrophysical bodies, including the early Universe at some nanosecond after the Big-Bang and temperatures of the order of 10 2 GeV. The picture is not yet completed, however, as we lack the observation of the Higgs boson, predicted in the 100-500 GeV range - a particle associated with the generation of particle masses and with the quantum fluctuations in the primordial Universe. In addition, the standard theory is expected to undergo a change of regime in the 10 3 GeV region, with the appearance of new families of particles, most likely associated with the onset of a new symmetry (supersymmetry). In 1994, the CERN Council approved the construction of the large hadron collider (LHC), a proton-proton collider of a new design to be installed in the existing LEP tunnel, with an energy of 7 TeV per beam and extremely large luminosity, of ∝10 34 cm -2 s -1 . Construction was started in 1996, with the additional support of the US, Japan, Russia, Canada and other European countries, making the LHC a really global project, the first one in particle physics. After a short review of the physics scenario, I report on the present status of the LHC construction. Special attention is given to technological problems such as the realization of the super-conducting dipoles, following an extensive R and D program with European industries. The construction of the large LHC detectors has required a vast R and D program by a large international community, to overcome the problems posed by the complexity of the collisions and by the large luminosity of the machine. (orig.)

  4. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  5. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  6. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  7. The linac control system for the large-scale synchrotron radiation facility (SPring-8)

    Energy Technology Data Exchange (ETDEWEB)

    Sakaki, Hironao; Yoshikawa, Hiroshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Itoh, Yuichi [Atomic Energy General Services Corporation, Tokai, Ibaraki (Japan); Terashima, Yasushi [Information Technology System Co., Ltd. (ITECS), Tokyo (Japan)

    2000-09-01

    The linac for large-scale synchrotron radiation facilities has been operated since August of 1996. The linac deal with the user requests without any big troubles. In this report, the control system development policy, details, and the operation for the linac are presented. It is also described so that these experiences can be used for control system of a large scale proton accelerators which will be developed in the High Intensity Proton Accelerator Project. (author)

  8. Large-scale correlations in gas traced by Mg II absorbers around low-mass galaxies

    Science.gov (United States)

    Kauffmann, Guinevere

    2018-03-01

    The physical origin of the large-scale conformity in the colours and specific star formation rates of isolated low-mass central galaxies and their neighbours on scales in excess of 1 Mpc is still under debate. One possible scenario is that gas is heated over large scales by feedback from active galactic nuclei (AGNs), leading to coherent modulation of cooling and star formation between well-separated galaxies. In this Letter, the metal line absorption catalogue of Zhu & Ménard is used to probe gas out to large projected radii around a sample of a million galaxies with stellar masses ˜1010M⊙ and photometric redshifts in the range 0.4 Survey imaging data. This galaxy sample covers an effective volume of 2.2 Gpc3. A statistically significant excess of Mg II absorbers is present around the red-low-mass galaxies compared to their blue counterparts out to projected radii of 10 Mpc. In addition, the equivalent width distribution function of Mg II absorbers around low-mass galaxies is shown to be strongly affected by the presence of a nearby (Rp < 2 Mpc) radio-loud AGNs out to projected radii of 5 Mpc.

  9. Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows

    Science.gov (United States)

    Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel

    2017-11-01

    We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.

  10. FutureGen 2.0 Oxy-combustion Large Scale Test – Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Kenison, LaVesta [URS, Pittsburgh, PA (United States); Flanigan, Thomas [URS, Pittsburgh, PA (United States); Hagerty, Gregg [URS, Pittsburgh, PA (United States); Gorrie, James [Air Liquide, Kennesaw, GA (United States); Leclerc, Mathieu [Air Liquide, Kennesaw, GA (United States); Lockwood, Frederick [Air Liquide, Kennesaw, GA (United States); Falla, Lyle [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Macinnis, Jim [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Fedak, Mathew [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Yakle, Jeff [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Williford, Mark [Futuregen Industrial Alliance, Inc., Morgan County, IL (United States); Wood, Paul [Futuregen Industrial Alliance, Inc., Morgan County, IL (United States)

    2016-04-01

    The primary objectives of the FutureGen 2.0 CO2 Oxy-Combustion Large Scale Test Project were to site, permit, design, construct, and commission, an oxy-combustion boiler, gas quality control system, air separation unit, and CO2 compression and purification unit, together with the necessary supporting and interconnection utilities. The project was to demonstrate at commercial scale (168MWe gross) the capability to cleanly produce electricity through coal combustion at a retrofitted, existing coal-fired power plant; thereby, resulting in near-zeroemissions of all commonly regulated air emissions, as well as 90% CO2 capture in steady-state operations. The project was to be fully integrated in terms of project management, capacity, capabilities, technical scope, cost, and schedule with the companion FutureGen 2.0 CO2 Pipeline and Storage Project, a separate but complementary project whose objective was to safely transport, permanently store and monitor the CO2 captured by the Oxy-combustion Power Plant Project. The FutureGen 2.0 Oxy-Combustion Large Scale Test Project successfully achieved all technical objectives inclusive of front-end-engineering and design, and advanced design required to accurately estimate and contract for the construction, commissioning, and start-up of a commercial-scale "ready to build" power plant using oxy-combustion technology, including full integration with the companion CO2 Pipeline and Storage project. Ultimately the project did not proceed to construction due to insufficient time to complete necessary EPC contract negotiations and commercial financing prior to expiration of federal co-funding, which triggered a DOE decision to closeout its participation in the project. Through the work that was completed, valuable technical, commercial, and programmatic lessons were learned. This project has significantly advanced the development of near-zero emission technology and will

  11. Architecture for large-scale automatic web accessibility evaluation based on the UWEM methodology

    DEFF Research Database (Denmark)

    Ulltveit-Moe, Nils; Olsen, Morten Goodwin; Pillai, Anand B.

    2008-01-01

    The European Internet Accessibility project (EIAO) has developed an Observatory for performing large scale automatic web accessibility evaluations of public sector web sites in Europe. The architecture includes a distributed web crawler that crawls web sites for links until either a given budget...... of web pages have been identified or the web site has been crawled exhaustively. Subsequently, a uniform random subset of the crawled web pages is sampled and sent for accessibility evaluation and the evaluation results are stored in a Resource Description Format (RDF) database that is later loaded...... challenges that the project faced and the solutions developed towards building a system capable of regular large-scale accessibility evaluations with sufficient capacity and stability. It also outlines some possible future architectural improvements....

  12. A large-scale soil-structure interaction experiment: Design and construction

    International Nuclear Information System (INIS)

    Tang, H.T.; Tang, Y.K.; Stepp, J.C.; Wall, I.B.; Lin, E.; Cheng, S.C.; Lee, S.K.

    1989-01-01

    This paper describes the design and construction phase of the Large-Scale Soil-Structure Interaction Experiment project jointly sponsored by EPRI and Taipower. The project has two objectives: 1. to obtain an earthquake database which can be used to substantiate soil-structure interaction (SSI) models and analysis methods; and 2. to quantify nuclear power plant reactor containment and internal components seismic margin based on earthquake experience data. These objectives were accomplished by recording and analyzing data from two instrumented, scaled down, reinforced concrete containment structures during seismic events. The two model structures are sited in a high seismic region in Taiwan (SMART-1). A strong-motion seismic array network is located at the site. The containment models (1/4- and 1/12-scale) were constructed and instrumented specially for this experiment. Construction was completed and data recording began in September 1985. By November 1986, 18 strong motion earthquakes ranging from Richter magnitude 4.5 to 7.0 were recorded. (orig./HP)

  13. Large-scale laboratory observations of beach morphodynamics and turbulence beneath shoaling and breaking waves

    NARCIS (Netherlands)

    Winter, W. de; Wesselman, D.; Grasso, F.R.; Ruessink, B.G.

    2013-01-01

    In 2012, large-scale laboratory experiments were carried out in the Deltagoot in the framework of the Hydralab IV-funded BARDEXII project. The overall project aims were to examine the effect of swash/groundwater interactions to sand transport and morphological development in the swash zone and,

  14. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  15. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  16. Numerical Analysis of Consolidation Settlement and Creep Deformation of Artificial Island Revetment Structure in a Large-Scale Marine Reclamation Land Project

    Directory of Open Access Journals (Sweden)

    Jie Zhao

    2015-09-01

    Full Text Available In order to analyze the influential factors of soft foundation settlement in a marine reclamation land project, the consolidation settlement and pore pressure dissipation of the entire area are numerically simulated using Soft-Soil- Creep Model, in which the PLAXIS finite element software for professional geotechnical engineering is applied and empirical data of Japanese Kansai’s airport project are used. Moreover, the figures of settlement and pore pressure results in the different basic period are drawn, and the corresponding analysis conclusions are ob-tained based on the comparison among the results from the computational parameters of depth. In addition,, the influence rules of various parameters on settlement results is concluded through running the parameter sensitivity analysis in Soft-Soil-Creep Model, and the experience and conclusions can be for reference in the design and con-struction of similar large-scale marine reclamation land project. Also the empirical value method of the creep index has not been applied widely. Further research needs to be done.

  17. Large-scale simulations with distributed computing: Asymptotic scaling of ballistic deposition

    International Nuclear Information System (INIS)

    Farnudi, Bahman; Vvedensky, Dimitri D

    2011-01-01

    Extensive kinetic Monte Carlo simulations are reported for ballistic deposition (BD) in (1 + 1) dimensions. The large system sizes L observed for the onset of asymptotic scaling (L ≅ 2 12 ) explains the widespread discrepancies in previous reports for exponents of BD in one and likely in higher dimensions. The exponents obtained directly from our simulations, α = 0.499 ± 0.004 and β = 0.336 ± 0.004, capture the exact values α = 1/2 and β = 1/3 for the one-dimensional Kardar-Parisi-Zhang equation. An analysis of our simulations suggests a criterion for identifying the onset of true asymptotic scaling, which enables a more informed evaluation of exponents for BD in higher dimensions. These simulations were made possible by the Simulation through Social Networking project at the Institute for Advanced Studies in Basic Sciences in 2007, which was re-launched in November 2010.

  18. Alignment between galaxies and large-scale structure

    International Nuclear Information System (INIS)

    Faltenbacher, A.; Li Cheng; White, Simon D. M.; Jing, Yi-Peng; Mao Shude; Wang Jie

    2009-01-01

    Based on the Sloan Digital Sky Survey DR6 (SDSS) and the Millennium Simulation (MS), we investigate the alignment between galaxies and large-scale structure. For this purpose, we develop two new statistical tools, namely the alignment correlation function and the cos(2θ)-statistic. The former is a two-dimensional extension of the traditional two-point correlation function and the latter is related to the ellipticity correlation function used for cosmic shear measurements. Both are based on the cross correlation between a sample of galaxies with orientations and a reference sample which represents the large-scale structure. We apply the new statistics to the SDSS galaxy catalog. The alignment correlation function reveals an overabundance of reference galaxies along the major axes of red, luminous (L ∼ * ) galaxies out to projected separations of 60 h- 1 Mpc. The signal increases with central galaxy luminosity. No alignment signal is detected for blue galaxies. The cos(2θ)-statistic yields very similar results. Starting from a MS semi-analytic galaxy catalog, we assign an orientation to each red, luminous and central galaxy, based on that of the central region of the host halo (with size similar to that of the stellar galaxy). As an alternative, we use the orientation of the host halo itself. We find a mean projected misalignment between a halo and its central region of ∼ 25 deg. The misalignment decreases slightly with increasing luminosity of the central galaxy. Using the orientations and luminosities of the semi-analytic galaxies, we repeat our alignment analysis on mock surveys of the MS. Agreement with the SDSS results is good if the central orientations are used. Predictions using the halo orientations as proxies for central galaxy orientations overestimate the observed alignment by more than a factor of 2. Finally, the large volume of the MS allows us to generate a two-dimensional map of the alignment correlation function, which shows the reference

  19. IoT European Large-Scale Pilots – Integration, Experimentation and Testing

    OpenAIRE

    Guillén, Sergio Gustavo; Sala, Pilar; Fico, Giuseppe; Arredondo, Maria Teresa; Cano, Alicia; Posada, Jorge; Gutierrez, Germán; Palau, Carlos; Votis, Konstantinos; Verdouw, Cor N.; Wolfert, Sjaak; Beers, George; Sundmaeker, Harald; Chatzikostas, Grigoris; Ziegler, Sébastien

    2017-01-01

    The IoT European Large-Scale Pilots Programme includes the innovation consortia that are collaborating to foster the deployment of IoT solutions in Europe through the integration of advanced IoT technologies across the value chain, demonstration of multiple IoT applications at scale and in a usage context, and as close as possible to operational conditions. The programme projects are targeted, goal-driven initiatives that propose IoT approaches to specific real-life industrial/societal challe...

  20. Power monitoring and control for large scale projects: SKA, a case study

    Science.gov (United States)

    Barbosa, Domingos; Barraca, João. Paulo; Maia, Dalmiro; Carvalho, Bruno; Vieira, Jorge; Swart, Paul; Le Roux, Gerhard; Natarajan, Swaminathan; van Ardenne, Arnold; Seca, Luis

    2016-07-01

    Large sensor-based science infrastructures for radio astronomy like the SKA will be among the most intensive datadriven projects in the world, facing very high demanding computation, storage, management, and above all power demands. The geographically wide distribution of the SKA and its associated processing requirements in the form of tailored High Performance Computing (HPC) facilities, require a Greener approach towards the Information and Communications Technologies (ICT) adopted for the data processing to enable operational compliance to potentially strict power budgets. Addressing the reduction of electricity costs, improve system power monitoring and the generation and management of electricity at system level is paramount to avoid future inefficiencies and higher costs and enable fulfillments of Key Science Cases. Here we outline major characteristics and innovation approaches to address power efficiency and long-term power sustainability for radio astronomy projects, focusing on Green ICT for science and Smart power monitoring and control.

  1. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  2. Cluster galaxy dynamics and the effects of large-scale environment

    Science.gov (United States)

    White, Martin; Cohn, J. D.; Smit, Renske

    2010-11-01

    Advances in observational capabilities have ushered in a new era of multi-wavelength, multi-physics probes of galaxy clusters and ambitious surveys are compiling large samples of cluster candidates selected in different ways. We use a high-resolution N-body simulation to study how the influence of large-scale structure in and around clusters causes correlated signals in different physical probes and discuss some implications this has for multi-physics probes of clusters (e.g. richness, lensing, Compton distortion and velocity dispersion). We pay particular attention to velocity dispersions, matching galaxies to subhaloes which are explicitly tracked in the simulation. We find that not only do haloes persist as subhaloes when they fall into a larger host, but groups of subhaloes retain their identity for long periods within larger host haloes. The highly anisotropic nature of infall into massive clusters, and their triaxiality, translates into an anisotropic velocity ellipsoid: line-of-sight galaxy velocity dispersions for any individual halo show large variance depending on viewing angle. The orientation of the velocity ellipsoid is correlated with the large-scale structure, and thus velocity outliers correlate with outliers caused by projection in other probes. We quantify this orientation uncertainty and give illustrative examples. Such a large variance suggests that velocity dispersion estimators will work better in an ensemble sense than for any individual cluster, which may inform strategies for obtaining redshifts of cluster members. We similarly find that the ability of substructure indicators to find kinematic substructures is highly viewing angle dependent. While groups of subhaloes which merge with a larger host halo can retain their identity for many Gyr, they are only sporadically picked up by substructure indicators. We discuss the effects of correlated scatter on scaling relations estimated through stacking, both analytically and in the simulations

  3. Theme II Joint Work Plan -2017 Collaboration and Knowledge Sharing on Large-scale Demonstration Projects

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Xiaoliang [World Resources Inst. (WRI), Washington, DC (United States); Stauffer, Philip H. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-09-25

    This effort is designed to expedite learnings from existing and planned large demonstration projects and their associated research through effective knowledge sharing among participants in the US and China.

  4. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  5. Macroweather Predictions and Climate Projections using Scaling and Historical Observations

    Science.gov (United States)

    Hébert, R.; Lovejoy, S.; Del Rio Amador, L.

    2017-12-01

    There are two fundamental time scales that are pertinent to decadal forecasts and multidecadal projections. The first is the lifetime of planetary scale structures, about 10 days (equal to the deterministic predictability limit), and the second is - in the anthropocene - the scale at which the forced anthropogenic variability exceeds the internal variability (around 16 - 18 years). These two time scales define three regimes of variability: weather, macroweather and climate that are respectively characterized by increasing, decreasing and then increasing varibility with scale.We discuss how macroweather temperature variability can be skilfully predicted to its theoretical stochastic predictability limits by exploiting its long-range memory with the Stochastic Seasonal and Interannual Prediction System (StocSIPS). At multi-decadal timescales, the temperature response to forcing is approximately linear and this can be exploited to make projections with a Green's function, or Climate Response Function (CRF). To make the problem tractable, we exploit the temporal scaling symmetry and restrict our attention to global mean forcing and temperature response using a scaling CRF characterized by the scaling exponent H and an inner scale of linearity τ. An aerosol linear scaling factor α and a non-linear volcanic damping exponent ν were introduced to account for the large uncertainty in these forcings. We estimate the model and forcing parameters by Bayesian inference using historical data and these allow us to analytically calculate a median (and likely 66% range) for the transient climate response, and for the equilibrium climate sensitivity: 1.6K ([1.5,1.8]K) and 2.4K ([1.9,3.4]K) respectively. Aerosol forcing typically has large uncertainty and we find a modern (2005) forcing very likely range (90%) of [-1.0, -0.3] Wm-2 with median at -0.7 Wm-2. Projecting to 2100, we find that to keep the warming below 1.5 K, future emissions must undergo cuts similar to Representative

  6. A Guide to Bundling Small-scale CDM Projects

    International Nuclear Information System (INIS)

    Mariyappan, J.; Bhardwaj, N.; De Coninck, H.; Van der Linden, N.

    2005-07-01

    Small-scale renewable energy and energy efficiency projects that fit the development needs of many developing countries, can potentially be supported via the Clean Development Mechanism (CDM), one of the Kyoto Protocol's flexible mechanisms for tackling climate change. However, there is concern that due to high transaction costs, as well as many existing barriers, very few investments will be made in small-scale projects, which are often the most suitable development option in countries such as India. In view of this, the 'bundling' together of appropriate small-scale projects on a regional basis has been proposed as a way in which funding can be leveraged from international sources and transaction costs reduced. IT Power, IT Power India and the Energy research Centre of the Netherlands (ECN) are carrying out a 2-year project to establish the capacity within India to enable individual small scale projects to be bundled as a single CDM project. Overall objectives are to develop the necessary institutional capabilities to formulate and implement small scale CDM projects in India; to provide a guide on how to bundle small scale projects under the CDM in developing countries; and to raise the awareness of the potential for investment in small scale energy projects which can gain funding through the CDM

  7. Combining offshore wind energy and large-scale mussel farming: background & technical, ecological and economic considerations

    NARCIS (Netherlands)

    Lagerveld, S.; Rockmann, C.; Scholl, M.M.; Bartelings, H.; Burg, van den S.W.K.; Jak, R.G.; Jansen, H.M.; Klijnstra, J.; Leopold, M.F.; Poelman, M.; Smith, S.R.; Stavenuiter, J.; Veenstra, F.A.; Veltman, C.; Westra, C.

    2014-01-01

    This Blauwdruk project report presents background and technical, ecological and economic considerations of the potential combination of offshore wind energy production and large-scale mussel farming in offshore areas in the North Sea. The main objective of the Blauwdruk project was to study the

  8. Climate change impact on streamflow in large-scale river basins: projections and their uncertainties sourced from GCMs and RCP scenarios

    Science.gov (United States)

    Nasonova, Olga N.; Gusev, Yeugeniy M.; Kovalev, Evgeny E.; Ayzel, Georgy V.

    2018-06-01

    Climate change impact on river runoff was investigated within the framework of the second phase of the Inter-Sectoral Impact Model Intercomparison Project (ISI-MIP2) using a physically-based land surface model Soil Water - Atmosphere - Plants (SWAP) (developed in the Institute of Water Problems of the Russian Academy of Sciences) and meteorological projections (for 2006-2099) simulated by five General Circulation Models (GCMs) (including GFDL-ESM2M, HadGEM2-ES, IPSL-CM5A-LR, MIROC-ESM-CHEM, and NorESM1-M) for each of four Representative Concentration Pathway (RCP) scenarios (RCP2.6, RCP4.5, RCP6.0, and RCP8.5). Eleven large-scale river basins were used in this study. First of all, SWAP was calibrated and validated against monthly values of measured river runoff with making use of forcing data from the WATCH data set and all GCMs' projections were bias-corrected to the WATCH. Then, for each basin, 20 projections of possible changes in river runoff during the 21st century were simulated by SWAP. Analysis of the obtained hydrological projections allowed us to estimate their uncertainties resulted from application of different GCMs and RCP scenarios. On the average, the contribution of different GCMs to the uncertainty of the projected river runoff is nearly twice larger than the contribution of RCP scenarios. At the same time the contribution of GCMs slightly decreases with time.

  9. The Expanded Very Large Array Project

    Science.gov (United States)

    Perley, Rick

    2012-10-01

    The Expanded Very Large Array Project, begun in 2001, is now completed -- on time, on budget, and ``on spec.'' The major goal of the project was to multiply the key observational capabilities of the Very Large Array -- the world's most powerful, versatile, and productive radio telescope -- by a factor of at least an order of magnitude by implementation of modern technologies. The result of the project is an operationally new telescope -- renamed the Jansky Very Large Array -- which is capable of new science unimagined in 1980, when the original VLA was completed. In this talk, I will review the goals of the project, and describe the new capabilities of the Jansky Very Large Array. Already, the array is providing fabulous new insights into the physics of the universe,and I will spend the majority of the time describing examples of new results.

  10. Technology for the large-scale production of multi-crystalline silicon solar cells and modules

    International Nuclear Information System (INIS)

    Weeber, A.W.; De Moor, H.H.C.

    1997-06-01

    In cooperation with Shell Solar Energy (formerly R and S Renewable Energy Systems) and the Research Institute for Materials of the Catholic University Nijmegen the Netherlands Energy Research Foundation (ECN) plans to develop a competitive technology for the large-scale manufacturing of solar cells and solar modules on the basis of multi-crystalline silicon. The project will be carried out within the framework of the Economy, Ecology and Technology (EET) program of the Dutch ministry of Economic Affairs and the Dutch ministry of Education, Culture and Sciences. The aim of the EET-project is to reduce the costs of a solar module by 50% by means of increasing the conversion efficiency as well as the development of cheap processes for large-scale production

  11. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  12. Towards a negotiated solution to conflicts between large-scale and small-scale miners? The Acupan contract mining project in the Philippines

    NARCIS (Netherlands)

    Verbrugge, B.L.P.

    2017-01-01

    Recent decades have witnessed a simultaneous expansion of large-scale mining and artisanal and small-scale mining (ASM). In an increasing number of instances, ASM-operators and mining companies are competing or even clashing over access to mineral deposits. A growing number of experts are now

  13. Reflections on the political economy of large-scale technology using the example of German fast-breeder development

    International Nuclear Information System (INIS)

    Keck, O.

    1981-01-01

    Proceeding from Anglo-Saxon opinions which, from a liberal point of view, criticize the German practice of research policy - state centres of large-scale research and state subventions for research and development in industry - to be inefficient, the author empirically verified these statements taking the German fast breeder project as an example. If the case of the German fast breeder can be generalized, this had consequences for the research political practice and for other technologies. Supporters as well as opponents of large-scale technology today proceed from the assumption that almost every technology can be made commercially viable when using sufficient amounts of money and persons. This is a migth which owes its existence to the technical success of great projects in non-commercial fields. The German fast breeder project confirms the opinion that the recipes for success of these non-commercial projects cannot be applied to the field of commercial technology. The results of this study suggest that practice and theory of technology policy can be misdirected if they are uncritically oriented according to the form of state intervention so far used in large-scale technology. (orig./HSCH) [de

  14. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  15. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  16. IP over optical multicasting for large-scale video delivery

    Science.gov (United States)

    Jin, Yaohui; Hu, Weisheng; Sun, Weiqiang; Guo, Wei

    2007-11-01

    In the IPTV systems, multicasting will play a crucial role in the delivery of high-quality video services, which can significantly improve bandwidth efficiency. However, the scalability and the signal quality of current IPTV can barely compete with the existing broadcast digital TV systems since it is difficult to implement large-scale multicasting with end-to-end guaranteed quality of service (QoS) in packet-switched IP network. China 3TNet project aimed to build a high performance broadband trial network to support large-scale concurrent streaming media and interactive multimedia services. The innovative idea of 3TNet is that an automatic switched optical networks (ASON) with the capability of dynamic point-to-multipoint (P2MP) connections replaces the conventional IP multicasting network in the transport core, while the edge remains an IP multicasting network. In this paper, we will introduce the network architecture and discuss challenges in such IP over Optical multicasting for video delivery.

  17. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    Science.gov (United States)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  18. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  19. Macro optical projection tomography for large scale 3D imaging of plant structures and gene activity.

    Science.gov (United States)

    Lee, Karen J I; Calder, Grant M; Hindle, Christopher R; Newman, Jacob L; Robinson, Simon N; Avondo, Jerome J H Y; Coen, Enrico S

    2017-01-01

    Optical projection tomography (OPT) is a well-established method for visualising gene activity in plants and animals. However, a limitation of conventional OPT is that the specimen upper size limit precludes its application to larger structures. To address this problem we constructed a macro version called Macro OPT (M-OPT). We apply M-OPT to 3D live imaging of gene activity in growing whole plants and to visualise structural morphology in large optically cleared plant and insect specimens up to 60 mm tall and 45 mm deep. We also show how M-OPT can be used to image gene expression domains in 3D within fixed tissue and to visualise gene activity in 3D in clones of growing young whole Arabidopsis plants. A further application of M-OPT is to visualise plant-insect interactions. Thus M-OPT provides an effective 3D imaging platform that allows the study of gene activity, internal plant structures and plant-insect interactions at a macroscopic scale. © The Author 2016. Published by Oxford University Press on behalf of the Society for Experimental Biology.

  20. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  1. MANAGING LARGE INVESTMENT PROJECTS IN GORJ

    Directory of Open Access Journals (Sweden)

    CÎRNU DORU

    2016-12-01

    Full Text Available Currently, the concept of project management is considered to be the best concept for efficient management of a project, so it is used all over the world, and most recently in our country. This concept is designed with all the general characteristics of project management, but adapted to the large investment projects. This paper presents the project management concept and project management organization for capital projects. This concept is conceived with all general characteristics of project management, but adopted to the condition of large investment projects. The concept also includes the project team and the project manager, the person authorized and responsible for achieving the objectives planned in the project. For efficient managing by project, it is necessary to insure a good compozition of project team, as a team of people who, in collaboration with project manager, work directly on managing the project. To effectively manage the project, it is necessary to ensure a proper composition of the project team, a team of people who, in collaboration with the project manager to work directly for project management. It is a particularly good method of achieving the objectives planned projects, which means a project with a certain level of performance required in a planned time, with planned costs.

  2. Inference of functional properties from large-scale analysis of enzyme superfamilies.

    Science.gov (United States)

    Brown, Shoshana D; Babbitt, Patricia C

    2012-01-02

    As increasingly large amounts of data from genome and other sequencing projects become available, new approaches are needed to determine the functions of the proteins these genes encode. We show how large-scale computational analysis can help to address this challenge by linking functional information to sequence and structural similarities using protein similarity networks. Network analyses using three functionally diverse enzyme superfamilies illustrate the use of these approaches for facile updating and comparison of available structures for a large superfamily, for creation of functional hypotheses for metagenomic sequences, and to summarize the limits of our functional knowledge about even well studied superfamilies.

  3. FALSIRE Phase II. CSNI project for Fracture Analyses of Large-Scale International Reference Experiments (Phase II). Comparison report

    International Nuclear Information System (INIS)

    Sievers, J.; Schulz, H.; Bass, R.; Pugh, C.; Keeney, J.

    1996-11-01

    A summary of Phase II of the Project for Fracture Analysis of Large-Scale International Reference Experiments (FALSIRE) is presented. A FALSIRE II Workshop focused on analyses of reference fracture experiments. More than 30 participants representing 22 organizations from 12 countries took part in the workshop. Final results for 45 analyses of the reference experiments were received from the participating analysts. For each experiment, analysis results provided estimates of variables that include temperature, crack-mouth-opening displacement, stress, strain, and applied K and J values. The data were sent electronically to the Organizing Committee, who assembled the results into a comparative data base using a special-purpose computer program. A comparative assessment and discussion of the analysis results are presented in the report. Generally, structural responses of the test specimens were predicted with tolerable scatter bands. (orig./DG)

  4. Large-scale demonstration of D ampersand D technologies

    International Nuclear Information System (INIS)

    Bhattacharyya, S.K.; Black, D.B.; Rose, R.W.

    1997-01-01

    It is becoming increasingly evident that new technologies will need to be utilized for decontamination and decommissioning (D ampersand D) activities in order to assure safe and cost effective operations. The magnitude of the international D ampersand D problem is sufficiently large in anticipated cost (100's of billions of dollars) and in elapsed time (decades), that the utilization of new technologies should lead to substantial improvements in cost and safety performance. Adoption of new technologies in the generally highly contaminated D ampersand D environments requires assurances that the technology will perform as advertised. Such assurances can be obtained from demonstrations of the technology in environments that are similar to the actual environments without being quite as contaminated and hazardous. The Large Scale Demonstration Project (LSDP) concept was designed to provide such a function. The first LSDP funded by the U.S. Department Of Energy's Environmental Management Office (EM) was on the Chicago Pile 5 (CP-5) Reactor at Argonne National Laboratory. The project, conducted by a Strategic Alliance for Environmental Restoration, has completed demonstrations of 10 D ampersand D technologies and is in the process of comparing the performance to baseline technologies. At the conclusion of the project, a catalog of performance comparisons of these technologies will be developed that will be suitable for use by future D ampersand D planners

  5. Large Scale Reduction of Graphite Oxide Project

    Science.gov (United States)

    Calle, Carlos; Mackey, Paul; Falker, John; Zeitlin, Nancy

    2015-01-01

    This project seeks to develop an optical method to reduce graphite oxide into graphene efficiently and in larger formats than currently available. Current reduction methods are expensive, time-consuming or restricted to small, limited formats. Graphene has potential uses in ultracapacitors, energy storage, solar cells, flexible and light-weight circuits, touch screens, and chemical sensors. In addition, graphite oxide is a sustainable material that can be produced from any form of carbon, making this method environmentally friendly and adaptable for in-situ reduction.

  6. Studies of Sub-Synchronous Oscillations in Large-Scale Wind Farm Integrated System

    Science.gov (United States)

    Yue, Liu; Hang, Mend

    2018-01-01

    With the rapid development and construction of large-scale wind farms and grid-connected operation, the series compensation wind power AC transmission is gradually becoming the main way of power usage and improvement of wind power availability and grid stability, but the integration of wind farm will change the SSO (Sub-Synchronous oscillation) damping characteristics of synchronous generator system. Regarding the above SSO problem caused by integration of large-scale wind farms, this paper focusing on doubly fed induction generator (DFIG) based wind farms, aim to summarize the SSO mechanism in large-scale wind power integrated system with series compensation, which can be classified as three types: sub-synchronous control interaction (SSCI), sub-synchronous torsional interaction (SSTI), sub-synchronous resonance (SSR). Then, SSO modelling and analysis methods are categorized and compared by its applicable areas. Furthermore, this paper summarizes the suppression measures of actual SSO projects based on different control objectives. Finally, the research prospect on this field is explored.

  7. The relationship between large-scale and convective states in the tropics - Towards an improved representation of convection in large-scale models

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, Christian [Monash Univ., Melbourne, VIC (Australia)

    2015-02-26

    This report summarises an investigation into the relationship of tropical thunderstorms to the atmospheric conditions they are embedded in. The study is based on the use of radar observations at the Atmospheric Radiation Measurement site in Darwin run under the auspices of the DOE Atmospheric Systems Research program. Linking the larger scales of the atmosphere with the smaller scales of thunderstorms is crucial for the development of the representation of thunderstorms in weather and climate models, which is carried out by a process termed parametrisation. Through the analysis of radar and wind profiler observations the project made several fundamental discoveries about tropical storms and quantified the relationship of the occurrence and intensity of these storms to the large-scale atmosphere. We were able to show that the rainfall averaged over an area the size of a typical climate model grid-box is largely controlled by the number of storms in the area, and less so by the storm intensity. This allows us to completely rethink the way we represent such storms in climate models. We also found that storms occur in three distinct categories based on their depth and that the transition between these categories is strongly related to the larger scale dynamical features of the atmosphere more so than its thermodynamic state. Finally, we used our observational findings to test and refine a new approach to cumulus parametrisation which relies on the stochastic modelling of the area covered by different convective cloud types.

  8. Utility-Scale Solar 2013: An empirical analysis of project cost, performance, and pricing trends in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Weaver, Samantha [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2014-09-17

    Other than the SEGS I-IX parabolic trough projects built in the 1980s, virtually no large-scale or "utility-scale" solar projects-defined here to include any ground-mounted photovoltaic ("PV"), concentrating photovoltaic ("CPV"), or concentrating solar power ("CSP" or solar thermal) project larger than 5 MWAC-existed in the United States prior to 2007.

  9. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  10. The genetic etiology of Tourette Syndrome: Large-scale collaborative efforts on the precipice of discovery

    Directory of Open Access Journals (Sweden)

    Marianthi Georgitsi

    2016-08-01

    Full Text Available Gilles de la Tourette Syndrome (TS is a childhood-onset neurodevelopmental disorder that is characterized by multiple motor and phonic tics. It has a complex etiology with multiple genes likely interacting with environmental factors to lead to the onset of symptoms. The genetic basis of the disorder remains elusive;however, multiple resources and large-scale projects are coming together, launching a new era in the field and bringing us on the verge of discovery. The large-scale efforts outlined in this report, are complementary and represent a range of different approaches to the study of disorders with complex inheritance. The Tourette Syndrome Association International Consortium for Genetics (TSAICG has focused on large families, parent-proband trios and cases for large case-control designs such as genomewide association studies (GWAS, copy number variation (CNV scans and exome/genome sequencing. TIC Genetics targets rare, large effect size mutations in simplex trios and multigenerational families. The European Multicentre Tics in Children Study (EMTICS seeks to elucidate gene-environment interactions including the involvement of infection and immune mechanisms in TS etiology. Finally, TS-EUROTRAIN, a Marie Curie Initial Training Network, aims to act as a platform to unify large-scale projects in the field and to educate the next generation of experts. Importantly, these complementary large-scale efforts are joining forces to uncover the full range of genetic variation and environmental risk factors for TS, holding great promise for indentifying definitive TS susceptibility genes and shedding light into the complex pathophysiology of this disorder.

  11. The Genetic Etiology of Tourette Syndrome: Large-Scale Collaborative Efforts on the Precipice of Discovery

    Science.gov (United States)

    Georgitsi, Marianthi; Willsey, A. Jeremy; Mathews, Carol A.; State, Matthew; Scharf, Jeremiah M.; Paschou, Peristera

    2016-01-01

    Gilles de la Tourette Syndrome (TS) is a childhood-onset neurodevelopmental disorder that is characterized by multiple motor and phonic tics. It has a complex etiology with multiple genes likely interacting with environmental factors to lead to the onset of symptoms. The genetic basis of the disorder remains elusive. However, multiple resources and large-scale projects are coming together, launching a new era in the field and bringing us on the verge of discovery. The large-scale efforts outlined in this report are complementary and represent a range of different approaches to the study of disorders with complex inheritance. The Tourette Syndrome Association International Consortium for Genetics (TSAICG) has focused on large families, parent-proband trios and cases for large case-control designs such as genomewide association studies (GWAS), copy number variation (CNV) scans, and exome/genome sequencing. TIC Genetics targets rare, large effect size mutations in simplex trios, and multigenerational families. The European Multicentre Tics in Children Study (EMTICS) seeks to elucidate gene-environment interactions including the involvement of infection and immune mechanisms in TS etiology. Finally, TS-EUROTRAIN, a Marie Curie Initial Training Network, aims to act as a platform to unify large-scale projects in the field and to educate the next generation of experts. Importantly, these complementary large-scale efforts are joining forces to uncover the full range of genetic variation and environmental risk factors for TS, holding great promise for identifying definitive TS susceptibility genes and shedding light into the complex pathophysiology of this disorder. PMID:27536211

  12. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  13. Development of Large-Scale Spacecraft Fire Safety Experiments

    DEFF Research Database (Denmark)

    Ruff, Gary A.; Urban, David L.; Fernandez-Pello, A. Carlos

    2013-01-01

    exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low...... of the spacecraft fire safety risk. The activity of this project is supported by an international topical team of fire experts from other space agencies who conduct research that is integrated into the overall experiment design. The large-scale space flight experiment will be conducted in an Orbital Sciences...

  14. Large scale gas chromatographic demonstration system for hydrogen isotope separation

    International Nuclear Information System (INIS)

    Cheh, C.H.

    1988-01-01

    A large scale demonstration system was designed for a throughput of 3 mol/day equimolar mixture of H,D, and T. The demonstration system was assembled and an experimental program carried out. This project was funded by Kernforschungszentrum Karlsruhe, Canadian Fusion Fuel Technology Projects and Ontario Hydro Research Division. Several major design innovations were successfully implemented in the demonstration system and are discussed in detail. Many experiments were carried out in the demonstration system to study the performance of the system to separate hydrogen isotopes at high throughput. Various temperature programming schemes were tested, heart-cutting operation was evaluated, and very large (up to 138 NL/injection) samples were separated in the system. The results of the experiments showed that the specially designed column performed well as a chromatographic column and good separation could be achieved even when a 138 NL sample was injected

  15. Overview of a large-scale bioremediation soil treatment project

    International Nuclear Information System (INIS)

    Stechmann, R.

    1991-01-01

    How long does it take to remediate 290,000 yd 3 of impacted soil containing an average total petroleum hydrocarbon concentration of 3,000 ppm? Approximately 15 months from start to end of treatment using bioremediation. Mittelhauser was retained by the seller of the property (a major oil company) as technical manager to supervise remediation of a 45-ac parcel in the Los Angeles basin. Mittelhauser completed site characterization, negotiated clean-up levels with the regulatory agencies, and prepared the remedial action plan (RAP) with which the treatment approach was approved and permitted. The RAP outlined the excavation, treatment, and recompaction procedures for the impacted soil resulting from leakage of bunker fuel oil from a large surface impoundment. The impacted soil was treated on site in unline Land Treatment Units (LTUs) in 18-in.-thick lifts. Due to space restraints, multiple lifts site. The native microbial population was cultivated using soil stabilization mixing equipment with the application of water and agricultural grade fertilizers. Costs on this multimillion dollar project are broken down as follows: general contractor cost (47%), bioremediation subcontractor cost (35%), site characterization (10%), technical management (7%), analytical services (3%), RAP preparation and permitting (1%), and civil engineering subcontractor cost (1%). Start-up of field work could have been severely impacted by the existence of Red Fox habitation. The foxes were successfully relocated prior to start of field work

  16. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  17. Small-scale bioenergy projects in rural China: Lessons to be learnt

    International Nuclear Information System (INIS)

    Han Jingyi; Mol, Arthur P.J.; Lu Yonglong; Zhang Lei

    2008-01-01

    Large amounts of small-scale bioenergy projects were carried out in China's rural areas in light of its national renewable energy policies. These projects applied pyrolysis gasification as the main technology, which turns biomass waste at low costs into biogas. This paper selects seven bioenergy projects in Shandong Province as a case and assesses these projects in terms of economy, technological performance and effectiveness. Results show that these projects have not achieved a satisfying performance after 10 years experience. Many projects have been discontinued. This failure is attributed to a complex of shortcomings in institutional structure, technical level, financial support and social factors. For a more successful future development of bioenergy in rural areas, China should reform its institutional structure, establish a renewable energy market and enhance the technological level of bioenergy projects

  18. Implementing Large Projects in Software Engineering Courses

    Science.gov (United States)

    Coppit, David

    2006-01-01

    In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that…

  19. Restoring Eelgrass (Zostera marina) from Seed: A Comparison of Planting Methods for Large-Scale Projects

    National Research Council Canada - National Science Library

    Orth, Robert; Marion, Scott; Granger, Steven; Traber, Michael

    2008-01-01

    Eelgrass (Zostera marina) seeds are being used in a variety of both small- and large-scale restoration activities and have been successfully used to initiate recovery of eelgrass in the Virginia seaside coastal lagoons...

  20. Results of research and development in large-scale research centers as an innovation source for firms

    International Nuclear Information System (INIS)

    Theenhaus, R.

    1978-01-01

    The twelve large-scale research centres of the Federal Republic of Germany with their 16,000 employees represent a considerable scientific and technical potential. Cooperation with industry with regard to large-scale projects has already become very close and the know-how flow as well as the contributions to innovation connected therewith are largely established. The first successful steps to utilizing the results of basic research, of spin off and those within the frame of research and development as well as the fulfilling of services are encouraging. However, there is a number of detail problems which can only be solved between all parties concerned, in particular between industry and all large-scale research centres. (orig./RW) [de

  1. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  2. Inference of Functional Properties from Large-scale Analysis of Enzyme Superfamilies*

    Science.gov (United States)

    Brown, Shoshana D.; Babbitt, Patricia C.

    2012-01-01

    As increasingly large amounts of data from genome and other sequencing projects become available, new approaches are needed to determine the functions of the proteins these genes encode. We show how large-scale computational analysis can help to address this challenge by linking functional information to sequence and structural similarities using protein similarity networks. Network analyses using three functionally diverse enzyme superfamilies illustrate the use of these approaches for facile updating and comparison of available structures for a large superfamily, for creation of functional hypotheses for metagenomic sequences, and to summarize the limits of our functional knowledge about even well studied superfamilies. PMID:22069325

  3. Locale-Specific Categorization of IT Projects for Proper Project Management

    Directory of Open Access Journals (Sweden)

    Muhammad Nasir

    2015-12-01

    Full Text Available In this paper we aim to show that not only are the project management techniques important for small-scale projects, but following proper project management techniques is one of the most important requirements. However, there come slight variations in implementing project management techniques in small projects as compared with larger projects. Therefore, proper categorization of projects holds the key to success in many situations. Our paper will show that categorization of a project as small-scale or large-scale should always be locale specific. In this regards, we develop a criteria and evaluate how the projects are categorized as small-scale and large-scale in different parts of the world with focus on IT projects. This research concludes that some methods cannot be used for projects of different scales.

  4. Solving Large Scale Nonlinear Eigenvalue Problem in Next-Generation Accelerator Design

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Ben-Shan; Bai, Zhaojun; /UC, Davis; Lee, Lie-Quan; Ko, Kwok; /SLAC

    2006-09-28

    A number of numerical methods, including inverse iteration, method of successive linear problem and nonlinear Arnoldi algorithm, are studied in this paper to solve a large scale nonlinear eigenvalue problem arising from finite element analysis of resonant frequencies and external Q{sub e} values of a waveguide loaded cavity in the next-generation accelerator design. They present a nonlinear Rayleigh-Ritz iterative projection algorithm, NRRIT in short and demonstrate that it is the most promising approach for a model scale cavity design. The NRRIT algorithm is an extension of the nonlinear Arnoldi algorithm due to Voss. Computational challenges of solving such a nonlinear eigenvalue problem for a full scale cavity design are outlined.

  5. Developing A Large-Scale, Collaborative, Productive Geoscience Education Network

    Science.gov (United States)

    Manduca, C. A.; Bralower, T. J.; Egger, A. E.; Fox, S.; Ledley, T. S.; Macdonald, H.; Mcconnell, D. A.; Mogk, D. W.; Tewksbury, B. J.

    2012-12-01

    Over the past 15 years, the geoscience education community has grown substantially and developed broad and deep capacity for collaboration and dissemination of ideas. While this community is best viewed as emergent from complex interactions among changing educational needs and opportunities, we highlight the role of several large projects in the development of a network within this community. In the 1990s, three NSF projects came together to build a robust web infrastructure to support the production and dissemination of on-line resources: On The Cutting Edge (OTCE), Earth Exploration Toolbook, and Starting Point: Teaching Introductory Geoscience. Along with the contemporaneous Digital Library for Earth System Education, these projects engaged geoscience educators nationwide in exploring professional development experiences that produced lasting on-line resources, collaborative authoring of resources, and models for web-based support for geoscience teaching. As a result, a culture developed in the 2000s in which geoscience educators anticipated that resources for geoscience teaching would be shared broadly and that collaborative authoring would be productive and engaging. By this time, a diverse set of examples demonstrated the power of the web infrastructure in supporting collaboration, dissemination and professional development . Building on this foundation, more recent work has expanded both the size of the network and the scope of its work. Many large research projects initiated collaborations to disseminate resources supporting educational use of their data. Research results from the rapidly expanding geoscience education research community were integrated into the Pedagogies in Action website and OTCE. Projects engaged faculty across the nation in large-scale data collection and educational research. The Climate Literacy and Energy Awareness Network and OTCE engaged community members in reviewing the expanding body of on-line resources. Building Strong

  6. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  7. Large-Scale Participation: A Case Study of a Participatory Approach to Developing a New Public Library

    DEFF Research Database (Denmark)

    Dalsgaard, Peter; Eriksson, Eva

    2013-01-01

    In this paper, we present a case study of a participatory project that focuses on interaction in large-scale design, namely, the development of the new Urban Mediaspace Aarhus. This project, which has been under way for ten years, embodies a series of issues that arise when participatory design...

  8. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  9. Biopolitics problems of large-scale hydraulic engineering construction

    International Nuclear Information System (INIS)

    Romanenko, V.D.

    1997-01-01

    The XX century which will enter in a history as a century of large-scale hydraulic engineering constructions come to the finish. Only on the European continent 517 large reservoirs (more than 1000 million km 3 of water were detained, had been constructed for a period from 1901 till 1985. In the Danube basin a plenty for reservoirs of power stations, navigations, navigating sluices and other hydraulic engineering structures are constructed. Among them more than 40 especially large objects are located along the main bed of the river. A number of hydro-complexes such as Dnieper-Danube and Gabcikovo, Danube-Oder-Labe (project), Danube-Tissa, Danube-Adriatic Sea (project), Danube-Aegean Sea, Danube-Black Sea ones, are entered into operation or are in a stage of designing. Hydraulic engineering construction was especially heavily conducted in Ukraine. On its territory some large reservoirs on Dnieper and Yuzhny Bug were constructed, which have heavily changed the hydrological regime of the rivers. Summarised the results of river systems regulating in Ukraine one can be noted that more than 27 thousand ponds (3 km 3 per year), 1098 reservoirs of total volume 55 km 3 , 11 large channels of total length more than 2000 km and with productivity of 1000 m 2 /s have been created in Ukraine. Hydraulic engineering construction played an important role in development of the industry and agriculture, water-supply of the cities and settlements, in environmental effects, and maintenance of safe navigation in Danube, Dnieper and other rivers. In next part of the paper, the environmental changes after construction of the Karakum Channel on the Aral Sea in the Middle Asia are discussed

  10. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  11. Utility-Scale Solar 2014. An Empirical Analysis of Project Cost, Performance, and Pricing Trends in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States); Seel, Joachim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)

    2015-09-01

    Other than the nine Solar Energy Generation Systems (“SEGS”) parabolic trough projects built in the 1980s, virtually no large-scale or “utility-scale” solar projects – defined here to include any groundmounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar thermal power (“CSP”) project larger than 5 MWAC – existed in the United States prior to 2007. By 2012 – just five years later – utility-scale had become the largest sector of the overall PV market in the United States, a distinction that was repeated in both 2013 and 2014 and that is expected to continue for at least the next few years. Over this same short period, CSP also experienced a bit of a renaissance in the United States, with a number of large new parabolic trough and power tower systems – some including thermal storage – achieving commercial operation. With this critical mass of new utility-scale projects now online and in some cases having operated for a number of years (generating not only electricity, but also empirical data that can be mined), the rapidly growing utility-scale sector is ripe for analysis. This report, the third edition in an ongoing annual series, meets this need through in-depth, annually updated, data-driven analysis of not just installed project costs or prices – i.e., the traditional realm of solar economics analyses – but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects in the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are presented where appropriate.

  12. Assessment of clean development mechanism potential of large-scale energy efficiency measures in heavy industries

    International Nuclear Information System (INIS)

    Hayashi, Daisuke; Krey, Matthias

    2007-01-01

    This paper assesses clean development mechanism (CDM) potential of large-scale energy efficiency measures in selected heavy industries (iron and steel, cement, aluminium, pulp and paper, and ammonia) taking India and Brazil as examples of CDM project host countries. We have chosen two criteria for identification of the CDM potential of each energy efficiency measure: (i) emission reductions volume (in CO 2 e) that can be expected from the measure and (ii) likelihood of the measure passing the additionality test of the CDM Executive Board (EB) when submitted as a proposed CDM project activity. The paper shows that the CDM potential of large-scale energy efficiency measures strongly depends on the project-specific and country-specific context. In particular, technologies for the iron and steel industry (coke dry quenching (CDQ), top pressure recovery turbine (TRT), and basic oxygen furnace (BOF) gas recovery), the aluminium industry (point feeder prebake (PFPB) smelter), and the pulp and paper industry (continuous digester technology) offer promising CDM potential

  13. Projecting water resources changes in potential large-scale agricultural investment areas of the Kafue River Basin in Zambia

    Science.gov (United States)

    Kim, Y.; Trainor, A. M.; Baker, T. J.

    2017-12-01

    Climate change impacts regional water availability through the spatial and temporal redistribution of available water resources. This study focuses on understanding possible response of water resources to climate change in regions where potentials for large-scale agricultural investments are planned in the upper and middle Kafue River Basin in Zambia. We used historical and projected precipitation and temperature to assess changes in water yield, using the Soil and Water Assessment Tool (SWAT) hydrological model. Some of the Coupled Model Intercomparison Project Phase 5 (CMIP5) climate model outputs for the Representative Concentration Pathway (RCP) 4.5 and 8.5 scenarios project a temperature warming range from 1.8 - 5.7 °C over the region from 2020 to 2095. Precipitation projection patterns vary monthly but tend toward drier dry seasons with a slight increase in precipitation during the rainy season as compared to the historical time series. The best five calibrated parameter sets generated for the historical record (1965 - 2005) were applied for two future periods, 2020 - 2060 and 2055 - 2095, to project water yield change. Simulations projected that the 90th percentile water yield would be exceeded across most of the study area by up to 800% under the medium-low (RCP4.5) CO2 emission scenario, whereas the high (RCP8.5) CO2 emission scenario resulted in a more spatially varied pattern mixed with increasing (up to 500%) and decreasing (up to -54%) trends. The 10th percentile water yield indicated spatially varied pattern across the basin, increasing by as much as 500% though decreasing in some areas by 66%, with the greatest decreases during the dry season under RCP8.5. Overall, available water resources in the study area are projected to trend toward increased floods (i.e. water yields far exceeding 90th percentile) as well as increasing drought (i.e. water yield far below 10th percentile) vulnerability. Because surface water is a primary source for agriculture

  14. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  15. Large Scale Beam-beam Simulations for the CERN LHC using Distributed Computing

    CERN Document Server

    Herr, Werner; McIntosh, E; Schmidt, F

    2006-01-01

    We report on a large scale simulation of beam-beam effects for the CERN Large Hadron Collider (LHC). The stability of particles which experience head-on and long-range beam-beam effects was investigated for different optical configurations and machine imperfections. To cover the interesting parameter space required computing resources not available at CERN. The necessary resources were available in the LHC@home project, based on the BOINC platform. At present, this project makes more than 60000 hosts available for distributed computing. We shall discuss our experience using this system during a simulation campaign of more than six months and describe the tools and procedures necessary to ensure consistent results. The results from this extended study are presented and future plans are discussed.

  16. Moving Beyond Indignation: Stakeholder Tactics, Legal Tools and Community Benefits in Large-Scale Redevelopment Projects

    Directory of Open Access Journals (Sweden)

    Lisa Bornstein

    2015-03-01

    Full Text Available Government and accompanying business interests often favour large-scale urban projects to promote urban growth, attract revenues, and place the city on the world stage. Such projects are primarily oriented towards consumption and spectacle, serving regional, if not global, clientele. Negative impacts – from traffic to displacement – are felt most heavily in the immediately adjacent areas, and developments often contribute to increases in socio-spatial polarization. This paper examines two redevelopment projects, one in South San Francisco, one in Montréal, to assess the tactics and legal tools employed by municipal authorities and local organisations to harness development for social and environmental ends. Associated legal tools include public consultation requirements, citizen ballot propositions, Community Benefits Agreements and Development Agreements. The paper concludes with recommended principles to underpin future development and cautionary notes about the limitations of these tools. Los gobiernos e intereses empresariales que los acompañan, favorecen a menudo proyectos urbanísticos de gran escala, para promover el crecimiento urbano, atraer ingresos, y poner la ciudad en el mapa. Estos proyectos están orientados principalmente hacia el consumo y el espectáculo, al servicio de una clientela regional, si no global. Los impactos negativos –desde el tráfico a los desplazamientos– se dejan sentir con más fuerza en las áreas inmediatamente adyacentes, y su desarrollo a menudo contribuye al aumento de la polarización socio-espacial. Este artículo examina dos proyectos de reurbanización, uno en el sur de San Francisco, y el otro en Montreal, para evaluar las tácticas y herramientas legales empleadas por las autoridades municipales y organizaciones locales para potenciar el desarrollo de los fines sociales y ambientales. Entre las herramientas jurídicas asociadas se incluyen los requisitos de consulta pública, propuestas

  17. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  18. Large-scale commercial applications of the in situ vitrification remediation technology

    International Nuclear Information System (INIS)

    Campbell, B.E.; Hansen, J.E.; McElroy, J.L.; Thompson, L.E.; Timmerman, C.L.

    1994-01-01

    The first large-scale commercial application of the innovative In Situ Vitrification (ISV) remediation technology was completed at the Parsons Chemical/ETM Enterprises Superfund site in Michigan State midyear 1994. This project involved treating 4,800 tons of pesticide and mercury-contaminated soil. The project also involved performance of the USEPA SITE Program demonstration test for the ISV technology. The Parsons project involved consolidation and staging of contaminated soil from widespread locations on and nearby the site. This paper presents a brief description of the ISV technology along with case-study type information on these two sites and the performance of the ISV technology on them. The paper also reviews other remediation projects where ISV has been identified as the/a preferred remedy, and where ISV is currently planned for use. These sites include soils contaminated with pesticides, dioxin, PCP, paint wastes, and a variety of heavy metals. This review of additional sites also includes a description of a planned radioactive mixed waste remediation project in Australia that contains large amounts of plutonium, uranium, lead, beryllium, and metallic and other debris buried in limestone and dolomitic soil burial pits. Initial test work has been completed on this application, and preparations are now underway for pilot testing in Australia. This project will demonstrate the applicability of the ISV technology to the challenging application of buried mixed wastes

  19. A convex optimization approach for solving large scale linear systems

    Directory of Open Access Journals (Sweden)

    Debora Cores

    2017-01-01

    Full Text Available The well-known Conjugate Gradient (CG method minimizes a strictly convex quadratic function for solving large-scale linear system of equations when the coefficient matrix is symmetric and positive definite. In this work we present and analyze a non-quadratic convex function for solving any large-scale linear system of equations regardless of the characteristics of the coefficient matrix. For finding the global minimizers, of this new convex function, any low-cost iterative optimization technique could be applied. In particular, we propose to use the low-cost globally convergent Spectral Projected Gradient (SPG method, which allow us to extend this optimization approach for solving consistent square and rectangular linear system, as well as linear feasibility problem, with and without convex constraints and with and without preconditioning strategies. Our numerical results indicate that the new scheme outperforms state-of-the-art iterative techniques for solving linear systems when the symmetric part of the coefficient matrix is indefinite, and also for solving linear feasibility problems.

  20. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  1. Developing Routines in Large Inter-organisational Projects: A Case Study of an Infrastructure Megaproject

    Directory of Open Access Journals (Sweden)

    Therese Eriksson

    2015-08-01

    Full Text Available General management research has increasingly recognised the significance of routines in organisational performance. Among organisational tasks, megaprojects depend more on routines selected and created within the project than standard, small-scale projects do, owing largely to their size, duration, and uniqueness. Within this context, the present paper investigates how project routines were established and developed during the early design phase of an inter-organisational megaproject. A case study of a large public infrastructure project was conducted, in which data were collected during observations, semi-structured interviews, and project document studies over the course of three years. Results of analysis revealed that the client exerted the greatest impact on choice of routines and that the temporary nature of tasks limited efforts to fine-tune routines. Changes in routines were primarily reactive to new knowledge concerning project needs. The findings suggest that meta-routines to consciously review routines should be used to a greater extent and designed to capture supplier experiences as well.

  2. Large rainfall changes consistently projected over substantial areas of tropical land

    Science.gov (United States)

    Chadwick, Robin; Good, Peter; Martin, Gill; Rowell, David P.

    2016-02-01

    Many tropical countries are exceptionally vulnerable to changes in rainfall patterns, with floods or droughts often severely affecting human life and health, food and water supplies, ecosystems and infrastructure. There is widespread disagreement among climate model projections of how and where rainfall will change over tropical land at the regional scales relevant to impacts, with different models predicting the position of current tropical wet and dry regions to shift in different ways. Here we show that despite uncertainty in the location of future rainfall shifts, climate models consistently project that large rainfall changes will occur for a considerable proportion of tropical land over the twenty-first century. The area of semi-arid land affected by large changes under a higher emissions scenario is likely to be greater than during even the most extreme regional wet or dry periods of the twentieth century, such as the Sahel drought of the late 1960s to 1990s. Substantial changes are projected to occur by mid-century--earlier than previously expected--and to intensify in line with global temperature rise. Therefore, current climate projections contain quantitative, decision-relevant information on future regional rainfall changes, particularly with regard to climate change mitigation policy.

  3. COMBINED EFFECTS OF GALAXY INTERACTIONS AND LARGE-SCALE ENVIRONMENT ON GALAXY PROPERTIES

    International Nuclear Information System (INIS)

    Park, Changbom; Choi, Yun-Young

    2009-01-01

    We inspect the coupled dependence of physical parameters of the Sloan Digital Sky Survey galaxies on the small-scale (distance to and morphology of the nearest neighbor galaxy) and the large-scale (background density smoothed over 20 nearby galaxies) environments. The impacts of interaction on galaxy properties are detected at least out to the neighbor separation corresponding to the virial radius of galaxies, which is typically between 200 and 400 h -1 kpc for the galaxies in our sample. To detect these long-range interaction effects, it is crucial to divide galaxy interactions into four cases dividing the morphology of target and neighbor galaxies into early and late types. We show that there are two characteristic neighbor-separation scales where the galaxy interactions cause abrupt changes in the properties of galaxies. The first scale is the virial radius of the nearest neighbor galaxy r vir,nei . Many physical parameters start to deviate from those of extremely isolated galaxies at the projected neighbor separation r p of about r vir,nei . The second scale is at r p ∼ 0.05r vir,nei = 10-20 h -1 kpc, and is the scale at which the galaxies in pairs start to merge. We find that late-type neighbors enhance the star formation activity of galaxies while early-type neighbors reduce it, and that these effects occur within r vir,nei . The hot halo gas and cold disk gas must be participating in the interactions at separations less than the virial radius of the galaxy plus dark halo system. Our results also show that the role of the large-scale density in determining galaxy properties is minimal once luminosity and morphology are fixed. We propose that the weak residual dependence of galaxy properties on the large-scale density is due to the dependence of the halo gas property on the large-scale density.

  4. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  5. A family of conjugate gradient methods for large-scale nonlinear equations.

    Science.gov (United States)

    Feng, Dexiang; Sun, Min; Wang, Xueyong

    2017-01-01

    In this paper, we present a family of conjugate gradient projection methods for solving large-scale nonlinear equations. At each iteration, it needs low storage and the subproblem can be easily solved. Compared with the existing solution methods for solving the problem, its global convergence is established without the restriction of the Lipschitz continuity on the underlying mapping. Preliminary numerical results are reported to show the efficiency of the proposed method.

  6. The role of ethics in data governance of large neuro-ICT projects.

    Science.gov (United States)

    Stahl, Bernd Carsten; Rainey, Stephen; Harris, Emma; Fothergill, B Tyr

    2018-05-14

    We describe current practices of ethics-related data governance in large neuro-ICT projects, identify gaps in current practice, and put forward recommendations on how to collaborate ethically in complex regulatory and normative contexts. We undertake a survey of published principles of data governance of large neuro-ICT projects. This grounds an approach to a normative analysis of current data governance approaches. Several ethical issues are well covered in the data governance policies of neuro-ICT projects, notably data protection and attribution of work. Projects use a set of similar policies to ensure users behave appropriately. However, many ethical issues are not covered at all. Implementation and enforcement of policies remain vague. The data governance policies we investigated indicate that the neuro-ICT research community is currently close-knit and that shared assumptions are reflected in infrastructural aspects. This explains why many ethical issues are not explicitly included in data governance policies at present. With neuro-ICT research growing in scale, scope, and international involvement, these shared assumptions should be made explicit and reflected in data governance.

  7. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  8. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  9. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  10. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    Full Text Available Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological drought? To answer this question, we evaluated the simulation of drought propagation in an ensemble mean of ten large-scale models, both land-surface models and global hydrological models, that participated in the model intercomparison project of WATCH (WaterMIP. For a selection of case study areas, we studied drought characteristics (number of droughts, duration, severity, drought propagation features (pooling, attenuation, lag, lengthening, and hydrological drought typology (classical rainfall deficit drought, rain-to-snow-season drought, wet-to-dry-season drought, cold snow season drought, warm snow season drought, composite drought.

    Drought characteristics simulated by large-scale models clearly reflected drought propagation; i.e. drought events became fewer and longer when moving through the hydrological cycle. However, more differentiation was expected between fast and slowly responding systems, with slowly responding systems having fewer and longer droughts in runoff than fast responding systems. This was not found using large-scale models. Drought propagation features were poorly reproduced by the large-scale models, because runoff reacted immediately to precipitation, in all case study areas. This fast reaction to precipitation, even in cold climates in winter and in semi-arid climates in summer, also greatly influenced the hydrological drought typology as identified by the large-scale models. In general, the large-scale models had the correct representation of drought types, but the percentages of occurrence had some important mismatches, e.g. an overestimation of classical rainfall deficit droughts, and an

  11. Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker

    Science.gov (United States)

    Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong

    2017-10-01

    Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.

  12. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  13. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  14. Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan.

    Science.gov (United States)

    Tsai, Kuang-Jung; Chiang, Jie-Lun; Lee, Ming-Hsi; Chen, Yie-Ruey

    2017-04-01

    Analysis on the Critical Rainfall Value For Predicting Large Scale Landslides Caused by Heavy Rainfall In Taiwan. Kuang-Jung Tsai 1, Jie-Lun Chiang 2,Ming-Hsi Lee 2, Yie-Ruey Chen 1, 1Department of Land Management and Development, Chang Jung Christian Universityt, Tainan, Taiwan. 2Department of Soil and Water Conservation, National Pingtung University of Science and Technology, Pingtung, Taiwan. ABSTRACT The accumulated rainfall amount was recorded more than 2,900mm that were brought by Morakot typhoon in August, 2009 within continuous 3 days. Very serious landslides, and sediment related disasters were induced by this heavy rainfall event. The satellite image analysis project conducted by Soil and Water Conservation Bureau after Morakot event indicated that more than 10,904 sites of landslide with total sliding area of 18,113ha were found by this project. At the same time, all severe sediment related disaster areas are also characterized based on their disaster type, scale, topography, major bedrock formations and geologic structures during the period of extremely heavy rainfall events occurred at the southern Taiwan. Characteristics and mechanism of large scale landslide are collected on the basis of the field investigation technology integrated with GPS/GIS/RS technique. In order to decrease the risk of large scale landslides on slope land, the strategy of slope land conservation, and critical rainfall database should be set up and executed as soon as possible. Meanwhile, study on the establishment of critical rainfall value used for predicting large scale landslides induced by heavy rainfall become an important issue which was seriously concerned by the government and all people live in Taiwan. The mechanism of large scale landslide, rainfall frequency analysis ,sediment budge estimation and river hydraulic analysis under the condition of extremely climate change during the past 10 years would be seriously concerned and recognized as a required issue by this

  15. A family of conjugate gradient methods for large-scale nonlinear equations

    Directory of Open Access Journals (Sweden)

    Dexiang Feng

    2017-09-01

    Full Text Available Abstract In this paper, we present a family of conjugate gradient projection methods for solving large-scale nonlinear equations. At each iteration, it needs low storage and the subproblem can be easily solved. Compared with the existing solution methods for solving the problem, its global convergence is established without the restriction of the Lipschitz continuity on the underlying mapping. Preliminary numerical results are reported to show the efficiency of the proposed method.

  16. Collaboration and Virtualization in Large Information Systems Projects

    Directory of Open Access Journals (Sweden)

    Stefan Ioan NITCHI

    2009-01-01

    Full Text Available A project is evolving through different phases from idea and conception until the experiments, implementation and maintenance. The globalization, the Internet, the Web and the mobile computing changed many human activities, and in this respect, the realization of the Information System (IS projects. The projects are growing, the teams are geographically distributed, and the users are heterogeneous. In this respect, the realization of the large Information Technology (IT projects needs to use collaborative technologies. The distribution of the team, the users' heterogeneity and the project complexity determines the virtualization. This paper is an overview of these aspects for large IT projects. It shortly present a general framework developed by the authors for collaborative systems in general and adapted to collaborative project management. The general considerations are illustrated on the case of a large IT project in which the authors were involved.

  17. Policy and planning for large infrastructure projects

    DEFF Research Database (Denmark)

    Flyvbjerg, Bent

    2005-01-01

    This paper focuses on problems and their causes and cures in policy and planning for large infrastructure projects. First, it identifies as the main problem in major infrastructure development pervasive misinformation about the costs, benefits, and risks involved. A consequence of misinformation ...... for large infrastructure projects, with a focus on better planning methods and changed governance structures, the latter being more important.......This paper focuses on problems and their causes and cures in policy and planning for large infrastructure projects. First, it identifies as the main problem in major infrastructure development pervasive misinformation about the costs, benefits, and risks involved. A consequence of misinformation...... the likelihood that it is their projects, and not the competition's, that gain approval and funding. This results in the "survival of the unfittest," where often it is not the best projects that are built, but the most misrepresented ones. Finally, the paper presents measures for reforming policy and planning...

  18. Investigation of the large scale regional hydrogeological situation at Ceberg

    International Nuclear Information System (INIS)

    Boghammar, A.; Grundfelt, B.; Hartley, L.

    1997-11-01

    The present study forms part of the large-scale groundwater flow studies within the SR 97 project. The site of interest is Ceberg. Within the present study two different regional scale groundwater models have been constructed, one large regional model with an areal extent of about 300 km 2 and one semi-regional model with an areal extent of about 50 km 2 . Different types of boundary conditions have been applied to the models. Topography driven pressures, constant infiltration rates, non-linear infiltration combined specified pressure boundary conditions, and transfer of groundwater pressures from the larger model to the semi-regional model. The present model has shown that: -Groundwater flow paths are mainly local. Large-scale groundwater flow paths are only seen below the depth of the hypothetical repository (below 500 meters) and are very slow. -Locations of recharge and discharge, to and from the site area are in the close vicinity of the site. -The low contrast between major structures and the rock mass means that the factor having the major effect on the flowpaths is the topography. -A sufficiently large model, to incorporate the recharge and discharge areas to the local site is in the order of kilometres. -A uniform infiltration rate boundary condition does not give a good representation of the groundwater movements in the model. -A local site model may be located to cover the site area and a few kilometers of the surrounding region. In order to incorporate all recharge and discharge areas within the site model, the model will be somewhat larger than site scale models at other sites. This is caused by the fact that the discharge areas are divided into three distinct areas to the east, south and west of the site. -Boundary conditions may be supplied to the site model by means of transferring groundwater pressures obtained with the semi-regional model

  19. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  20. HSTDEK: Developing a methodology for construction of large-scale, multi-use knowledge bases

    Science.gov (United States)

    Freeman, Michael S.

    1987-01-01

    The primary research objectives of the Hubble Space Telescope Design/Engineering Knowledgebase (HSTDEK) are to develop a methodology for constructing and maintaining large scale knowledge bases which can be used to support multiple applications. To insure the validity of its results, this research is being persued in the context of a real world system, the Hubble Space Telescope. The HSTDEK objectives are described in detail. The history and motivation of the project are briefly described. The technical challenges faced by the project are outlined.

  1. Modeling the impact of large-scale energy conversion systems on global climate

    International Nuclear Information System (INIS)

    Williams, J.

    There are three energy options which could satisfy a projected energy requirement of about 30 TW and these are the solar, nuclear and (to a lesser extent) coal options. Climate models can be used to assess the impact of large scale deployment of these options. The impact of waste heat has been assessed using energy balance models and general circulation models (GCMs). Results suggest that the impacts are significant when the heat imput is very high and studies of more realistic scenarios are required. Energy balance models, radiative-convective models and a GCM have been used to study the impact of doubling the atmospheric CO 2 concentration. State-of-the-art models estimate a surface temperature increase of 1.5-3.0 0 C with large amplification near the poles, but much uncertainty remains. Very few model studies have been made of the impact of particles on global climate, more information on the characteristics of particle input are required. The impact of large-scale deployment of solar energy conversion systems has received little attention but model studies suggest that large scale changes in surface characteristics associated with such systems (surface heat balance, roughness and hydrological characteristics and ocean surface temperature) could have significant global climatic effects. (Auth.)

  2. Large-scale building integrated photovoltaics field trial. First technical report - installation phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the results of the first eighteen months of the Large-Scale Building Integrated Photovoltaic Field Trial focussing on technical aspects. The project aims included increasing awareness and application of the technology, raising the UK capabilities in application of the technology, and assessing the potential for building integrated photovoltaics (BIPV). Details are given of technology choices; project organisation, cost, and status; and the evaluation criteria. Installations of BIPV described include University buildings, commercial centres, and a sports stadium, wildlife park, church hall, and district council building. Lessons learnt are discussed, and a further report covering monitoring aspects is planned.

  3. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  4. Economics of Large Helium Cryogenic Systems experience from Recent Projects at CERN

    CERN Document Server

    Claudet, S; Lebrun, P; Tavian, L; Wagner, U

    1999-01-01

    Large projects based on applied superconductivity, such as particle accelerators, tokamaks or SMES, require powerful and complex helium cryogenic systems, the cost of which represents a significant, if not dominant fraction of the total capital and operational expenditure. It is therefore important to establish guidelines and scaling laws for costing such systems, based on synthetic estimators of their size and performance. Although such data has already been published for many years, the experience recently gathered at CERN with the LEP and LHC projects, which have de facto turned the laboratory into a major world cryogenic center, can be exploited to update this information and broaden the range of application of the scaling laws. We report on the economics of 4.5 K and 1.8 K refrigeration, cryogen distribution and storage systems, and indicate paths towards their cost-to-performance optimisation.

  5. Strategic Environmental Assessment and Environmental Auditing in Large-scale Public Infrastructure Construction: the case of Qinghai-Tibet Railway

    NARCIS (Netherlands)

    He, G.; Zhang, L.; Lu, Y.

    2009-01-01

    Large-scale public infrastructure projects have featured in China’s modernization course since the early 1980s. During the early stages of China’s rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however,

  6. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  7. The Large-Scale Biosphere-Atmosphere Experiment in Amazonia: Analyzing Regional Land Use Change Effects.

    Science.gov (United States)

    Michael Keller; Maria Assunção Silva-Dias; Daniel C. Nepstad; Meinrat O. Andreae

    2004-01-01

    The Large-Scale Biosphere-Atmosphere Experiment in Amazonia (LBA) is a multi-disciplinary, multinational scientific project led by Brazil. LBA researchers seek to understand Amazonia in its global context especially with regard to regional and global climate. Current development activities in Amazonia including deforestation, logging, cattle ranching, and agriculture...

  8. 78 FR 70076 - Large Scale Networking (LSN)-Middleware and Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2013-11-22

    ... projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating Group (CG). Public... Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD... MAGIC Team meetings are held on the first Wednesday of each month, 2:00-4:00 p.m., at the National...

  9. Large scale parallel FEM computations of far/near stress field changes in rocks

    Czech Academy of Sciences Publication Activity Database

    Blaheta, Radim; Byczanski, Petr; Jakl, Ondřej; Kohut, Roman; Kolcun, Alexej; Krečmer, Karel; Starý, Jiří

    2006-01-01

    Roč. 22, č. 4 (2006), s. 449-459 ISSN 0167-739X R&D Projects: GA ČR(CZ) GA105/02/0492; GA AV ČR(CZ) 1ET400300415 Institutional research plan: CEZ:AV0Z30860518 Keywords : large scale finite element analysis Subject RIV: BA - General Mathematics Impact factor: 0.722, year: 2006

  10. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  11. Setting up fuel supply strategies for large-scale bio-energy projects using agricultural and forest residues. A methodology for developing countries

    International Nuclear Information System (INIS)

    Junginger, M.

    2000-08-01

    The objective of this paper is to develop a coherent methodology to set up fuel supply strategies for large-scale biomass-conversion units. This method will explicitly take risks and uncertainties regarding availability and costs in relation to time into account. This paper aims at providing general guidelines, which are not country-specific. These guidelines cannot provide 'perfect fit'-solutions, but aim to give general help to overcome barriers and to set up supply strategies. It will mainly focus on residues from the agricultural and forestry sector. This study focuses on electricity or both electricity and heat production (CHP) with plant scales between 1040 MWe. This range is chosen due to rules of economies of scale. In large-scale plants the benefits of increased efficiency outweigh increased transportation costs, allowing a lower price per kWh which in turn may allow higher biomass costs. However, fuel-supply risks tend to get higher with increasing plant size, which makes it more important to assess them for large(r) conversion plants. Although the methodology does not focus on a specific conversion technology, it should be stressed that the technology must be able to handle a wide variety of biomass fuels with different characteristics because many biomass residues are not available the year round and various fuels are needed for a constant supply. The methodology allows for comparing different technologies (with known investment and operational and maintenance costs from literature) and evaluation for different fuel supply scenarios. In order to demonstrate the methodology, a case study was carried out for the north-eastern part of Thailand (Isaan), an agricultural region. The research was conducted in collaboration with the Regional Wood Energy Development Programme in Asia (RWEDP), a project of the UN Food and Agricultural Organization (FAO) in Bangkok, Thailand. In Section 2 of this paper the methodology will be presented. In Section 3 the economic

  12. Large-scale renewable energy project barriers: Environmental impact assessment streamlining efforts in Japan and the EU

    International Nuclear Information System (INIS)

    Schumacher, Kim

    2017-01-01

    Environmental Impact Assessment (EIA) procedures have been identified as a major barrier to renewable energy (RE) development with regards to large-scale projects (LS-RE). However EIA laws have also been neglected by many decision-makers who have been underestimating its impact on RE development and the stifling potential they possess. As a consequence, apart from acknowledging the shortcomings of the systems currently in place, few governments momentarily have concrete plans to reform their EIA laws. By looking at recent EIA streamlining efforts in two industrialized regions that underwent major transformations in their energy sectors, this paper attempts to assess how such reform efforts can act as a means to support the balancing of environmental protection and climate change mitigation with socio-economic challenges. Thereby this paper fills this intellectual void by identifying the strengths and weaknesses of the Japanese EIA law by contrasting it with the recently revised EIA Directive of the European Union (EU). This enables the identification of the regulatory provisions that impact RE development the most and the determination of how structured EIA law reforms would affect domestic RE project development. The main focus lies on the evaluation of regulatory streamlining efforts in the Japanese and EU contexts through the application of a mixed-methods approach, consisting of in-depth literary and legal reviews, followed by a comparative analysis and a series of semi-structured interviews. Highlighting several legal inconsistencies in combination with the views of EIA professionals, academics and law- and policymakers, allowed for a more comprehensive assessment of what streamlining elements of the reformed EU EIA Directive and the proposed Japanese EIA framework modifications could either promote or stifle further RE deployment. - Highlights: •Performs an in-depth review of EIA reforms in OECD territories •First paper to compare Japan and the European

  13. Managing sensitive phenotypic data and biomaterial in large-scale collaborative psychiatric genetic research projects: practical considerations.

    Science.gov (United States)

    Demiroglu, S Y; Skrowny, D; Quade, M; Schwanke, J; Budde, M; Gullatz, V; Reich-Erkelenz, D; Jakob, J J; Falkai, P; Rienhoff, O; Helbing, K; Heilbronner, U; Schulze, T G

    2012-12-01

    Large-scale collaborative research will be a hallmark of future psychiatric genetic research. Ideally, both academic and non-academic institutions should be able to participate in such collaborations to allow for the establishment of very large samples in a straightforward manner. Any such endeavor requires an easy-to-implement information technology (IT) framework. Here we present the requirements for a centralized framework and describe how they can be met through a modular IT toolbox.

  14. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  15. Survey and research for the enhancement of large-scale technology development 2. How large-scale technology development should be in the future; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 2. Kongo no ogata gijutsu kaihatsu no arikata

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    A survey is conducted over the subject matter by holding interviews with people, employed with the entrusted businesses participating in the large-scale industrial technology development system, who are engaged in the development of industrial technologies, and with people of experience or academic background involved in the project enhancement effort. Needs of improvement are pointed out that the competition principle based for example on parallel development be introduced; that research-on-research be practiced for effective task institution; midway evaluation be substantiated since prior evaluation is difficult; efforts be made to organize new industries utilizing the fruits of large-scale industrial technology for the creation of markets, not to induce economic conflicts; that transfer of technologies be enhanced from the private sector to public sector. Studies are made about the review of research conducting systems; utilization of the power of private sector research and development efforts; enlightening about industrial proprietorship; and the diffusion of large-scale project systems. In this connection, problems are pointed out, requests are submitted, and remedial measures and suggestions are presented. (NEDO)

  16. Large-scale demonstration of reliability centered maintenance at two nuclear generating stations

    International Nuclear Information System (INIS)

    Gaertner, J.P.; Edgar, C.; Rodin, M.E.

    1989-01-01

    This paper reports that after successful single-system pilot applications of Reliability Centered Maintenance (RCM) at various utilities, EPRI with Rochester Gas and Electric and Southern California Edison is undertaking multiple-system applications of RCM at their respective nuclear plants. The objective is to demonstrate the feasibility and cost-effectiveness of large-scale RCM application. In addition, each utility has plant-specific objectives to improve maintenance and plant availability. Each project has selected a prioritized list of some 15-20 systems on which to perform RCM. Each project is employing somewhat different RCM analysis methods, both of which conform to a global RCM definition applicable to all EPRI RCM work to date. Each project has developed important insights for improving cost and value of future analyses. Both projects will have applied the RCM process, including implementation, on several plant systems by April 1989

  17. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  18. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  19. 77 FR 58416 - Large Scale Networking (LSN); Middleware and Grid Interagency Coordination (MAGIC) Team

    Science.gov (United States)

    2012-09-20

    ..., Grid, and cloud projects. The MAGIC Team reports to the Large Scale Networking (LSN) Coordinating Group... Coordination (MAGIC) Team AGENCY: The Networking and Information Technology Research and Development (NITRD.... Dates/Location: The MAGIC Team meetings are held on the first Wednesday of each month, 2:00-4:00pm, at...

  20. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  1. Intercomparison of regional-scale hydrological models and climate change impacts projected for 12 large river basins worldwide—a synthesis

    Science.gov (United States)

    Krysanova, Valentina; Vetter, Tobias; Eisner, Stephanie; Huang, Shaochun; Pechlivanidis, Ilias; Strauch, Michael; Gelfan, Alexander; Kumar, Rohini; Aich, Valentin; Arheimer, Berit; Chamorro, Alejandro; van Griensven, Ann; Kundu, Dipangkar; Lobanova, Anastasia; Mishra, Vimal; Plötner, Stefan; Reinhardt, Julia; Seidou, Ousmane; Wang, Xiaoyan; Wortmann, Michel; Zeng, Xiaofan; Hattermann, Fred F.

    2017-10-01

    An intercomparison of climate change impacts projected by nine regional-scale hydrological models for 12 large river basins on all continents was performed, and sources of uncertainty were quantified in the framework of the ISIMIP project. The models ECOMAG, HBV, HYMOD, HYPE, mHM, SWAT, SWIM, VIC and WaterGAP3 were applied in the following basins: Rhine and Tagus in Europe, Niger and Blue Nile in Africa, Ganges, Lena, Upper Yellow and Upper Yangtze in Asia, Upper Mississippi, MacKenzie and Upper Amazon in America, and Darling in Australia. The model calibration and validation was done using WATCH climate data for the period 1971-2000. The results, evaluated with 14 criteria, are mostly satisfactory, except for the low flow. Climate change impacts were analyzed using projections from five global climate models under four representative concentration pathways. Trends in the period 2070-2099 in relation to the reference period 1975-2004 were evaluated for three variables: the long-term mean annual flow and high and low flow percentiles Q 10 and Q 90, as well as for flows in three months high- and low-flow periods denoted as HF and LF. For three river basins: the Lena, MacKenzie and Tagus strong trends in all five variables were found (except for Q 10 in the MacKenzie); trends with moderate certainty for three to five variables were confirmed for the Rhine, Ganges and Upper Mississippi; and increases in HF and LF were found for the Upper Amazon, Upper Yangtze and Upper Yellow. The analysis of projected streamflow seasonality demonstrated increasing streamflow volumes during the high-flow period in four basins influenced by monsoonal precipitation (Ganges, Upper Amazon, Upper Yangtze and Upper Yellow), an amplification of the snowmelt flood peaks in the Lena and MacKenzie, and a substantial decrease of discharge in the Tagus (all months). The overall average fractions of uncertainty for the annual mean flow projections in the multi-model ensemble applied for all basins

  2. Opportunities and Challenges of Large Investment Projects in the New Economy: the Port of Ust-Luga

    Directory of Open Access Journals (Sweden)

    Popodko Galina

    2015-09-01

    Full Text Available The aim of this study is to search for a mechanism for implementing large investment projects of crucial economic importance in the modern economic conditions characterized by the sanction policy of foreign states, limited public investment, and a mass exodus of foreign investors. An example of a large-scale investment project is the construction of a multipurpose multimodal complex — the commercial seaport of Ust-Luga. This is one of the most recent large projects in seaport infrastructure development. This article estimates the project’s significance for the development of the Baltic region and presents a competitive analysis of the seaport position in comparison to the largest European ports. The authors analyze the strengths of the seaport construction project, namely, the favorable natural environment and climate, advantageous geographical position, strong political will demonstrated by the federal and regional authorities. The article also considers the challenges the project faces — unfortunate geopolitical situation, growing competition from other seaports, and lack of investment. Based on the analysis of challenges, it is concluded that there are significant risks associated predominantly with lack of investment. In these conditions, a large investment project requires the enhancement of public-private partnership, which will ensure the timely implementation of such projects

  3. Large-scale straw supplies to existing coal-fired power stations

    International Nuclear Information System (INIS)

    Gylling, M.; Parsby, M.; Thellesen, H.Z.; Keller, P.

    1992-08-01

    It is considered that large-scale supply of straw to power stations and decentral cogeneration plants could open up new economical systems and methods of organization of straw supply in Denmark. This thesis is elucidated and involved constraints are pointed out. The aim is to describe to what extent large-scale straw supply is interesting with regard to monetary savings and available resources. Analyses of models, systems and techniques described in a foregoing project are carried out. It is reckoned that the annual total amount of surplus straw in Denmark is 3.6 million tons. At present, use of straw which is not agricultural is limited to district heating plants with an annual consumption of 2-12 thousand tons. A prerequisite for a significant increase in the use of straw is an annual consumption by power and cogeneration plants of more than 100.000 tons. All aspects of straw management are examined in detail, also in relation to two actual Danish coal-fired plants. The reliability of straw supply is considered. It is concluded that very significant resources of straw are available in Denmark but there remain a number of constraints. Price competitiveness must be considered in relation to other fuels. It is suggested that the use of corn harvests, with whole stems attached (handled as large bales or in the same way as sliced straw alone) as fuel, would result in significant monetary savings in transport and storage especially. An equal status for whole-harvested corn with other forms of biomass fuels, with following changes in taxes and subsidies could possibly reduce constraints on large scale straw fuel supply. (AB) (13 refs.)

  4. Current status of large-scale cryogenic gravitational wave telescope

    International Nuclear Information System (INIS)

    Kuroda, K; Ohashi, M; Miyoki, S; Uchiyama, T; Ishitsuka, H; Yamamoto, K; Kasahara, K; Fujimoto, M-K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Nagano, S; Tsunesada, Y; Zhu, Zong-Hong; Shintomi, T; Yamamoto, A; Suzuki, T; Saito, Y; Haruyama, T; Sato, N; Higashi, Y; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Aso, Y; Ueda, K-I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Tagoshi, H; Nakamura, T; Sasaki, M; Tanaka, T; Oohara, K; Takahashi, H; Miyakawa, O; Tobar, M E

    2003-01-01

    The large-scale cryogenic gravitational wave telescope (LCGT) project is the proposed advancement of TAMA, which will be able to detect the coalescences of binary neutron stars occurring in our galaxy. LCGT intends to detect the coalescence events within about 240 Mpc, the rate of which is expected to be from 0.1 to several events in a year. LCGT has Fabry-Perot cavities of 3 km baseline and the mirrors are cooled down to a cryogenic temperature of 20 K. It is planned to be built in the underground of Kamioka mine. This paper overviews the revision of the design and the current status of the R and D

  5. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  6. Policy support for large scale demonstration projects for hydrogen use in transport. Deliverable D 5.1 (Part B)

    International Nuclear Information System (INIS)

    Ros, M.E.; Jeeninga, H.; Godfroij, P.

    2007-06-01

    This research addresses the possible policy support mechanisms for hydrogen use in transport to answer the question which policy support mechanism potentially is most effective to stimulate hydrogen in transport and especially for large scale demonstrations. This is done by investigating two approaches. First, the possible policy support mechanisms for energy innovations. Second, by relating these to the different technology development stages (R and D, early market and mass market stage) and reviewing their effect on different parts of the hydrogen energy chain (production, distribution and end-use). Additionally, a comparison of the currently policy support mechanisms used in Europe (on EU level) with the United States (National and State level) is made. The analysis shows that in principle various policy support mechanisms can be used to stimulate hydrogen. The choice for a policy support mechanism should depend on the need to reduce the investment cost (euros/MW), production/use cost (euros/GJ) or increase performance (euros/kg CO2 avoided) of a technology during its development. Careful thought has to be put into the design and choice of a policy support mechanism because it can have effects on other parts of the hydrogen energy chain, mostly how hydrogen is produced. The effectiveness of a policy support mechanism greatly depends on the ability to adapt to the developments of the technology and the changing requirements which come with technological progress. In time different policy support mechanisms have to be applied. For demonstration projects there is currently the tendency to apply R and D subsidies in Europe, while the United States applies a variety of policy support mechanisms. The United States not only has higher and more support for demonstration projects but also has stronger incentives to prepare early market demand (for instance requiring public procurement and sales obligations). In order to re-establish the level playing field, Europe may

  7. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  8. Biotechnological lignite conversion - a large-scale concept

    Energy Technology Data Exchange (ETDEWEB)

    Reich-Walber, M.; Meyrahn, H.; Felgener, G.W. [Rheinbraun AG, Koeln (Germany). Fuel Technology and Lab. Dept.

    1997-12-31

    Concerning the research on biotechnological lignite upgrading, Rheinbraun`s overall objective is the large-scale production of liquid and gaseous products for the energy and chemical/refinery sectors. The presentation outlines Rheinbraun`s technical concept for electricity production on the basis of biotechnologically solubilized lignite. A first rough cost estimate based on the assumptions described in the paper in detail and compared with the latest power plant generation shows the general cost efficiency of this technology despite the additional costs in respect of coal solubilization. The main reasons are low-cost process techniques for coal conversion on the one hand and cost reductions mainly in power plant technology (more efficient combustion processes and simplified gas clean-up) but also in coal transport (easy fuel handling) on the other hand. Moreover, it is hoped that an extended range of products will make it possible to widen the fields of lignite application. The presentation also points out that there is still a huge gap between this scenario and reality by limited microbiological knowledge. To close this gap Rheinbraun started a research project supported by the North-Rhine Westphalian government in 1995. Several leading biotechnological companies and institutes in Germany and the United States are involved in the project. The latest results of the current project will be presented in the paper. This includes fundamental research activities in the field of microbial coal conversion as well as investigations into bioreactor design and product treatment (dewatering, deashing and desulphurization). (orig.)

  9. Evaluation of energy saving in large scale projects in domestic lighting; Evaluacion del ahorro de energia en proyectos de gran escala en alumbrado domestico

    Energy Technology Data Exchange (ETDEWEB)

    Valera Negrete, Adrian [Comision Federal de Electricidad, Mexico, D.F. (Mexico)

    2001-07-01

    The present work shows the methodology and the necessary parameters are indicated to evaluate the energy saving and the reduction of power demand obtained, by the large scale projects of substitution of incandescent lamps by compact, circular fluorescent and globe type lamps in the domestic sector. [Spanish] El presente trabajo muestra la metodologia y se indican los parametros necesarios para evaluar el ahorro de energia y reduccion de la demanda de potencia obtenidos, por los proyectos de gran escala de sustitucion de focos incandescentes por lamparas fluorescentes compactas, circulares y tipo globo en el sector domestico.

  10. Large-Scale Spacecraft Fire Safety Experiments in ISS Resupply Vehicles

    Science.gov (United States)

    Ruff, Gary A.; Urban, David

    2013-01-01

    Our understanding of the fire safety risk in manned spacecraft has been limited by the small scale of the testing we have been able to conduct in low-gravity. Fire growth and spread cannot be expected to scale linearly with sample size so we cannot make accurate predictions of the behavior of realistic scale fires in spacecraft based on the limited low-g testing to date. As a result, spacecraft fire safety protocols are necessarily very conservative and costly. Future crewed missions are expected to be longer in duration than previous exploration missions outside of low-earth orbit and accordingly, more complex in terms of operations, logistics, and safety. This will increase the challenge of ensuring a fire-safe environment for the crew throughout the mission. Based on our fundamental uncertainty of the behavior of fires in low-gravity, the need for realistic scale testing at reduced gravity has been demonstrated. To address this concern, a spacecraft fire safety research project is underway to reduce the uncertainty and risk in the design of spacecraft fire safety systems by testing at nearly full scale in low-gravity. This project is supported by the NASA Advanced Exploration Systems Program Office in the Human Exploration and Operations Mission Directorate. The activity of this project is supported by an international topical team of fire experts from other space agencies to maximize the utility of the data and to ensure the widest possible scrutiny of the concept. The large-scale space flight experiment will be conducted on three missions; each in an Orbital Sciences Corporation Cygnus vehicle after it has deberthed from the ISS. Although the experiment will need to meet rigorous safety requirements to ensure the carrier vehicle does not sustain damage, the absence of a crew allows the fire products to be released into the cabin. The tests will be fully automated with the data downlinked at the conclusion of the test before the Cygnus vehicle reenters the

  11. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  12. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  13. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  14. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    multiple jurisdictions. The SaskRB has therefore been developed as a large scale observatory, now a Regional Hydroclimate Project of the World Climate Research Programme's GEWEX project, and is available to contribute to the emerging North American Water Program. State-of-the-art hydro-ecological experimental sites have been developed for the key biomes, and a river and lake biogeochemical research facility, focussed on impacts of nutrients and exotic chemicals. Data are integrated at SaskRB scale to support the development of improved large scale climate and hydrological modelling products, the development of DSS systems for local, provincial and basin-scale management, and the development of related social science research, engaging stakeholders in the research and exploring their values and priorities for water security. The observatory provides multiple scales of observation and modelling required to develop: a) new climate, hydrological and ecological science and modelling tools to address environmental change in key environments, and their integrated effects and feedbacks at large catchment scale, b) new tools needed to support river basin management under uncertainty, including anthropogenic controls on land and water management and c) the place-based focus for the development of new transdisciplinary science.

  15. The status of US Teraflops-scale projects

    International Nuclear Information System (INIS)

    Mawhinney, R.D.

    1995-01-01

    The current status of United States projects pursuing Teraflops-scale computing resources for lattice field theory is discussed. Two projects are in existence at this time: the Multidisciplinary Teraflops Project, incorporating the physicists of the QCD Teraflops Collaboration, and a smaller project, centered at Columbia, involving the design and construction of a 0.8Teraflops computer primarily for QCD. ((orig.))

  16. Contracting Economics of Large Engineering and Construction Projects

    NARCIS (Netherlands)

    Berends, T.C.

    2007-01-01

    Large Engineering and Construction Projects (LECPs) form an important area of economic activity, covering a range of different artefacts. These projects have in common that they are massive undertakings, spanning long time periods and they involve large capital investments. Uncertainty and risk are

  17. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  18. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  19. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  20. Active self-testing noise measurement sensors for large-scale environmental sensor networks.

    Science.gov (United States)

    Domínguez, Federico; Cuong, Nguyen The; Reinoso, Felipe; Touhafi, Abdellah; Steenhaut, Kris

    2013-12-13

    Large-scale noise pollution sensor networks consist of hundreds of spatially distributed microphones that measure environmental noise. These networks provide historical and real-time environmental data to citizens and decision makers and are therefore a key technology to steer environmental policy. However, the high cost of certified environmental microphone sensors render large-scale environmental networks prohibitively expensive. Several environmental network projects have started using off-the-shelf low-cost microphone sensors to reduce their costs, but these sensors have higher failure rates and produce lower quality data. To offset this disadvantage, we developed a low-cost noise sensor that actively checks its condition and indirectly the integrity of the data it produces. The main design concept is to embed a 13 mm speaker in the noise sensor casing and, by regularly scheduling a frequency sweep, estimate the evolution of the microphone's frequency response over time. This paper presents our noise sensor's hardware and software design together with the results of a test deployment in a large-scale environmental network in Belgium. Our middle-range-value sensor (around €50) effectively detected all experienced malfunctions, in laboratory tests and outdoor deployments, with a few false positives. Future improvements could further lower the cost of our sensor below €10.

  1. Large-scale coral reef restoration could assist natural recovery in Seychelles, Indian Ocean

    Directory of Open Access Journals (Sweden)

    Phanor Hernando Montoya Maya

    2016-11-01

    Full Text Available The aim of ecological restoration is to establish self-sustaining and resilient systems. In coral reef restoration, transplantation of nursery-grown corals is seen as a potential method to mitigate reef degradation and enhance recovery. The transplanted reef should be capable of recruiting new juvenile corals to ensure long-term resilience. Here, we quantified how coral transplantation influenced natural coral recruitment at a large-scale coral reef restoration site in Seychelles, Indian Ocean. Between November 2011 and June 2014 a total of 24,431 nursery-grown coral colonies from 10 different coral species were transplanted in 5,225 m2 (0.52 ha of degraded reef at the no-take marine reserve of Cousin Island Special Reserve in an attempt to assist in natural reef recovery. We present the results of research and monitoring conducted before and after coral transplantation to evaluate the positive effect that the project had on coral recruitment and reef recovery at the restored site. We quantified the density of coral recruits (spat <1 cm and juveniles (colonies 1-5 cm at the transplanted site, a degraded control site and a healthy control site at the marine reserve. We used ceramic tiles to estimate coral settlement and visual surveys with 1 m2 quadrats to estimate coral recruitment. Six months after tile deployment, total spat density at the transplanted site (123.4 ± 13.3 spat m-2 was 1.8 times higher than at healthy site (68.4 ± 7.8 spat m-2 and 1.6 times higher than at degraded site (78.2 ± 7.17 spat m-2. Two years after first transplantation, the total recruit density was highest at healthy site (4.8 ± 0.4 recruits m-2, intermediate at transplanted site (2.7 ± 0.4 recruits m-2, and lowest at degraded site (1.7 ± 0.3 recruits m-2. The results suggest that large-scale coral restoration may have a positive influence on coral recruitment and juveniles. The effect of key project techniques on the results are discussed. This study supports

  2. Large Pilot-Scale Carbon Dioxide (CO2) Capture Project Using Aminosilicone Solvent.Final Scientific/Technical Report

    Energy Technology Data Exchange (ETDEWEB)

    Hancu, Dan [General Electric Company, Niskayuna, NY (United States)

    2017-12-21

    GE Global Research has developed, over the last 8 years, a platform of cost effective CO2 capture technologies based on a non-aqueous aminosilicone solvent (GAP-1m). As demonstrated in previous funded DOE projects (DE-FE0007502 and DEFE0013755), the GAP-1m solvent has increased CO2 working capacity, lower volatility and corrosivity than the benchmark aqueous amine technology. Performance of the GAP-1m solvent was recently demonstrated in a 0.5 MWe pilot at National Carbon Capture Center, AL with real flue gas for over 500 hours of operation using a Steam Stripper Column (SSC). The pilot-scale PSTU engineering data were used to (i) update the techno-economic analysis, and EH&S assessment, (ii) perform technology gap analysis, and (iii) conduct the solvent manufacturability and scale-up study.

  3. Large-scale sulfolane-impacted soil remediation at a gas plant

    Energy Technology Data Exchange (ETDEWEB)

    Lavoie, G.; Rockwell, K. [Biogenie Inc., Calgary, AB (Canada)

    2006-07-01

    A large-scale sulfolane-impacted soil remediation project at a gas plant in central Alberta was discussed. The plant was operational from the 1960s to present and the former operation involved the Sulfinol process which resulted in groundwater contamination. In 2005, the client wanted to address the sources area. The Sulfinol process has been used since the 1960s to remove hydrogen sulfide and other corrosive gases from natural gas streams. Sulfinol uses sulfolane and diisopropanolamine. Sulfolane is toxic, non-volatile, and water soluble. The presentation also addressed the remediation objectives and an additional site assessment that was conducted to better delineate the sulfolane and sulphur plume, as well as metals. The findings of the ESA and site specific challenges were presented. These challenges included: plant operation concerns; numerous overhead, surface, and underground structures; large volume of impacted material, limited space available on site; several types of contaminants; and time required to perform the overall work. Next, the sulfolane remediation strategy was discussed including advantages and results of the investigation. Last, the results of the project were presented. It was found that there were no recordable safety incidents and that all remedial objectives were achieved. tabs., figs.

  4. Large-scale road safety programmes in low- and middle-income countries: an opportunity to generate evidence.

    Science.gov (United States)

    Hyder, Adnan A; Allen, Katharine A; Peters, David H; Chandran, Aruna; Bishai, David

    2013-01-01

    The growing burden of road traffic injuries, which kill over 1.2 million people yearly, falls mostly on low- and middle-income countries (LMICs). Despite this, evidence generation on the effectiveness of road safety interventions in LMIC settings remains scarce. This paper explores a scientific approach for evaluating road safety programmes in LMICs and introduces such a road safety multi-country initiative, the Road Safety in 10 Countries Project (RS-10). By building on existing evaluation frameworks, we develop a scientific approach for evaluating large-scale road safety programmes in LMIC settings. This also draws on '13 lessons' of large-scale programme evaluation: defining the evaluation scope; selecting study sites; maintaining objectivity; developing an impact model; utilising multiple data sources; using multiple analytic techniques; maximising external validity; ensuring an appropriate time frame; the importance of flexibility and a stepwise approach; continuous monitoring; providing feedback to implementers, policy-makers; promoting the uptake of evaluation results; and understanding evaluation costs. The use of relatively new approaches for evaluation of real-world programmes allows for the production of relevant knowledge. The RS-10 project affords an important opportunity to scientifically test these approaches for a real-world, large-scale road safety evaluation and generate new knowledge for the field of road safety.

  5. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  6. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  7. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  8. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  9. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  10. Large-scale wind energy and birds in Friesland: A survey of the possible hindrance for birds of the project Windstreek: Environmental Effect Report Windstreek, Base report A

    International Nuclear Information System (INIS)

    Van den Bergh, L.M.J.; Spaans, A.L.

    1994-01-01

    For several alternative sites in the Dutch province Friesland, where it is planned to install 500 kW and 1 MW wind turbines on a large scale (the Windstreek project), the possible hindrance for breeding birds, stopping birds and migratory birds has been investigated. First an overview is given of basic data from the literature and experts in the field with regard to the possible hindrances for birds and the situation of the birds in the areas considered. Next, several criteria are used to determine which locations are the most suitable sites for placing the wind turbines. For the 500 kW wind turbines (worst case) the suitability differs strongly per site. The variation is the largest for the option of decentralized sites. Realization of 200 MW large-scale generated wind power is the least unfavourable option, regarding the effects on birds, by means of decentralized sites. The super-structure option Afsluitdijk-Waddenkust and the Project Memorandum alternative are for that matter the least favourable options. The combined alternative and the super-structure Middelzeetrechter take the in-between position. The 1MW variants of the alternatives score better than the similar 500 kW options. The hindrance for birds can be reduced further by applying two-bladed rotors instead of three-bladed rotors and to adjust the site. 20 figs., 9 tabs., 2 appendices, 42 refs

  11. Large Scale Simulation Platform for NODES Validation Study

    Energy Technology Data Exchange (ETDEWEB)

    Sotorrio, P. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Qin, Y. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Min, L. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2017-04-27

    This report summarizes the Large Scale (LS) simulation platform created for the Eaton NODES project. The simulation environment consists of both wholesale market simulator and distribution simulator and includes the CAISO wholesale market model and a PG&E footprint of 25-75 feeders to validate the scalability under a scenario of 33% RPS in California with additional 17% of DERS coming from distribution and customers. The simulator can generate hourly unit commitment, 5-minute economic dispatch, and 4-second AGC regulation signals. The simulator is also capable of simulating greater than 10k individual controllable devices. Simulated DERs include water heaters, EVs, residential and light commercial HVAC/buildings, and residential-level battery storage. Feeder-level voltage regulators and capacitor banks are also simulated for feeder-level real and reactive power management and Vol/Var control.

  12. MicroEcos: Micro-Scale Explorations of Large-Scale Late Pleistocene Ecosystems

    Science.gov (United States)

    Gellis, B. S.

    2017-12-01

    Pollen data can inform the reconstruction of early-floral environments by providing data for artistic representations of what early-terrestrial ecosystems looked like, and how existing terrestrial landscapes have evolved. For example, what did the Bighorn Basin look like when large ice sheets covered modern Canada, the Yellowstone Plateau had an ice cap, and the Bighorn Mountains were mantled with alpine glaciers? MicroEcos is an immersive, multimedia project that aims to strengthen human-nature connections through the understanding and appreciation of biological ecosystems. Collected pollen data elucidates flora that are visible in the fossil record - associated with the Late-Pleistocene - and have been illustrated and described in botanical literature. It aims to make scientific data accessible and interesting to all audiences through a series of interactive-digital sculptures, large-scale photography and field-based videography. While this project is driven by scientific data, it is rooted in deeply artistic and outreach-based practices, which include broad artistic practices, e.g.: digital design, illustration, photography, video and sound design. Using 3D modeling and printing technology MicroEcos centers around a series of 3D-printed models of the Last Canyon rock shelter on the Wyoming and Montana border, Little Windy Hill pond site in Wyoming's Medicine Bow National Forest, and Natural Trap Cave site in Wyoming's Big Horn Basin. These digital, interactive-3D sculpture provide audiences with glimpses of three-dimensional Late-Pleistocene environments, and helps create dialogue of how grass, sagebrush, and spruce based ecosystems form. To help audiences better contextualize how MicroEcos bridges notions of time, space, and place, modern photography and videography of the Last Canyon, Little Windy Hill and Natural Trap Cave sites surround these 3D-digital reconstructions.

  13. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  14. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  15. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  16. Infrastructure for Large-Scale Quality-Improvement Projects: Early Lessons from North Carolina Improving Performance in Practice

    Science.gov (United States)

    Newton, Warren P.; Lefebvre, Ann; Donahue, Katrina E.; Bacon, Thomas; Dobson, Allen

    2010-01-01

    Introduction: Little is known regarding how to accomplish large-scale health care improvement. Our goal is to improve the quality of chronic disease care in all primary care practices throughout North Carolina. Methods: Methods for improvement include (1) common quality measures and shared data system; (2) rapid cycle improvement principles; (3)…

  17. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  18. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  19. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  20. Scaling the quality of clinical audit projects: a pilot study.

    Science.gov (United States)

    Millard, A D

    1999-06-01

    To pilot the development of a scale measuring the quality of audit projects through audit project reports. Statements about clinical audit projects were selected from existing instruments, assessing the quality of clinical audit projects, to form a Likert scale. The audit facilitators were based in Scottish health boards and trusts. The participants were audit facilitators known to have over 2 years experience of supporting clinical audit. The response at first test was 11 out of 14 and at the second test it was 27 out of 46. The draft scale was tested by 27 audit facilitators who expressed their strength of agreement or disagreement with each statement for three reports. Validity was assessed by test-re-test, item-total, and total-global indicator correlations. Of the 20 statements, 15 had satisfactory correlations with scale totals. Scale totals had good correlations with global indicators. Test-re-test correlation was modest. The wide range of responses means further research is needed to measure the consistency of audit facilitators' interpretations, perhaps comparing a trained group with an untrained group. There may be a need for a separate scale for reaudits. Educational impact is distinct from project impact generally. It may be more meaningful to treat the selection of projects and aims, methodology and impact separately as subscales and take a project profiling approach rather than attempting to produce a global quality index.

  1. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  2. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  3. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  4. Why are small scale demonstration projects important for the future of CCS?

    Science.gov (United States)

    Leetaru, H. E.; Bauer, R. A.; McBride, J. H.; Freiburg, J. T.; Greenberg, S. E.

    2017-12-01

    Carbon Capture and Storage (CCS) is moving toward large-scale commercial projects and the U.S. Department of Energy is supporting a new CarbonSAFE initiative to assist in the development of a 50 million tonnes geologic storage project. This type of large commercial CCS project will rely on lessons learned from smaller DOE CCS projects such as the Illinois Basin-Decatur Project (IBDP) and the Illinois Industrial Carbon Capture and Storage (IL-ICCS) Project located one mile north of IBDP. Over a three year period ending 2014 IBDP injected almost one million tonnes of CO2 into the Mt. Simon Sandstone, and the IL-ICCS project which commenced injection in 2017 will inject another four million tonnes over a four year period. The IBDP has recorded microseismic events within the study area through continuous downhole seismic monitoring before, during, and after injection. Monitoring shows that microseismicity increased during injection and originate not only in the Cambrian Mt. Simon Sandstone (the target reservoir), but also in the underlying Argenta clastics and deeper Precambrian igneous rocks as SW-NE elongate clusters aligned in strike to the maximum in situ stress direction. An interpretation of site 3D seismic reflection data suggests that much of the microseismicity is proximal to interpreted faults that extend from the basement up into the lowermost Mt. Simon strata. The faults proximally associated with microseismic activity are oriented parallel with respect to the maximum stress direction. The seismic monitoring of the IBDP indicate that the assessment of induced seismic potential associated with commercial-scale CCS requires not only identification of a suitable reservoir and its petrophysical characteristics, but also the extent and orientation of existing faults and their relation to regional stress orientation. Assessment of regional fault orientation using 3D seismic reflection data can be extremely useful to understanding the risks of induced seismicity

  5. Large-Scale Science Observatories: Building on What We Have Learned from USArray

    Science.gov (United States)

    Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.

    2015-12-01

    With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community

  6. LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY

    International Nuclear Information System (INIS)

    SCHULTZ, K.R.; BROWN, L.C.; BESENBRUCH, G.E.; HAMILTON, C.J.

    2003-01-01

    OAK B202 LARGE-SCALE PRODUCTION OF HYDROGEN BY NUCLEAR ENERGY FOR THE HYDROGEN ECONOMY. The ''Hydrogen Economy'' will reduce petroleum imports and greenhouse gas emissions. However, current commercial hydrogen production processes use fossil fuels and releases carbon dioxide. Hydrogen produced from nuclear energy could avoid these concerns. The authors have recently completed a three-year project for the US Department of Energy whose objective was to ''define an economically feasible concept for production of hydrogen, by nuclear means, using an advanced high-temperature nuclear reactor as the energy source''. Thermochemical water-splitting, a chemical process that accomplishes the decomposition of water into hydrogen and oxygen, met this objective. The goal of the first phase of this study was to evaluate thermochemical processes which offer the potential for efficient, cost-effective, large-scale production of hydrogen and to select one for further detailed consideration. The authors selected the Sulfur-Iodine cycle, In the second phase, they reviewed all the basic reactor types for suitability to provide the high temperature heat needed by the selected thermochemical water splitting cycle and chose the helium gas-cooled reactor. In the third phase they designed the chemical flowsheet for the thermochemical process and estimated the efficiency and cost of the process and the projected cost of producing hydrogen. These results are summarized in this paper

  7. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  8. Ecogrid EU: a large scale smart grids demonstration of real time market-based integration of numerous small der and DR

    NARCIS (Netherlands)

    Ding, Y.; Nyeng, P.; Ostergaard, J.; Trong, M.D.; Pineda, S.; Kok, K.; Huitema, G.B.; Grande, O.S.

    2012-01-01

    This paper provides an overview of the Ecogrid EU project, which is a large-scale demonstration project on the Danish island Bornholm. It provides Europe a fast track evolution towards smart grid dissemination and deployment in the distribution network. Objective of Ecogrid EU is to illustrate that

  9. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  10. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  11. EU-Great! European guide and recommendations for the combined funding of large-scale RDI initiatives

    NARCIS (Netherlands)

    Heide, M.J.L. de; Butter, M.

    2016-01-01

    This report results from the H2020 project entitled “EU-GREAT!”.1 Objective of the report is to assess how to finance the set-up and / or continuation of what is defined as Large-scale RDI initiatives (LSIs, see Box 1). The report analyses the potential use of different public and private funding

  12. Large-scale tropospheric transport in the Chemistry-Climate Model Initiative (CCMI) simulations

    Science.gov (United States)

    Orbe, Clara; Yang, Huang; Waugh, Darryn W.; Zeng, Guang; Morgenstern, Olaf; Kinnison, Douglas E.; Lamarque, Jean-Francois; Tilmes, Simone; Plummer, David A.; Scinocca, John F.; Josse, Beatrice; Marecal, Virginie; Jöckel, Patrick; Oman, Luke D.; Strahan, Susan E.; Deushi, Makoto; Tanaka, Taichu Y.; Yoshida, Kohei; Akiyoshi, Hideharu; Yamashita, Yousuke; Stenke, Andreas; Revell, Laura; Sukhodolov, Timofei; Rozanov, Eugene; Pitari, Giovanni; Visioni, Daniele; Stone, Kane A.; Schofield, Robyn; Banerjee, Antara

    2018-05-01

    Understanding and modeling the large-scale transport of trace gases and aerosols is important for interpreting past (and projecting future) changes in atmospheric composition. Here we show that there are large differences in the global-scale atmospheric transport properties among the models participating in the IGAC SPARC Chemistry-Climate Model Initiative (CCMI). Specifically, we find up to 40 % differences in the transport timescales connecting the Northern Hemisphere (NH) midlatitude surface to the Arctic and to Southern Hemisphere high latitudes, where the mean age ranges between 1.7 and 2.6 years. We show that these differences are related to large differences in vertical transport among the simulations, in particular to differences in parameterized convection over the oceans. While stronger convection over NH midlatitudes is associated with slower transport to the Arctic, stronger convection in the tropics and subtropics is associated with faster interhemispheric transport. We also show that the differences among simulations constrained with fields derived from the same reanalysis products are as large as (and in some cases larger than) the differences among free-running simulations, most likely due to larger differences in parameterized convection. Our results indicate that care must be taken when using simulations constrained with analyzed winds to interpret the influence of meteorology on tropospheric composition.

  13. Large-scale tropospheric transport in the Chemistry–Climate Model Initiative (CCMI simulations

    Directory of Open Access Journals (Sweden)

    C. Orbe

    2018-05-01

    Full Text Available Understanding and modeling the large-scale transport of trace gases and aerosols is important for interpreting past (and projecting future changes in atmospheric composition. Here we show that there are large differences in the global-scale atmospheric transport properties among the models participating in the IGAC SPARC Chemistry–Climate Model Initiative (CCMI. Specifically, we find up to 40 % differences in the transport timescales connecting the Northern Hemisphere (NH midlatitude surface to the Arctic and to Southern Hemisphere high latitudes, where the mean age ranges between 1.7 and 2.6 years. We show that these differences are related to large differences in vertical transport among the simulations, in particular to differences in parameterized convection over the oceans. While stronger convection over NH midlatitudes is associated with slower transport to the Arctic, stronger convection in the tropics and subtropics is associated with faster interhemispheric transport. We also show that the differences among simulations constrained with fields derived from the same reanalysis products are as large as (and in some cases larger than the differences among free-running simulations, most likely due to larger differences in parameterized convection. Our results indicate that care must be taken when using simulations constrained with analyzed winds to interpret the influence of meteorology on tropospheric composition.

  14. LACOMERA - large scale experiments on core degradation, melt retention and coolability at the Forschungszentrum Karslruhe

    International Nuclear Information System (INIS)

    Miassoedov, A.; Alsmeyer, H.; Meyer, L.

    2003-01-01

    The LACOMERA project at the Forschungszentrum Karlsruhe is a 3 year shared-cost action within the Fifth Framework Programme which started in September 2002. The overall objectives of the LACOMERA project are to provide research institutions from the EU member countries and associated states access to large scale experimental facilities at the Forschungszentrum Karlsruhe which shall be used to increase the knowledge of the quenching of a degraded core and regaining melt coolability in the reactor pressure vessel, of possible melt dispersion to the cavity, of molten core concrete interaction and of ex-vessel melt coolability. One major aspect is to understand how these events affect the safety of European reactors so as to lead to soundly-based accident management procedures. The project will bring together interested partners of different European member states in the area of severe accident analysis and control, with the goal to increase the public confidence in the use of nuclear energy. Moreover, partners from the newly associated states should be included as far as possible, and therefore the needs of Eastern, as well as Western, reactors will be considered in LACOMERA project. The project offers a unique opportunity to get involved in the networks and activities supporting VVER safety, and for Eastern experts to get an access to large scale experimental facilities in a Western research organisation to improve understanding of material properties and core behaviour under severe accident conditions. As a result of the first call for proposals a project on air ingress test in the QUENCH facility has been selected. A second call for proposals is opened with a deadline of 31 December 2003. (author)

  15. Geophysical mapping of complex glaciogenic large-scale structures

    DEFF Research Database (Denmark)

    Høyer, Anne-Sophie

    2013-01-01

    This thesis presents the main results of a four year PhD study concerning the use of geophysical data in geological mapping. The study is related to the Geocenter project, “KOMPLEKS”, which focuses on the mapping of complex, large-scale geological structures. The study area is approximately 100 km2...... data types and co-interpret them in order to improve our geological understanding. However, in order to perform this successfully, methodological considerations are necessary. For instance, a structure indicated by a reflection in the seismic data is not always apparent in the resistivity data...... information) can be collected. The geophysical data are used together with geological analyses from boreholes and pits to interpret the geological history of the hill-island. The geophysical data reveal that the glaciotectonic structures truncate at the surface. The directions of the structures were mapped...

  16. Managing the continuum certainty, uncertainty, unpredictability in large engineering projects

    CERN Document Server

    Caron, Franco

    2013-01-01

    The brief will describe how to develop a risk analysis applied to a project , through a sequence of steps: risk management planning, risk identification, risk classification, risk assessment, risk quantification, risk response planning, risk monitoring and control, process close out and lessons learning. The project risk analysis and management process will be applied to large engineering projects, in particular related to the oil and gas industry. The brief will address the overall range of possible events affecting the project moving from certainty (project issues) through uncertainty (project risks) to unpredictability (unforeseeable events), considering both negative and positive events. Some quantitative techniques (simulation, event tree, Bayesian inference, etc.) will be used to develop risk quantification. The brief addresses a typical subject in the area of project management, with reference to large engineering projects concerning the realization of large plants and infrastructures. These projects a...

  17. Comprehensive Evaluation of Large Infrastructure Project Plan with ANP

    Institute of Scientific and Technical Information of China (English)

    HAN Chuan-feng; CHEN Jian-ye

    2005-01-01

    Analytic Network Process(ANP) was used in comprehensive evaluation of large infrastructure project plan. A model including social economy, ecological environment, and resources was established with ANP method. The evaluation pattern of hierarchy structure and comprehensive evaluation method for quantity and quality of large infrastructure project were put forward, which provides an effective way to evaluate the large infrastructure project plan. Quantitative analysis indicated that the internal dependence relation of hierarchy structure has influence on ranking results of plan. It is suggested that considering the internal relation can helps managers make effective decisions.

  18. Simulation research on the process of large scale ship plane segmentation intelligent workshop

    Science.gov (United States)

    Xu, Peng; Liao, Liangchuang; Zhou, Chao; Xue, Rui; Fu, Wei

    2017-04-01

    Large scale ship plane segmentation intelligent workshop is a new thing, and there is no research work in related fields at home and abroad. The mode of production should be transformed by the existing industry 2.0 or part of industry 3.0, also transformed from "human brain analysis and judgment + machine manufacturing" to "machine analysis and judgment + machine manufacturing". In this transforming process, there are a great deal of tasks need to be determined on the aspects of management and technology, such as workshop structure evolution, development of intelligent equipment and changes in business model. Along with them is the reformation of the whole workshop. Process simulation in this project would verify general layout and process flow of large scale ship plane section intelligent workshop, also would analyze intelligent workshop working efficiency, which is significant to the next step of the transformation of plane segmentation intelligent workshop.

  19. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  20. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  1. The XChemExplorer graphical workflow tool for routine or large-scale protein-ligand structure determination.

    Science.gov (United States)

    Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Collins, Patrick; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian; von Delft, Frank

    2017-03-01

    XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein-ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallographic software packages such as CCP4 [Winn et al. (2011), Acta Cryst. D67, 235-242] or PHENIX [Adams et al. (2010), Acta Cryst. D66, 213-221] have entrenched the paradigm that a `project' is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects.

  2. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  3. An establishment on the hazard mitigation system of large scale landslides for Zengwen reservoir watershed management in Taiwan

    Science.gov (United States)

    Tsai, Kuang-Jung; Lee, Ming-Hsi; Chen, Yie-Ruey; Huang, Meng-Hsuan; Yu, Chia-Ching

    2016-04-01

    Extremely heavy rainfall with accumulated rainfall amount more than 2900mm within continuous 3 day event occurred at southern Taiwan has been recognized as a serious natural hazard caused by Morakot typhoon in august, 2009. Very destructive large scale landslides and debris flows were induced by this heavy rainfall event. According to the satellite image processing and monitoring project was conducted by Soil & Water Conservation Bureau after Morakot typhoon. More than 10904 sites of landslide with total sliding area of 18113 ha were significantly found by this project. Also, the field investigation on all landslide areas were executed by this research on the basis of disaster type, scale and location related to the topographic condition, colluvium soil characteristics, bedrock formation and geological structure after Morakot hazard. The mechanism, characteristics and behavior of this large scale landslide combined with debris flow disasters are analyzed and Investigated to rule out the interaction of factors concerned above and identify the disaster extent of rainfall induced landslide during the period of this study. In order to reduce the disaster risk of large scale landslide and debris flow, the adaption strategy of hazard mitigation system should be set up as soon as possible and taken into consideration of slope land conservation, landslide control countermeasure planning, disaster database establishment, environment impact analysis and disaster risk assessment respectively. As a result, this 3-year research has been focused on the field investigation by using GPS/GIS/RS integration, mechanism and behavior study regarding to the rainfall induced landslide occurrence, disaster database and hazard mitigation system establishment. In fact, this project has become an important issue which was seriously concerned by the government and people live in Taiwan. Hopefully, all results come from this research can be used as a guidance for the disaster prevention and

  4. Lessons from a large-scale assessment: Results from conceptual inventories

    Directory of Open Access Journals (Sweden)

    Beth Thacker

    2014-07-01

    Full Text Available We report conceptual inventory results of a large-scale assessment project at a large university. We studied the introduction of materials and instructional methods informed by physics education research (PER (physics education research-informed materials into a department where most instruction has previously been traditional and a significant number of faculty are hesitant, ambivalent, or even resistant to the introduction of such reforms. Data were collected in all of the sections of both the large algebra- and calculus-based introductory courses for a number of years employing commonly used conceptual inventories. Results from a small PER-informed, inquiry-based, laboratory-based class are also reported. Results suggest that when PER-informed materials are introduced in the labs and recitations, independent of the lecture style, there is an increase in students’ conceptual inventory gains. There is also an increase in the results on conceptual inventories if PER-informed instruction is used in the lecture. The highest conceptual inventory gains were achieved by the combination of PER-informed lectures and laboratories in large class settings and by the hands-on, laboratory-based, inquiry-based course taught in a small class setting.

  5. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  6. How uncertainty in socio-economic variables affects large-scale transport model forecasts

    DEFF Research Database (Denmark)

    Manzo, Stefano; Nielsen, Otto Anker; Prato, Carlo Giacomo

    2015-01-01

    A strategic task assigned to large-scale transport models is to forecast the demand for transport over long periods of time to assess transport projects. However, by modelling complex systems transport models have an inherent uncertainty which increases over time. As a consequence, the longer...... the period forecasted the less reliable is the forecasted model output. Describing uncertainty propagation patterns over time is therefore important in order to provide complete information to the decision makers. Among the existing literature only few studies analyze uncertainty propagation patterns over...

  7. Plans for Embedding ICTs into Teaching and Learning through a Large-Scale Secondary Education Reform in the Country of Georgia

    Science.gov (United States)

    Richardson, Jayson W.; Sales, Gregory; Sentocnik, Sonja

    2015-01-01

    Integrating ICTs into international development projects is common. However, focusing on how ICTs support leading, teaching, and learning is often overlooked. This article describes a team's approach to technology integration into the design of a large-scale, five year, teacher and leader professional development project in the country of Georgia.…

  8. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  9. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  10. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  11. CERN's Large Hadron Collider project

    Science.gov (United States)

    Fearnley, Tom A.

    1997-03-01

    The paper gives a brief overview of CERN's Large Hadron Collider (LHC) project. After an outline of the physics motivation, we describe the LHC machine, interaction rates, experimental challenges, and some important physics channels to be studied. Finally we discuss the four experiments planned at the LHC: ATLAS, CMS, ALICE and LHC-B.

  12. CERN's Large Hadron Collider project

    International Nuclear Information System (INIS)

    Fearnley, Tom A.

    1997-01-01

    The paper gives a brief overview of CERN's Large Hadron Collider (LHC) project. After an outline of the physics motivation, we describe the LHC machine, interaction rates, experimental challenges, and some important physics channels to be studied. Finally we discuss the four experiments planned at the LHC: ATLAS, CMS, ALICE and LHC-B

  13. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  14. Can we combine symptom scales for collaborative research projects?

    LENUS (Irish Health Repository)

    Lyne, John P

    2012-02-01

    Collaborative research projects have the potential to answer important research questions, which may otherwise require huge resources, funding, and time to complete. There are several scales for measuring psychotic symptoms in schizophrenia and other psychotic disorders, with the Scale for Assessment of Positive Symptoms (SAPS), Scale for Assessment of Negative Symptoms (SANS), Positive and Negative Symptom Scale (PANSS), and the Brief Psychiatric Rating Scale (BPRS) being among the most commonly used. High quality research efforts have used these three scales in different projects, and in order to merge study efforts, some means of combining data from these scales may be necessary. We reviewed correlations in published studies for these three scales, finding them to be highly correlated, however on comparison of the three scales there were considerable clinical differences between them. The paper discusses potential methods for combining the scales in collaborative research, including use of the recently developed standardised remission criteria for schizophrenia.

  15. Modular, object-oriented redesign of a large-scale Monte Carlo neutron transport program

    International Nuclear Information System (INIS)

    Moskowitz, B.S.

    2000-01-01

    This paper describes the modular, object-oriented redesign of a large-scale Monte Carlo neutron transport program. This effort represents a complete 'white sheet of paper' rewrite of the code. In this paper, the motivation driving this project, the design objectives for the new version of the program, and the design choices and their consequences will be discussed. The design itself will also be described, including the important subsystems as well as the key classes within those subsystems

  16. Nutrient removal from Chinese coastal waters by large-scale seaweed aquaculture

    KAUST Repository

    Xiao, Xi

    2017-04-21

    China is facing intense coastal eutrophication. Large-scale seaweed aquaculture in China is popular, now accounting for over 2/3\\'s of global production. Here, we estimate the nutrient removal capability of large-scale Chinese seaweed farms to determine its significance in mitigating eutrophication. We combined estimates of yield and nutrient concentration of Chinese seaweed aquaculture to quantify that one hectare of seaweed aquaculture removes the equivalent nutrient inputs entering 17.8 ha for nitrogen and 126.7 ha for phosphorus of Chinese coastal waters, respectively. Chinese seaweed aquaculture annually removes approximately 75,000 t nitrogen and 9,500 t phosphorus. Whereas removal of the total N inputs to Chinese coastal waters requires a seaweed farming area 17 times larger than the extant area, one and a half times more of the seaweed area would be able to remove close to 100% of the P inputs. With the current growth rate of seaweed aquaculture, we project this industry will remove 100% of the current phosphorus inputs to Chinese coastal waters by 2026. Hence, seaweed aquaculture already plays a hitherto unrealized role in mitigating coastal eutrophication, a role that may be greatly expanded with future growth of seaweed aquaculture.

  17. Nutrient removal from Chinese coastal waters by large-scale seaweed aquaculture

    KAUST Repository

    Xiao, Xi; Agusti, Susana; Lin, Fang; Li, Ke; Pan, Yaoru; Yu, Yan; Zheng, Yuhan; Wu, Jiaping; Duarte, Carlos M.

    2017-01-01

    China is facing intense coastal eutrophication. Large-scale seaweed aquaculture in China is popular, now accounting for over 2/3's of global production. Here, we estimate the nutrient removal capability of large-scale Chinese seaweed farms to determine its significance in mitigating eutrophication. We combined estimates of yield and nutrient concentration of Chinese seaweed aquaculture to quantify that one hectare of seaweed aquaculture removes the equivalent nutrient inputs entering 17.8 ha for nitrogen and 126.7 ha for phosphorus of Chinese coastal waters, respectively. Chinese seaweed aquaculture annually removes approximately 75,000 t nitrogen and 9,500 t phosphorus. Whereas removal of the total N inputs to Chinese coastal waters requires a seaweed farming area 17 times larger than the extant area, one and a half times more of the seaweed area would be able to remove close to 100% of the P inputs. With the current growth rate of seaweed aquaculture, we project this industry will remove 100% of the current phosphorus inputs to Chinese coastal waters by 2026. Hence, seaweed aquaculture already plays a hitherto unrealized role in mitigating coastal eutrophication, a role that may be greatly expanded with future growth of seaweed aquaculture.

  18. Investigations on efficiency of the emergency cooling by means of large-scale tests

    International Nuclear Information System (INIS)

    Hicken, E.F.

    1982-01-01

    The RSK guidelines contain the maximum permissible loads (max. cladding tube temperature 1200 0 C, max. Zr/H 2 O-reaction of 1% Zr). Their observance implies that only a small number of fuel rods fail. The safety research has to produce the evidence that the limiting loads are not exceeded. The analytical investigations on the emergency cooling behaviour could so far only be verified in scaled-down test facilities. After about 100 tests in four different large-scale test facilities the experimental investigations on the blow-down phase for large cracks are finished in the main. With the refill- and flood process the systems behaviour in scaled down test stands, the multidimensional conditions in the reactor pressure vessel can, however, only be simulated on the original scale. More experiments are planned as part of the 2D/3D-project (CCTF , SCTF, UPTF) and as part of the PKL-tests, so that more than 200 tests in seven plants will be available then. As to the small cracks the physical phenomena are known. The current investigations are used to increase the reliability of statement. After their being finished approximately 300 tests in seven plants will be available. (orig./HP) [de

  19. SCALES: SEVIRI and GERB CaL/VaL area for large-scale field experiments

    Science.gov (United States)

    Lopez-Baeza, Ernesto; Belda, Fernando; Bodas, Alejandro; Crommelynck, Dominique; Dewitte, Steven; Domenech, Carlos; Gimeno, Jaume F.; Harries, John E.; Jorge Sanchez, Joan; Pineda, Nicolau; Pino, David; Rius, Antonio; Saleh, Kauzar; Tarruella, Ramon; Velazquez, Almudena

    2004-02-01

    The main objective of the SCALES Project is to exploit the unique opportunity offered by the recent launch of the first European METEOSAT Second Generation geostationary satellite (MSG-1) to generate and validate new radiation budget and cloud products provided by the GERB (Geostationary Earth Radiation Budget) instrument. SCALES" specific objectives are: (i) definition and characterization of a large reasonably homogeneous area compatible to GERB pixel size (around 50 x 50 km2), (ii) validation of GERB TOA radiances and fluxes derived by means of angular distribution models, (iii) development of algorithms to estimate surface net radiation from GERB TOA measurements, and (iv) development of accurate methodologies to measure radiation flux divergence and analyze its influence on the thermal regime and dynamics of the atmosphere, also using GERB data. SCALES is highly innovative: it focuses on a new and unique space instrument and develops a new specific validation methodology for low resolution sensors that is based on the use of a robust reference meteorological station (Valencia Anchor Station) around which 3D high resolution meteorological fields are obtained from the MM5 Meteorological Model. During the 1st GERB Ground Validation Campaign (18th-24th June, 2003), CERES instruments on Aqua and Terra provided additional radiance measurements to support validation efforts. CERES instruments operated in the PAPS mode (Programmable Azimuth Plane Scanning) focusing the station. Ground measurements were taken by lidar, sun photometer, GPS precipitable water content, radiosounding ascents, Anchor Station operational meteorological measurements at 2m and 15m., 4 radiation components at 2m, and mobile stations to characterize a large area. In addition, measurements during LANDSAT overpasses on June 14th and 30th were also performed. These activities were carried out within the GIST (GERB International Science Team) framework, during GERB Commissioning Period.

  20. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  1. Sampling design in large-scale vegetation studies: Do not sacrifice ecological thinking to statistical purism!

    Czech Academy of Sciences Publication Activity Database

    Roleček, J.; Chytrý, M.; Hájek, Michal; Lvončík, S.; Tichý, L.

    2007-01-01

    Roč. 42, - (2007), s. 199-208 ISSN 1211-9520 R&D Projects: GA AV ČR IAA6163303; GA ČR(CZ) GA206/05/0020 Grant - others:GA AV ČR(CZ) KJB601630504 Institutional research plan: CEZ:AV0Z60050516 Keywords : Ecological methodology * Large-scale vegetation patterns * Macroecology Subject RIV: EF - Botanics Impact factor: 1.133, year: 2007

  2. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  3. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  4. Strategic Management of Large Projects

    Institute of Scientific and Technical Information of China (English)

    WangYingluo; LiuYi; LiYuan

    2004-01-01

    The strategic management of large projects is both theoretically and practically important. Some scholars have advanced flexible strategy theory in China. The difference of strategic flexibility and flexible strategy is pointed out. The supporting system and characteristics of flexible strategy are analyzed. The changes of flexible strategy and integration of strategic management are discussed.

  5. Modified scaling function projective synchronization of chaotic systems

    International Nuclear Information System (INIS)

    Xu Yu-Hua; Zhou Wu-Neng; Fang Jian-An

    2011-01-01

    This paper investigates a kind of modified scaling function projective synchronization of uncertain chaotic systems using an adaptive controller. The given scaling function in the new method can be an equilibrium point, a periodic orbit, or even a chaotic attractor in the phase space. Based on LaSalle's invariance set principle, the adaptive control law is derived to make the states of two chaotic systems function projective synchronized. Some numerical examples are also given to show the effectiveness of the proposed method. (general)

  6. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  7. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  8. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  9. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Price, C.R.; Elmer, D.C.

    1995-01-01

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  10. Large scale solar thermal power for the European Union{exclamation_point}

    Energy Technology Data Exchange (ETDEWEB)

    Anon.

    1997-06-01

    Southern Europe, on the edge of the sunbelt, represents the ideal location for solar thermal generated power. Last year. SAWIE reported on the THESEUS project, a proposed 50 MWe solar thermal power plant for Frangokastello, southern Crete, which was submitted for support under the European Union`s THERMIE Programme. Funding was approved for the design phase for this innovative power plant, the first large-scale SEGS-style plant on European soil, at the end of last year. However, the THERMIE Programme also provided support for another Southern European plant, proposed by Colon Solar for Huelva in Southern Spain. Whilst hurdles remain to be overcome before both plants are built and commissioned, there is an excellent chance that by the start of the new Millennium, the solar collectors from these two plants could be generating over half a million MWh of energy a year. SAWIE compares the two projects. (author)

  11. Research into condensed matter using large-scale apparatus. Physics, chemistry, biology. Progress report 1992-1995. Summarizing reports

    International Nuclear Information System (INIS)

    1996-01-01

    Activities for research into condensed matter have been supported by the German BMBF with approx. 102 million Deutschmarks in the years 1992 through 1995. These financial means have been distributed among 314 research projects in the fields of physics, chemistry, biology, materials science, and other fields, which all rely on the intensive utilization of photon and particle beams generated in large-scale apparatus of institutions for basic research. The volume in hand first gives information of a general kind and statistical data on the distribution of financial means, for a number of priority research projects. The project reports are summarizing reports on the progress achieved in the various projects. (CB) [de

  12. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  13. Large scale applicability of a Fully Adaptive Non-Intrusive Spectral Projection technique: Sensitivity and uncertainty analysis of a transient

    International Nuclear Information System (INIS)

    Perkó, Zoltán; Lathouwers, Danny; Kloosterman, Jan Leen; Hagen, Tim van der

    2014-01-01

    Highlights: • Grid and basis adaptive Polynomial Chaos techniques are presented for S and U analysis. • Dimensionality reduction and incremental polynomial order reduce computational costs. • An unprotected loss of flow transient is investigated in a Gas Cooled Fast Reactor. • S and U analysis is performed with MC and adaptive PC methods, for 42 input parameters. • PC accurately estimates means, variances, PDFs, sensitivities and uncertainties. - Abstract: Since the early years of reactor physics the most prominent sensitivity and uncertainty (S and U) analysis methods in the nuclear community have been adjoint based techniques. While these are very effective for pure neutronics problems due to the linearity of the transport equation, they become complicated when coupled non-linear systems are involved. With the continuous increase in computational power such complicated multi-physics problems are becoming progressively tractable, hence affordable and easily applicable S and U analysis tools also have to be developed in parallel. For reactor physics problems for which adjoint methods are prohibitive Polynomial Chaos (PC) techniques offer an attractive alternative to traditional random sampling based approaches. At TU Delft such PC methods have been studied for a number of years and this paper presents a large scale application of our Fully Adaptive Non-Intrusive Spectral Projection (FANISP) algorithm for performing the sensitivity and uncertainty analysis of a Gas Cooled Fast Reactor (GFR) Unprotected Loss Of Flow (ULOF) transient. The transient was simulated using the Cathare 2 code system and a fully detailed model of the GFR2400 reactor design that was investigated in the European FP7 GoFastR project. Several sources of uncertainty were taken into account amounting to an unusually high number of stochastic input parameters (42) and numerous output quantities were investigated. The results show consistently good performance of the applied adaptive PC

  14. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  15. Implementing ergonomics in large-scale engineering design. Communicating and negotiating requirements in an organizational context

    Energy Technology Data Exchange (ETDEWEB)

    Wulff, Ingrid Anette

    1997-12-31

    This thesis investigates under what conditions ergonomic criteria are being adhered to in engineering design. Specifically, the thesis discusses (1) the ergonomic criteria implementation process, (2) designer recognition of ergonomic requirements and the organization of ergonomics, (3) issues important for the implementation of ergonomic requirements, (4) how different means for experience transfer in design and operation are evaluated by the designers, (5) how designers ensure usability of offshore work places, and (6) how project members experience and cope with the large amount of documentation in large-scale engineering. 84 refs., 11 figs., 18 tabs.

  16. Large-scale Health Information Database and Privacy Protection.

    Science.gov (United States)

    Yamamoto, Ryuichi

    2016-09-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients' medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  17. Large scale facilities for synchrotron radiation and neutrons. New possibilities for Denmark

    International Nuclear Information System (INIS)

    Feidenhans'l, R.

    2003-02-01

    New large-scale facilities for investigating the structure and dynamics of matter and biological systems are currently under construction or are being planned at many locations around Europe. These facilities are likely to have a large impact on the science landscape in Europe, and more locally, as some will be located in the immediate neighbourhood of Denmark. The facilities will enable new analytical capabilities of matter, which will push the frontiers of science and technology in many areas of research of importance for Denmark. This report provides an overview of the new facilities (including very rough funding estimates) of importance for Danish science, describes possible ways of engaging in the various projects and identifies potential user groups. The report also includes a summary of the status of the current use of existing facilities as a benchmark. This is done by showing different cross sections through this multiple parameter space of activities. The intention is that the report should serve as guideline for making a long-term national strategy for the exploitation of large-scale facilities in order to help to optimise their impact on science, education and industry within Denmark, and to safeguard the prominent internationally leading role that Denmark has in this area. (LN)

  18. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  19. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  20. Hydrologic test plans for large-scale, multiple-well tests in support of site characterization at Hanford, Washington

    International Nuclear Information System (INIS)

    Rogers, P.M.; Stone, R.; Lu, A.H.

    1985-01-01

    The Basalt Waste Isolation Project is preparing plans for tests and has begun work on some tests that will provide the data necessary for the hydrogeologic characterization of a site located on a United States government reservation at Hanford, Washington. This site is being considered for the Nation's first geologic repository of high level nuclear waste. Hydrogeologic characterization of this site requires several lines of investigation which include: surface-based small-scale tests, testing performed at depth from an exploratory shaft, geochemistry investigations, regional studies, and site-specific investigations using large-scale, multiple-well hydraulic tests. The large-scale multiple-well tests are planned for several locations in and around the site. These tests are being designed to provide estimates of hydraulic parameter values of the geologic media, chemical properties of the groundwater, and hydrogeologic boundary conditions at a scale appropriate for evaluating repository performance with respect to potential radionuclide transport

  1. Data management in large-scale collaborative toxicity studies: how to file experimental data for automated statistical analysis.

    Science.gov (United States)

    Stanzel, Sven; Weimer, Marc; Kopp-Schneider, Annette

    2013-06-01

    High-throughput screening approaches are carried out for the toxicity assessment of a large number of chemical compounds. In such large-scale in vitro toxicity studies several hundred or thousand concentration-response experiments are conducted. The automated evaluation of concentration-response data using statistical analysis scripts saves time and yields more consistent results in comparison to data analysis performed by the use of menu-driven statistical software. Automated statistical analysis requires that concentration-response data are available in a standardised data format across all compounds. To obtain consistent data formats, a standardised data management workflow must be established, including guidelines for data storage, data handling and data extraction. In this paper two procedures for data management within large-scale toxicological projects are proposed. Both procedures are based on Microsoft Excel files as the researcher's primary data format and use a computer programme to automate the handling of data files. The first procedure assumes that data collection has not yet started whereas the second procedure can be used when data files already exist. Successful implementation of the two approaches into the European project ACuteTox is illustrated. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  3. Environmental impact assessment and environmental audit in large-scale public infrastructure construction: the case of the Qinghai-Tibet Railway.

    Science.gov (United States)

    He, Guizhen; Zhang, Lei; Lu, Yonglong

    2009-09-01

    Large-scale public infrastructure projects have featured in China's modernization course since the early 1980s. During the early stages of China's rapid economic development, public attention focused on the economic and social impact of high-profile construction projects. In recent years, however, we have seen a shift in public concern toward the environmental and ecological effects of such projects, and today governments are required to provide valid environmental impact assessments prior to allowing large-scale construction. The official requirement for the monitoring of environmental conditions has led to an increased number of debates in recent years regarding the effectiveness of Environmental Impact Assessments (EIAs) and Governmental Environmental Audits (GEAs) as environmental safeguards in instances of large-scale construction. Although EIA and GEA are conducted by different institutions and have different goals and enforcement potential, these two practices can be closely related in terms of methodology. This article cites the construction of the Qinghai-Tibet Railway as an instance in which EIA and GEA offer complementary approaches to environmental impact management. This study concludes that the GEA approach can serve as an effective follow-up to the EIA and establishes that the EIA lays a base for conducting future GEAs. The relationship that emerges through a study of the Railway's construction calls for more deliberate institutional arrangements and cooperation if the two practices are to be used in concert to optimal effect.

  4. Coordinated Multi-layer Multi-domain Optical Network (COMMON) for Large-Scale Science Applications (COMMON)

    Energy Technology Data Exchange (ETDEWEB)

    Vokkarane, Vinod [University of Massachusetts

    2013-09-01

    We intend to implement a Coordinated Multi-layer Multi-domain Optical Network (COMMON) Framework for Large-scale Science Applications. In the COMMON project, specific problems to be addressed include 1) anycast/multicast/manycast request provisioning, 2) deployable OSCARS enhancements, 3) multi-layer, multi-domain quality of service (QoS), and 4) multi-layer, multidomain path survivability. In what follows, we outline the progress in the above categories (Year 1, 2, and 3 deliverables).

  5. Embedded Electro-Optic Sensor Network for the On-Site Calibration and Real-Time Performance Monitoring of Large-Scale Phased Arrays

    National Research Council Canada - National Science Library

    Yang, Kyoung

    2005-01-01

    This final report summarizes the progress during the Phase I SBIR project entitled "Embedded Electro-Optic Sensor Network for the On-Site Calibration and Real-Time Performance Monitoring of Large-Scale Phased Arrays...

  6. Regenesys utility scale energy storage. Project summary

    International Nuclear Information System (INIS)

    2004-01-01

    This report summarises the work to date, the current situation and the future direction of a project carried out by Regenesys Technology Ltd. (RGN) to investigate the benefits of electrochemical energy storage for power generators using renewable energy sources focussing on wind energy. The background to the study is traced covering the progress of the Regenesys energy storage technology, and the milestones achieved and lessons learnt. Details are given of the planned renewable-store-market interface to allow renewable generators optimise revenue under the New Electricity Trading Arrangements (NETA) and help in the connection of the renewable energy to the electric grid system. The four integrated work programmes of the project are described and involve a system study examining market penetration of renewable generators, a technical study into connection of renewable generators and energy storage, a small scale demonstration, and a pilot scale energy storage plant at Little Barton in Cambridgeshire. Problems leading to the closure of the project are discussed

  7. Regenesys utility scale energy storage. Project summary

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2004-07-01

    This report summarises the work to date, the current situation and the future direction of a project carried out by Regenesys Technology Ltd. (RGN) to investigate the benefits of electrochemical energy storage for power generators using renewable energy sources focussing on wind energy. The background to the study is traced covering the progress of the Regenesys energy storage technology, and the milestones achieved and lessons learnt. Details are given of the planned renewable-store-market interface to allow renewable generators optimise revenue under the New Electricity Trading Arrangements (NETA) and help in the connection of the renewable energy to the electric grid system. The four integrated work programmes of the project are described and involve a system study examining market penetration of renewable generators, a technical study into connection of renewable generators and energy storage, a small scale demonstration, and a pilot scale energy storage plant at Little Barton in Cambridgeshire. Problems leading to the closure of the project are discussed.

  8. Chemically intuited, large-scale screening of MOFs by machine learning techniques

    Science.gov (United States)

    Borboudakis, Giorgos; Stergiannakos, Taxiarchis; Frysali, Maria; Klontzas, Emmanuel; Tsamardinos, Ioannis; Froudakis, George E.

    2017-10-01

    A novel computational methodology for large-scale screening of MOFs is applied to gas storage with the use of machine learning technologies. This approach is a promising trade-off between the accuracy of ab initio methods and the speed of classical approaches, strategically combined with chemical intuition. The results demonstrate that the chemical properties of MOFs are indeed predictable (stochastically, not deterministically) using machine learning methods and automated analysis protocols, with the accuracy of predictions increasing with sample size. Our initial results indicate that this methodology is promising to apply not only to gas storage in MOFs but in many other material science projects.

  9. Large-scale ash recycling in Central Sweden; Storskalig askhantering i mellansverige

    Energy Technology Data Exchange (ETDEWEB)

    Hansson, Mats [Stora Skog (Sweden)

    1998-08-01

    When logging residues (tops, branches) are withdrawn from the forest, most of the nutrient content of the trees is also lost. Some of the nutrient content of the soil is restored by weathering, but not all. When biomass is burnt as fuel most of the nutrients will be found in the ash. By recycling wood ash, in similar amounts as was withdrawn with the biomass, it is possible to compensate for the nutrient losses. This project was initiated to study how a rational recycling of wood ash could be performed under conditions valid for Stora, a large forest company in the middle of Sweden. A second aim was to give guiding principles for Stora`s own ash recycling while awaiting instructions from the authorities. In the project both theoretical studies and practical field studies were carried out. Studied areas are production of a stabilised ash product and different systems for transport and spreading of the ash product. The costs and results of spreading have also been monitored. The project showed that spreading of the ash can normally only take place when there is no snow. If production or transport is carried out during another time of the year, the ash has to be stored, either at the industry, in an intermediate storage, or in the forest. One important conclusion from the test period was that the result of the spreading depends heavily on the quality of the ash. Some of the ashes hardened in the spreading equipment, causing a complete stop of the spreading. It also caused problems if the ash was too wet. Plate-spreaders led to unequal quality of spreading, where some areas got more ash and some got less. Granulated ash was most easy to spread. Recommended system for spreading ash is: granulated ash transported unpacked in separate transports with lorries with exchangeable platforms. A large fores tractor spreads the ash in clearings, in the summer. The project has shown that large-scale ash recycling is possible to realize 22 figs, 5 tabs, 13 appendices

  10. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  11. Large-scale, multi-compartment tests in PANDA for LWR-containment analysis and code validation

    International Nuclear Information System (INIS)

    Paladino, Domenico; Auban, Olivier; Zboray, Robert

    2006-01-01

    The large-scale thermal-hydraulic PANDA facility has been used for the last years for investigating passive decay heat removal systems and related containment phenomena relevant for next-generation and current light water reactors. As part of the 5. EURATOM framework program project TEMPEST, a series of tests was performed in PANDA to experimentally investigate the distribution of hydrogen inside the containment and its effect on the performance of the Passive Containment Cooling System (PCCS) designed for the Economic Simplified Boiling Water Reactor (ESBWR). In a postulated severe accident, a large amount of hydrogen could be released in the Reactor Pressure Vessel (RPV) as a consequence of the cladding Metal- Water (M-W) reaction and discharged together with steam to the Drywell (DW) compartment. In PANDA tests, hydrogen was simulated by using helium. This paper illustrates the results of a TEMPEST test performed in PANDA and named as Test T1.2. In Test T1.2, the gas stratification (steam-helium) patterns forming in the large-scale multi-compartment PANDA DW, and the effect of non-condensable gas (helium) on the overall behaviour of the PCCS were identified. Gas mixing and stratification in a large-scale multi-compartment system are currently being further investigated in PANDA in the frame of the OECD project SETH. The testing philosophy in this new PANDA program is to produce data for code validation in relation to specific phenomena, such as: gas stratification in the containment, gas transport between containment compartments, wall condensation, etc. These types of phenomena are driven by buoyant high-momentum injections (jets) and/or low momentum injection (plumes), depending on the transient scenario. In this context, the new SETH tests in PANDA are particularly valuable to produce an experimental database for code assessment. This paper also presents an overview of the PANDA SETH tests and the major improvements in instrumentation carried out in the PANDA

  12. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  13. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  14. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  15. Distributed constraint satisfaction for coordinating and integrating a large-scale, heterogenous enterprise

    CERN Document Server

    Eisenberg, C

    2003-01-01

    Market forces are continuously driving public and private organisations towards higher productivity, shorter process and production times, and fewer labour hours. To cope with these changes, organisations are adopting new organisational models of coordination and cooperation that increase their flexibility, consistency, efficiency, productivity and profit margins. In this thesis an organisational model of coordination and cooperation is examined using a real life example; the technical integration of a distributed large-scale project of an international physics collaboration. The distributed resource constraint project scheduling problem is modelled and solved with the methods of distributed constraint satisfaction. A distributed local search method, the distributed breakout algorithm (DisBO), is used as the basis for the coordination scheme. The efficiency of the local search method is improved by extending it with an incremental problem solving scheme with variable ordering. The scheme is implemented as cen...

  16. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  17. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  18. The XChemExplorer graphical workflow tool for routine or large-scale protein–ligand structure determination

    Science.gov (United States)

    Krojer, Tobias; Talon, Romain; Pearce, Nicholas; Douangamath, Alice; Brandao-Neto, Jose; Dias, Alexandre; Marsden, Brian

    2017-01-01

    XChemExplorer (XCE) is a data-management and workflow tool to support large-scale simultaneous analysis of protein–ligand complexes during structure-based ligand discovery (SBLD). The user interfaces of established crystallo­graphic software packages such as CCP4 [Winn et al. (2011 ▸), Acta Cryst. D67, 235–242] or PHENIX [Adams et al. (2010 ▸), Acta Cryst. D66, 213–221] have entrenched the paradigm that a ‘project’ is concerned with solving one structure. This does not hold for SBLD, where many almost identical structures need to be solved and analysed quickly in one batch of work. Functionality to track progress and annotate structures is essential. XCE provides an intuitive graphical user interface which guides the user from data processing, initial map calculation, ligand identification and refinement up until data dissemination. It provides multiple entry points depending on the need of each project, enables batch processing of multiple data sets and records metadata, progress and annotations in an SQLite database. XCE is freely available and works on any Linux and Mac OS X system, and the only dependency is to have the latest version of CCP4 installed. The design and usage of this tool are described here, and its usefulness is demonstrated in the context of fragment-screening campaigns at the Diamond Light Source. It is routinely used to analyse projects comprising 1000 data sets or more, and therefore scales well to even very large ligand-design projects. PMID:28291762

  19. Integral large scale experiments on hydrogen combustion for severe accident code validation-HYCOM

    International Nuclear Information System (INIS)

    Breitung, W.; Dorofeev, S.; Kotchourko, A.; Redlinger, R.; Scholtyssek, W.; Bentaib, A.; L'Heriteau, J.-P.; Pailhories, P.; Eyink, J.; Movahed, M.; Petzold, K.-G.; Heitsch, M.; Alekseev, V.; Denkevits, A.; Kuznetsov, M.; Efimenko, A.; Okun, M.V.; Huld, T.; Baraldi, D.

    2005-01-01

    A joint research project was carried out in the EU Fifth Framework Programme, concerning hydrogen risk in a nuclear power plant. The goals were: Firstly, to create a new data base of results on hydrogen combustion experiments in the slow to turbulent combustion regimes. Secondly, to validate the partners CFD and lumped parameter codes on the experimental data, and to evaluate suitable parameter sets for application calculations. Thirdly, to conduct a benchmark exercise by applying the codes to the full scale analysis of a postulated hydrogen combustion scenario in a light water reactor containment after a core melt accident. The paper describes the work programme of the project and the partners activities. Significant progress has been made in the experimental area, where test series in medium and large scale facilities have been carried out with the focus on specific effects of scale, multi-compartent geometry, heat losses and venting. The data were used for the validation of the partners CFD and lumped parameter codes, which included blind predictive calculations and pre- and post-test intercomparison exercises. Finally, a benchmark exercise was conducted by applying the codes to the full scale analysis of a hydrogen combustion scenario. The comparison and assessment of the results of the validation phase and of the challenging containment calculation exercise allows a deep insight in the quality, capabilities and limits of the CFD and the lumped parameter tools which are currently in use at various research laboratories

  20. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  1. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  2. Comparison of Multi-Scale Digital Elevation Models for Defining Waterways and Catchments Over Large Areas

    Science.gov (United States)

    Harris, B.; McDougall, K.; Barry, M.

    2012-07-01

    Digital Elevation Models (DEMs) allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS) techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment) including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas) are adequate for the creation of waterways and catchments at a regional scale.

  3. How big is too big or how many partners are needed to build a large project which still can be managed successfully?

    Science.gov (United States)

    Henkel, Daniela; Eisenhauer, Anton

    2017-04-01

    During the last decades, the number of large research projects has increased and therewith the requirement for multidisciplinary, multisectoral collaboration. Such complex and large-scale projects pose new competencies to form, manage, and use large, diverse teams as a competitive advantage. For complex projects the effort is magnified because multiple large international research consortia involving academic and non-academic partners, including big industries, NGOs, private and public bodies, all with cultural differences, individually discrepant expectations on teamwork and differences in the collaboration between national and multi-national administrations and research organisations, challenge the organisation and management of such multi-partner research consortia. How many partners are needed to establish and conduct collaboration with a multidisciplinary and multisectoral approach? How much personnel effort and what kinds of management techniques are required for such projects. This presentation identifies advantages and challenges of large research projects based on the experiences made in the context of an Innovative Training Network (ITN) project within Marie Skłodowska-Curie Actions of the European HORIZON 2020 program. Possible strategies are discussed to circumvent and avoid conflicts already at the beginning of the project.

  4. Large-scale CO2 injection demos for the development of monitoring and verification technology and guidelines (CO2ReMoVe)

    Energy Technology Data Exchange (ETDEWEB)

    Wildenborg, T.; David, P. [TNO Built Environment and Geosciences, Princetonlaan 6, 3584 CB Utrecht (Netherlands); Bentham, M.; Chadwick, A.; Kirk, K. [British Geological Survey, Kingsley Dunham Centre, Keyworth, Nottingham NG12 5GG (United Kingdom); Dillen, M. [SINTEF Petroleum Research, Trondheim (Norway); Groenenberg, H. [Unit Policy Studies, Energy Research Centre of the Netherlands ECN, Amsterdam (Netherlands); Deflandre, J.P.; Le Gallo, J. [Institut Francais du Petrole, Rueil-Malmaison (France)

    2009-04-15

    The objectives of the EU project CO2ReMoVe are to undertake the research and development necessary to establish scientifically based standards for monitoring future CCS operations and to develop the performance assessment methodologies necessary to demonstrate the long-term reliability of geological storage of CO2. This could in turn lead to guidelines for the certification of sites suitable for CCS on a wide scale. Crucial to the project portfolio are the continuing large-scale CO2 injection operation at Sleipner, the injection operation at In Salah (Algeria) and the recently started injection project at Snoehvit (Norway). Two pilot sites are also currently in the project portfolio, Ketzin in Germany and K12-B in the offshore continental shelf of the Netherlands.

  5. Rainbow: a tool for large-scale whole-genome sequencing data analysis using cloud computing.

    Science.gov (United States)

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance; Messina, Thomas; Fan, Hongtao; Jaeger, Edward; Stephens, Susan

    2013-06-27

    Technical improvements have decreased sequencing costs and, as a result, the size and number of genomic datasets have increased rapidly. Because of the lower cost, large amounts of sequence data are now being produced by small to midsize research groups. Crossbow is a software tool that can detect single nucleotide polymorphisms (SNPs) in whole-genome sequencing (WGS) data from a single subject; however, Crossbow has a number of limitations when applied to multiple subjects from large-scale WGS projects. The data storage and CPU resources that are required for large-scale whole genome sequencing data analyses are too large for many core facilities and individual laboratories to provide. To help meet these challenges, we have developed Rainbow, a cloud-based software package that can assist in the automation of large-scale WGS data analyses. Here, we evaluated the performance of Rainbow by analyzing 44 different whole-genome-sequenced subjects. Rainbow has the capacity to process genomic data from more than 500 subjects in two weeks using cloud computing provided by the Amazon Web Service. The time includes the import and export of the data using Amazon Import/Export service. The average cost of processing a single sample in the cloud was less than 120 US dollars. Compared with Crossbow, the main improvements incorporated into Rainbow include the ability: (1) to handle BAM as well as FASTQ input files; (2) to split large sequence files for better load balance downstream; (3) to log the running metrics in data processing and monitoring multiple Amazon Elastic Compute Cloud (EC2) instances; and (4) to merge SOAPsnp outputs for multiple individuals into a single file to facilitate downstream genome-wide association studies. Rainbow is a scalable, cost-effective, and open-source tool for large-scale WGS data analysis. For human WGS data sequenced by either the Illumina HiSeq 2000 or HiSeq 2500 platforms, Rainbow can be used straight out of the box. Rainbow is available

  6. Bayesian Inversion for Large Scale Antarctic Ice Sheet Flow

    KAUST Repository

    Ghattas, Omar

    2015-01-07

    The flow of ice from the interior of polar ice sheets is the primary contributor to projected sea level rise. One of the main difficulties faced in modeling ice sheet flow is the uncertain spatially-varying Robin boundary condition that describes the resistance to sliding at the base of the ice. Satellite observations of the surface ice flow velocity, along with a model of ice as a creeping incompressible shear-thinning fluid, can be used to infer this uncertain basal boundary condition. We cast this ill-posed inverse problem in the framework of Bayesian inference, which allows us to infer not only the basal sliding parameters, but also the associated uncertainty. To overcome the prohibitive nature of Bayesian methods for large-scale inverse problems, we exploit the fact that, despite the large size of observational data, they typically provide only sparse information on model parameters. We show results for Bayesian inversion of the basal sliding parameter field for the full Antarctic continent, and demonstrate that the work required to solve the inverse problem, measured in number of forward (and adjoint) ice sheet model solves, is independent of the parameter and data dimensions

  7. Bayesian Inversion for Large Scale Antarctic Ice Sheet Flow

    KAUST Repository

    Ghattas, Omar

    2015-01-01

    The flow of ice from the interior of polar ice sheets is the primary contributor to projected sea level rise. One of the main difficulties faced in modeling ice sheet flow is the uncertain spatially-varying Robin boundary condition that describes the resistance to sliding at the base of the ice. Satellite observations of the surface ice flow velocity, along with a model of ice as a creeping incompressible shear-thinning fluid, can be used to infer this uncertain basal boundary condition. We cast this ill-posed inverse problem in the framework of Bayesian inference, which allows us to infer not only the basal sliding parameters, but also the associated uncertainty. To overcome the prohibitive nature of Bayesian methods for large-scale inverse problems, we exploit the fact that, despite the large size of observational data, they typically provide only sparse information on model parameters. We show results for Bayesian inversion of the basal sliding parameter field for the full Antarctic continent, and demonstrate that the work required to solve the inverse problem, measured in number of forward (and adjoint) ice sheet model solves, is independent of the parameter and data dimensions

  8. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  9. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  10. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  11. PANDA: a Large Scale Multi-Purpose Test Facility for LWR Safety Research

    Energy Technology Data Exchange (ETDEWEB)

    Dreier, Joerg; Paladino, Domenico; Huggenberger, Max; Andreani, Michele [Laboratory for Thermal-Hydraulics, Nuclear Energy and Safety Research Department, Paul Scherrer Institut (PSI), CH-5232 Villigen PSI (Switzerland); Yadigaroglu, George [ETH Zuerich, Technoparkstrasse 1, Einstein 22- CH-8005 Zuerich (Switzerland)

    2008-07-01

    PANDA is a large-scale multi-purpose thermal-hydraulics test facility, built and operated by PSI. Due to its modular structure, PANDA provides flexibility for a variety of applications, ranging from integral containment system investigations, primary system tests, component experiments to large-scale separate-effects tests. For many applications, the experimental results are directly used for example for concept demonstrations or for the characterisation of phenomena or components, but all the experimental data generated in the various test campaigns is unique and was or/and will still be widely used for the validation and improvement of a variety of computer codes, including codes with 3D capabilities, for reactor safety analysis. The paper provides an overview of the already completed and on-going research programs performed in the PANDA facility in the different area of applications, including the main results and conclusions of the investigations. In particular the advanced passive containment cooling system concept investigations of the SBWR, ESBWR as well as of the SWR1000 in relation to various aspects are presented and the main findings are summarised. Finally the goals, planned investigations and expected results of the on-going OECD project SETH-2 are presented. (authors)

  12. PANDA: a Large Scale Multi-Purpose Test Facility for LWR Safety Research

    International Nuclear Information System (INIS)

    Dreier, Joerg; Paladino, Domenico; Huggenberger, Max; Andreani, Michele; Yadigaroglu, George

    2008-01-01

    PANDA is a large-scale multi-purpose thermal-hydraulics test facility, built and operated by PSI. Due to its modular structure, PANDA provides flexibility for a variety of applications, ranging from integral containment system investigations, primary system tests, component experiments to large-scale separate-effects tests. For many applications, the experimental results are directly used for example for concept demonstrations or for the characterisation of phenomena or components, but all the experimental data generated in the various test campaigns is unique and was or/and will still be widely used for the validation and improvement of a variety of computer codes, including codes with 3D capabilities, for reactor safety analysis. The paper provides an overview of the already completed and on-going research programs performed in the PANDA facility in the different area of applications, including the main results and conclusions of the investigations. In particular the advanced passive containment cooling system concept investigations of the SBWR, ESBWR as well as of the SWR1000 in relation to various aspects are presented and the main findings are summarised. Finally the goals, planned investigations and expected results of the on-going OECD project SETH-2 are presented. (authors)

  13. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  14. Size matters: the ethical, legal, and social issues surrounding large-scale genetic biobank initiatives

    Directory of Open Access Journals (Sweden)

    Klaus Lindgaard Hoeyer

    2012-04-01

    Full Text Available During the past ten years the complex ethical, legal and social issues (ELSI typically surrounding large-scale genetic biobank research initiatives have been intensely debated in academic circles. In many ways genetic epidemiology has undergone a set of changes resembling what in physics has been called a transition into Big Science. This article outlines consequences of this transition and suggests that the change in scale implies challenges to the roles of scientists and public alike. An overview of key issues is presented, and it is argued that biobanks represent not just scientific endeavors with purely epistemic objectives, but also political projects with social implications. As such, they demand clever maneuvering among social interests to succeed.

  15. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  16. Developing a Massively Parallel Forward Projection Radiography Model for Large-Scale Industrial Applications

    Energy Technology Data Exchange (ETDEWEB)

    Bauerle, Matthew [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-08-01

    This project utilizes Graphics Processing Units (GPUs) to compute radiograph simulations for arbitrary objects. The generation of radiographs, also known as the forward projection imaging model, is computationally intensive and not widely utilized. The goal of this research is to develop a massively parallel algorithm that can compute forward projections for objects with a trillion voxels (3D pixels). To achieve this end, the data are divided into blocks that can each t into GPU memory. The forward projected image is also divided into segments to allow for future parallelization and to avoid needless computations.

  17. Naming Conventions for the Large Hadron Collider Project

    CERN Document Server

    Faugeras, Paul E

    1997-01-01

    This report gives the procedures for defining standard abbreviations for the various machine components of the Large Hadron Collider (LHC) Project, as well as for the surface buildings and the underground Civil Engineering works of the LHC. The contents of this report has been approved by the LHC Project Leader and is published in the form of a Project Report in order to allow its immediate implementation. It will be incorporated later in the Quality Assurance Plan of the LHC Project which is under preparation.

  18. Large-scale network dynamics of beta-band oscillations underlie auditory perceptual decision-making

    Directory of Open Access Journals (Sweden)

    Mohsen Alavash

    2017-06-01

    Full Text Available Perceptual decisions vary in the speed at which we make them. Evidence suggests that translating sensory information into perceptual decisions relies on distributed interacting neural populations, with decision speed hinging on power modulations of the neural oscillations. Yet the dependence of perceptual decisions on the large-scale network organization of coupled neural oscillations has remained elusive. We measured magnetoencephalographic signals in human listeners who judged acoustic stimuli composed of carefully titrated clouds of tone sweeps. These stimuli were used in two task contexts, in which the participants judged the overall pitch or direction of the tone sweeps. We traced the large-scale network dynamics of the source-projected neural oscillations on a trial-by-trial basis using power-envelope correlations and graph-theoretical network discovery. In both tasks, faster decisions were predicted by higher segregation and lower integration of coupled beta-band (∼16–28 Hz oscillations. We also uncovered the brain network states that promoted faster decisions in either lower-order auditory or higher-order control brain areas. Specifically, decision speed in judging the tone sweep direction critically relied on the nodal network configurations of anterior temporal, cingulate, and middle frontal cortices. Our findings suggest that global network communication during perceptual decision-making is implemented in the human brain by large-scale couplings between beta-band neural oscillations. The speed at which we make perceptual decisions varies. This translation of sensory information into perceptual decisions hinges on dynamic changes in neural oscillatory activity. However, the large-scale neural-network embodiment supporting perceptual decision-making is unclear. We addressed this question by experimenting two auditory perceptual decision-making situations. Using graph-theoretical network discovery, we traced the large-scale network

  19. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    Science.gov (United States)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are

  20. The permissibility of power plants and other large-scale projects outside densely populated areas from the point of view of planning legislation

    International Nuclear Information System (INIS)

    Hoppe, W.

    1978-01-01

    For the licensing of power plants, nuclear power plants and other industrial large projects, the question of the building law admissibility gains more and more importance in the licensing procedure. According to paragraph 6 of the Fed. Immission Protection, it is not only necessary that immission protection duties of the operator of plants which are subject to licensing are fulfilled, but other public law regulations must be in correspondance with the construction and operation of the plant. In this connection, the planning law admissibility of the project must also be examined. This is also valid for nuclear power plants. In the practical application of licensing such plants are treated in different ways, as far as terms of building planning are concerned. Some licensing authorities regard them as priviledged outskirts projects, others demand a zoning plan of the community. The author pleads for such large projects to be licensed only on the basis of a zoning plan set up by the community. (orig.) [de

  1. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  2. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  3. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  4. First Joint Workshop on Energy Management for Large-Scale Research Infrastructures

    CERN Document Server

    2011-01-01

      CERN, ERF (European Association of National Research Facilities) and ESS (European Spallation Source) announce the first Joint Workshop on Energy Management for Large-Scale Research Infrastructures. The event will take place on 13-14 October 2011 at the ESS office in Sparta - Lund, Sweden.   The workshop will bring together international experts on energy and representatives from laboratories and future projects all over the world in order to identify the challenges and best practice in respect of energy efficiency and optimization, solutions and implementation as well as to review the challenges represented by potential future technical solutions and the tools for effective collaboration. Further information at: http://ess-scandinavia.eu/general-information

  5. Large-scale applications of superconductivity in the United States: an overview. Metallurgy, fabrication, and applications

    International Nuclear Information System (INIS)

    Hein, R.A.; Gubser, D.U.

    1981-01-01

    This report presents an overview of ongoing development efforts in the USA concerned with large-scale applications of superconductivity. These applications are grouped according to magnetic field regime, as low field regime, intermediate field regime, and high field regime. In the low field regime two diverse areas of large application are identified, superconducting power transmission lines for electric utilities, and RF cavities for particle accelerators for high energy physics research. Activity in the intermediate regime has been significantly increased due to Fermilab's energy doubler or Tevatron project, and BNL's ISABELLE project. Rotating electrical machines, such as DC acyclic (homopolar) motors, generators, and energy storage magnets are also studied. In the high field regime magnetohydrodynamics (MHD) and magnetically confined fusion in tokamaks are examined. In each regime all current work is summarized according to key person, research topic, type of program, funding, status, and future outlook

  6. Utility-Scale Solar 2015: An Empirical Analysis of Project Cost, Performance, and Pricing Trends in the United States

    Energy Technology Data Exchange (ETDEWEB)

    Bolinger, Mark [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division; Seel, Joachim [Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States). Environmental Energy Technologies Division

    2016-08-17

    The utility-scale solar sector—defined here to include any ground-mounted photovoltaic (“PV”), concentrating photovoltaic (“CPV”), or concentrating solar power (“CSP”) project that is larger than 5 MWAC in capacity—has led the overall U.S. solar market in terms of installed capacity since 2012. It is expected to maintain its market-leading position for at least another five years, driven in part by December 2015’s three-year extension of the 30% federal investment tax credit (“ITC”) through 2019 (coupled with a favorable switch to a “start construction” rather than a “placed in service” eligibility requirement, and a gradual phase down of the credit to 10% by 2022). In fact, in 2016 alone, the utility-scale sector is projected to install more than twice as much new capacity as it ever has previously in a single year. This unprecedented boom makes it difficult, yet more important than ever, to stay abreast of the latest utility-scale market developments and trends. This report—the fourth edition in an ongoing annual series—is intended to help meet this need, by providing in-depth, annually updated, data-driven analysis of the utility-scale solar project fleet in the United States. Drawing on empirical project-level data from a wide range of sources, this report analyzes not just installed project costs or prices—i.e., the traditional realm of most solar economic analyses—but also operating costs, capacity factors, and power purchase agreement (“PPA”) prices from a large sample of utility-scale solar projects throughout the United States. Given its current dominance in the market, utility-scale PV also dominates much of this report, though data from CPV and CSP projects are also presented where appropriate.

  7. South Louisiana Enhanced Oil Recovery/Sequestration R&D Project Small Scale Field Tests of Geologic Reservoir Classes for Geologic Storage

    Energy Technology Data Exchange (ETDEWEB)

    Hite, Roger [Blackhorse Energy LLC, Houston, TX (United States)

    2016-10-01

    The project site is located in Livingston Parish, Louisiana, approximately 26 miles due east of Baton Rouge. This project proposed to evaluate an early Eocene-aged Wilcox oil reservoir for permanent storage of CO2. Blackhorse Energy, LLC planned to conduct a parallel CO2 oil recovery project in the First Wilcox Sand. The primary focus of this project was to examine and prove the suitability of South Louisiana geologic formations for large-scale geologic sequestration of CO2 in association with enhanced oil recovery applications. This was to be accomplished through the focused demonstration of small-scale, permanent storage of CO2 in the First Wilcox Sand. The project was terminated at the request of Blackhorse Energy LLC on October 22, 2014.

  8. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  9. Energy System Analysis of Large-Scale Integration of Wind Power

    International Nuclear Information System (INIS)

    Lund, Henrik

    2003-11-01

    The paper presents the results of two research projects conducted by Aalborg University and financed by the Danish Energy Research Programme. Both projects include the development of models and system analysis with focus on large-scale integration of wind power into different energy systems. Market reactions and ability to exploit exchange on the international market for electricity by locating exports in hours of high prices are included in the analyses. This paper focuses on results which are valid for energy systems in general. The paper presents the ability of different energy systems and regulation strategies to integrate wind power, The ability is expressed by three factors: One factor is the degree of electricity excess production caused by fluctuations in wind and CHP heat demands. The other factor is the ability to utilise wind power to reduce CO 2 emission in the system. And the third factor is the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system, in which 50 per cent of the electricity demand is produced in CHP, a number of future energy systems with CO 2 reduction potentials are analysed, i.e. systems with more CHP, systems using electricity for transportation (battery or hydrogen vehicles) and systems with fuel-cell technologies. For the present and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and grid stability and investments in electric heating, heat pumps and heat storage capacity. Also the potential of energy management has been analysed. The results of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power

  10. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  11. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  12. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  13. Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Kishimoto, Yasuaki; Sugahara, Akihiro; Li, J.Q.

    2008-01-01

    Large scale simulation using super-computer, which generally requires long CPU time and produces large amount of data, has been extensively studied as a third pillar in various advanced science fields in parallel to theory and experiment. Such a simulation is expected to lead new scientific discoveries through elucidation of various complex phenomena, which are hardly identified only by conventional theoretical and experimental approaches. In order to assist such large simulation studies for which many collaborators working at geographically different places participate and contribute, we have developed a unique remote collaboration system, referred to as SIMON (simulation monitoring system), which is based on client-server system control introducing an idea of up-date processing, contrary to that of widely used post-processing. As a key ingredient, we have developed a trigger method, which transmits various requests for the up-date processing from the simulation (client) running on a super-computer to a workstation (server). Namely, the simulation running on a super-computer actively controls the timing of up-date processing. The server that has received the requests from the ongoing simulation such as data transfer, data analyses, and visualizations, etc. starts operations according to the requests during the simulation. The server makes the latest results available to web browsers, so that the collaborators can monitor the results at any place and time in the world. By applying the system to a specific simulation project of laser-matter interaction, we have confirmed that the system works well and plays an important role as a collaboration platform on which many collaborators work with one another

  14. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    Science.gov (United States)

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs

  15. Advanced Grid-Friendly Controls Demonstration Project for Utility-Scale PV Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Gevorgian, Vahan; O' Neill, Barbara

    2016-01-21

    A typical photovoltaic (PV) power plant consists of multiple power electronic inverters and can contribute to grid stability and reliability through sophisticated 'grid-friendly' controls. The availability and dissemination of actual test data showing the viability of advanced utility-scale PV controls among all industry stakeholders can leverage PV's value from being simply an energy resource to providing additional ancillary services that range from variability smoothing and frequency regulation to power quality. Strategically partnering with a selected utility and/or PV power plant operator is a key condition for a successful demonstration project. The U.S. Department of Energy's (DOE's) Solar Energy Technologies Office selected the National Renewable Energy Laboratory (NREL) to be a principal investigator in a two-year project with goals to (1) identify a potential partner(s), (2) develop a detailed scope of work and test plan for a field project to demonstrate the gird-friendly capabilities of utility-scale PV power plants, (3) facilitate conducting actual demonstration tests, and (4) disseminate test results among industry stakeholders via a joint NREL/DOE publication and participation in relevant technical conferences. The project implementation took place in FY 2014 and FY 2015. In FY14, NREL established collaborations with AES and First Solar Electric, LLC, to conduct demonstration testing on their utility-scale PV power plants in Puerto Rico and Texas, respectively, and developed test plans for each partner. Both Puerto Rico Electric Power Authority and the Electric Reliability Council of Texas expressed interest in this project because of the importance of such advanced controls for the reliable operation of their power systems under high penetration levels of variable renewable generation. During FY15, testing was completed on both plants, and a large amount of test data was produced and analyzed that demonstrates the ability of

  16. MEMPHYS: A large scale water Cherenkov detector at Frejus

    International Nuclear Information System (INIS)

    Bellefon, A. de; Dolbeau, J.; Gorodetzky, P.; Katsanevas, S.; Patzak, T.; Salin, P.; Tonazzo, A.; Bouchez, J.; Busto, J.; Campagne, J.E.; Cavata, C.; Mosca, L.; Dumarchez, J.; Mezzetto, M.; Volpe, C.

    2006-07-01

    A water Cherenkov detector project, of megaton scale, to be installed in the Frejus underground site and dedicated to nucleon decay, neutrinos from supernovae, solar and atmospheric neutrinos, as well as neutrinos from a super-beam and/or a beta-beam coming from CERN, is presented and compared with competitor projects in Japan and in the USA. The performances of the European project are discussed, including the possibility to measure the mixing angle θ 13 and the CP-violating phase δ. (authors)

  17. Large-scale fortification of condiments and seasonings as a public health strategy: equity considerations for implementation.

    Science.gov (United States)

    Zamora, Gerardo; Flores-Urrutia, Mónica Crissel; Mayén, Ana-Lucia

    2016-09-01

    Fortification of staple foods with vitamins and minerals is an effective approach to increase micronutrient intake and improve nutritional status. The specific use of condiments and seasonings as vehicles in large-scale fortification programs is a relatively new public health strategy. This paper underscores equity considerations for the implementation of large-scale fortification of condiments and seasonings as a public health strategy by examining nonexhaustive examples of programmatic experiences and pilot projects in various settings. An overview of conceptual elements in implementation research and equity is presented, followed by an examination of equity considerations for five implementation strategies: (1) enhancing the capabilities of the public sector, (2) improving the performance of implementing agencies, (3) strengthening the capabilities and performance of frontline workers, (3) empowering communities and individuals, and (4) supporting multiple stakeholders engaged in improving health. Finally, specific considerations related to intersectoral action are considered. Large-scale fortification of condiments and seasonings cannot be a standalone strategy and needs to be implemented with concurrent and coordinated public health strategies, which should be informed by a health equity lens. © 2016 New York Academy of Sciences.

  18. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  20. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  1. What to be implemented at the early stage of a large-scale project

    CERN Document Server

    Bachy, Gérard; Bachy, Gerard; Hameri, Ari Pekka

    1997-01-01

    This paper addresses the importance of the actions to be taken before the project planning phases begin. The approach taken stems from the production planning paradigm, with emphasis on the product, rather than on the process. It is argued that a complete part list or product breakdown structure (PBS) is the absolute prerequisite for the design of a successful work breakdown structure (WBS) for a project. This process requires the definition of the design and configuration disciplines during the engineering phase. These critical issues of concurrent engineering and product development are also emphasized in the paper. The WBS is, in turn, needed to establish a suitable organizational breakdown structure (OBS or organigram) for the project. Finally, the assembly sequence and the related assembly breakdown structure (ABS) of the end product is required before commencing the project planning phase, which provides the schedules, resource allocation, progress control, and the like for the project management. Detai...

  2. The Large Scale Distribution of Water Ice in the Polar Regions of the Moon

    Science.gov (United States)

    Jordan, A.; Wilson, J. K.; Schwadron, N.; Spence, H. E.

    2017-12-01

    For in situ resource utilization, one must know where water ice is on the Moon. Many datasets have revealed both surface deposits of water ice and subsurface deposits of hydrogen near the lunar poles, but it has proved difficult to resolve the differences among the locations of these deposits. Despite these datasets disagreeing on how deposits are distributed on small scales, we show that most of these datasets do agree on the large scale distribution of water ice. We present data from the Cosmic Ray Telescope for the Effects of Radiation (CRaTER) on the Lunar Reconnaissance Orbiter (LRO), LRO's Lunar Exploration Neutron Detector (LEND), the Neutron Spectrometer on Lunar Prospector (LPNS), LRO's Lyman Alpha Mapping Project (LAMP), LRO's Lunar Orbiter Laser Altimeter (LOLA), and Chandrayaan-1's Moon Mineralogy Mapper (M3). All, including those that show clear evidence for water ice, reveal surprisingly similar trends with latitude, suggesting that both surface and subsurface datasets are measuring ice. All show that water ice increases towards the poles, and most demonstrate that its signature appears at about ±70° latitude and increases poleward. This is consistent with simulations of how surface and subsurface cold traps are distributed with latitude. This large scale agreement constrains the origin of the ice, suggesting that an ancient cometary impact (or impacts) created a large scale deposit that has been rendered locally heterogeneous by subsequent impacts. Furthermore, it also shows that water ice may be available down to ±70°—latitudes that are more accessible than the poles for landing.

  3. Estimating the electricity prices, generation costs and CO_2 emissions of large scale wind energy exports from Ireland to Great Britain

    International Nuclear Information System (INIS)

    Cleary, Brendan; Duffy, Aidan; Bach, Bjarne; Vitina, Aisma; O’Connor, Alan; Conlon, Michael

    2016-01-01

    The share of wind generation in the Irish and British electricity markets is set to increase by 2020 due to renewable energy (RE) targets. The United Kingdom (UK) and Ireland have set ambitious targets which require 30% and 40% of electricity demand to come from RE, mainly wind, by 2020, respectively. Ireland has sufficient indigenous onshore wind energy resources to exceed the RE target, while the UK faces uncertainty in achieving its target. A possible solution for the UK is to import RE directly from large scale onshore and offshore wind energy projects in Ireland; this possibility has recently been explored by both governments but is currently on hold. Thus, the aim of this paper is to estimate the effects of large scale wind energy in the Irish and British electricity markets in terms of wholesale system marginal prices, total generation costs and CO_2 emissions. The results indicate when the large scale Irish-based wind energy projects are connected directly to the UK there is a decrease of 0.6% and 2% in the Irish and British wholesale system marginal prices under the UK National Grid slow progression scenario, respectively. - Highlights: • Modelling the Irish and British electricity markets. • Investigating the impacts of large scale wind energy within the markets. • Results indicate a reduction in wholesale system marginal prices in both markets. • Decrease in total generation costs and CO_2 emissions in both markets.

  4. Towards Agent-Based Simulation of Emerging and Large-Scale Social Networks. Examples of the Migrant Crisis and MMORPGs

    Directory of Open Access Journals (Sweden)

    Schatten, Markus

    2016-10-01

    Full Text Available Large-scale agent based simulation of social networks is described in the context of the migrant crisis in Syria and the EU as well as massively multi-player on-line role playing games (MMORPG. The recipeWorld system by Terna and Fontana is proposed as a possible solution to simulating large-scale social networks. The initial system has been re-implemented using the Smart Python multi-Agent Development Environment (SPADE and Pyinteractive was used for visualization. We present initial models of simulation that we plan to develop further in future studies. Thus this paper is research in progress that will hopefully establish a novel agent-based modelling system in the context of the ModelMMORPG project.

  5. COMPARISON OF MULTI-SCALE DIGITAL ELEVATION MODELS FOR DEFINING WATERWAYS AND CATCHMENTS OVER LARGE AREAS

    Directory of Open Access Journals (Sweden)

    B. Harris

    2012-07-01

    Full Text Available Digital Elevation Models (DEMs allow for the efficient and consistent creation of waterways and catchment boundaries over large areas. Studies of waterway delineation from DEMs are usually undertaken over small or single catchment areas due to the nature of the problems being investigated. Improvements in Geographic Information Systems (GIS techniques, software, hardware and data allow for analysis of larger data sets and also facilitate a consistent tool for the creation and analysis of waterways over extensive areas. However, rarely are they developed over large regional areas because of the lack of available raw data sets and the amount of work required to create the underlying DEMs. This paper examines definition of waterways and catchments over an area of approximately 25,000 km2 to establish the optimal DEM scale required for waterway delineation over large regional projects. The comparative study analysed multi-scale DEMs over two test areas (Wivenhoe catchment, 543 km2 and a detailed 13 km2 within the Wivenhoe catchment including various data types, scales, quality, and variable catchment input parameters. Historic and available DEM data was compared to high resolution Lidar based DEMs to assess variations in the formation of stream networks. The results identified that, particularly in areas of high elevation change, DEMs at 20 m cell size created from broad scale 1:25,000 data (combined with more detailed data or manual delineation in flat areas are adequate for the creation of waterways and catchments at a regional scale.

  6. Economic viability of large-scale fusion systems

    Energy Technology Data Exchange (ETDEWEB)

    Helsley, Charles E., E-mail: cehelsley@fusionpowercorporation.com; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system

  7. Economic viability of large-scale fusion systems

    International Nuclear Information System (INIS)

    Helsley, Charles E.; Burke, Robert J.

    2014-01-01

    A typical modern power generation facility has a capacity of about 1 GWe (Gigawatt electric) per unit. This works well for fossil fuel plants and for most fission facilities for it is large enough to support the sophisticated generation infrastructure but still small enough to be accommodated by most utility grid systems. The size of potential fusion power systems may demand a different viewpoint. The compression and heating of the fusion fuel for ignition requires a large driver, even if it is necessary for only a few microseconds or nanoseconds per energy pulse. The economics of large systems, that can effectively use more of the driver capacity, need to be examined. The assumptions used in this model are specific for the Fusion Power Corporation (FPC) SPRFD process but could be generalized for any system. We assume that the accelerator is the most expensive element of the facility and estimate its cost to be $20 billion. Ignition chambers and fuel handling facilities are projected to cost $1.5 billion each with up to 10 to be serviced by one accelerator. At first this seems expensive but that impression has to be tempered by the energy output that is equal to 35 conventional nuclear plants. This means the cost per kWh is actually low. Using the above assumptions and industry data for generators and heat exchange systems, we conclude that a fully utilized fusion system will produce marketable energy at roughly one half the cost of our current means of generating an equivalent amount of energy from conventional fossil fuel and/or fission systems. Even fractionally utilized systems, i.e. systems used at 25% of capacity, can be cost effective in many cases. In conclusion, SPRFD systems can be scaled to a size and configuration that can be economically viable and very competitive in today's energy market. Electricity will be a significant element in the product mix but synthetic fuels and water may also need to be incorporated to make the large system economically

  8. Large scale photovoltaic field trials. Second technical report: monitoring phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This report provides an update on the Large-Scale Building Integrated Photovoltaic Field Trials (LS-BIPV FT) programme commissioned by the Department of Trade and Industry (Department for Business, Enterprise and Industry; BERR). It provides detailed profiles of the 12 projects making up this programme, which is part of the UK programme on photovoltaics and has run in parallel with the Domestic Field Trial. These field trials aim to record the experience and use the lessons learnt to raise awareness of, and confidence in, the technology and increase UK capabilities. The projects involved: the visitor centre at the Gaia Energy Centre in Cornwall; a community church hall in London; council offices in West Oxfordshire; a sports science centre at Gloucester University; the visitor centre at Cotswold Water Park; the headquarters of the Insolvency Service; a Welsh Development Agency building; an athletics centre in Birmingham; a research facility at the University of East Anglia; a primary school in Belfast; and Barnstable civic centre in Devon. The report describes the aims of the field trials, monitoring issues, performance, observations and trends, lessons learnt and the results of occupancy surveys.

  9. Integrated project support environments the ASPECT project

    CERN Document Server

    Brown, Alan W

    1991-01-01

    A major part of software engineering developments involve the use of computing tools which facilitate the management, maintenance, security, and building of long-scale software engineer projects. Consequently, there have been a proliferation of CASE tools and IPSES. This book looks at IPSES in general and the ASPECT project in particular, providing design and implementation details, as well as locating ASPECT in IPSE developments.Survey of integrated project support environments for more efficient software engineering**Description of a large scale IPSE--ASPECT**Evaluation of formal methods in

  10. In situ vitrification large-scale operational acceptance test analysis

    International Nuclear Information System (INIS)

    Buelt, J.L.; Carter, J.G.

    1986-05-01

    A thermal treatment process is currently under study to provide possible enhancement of in-place stabilization of transuranic and chemically contaminated soil sites. The process is known as in situ vitrification (ISV). In situ vitrification is a remedial action process that destroys solid and liquid organic contaminants and incorporates radionuclides into a glass-like material that renders contaminants substantially less mobile and less likely to impact the environment. A large-scale operational acceptance test (LSOAT) was recently completed in which more than 180 t of vitrified soil were produced in each of three adjacent settings. The LSOAT demonstrated that the process conforms to the functional design criteria necessary for the large-scale radioactive test (LSRT) to be conducted following verification of the performance capabilities of the process. The energy requirements and vitrified block size, shape, and mass are sufficiently equivalent to those predicted by the ISV mathematical model to confirm its usefulness as a predictive tool. The LSOAT demonstrated an electrode replacement technique, which can be used if an electrode fails, and techniques have been identified to minimize air oxidation, thereby extending electrode life. A statistical analysis was employed during the LSOAT to identify graphite collars and an insulative surface as successful cold cap subsidence techniques. The LSOAT also showed that even under worst-case conditions, the off-gas system exceeds the flow requirements necessary to maintain a negative pressure on the hood covering the area being vitrified. The retention of simulated radionuclides and chemicals in the soil and off-gas system exceeds requirements so that projected emissions are one to two orders of magnitude below the maximum permissible concentrations of contaminants at the stack

  11. A successful waste stream analysis on a large construction project in a radiologically controlled area

    International Nuclear Information System (INIS)

    Kennicott, M.; Richardson, D.; Starke, T.P.

    1997-01-01

    The Los Alamos National Laboratory (the Laboratory) Chemistry and Metallurgy Research (CMR) Facility, constructed in 1952, is currently under going a major, multi-year demolition and construction project. Many of the operations required under this project (i.e., design, demolition, decontamination, construction, and waste management) mimic the processes required of a large scale decontamination and decommissioning (D and D) job and are identical to the requirements of any of several upgrades projects anticipated for the laboratory and other Department of Energy (DOE) sites. For these reasons the CMR upgrades Project is seen as an ideal model facility--to test the application and measure the success of waste minimization techniques which could be implemented for any similar projects. The purpose of this paper will be to discuss the successful completion of a waste stream analysis. The analyses performed was to measure the potential impact of waste generation, in terms of volume and costs, for a reconfiguration option being considered to change the approach and execution of the original project

  12. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  13. Large-scale Health Information Database and Privacy Protection*1

    Science.gov (United States)

    YAMAMOTO, Ryuichi

    2016-01-01

    Japan was once progressive in the digitalization of healthcare fields but unfortunately has fallen behind in terms of the secondary use of data for public interest. There has recently been a trend to establish large-scale health databases in the nation, and a conflict between data use for public interest and privacy protection has surfaced as this trend has progressed. Databases for health insurance claims or for specific health checkups and guidance services were created according to the law that aims to ensure healthcare for the elderly; however, there is no mention in the act about using these databases for public interest in general. Thus, an initiative for such use must proceed carefully and attentively. The PMDA*2 projects that collect a large amount of medical record information from large hospitals and the health database development project that the Ministry of Health, Labour and Welfare (MHLW) is working on will soon begin to operate according to a general consensus; however, the validity of this consensus can be questioned if issues of anonymity arise. The likelihood that researchers conducting a study for public interest would intentionally invade the privacy of their subjects is slim. However, patients could develop a sense of distrust about their data being used since legal requirements are ambiguous. Nevertheless, without using patients’ medical records for public interest, progress in medicine will grind to a halt. Proper legislation that is clear for both researchers and patients will therefore be highly desirable. A revision of the Act on the Protection of Personal Information is currently in progress. In reality, however, privacy is not something that laws alone can protect; it will also require guidelines and self-discipline. We now live in an information capitalization age. I will introduce the trends in legal reform regarding healthcare information and discuss some basics to help people properly face the issue of health big data and privacy

  14. Global Wildfire Forecasts Using Large Scale Climate Indices

    Science.gov (United States)

    Shen, Huizhong; Tao, Shu

    2016-04-01

    Using weather readings, fire early warning can provided forecast 4-6 hour in advance to minimize fire loss. The benefit would be dramatically enhanced if relatively accurate long-term projection can be also provided. Here we present a novel method for predicting global fire season severity (FSS) at least three months in advance using multiple large-scale climate indices (CIs). The predictive ability is proven effective for various geographic locations and resolution. Globally, as well as in most continents, the El Niño Southern Oscillation (ENSO) is the dominant driving force controlling interannual FSS variability, whereas other CIs also play indispensable roles. We found that a moderate El Niño event is responsible for 465 (272-658 as interquartile range) Tg carbon release and an annual increase of 29,500 (24,500-34,800) deaths from inhalation exposure to air pollutants. Southeast Asia accounts for half of the deaths. Both intercorrelation and interaction of WPs and CIs are revealed, suggesting possible climate-induced modification of fire responses to weather conditions. Our models can benefit fire management in response to climate change.

  15. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  16. Large-scale fuel cycle centres

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The US Nuclear Regulatory Commission (NRC) has considered the nuclear energy centre concept for fuel cycle plants in the Nuclear Energy Centre Site Survey 1975 (NECSS-75) Rep. No. NUREG-0001, an important study mandated by the US Congress in the Energy Reorganization Act of 1974 which created the NRC. For this study, the NRC defined fuel cycle centres as consisting of fuel reprocessing and mixed-oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle centre sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000-300,000MW(e). The types of fuel cycle facilities located at the fuel cycle centre permit the assessment of the role of fuel cycle centres in enhancing the safeguard of strategic special nuclear materials - plutonium and mixed oxides. Siting fuel cycle centres presents a smaller problem than siting reactors. A single reprocessing plant of the scale projected for use in the USA (1500-2000t/a) can reprocess fuel from reactors producing 50,000-65,000MW(e). Only two or three fuel cycle centres of the upper limit size considered in the NECSS-75 would be required in the USA by the year 2000. The NECSS-75 fuel cycle centre evaluation showed that large-scale fuel cycle centres present no real technical siting difficulties from a radiological effluent and safety standpoint. Some construction economies may be achievable with fuel cycle centres, which offer opportunities to improve waste-management systems. Combined centres consisting of reactors and fuel reprocessing and mixed-oxide fuel fabrication plants were also studied in the NECSS. Such centres can eliminate shipment not only of Pu but also mixed-oxide fuel. Increased fuel cycle costs result from implementation of combined centres unless the fuel reprocessing plants are commercial-sized. Development of Pu-burning reactors could reduce any economic penalties of combined centres. The need for effective fissile

  17. Ad interim legal remedy in case of large projects

    International Nuclear Information System (INIS)

    Limberger, J.

    1985-01-01

    Action for ad interim judicial relief in response to large projects approved of by the authorities very frequently take several years until a decision is taken by the court. Experience has shown that this applies in particular to large projects such as the construction of nuclear power plants, large-size industrial plants, or airports. It has become regular practice by the authorities concerned to issue an order for immediate execution upon the plan approving decision and granting of licence for said projects, in accordance with section 80, sub-section (2), no.4 of the VwGO. The problems thus created with regard to interim legal remedy sought by the parties concerned are of great significance. The book in hand discusses the question as to whether the practice of the administrative authorities and the courts satisfies the requirements embodied in the law, to provide for efficient legal protection. (orig./HSCH) [de

  18. Automatic tools for enhancing the collaborative experience in large projects

    International Nuclear Information System (INIS)

    Bourilkov, D; Rodriquez, J L

    2014-01-01

    With the explosion of big data in many fields, the efficient management of knowledge about all aspects of the data analysis gains in importance. A key feature of collaboration in large scale projects is keeping a log of what is being done and how - for private use, reuse, and for sharing selected parts with collaborators and peers, often distributed geographically on an increasingly global scale. Even better if the log is automatically created on the fly while the scientist or software developer is working in a habitual way, without the need for extra efforts. This saves time and enables a team to do more with the same resources. The CODESH - COllaborative DEvelopment SHell - and CAVES - Collaborative Analysis Versioning Environment System projects address this problem in a novel way. They build on the concepts of virtual states and transitions to enhance the collaborative experience by providing automatic persistent virtual logbooks. CAVES is designed for sessions of distributed data analysis using the popular ROOT framework, while CODESH generalizes the approach for any type of work on the command line in typical UNIX shells like bash or tcsh. Repositories of sessions can be configured dynamically to record and make available the knowledge accumulated in the course of a scientific or software endeavor. Access can be controlled to define logbooks of private sessions or sessions shared within or between collaborating groups. A typical use case is building working scalable systems for analysis of Petascale volumes of data as encountered in the LHC experiments. Our approach is general enough to find applications in many fields.

  19. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  20. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  1. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  2. A Simple Sampling Method for Estimating the Accuracy of Large Scale Record Linkage Projects.

    Science.gov (United States)

    Boyd, James H; Guiver, Tenniel; Randall, Sean M; Ferrante, Anna M; Semmens, James B; Anderson, Phil; Dickinson, Teresa

    2016-05-17

    Record linkage techniques allow different data collections to be brought together to provide a wider picture of the health status of individuals. Ensuring high linkage quality is important to guarantee the quality and integrity of research. Current methods for measuring linkage quality typically focus on precision (the proportion of incorrect links), given the difficulty of measuring the proportion of false negatives. The aim of this work is to introduce and evaluate a sampling based method to estimate both precision and recall following record linkage. In the sampling based method, record-pairs from each threshold (including those below the identified cut-off for acceptance) are sampled and clerically reviewed. These results are then applied to the entire set of record-pairs, providing estimates of false positives and false negatives. This method was evaluated on a synthetically generated dataset, where the true match status (which records belonged to the same person) was known. The sampled estimates of linkage quality were relatively close to actual linkage quality metrics calculated for the whole synthetic dataset. The precision and recall measures for seven reviewers were very consistent with little variation in the clerical assessment results (overall agreement using the Fleiss Kappa statistics was 0.601). This method presents as a possible means of accurately estimating matching quality and refining linkages in population level linkage studies. The sampling approach is especially important for large project linkages where the number of record pairs produced may be very large often running into millions.

  3. FEASIBILITY OF LARGE-SCALE OCEAN CO2 SEQUESTRATION

    Energy Technology Data Exchange (ETDEWEB)

    Dr. Peter Brewer; Dr. James Barry

    2002-09-30

    We have continued to carry out creative small-scale experiments in the deep ocean to investigate the science underlying questions of possible future large-scale deep-ocean CO{sub 2} sequestration as a means of ameliorating greenhouse gas growth rates in the atmosphere. This project is closely linked to additional research funded by the DoE Office of Science, and to support from the Monterey Bay Aquarium Research Institute. The listing of project achievements here over the past year reflects these combined resources. Within the last project year we have: (1) Published a significant workshop report (58 pages) entitled ''Direct Ocean Sequestration Expert's Workshop'', based upon a meeting held at MBARI in 2001. The report is available both in hard copy, and on the NETL web site. (2) Carried out three major, deep ocean, (3600m) cruises to examine the physical chemistry, and biological consequences, of several liter quantities released on the ocean floor. (3) Carried out two successful short cruises in collaboration with Dr. Izuo Aya and colleagues (NMRI, Osaka, Japan) to examine the fate of cold (-55 C) CO{sub 2} released at relatively shallow ocean depth. (4) Carried out two short cruises in collaboration with Dr. Costas Tsouris, ORNL, to field test an injection nozzle designed to transform liquid CO{sub 2} into a hydrate slurry at {approx}1000m depth. (5) In collaboration with Prof. Jill Pasteris (Washington University) we have successfully accomplished the first field test of a deep ocean laser Raman spectrometer for probing in situ the physical chemistry of the CO{sub 2} system. (6) Submitted the first major paper on biological impacts as determined from our field studies. (7) Submitted a paper on our measurements of the fate of a rising stream of liquid CO{sub 2} droplets to Environmental Science & Technology. (8) Have had accepted for publication in Eos the first brief account of the laser Raman spectrometer success. (9) Have had two

  4. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  5. Carbon offsets as an economic alternative to large-scale logging: a case study in Guyana

    Energy Technology Data Exchange (ETDEWEB)

    Osborne, T. [Energy and Resources Group, University of California Berkeley, 310 Barrows Hall, Berkeley CA 94720 (United States); Kiker, C. [Food and Resource Economics Department, University of Florida, PO Box 110240, Gainesville, FL 32611 (United States)

    2005-03-01

    The objective of this study is to analyze the economic viability of carbon-offset projects that avoid logging in Guyana's forests. The results of this case study illustrate the cost effectiveness of alternative land-use options that reduce deforestation and associated greenhouse gas (GHG) emissions. This analysis demonstrates that using Guyana's rainforests for climate change mitigation can generate equivalent revenue to that of conventional large-scale logging without detrimental environmental impacts. At a 12% discount rate, the break-even price for carbon is estimated to be about US$ 0.20/tC. This estimate falls toward the low range of carbon prices for existing carbon offset projects that avoid deforestation.

  6. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  7. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  8. Risk Management and Uncertainty in Large Complex Public Projects

    DEFF Research Database (Denmark)

    Neerup Themsen, Tim; Harty, Chris; Tryggestad, Kjell

    Governmental actors worldwide are promoting risk management as a rational approach to man-age uncertainty and improve the abilities to deliver large complex projects according to budget, time plans, and pre-set project specifications: But what do we know about the effects of risk management...... on the abilities to meet such objectives? Using Callon’s (1998) twin notions of framing and overflowing we examine the implementation of risk management within the Dan-ish public sector and the effects this generated for the management of two large complex pro-jects. We show how the rational framing of risk...... management have generated unexpected costly outcomes such as: the undermining of the longer-term value and societal relevance of the built asset, the negligence of the wider range of uncertainties emerging during project processes, and constraining forms of knowledge. We also show how expert accountants play...

  9. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  10. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  11. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  12. Large-scale dynamic compaction demonstration using WIPP salt: Fielding and preliminary results

    International Nuclear Information System (INIS)

    Ahrens, E.H.; Hansen, F.D.

    1995-10-01

    Reconsolidation of crushed rock salt is a phenomenon of great interest to programs studying isolation of hazardous materials in natural salt geologic settings. Of particular interest is the potential for disaggregated salt to be restored to nearly an impermeable state. For example, reconsolidated crushed salt is proposed as a major shaft seal component for the Waste Isolation Pilot Plant (WIPP) Project. The concept for a permanent shaft seal component of the WIPP repository is to densely compact crushed salt in the four shafts; an effective seal will then be developed as the surrounding salt creeps into the shafts, further consolidating the crushed salt. Fundamental information on placement density and permeability is required to ensure attainment of the design function. The work reported here is the first large-scale compaction demonstration to provide information on initial salt properties applicable to design, construction, and performance expectations. The shaft seals must function for 10,000 years. Over this period a crushed salt mass will become less permeable as it is compressed by creep closure of salt surrounding the shaft. These facts preclude the possibility of conducting a full-scale, real-time field test. Because permanent seals taking advantage of salt reconsolidation have never been constructed, performance measurements have not been made on an appropriately large scale. An understanding of potential construction methods, achievable initial density and permeability, and performance of reconsolidated salt over time is required for seal design and performance assessment. This report discusses fielding and operations of a nearly full-scale dynamic compaction of mine-run WIPP salt, and presents preliminary density and in situ (in place) gas permeability results

  13. Possible implications of large scale radiation processing of food

    International Nuclear Information System (INIS)

    Zagorski, Z.P.

    1990-01-01

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of successful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fulfilment of conditions for successful processing is observed in the group of dry food, in expensive spices in particular. (author)

  14. Possible implications of large scale radiation processing of food

    Science.gov (United States)

    Zagórski, Z. P.

    Large scale irradiation has been discussed in terms of the participation of processing cost in the final value of the improved product. Another factor has been taken into account and that is the saturation of the market with the new product. In the case of succesful projects the participation of irradiation cost is low, and the demand for the better product is covered. A limited availability of sources makes the modest saturation of the market difficult with all food subjected to correct radiation treatment. The implementation of the preservation of food needs a decided selection of these kinds of food which comply to all conditions i.e. of acceptance by regulatory bodies, real improvement of quality and economy. The last condition prefers the possibility of use of electron beams of low energy. The best fullfilment of conditions for succesful processing is observed in the group of dry food, in expensive spices in particular.

  15. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  16. INTEGRATED DESIGN AND ENGINEERING USING BUILDING INFORMATION MODELLING: A PILOT PROJECT OF SMALL-SCALE HOUSING DEVELOPMENT IN THE NETHERLANDS

    Directory of Open Access Journals (Sweden)

    Rizal Sebastian

    2010-11-01

    Full Text Available During the design phase, decisions are made that affect, on average, 70% of the life-cycle cost of a building. Therefore, collaborative design relying on multidisciplinary knowledge of the building life cycle is essential. Building information modelling (BIM makes it possible to integrate knowledge from various project participants that traditionally work in different phases of the building process. BIM has been applied in a number of large-scale projects in the industrial real estate and infrastructure sectors in different countries, including The Netherlands. The projects in the housing sector, however, are predominantly small scale and carried out by small and medium enterprises (SMEs. These SMEs are looking for practical and affordable BIM solutions for housing projects. This article reports a pilot project of small-scale housing development using BIM in the province of Zeeland, The Netherlands. The conceptual knowledge derived from European and national research projects is disseminated to the SMEs through a series of experimental working sessions. Action learning protocols within a pilot project are developed to ensure direct impacts in terms of cost reduction and quality improvement. The project shows that BIM can be applied without radical changes to the SMEs' information and communication technology systems or to their business organizations. DOI: 10.3763/aedm.2010.0116 Source: Architectural Engineering and Design Management, Volume 6, Number 2, 2010 , pp. 103-110(8

  17. Large-scale Exploration of Neuronal Morphologies Using Deep Learning and Augmented Reality.

    Science.gov (United States)

    Li, Zhongyu; Butler, Erik; Li, Kang; Lu, Aidong; Ji, Shuiwang; Zhang, Shaoting

    2018-02-12

    Recently released large-scale neuron morphological data has greatly facilitated the research in neuroinformatics. However, the sheer volume and complexity of these data pose significant challenges for efficient and accurate neuron exploration. In this paper, we propose an effective retrieval framework to address these problems, based on frontier techniques of deep learning and binary coding. For the first time, we develop a deep learning based feature representation method for the neuron morphological data, where the 3D neurons are first projected into binary images and then learned features using an unsupervised deep neural network, i.e., stacked convolutional autoencoders (SCAEs). The deep features are subsequently fused with the hand-crafted features for more accurate representation. Considering the exhaustive search is usually very time-consuming in large-scale databases, we employ a novel binary coding method to compress feature vectors into short binary codes. Our framework is validated on a public data set including 58,000 neurons, showing promising retrieval precision and efficiency compared with state-of-the-art methods. In addition, we develop a novel neuron visualization program based on the techniques of augmented reality (AR), which can help users take a deep exploration of neuron morphologies in an interactive and immersive manner.

  18. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  19. Re-thinking china's densified biomass fuel policies: Large or small scale?

    International Nuclear Information System (INIS)

    Shan, Ming; Li, Dingkai; Jiang, Yi; Yang, Xudong

    2016-01-01

    Current policies and strategies related to the utilization of densified biomass fuel (DBF) in China are mainly focused on medium- or large-scale manufacturing modes, which cannot provide feasible solutions to solve the household energy problems in China's rural areas. To simplify commercial processes related to the collection of DBF feedstock and the production and utilization of fuel, a novel village-scale DBF approach is proposed. Pilot demonstration projects have shown the feasibility and flexibility of this new approach in realizing sustainable development in rural China. Effective utilization of DBF in rural China will lead to gains for global, regional, and local energy savings, environmental protection, sustainable development, and related social benefits. It could also benefit other developing countries for better utilization of biomass as a viable household energy source. This proposal therefore delivers the possibility of reciprocal gains, and as such deserves the attention of policy makers and various stakeholders. - Highlights: •A field survey of Chinese densified biomass fuel (DBF) development is conducted. •The current situation and problems related to China's DBF industry are analyzed. •A novel and viable village-scale DBF utilization mode is proposed. •Further actions are suggested to boost the utilization of DBF in rural China.

  20. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  1. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  2. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  3. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  4. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  5. Assessment of the technology required to develop photovoltaic power system for large scale national energy applications

    Science.gov (United States)

    Lutwack, R.

    1974-01-01

    A technical assessment of a program to develop photovoltaic power system technology for large-scale national energy applications was made by analyzing and judging the alternative candidate photovoltaic systems and development tasks. A program plan was constructed based on achieving the 10 year objective of a program to establish the practicability of large-scale terrestrial power installations using photovoltaic conversion arrays costing less than $0.50/peak W. Guidelines for the tasks of a 5 year program were derived from a set of 5 year objectives deduced from the 10 year objective. This report indicates the need for an early emphasis on the development of the single-crystal Si photovoltaic system for commercial utilization; a production goal of 5 x 10 to the 8th power peak W/year of $0.50 cells was projected for the year 1985. The developments of other photovoltaic conversion systems were assigned to longer range development roles. The status of the technology developments and the applicability of solar arrays in particular power installations, ranging from houses to central power plants, was scheduled to be verified in a series of demonstration projects. The budget recommended for the first 5 year phase of the program is $268.5M.

  6. Behavioral responses of birds of prey to large scale energy development in southcentral Washington

    International Nuclear Information System (INIS)

    Fitzner, R.E.

    1985-02-01

    The types of raptorial and semi-raptorial birds that use the Hanford environs are discussed along with the impacts of past operations and the recent WPPSS project on their populations. These findings add to our understanding of the population dynamics of the birds of prey community at the Hanford Site and the expected impacts of the WPPSS energy facilities. The results may have implications toward other large scale energy facilities, and may aid us in management of bird of prey communities throughout the grasslands of the western United States. 110 refs., 5 figs., 4 tabs

  7. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  8. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  9. Rock sealing - large scale field test and accessory investigations

    International Nuclear Information System (INIS)

    Pusch, R.

    1988-03-01

    The experience from the pilot field test and the basic knowledge extracted from the lab experiments have formed the basis of the planning of a Large Scale Field Test. The intention is to find out how the 'instrument of rock sealing' can be applied to a number of practical cases, where cutting-off and redirection of groundwater flow in repositories are called for. Five field subtests, which are integrated mutually or with other Stripa projects (3D), are proposed. One of them concerns 'near-field' sealing, i.e. sealing of tunnel floors hosting deposition holes, while two involve sealing of 'disturbed' rock around tunnels. The fourth concerns sealing of a natural fracture zone in the 3D area, and this latter test has the expected spin-off effect of obtaining additional information on the general flow pattern around the northeastern wing of the 3D cross. The fifth test is an option of sealing structures in the Validation Drift. The longevity of major grout types is focussed on as the most important part of the 'Accessory Investigations', and detailed plans have been worked out for that purpose. It is foreseen that the continuation of the project, as outlined in this report, will yield suitable methods and grouts for effective and long-lasting sealing of rock for use at stategic points in repositories. (author)

  10. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  11. Proceedings of the Joint IAEA/CSNI Specialists' Meeting on Fracture Mechanics Verification by Large-Scale Testing

    International Nuclear Information System (INIS)

    1993-10-01

    This report provides the proceedings of a Specialists' Meeting on Fracture Mechanics Verification by Large-Scale Testing that was held in Oak Ridge, Tennessee, on October 23-25, 1992. The meeting was jointly sponsored by the International Atomic Energy Agency (IAEA) and the Nuclear Energy Agency (NEA) of the Organization for Economic Cooperation and Development. In particular, the International Working Group (IWG) on Life Management of Nuclear Power Plants (LMNPP) was the IAEA sponsor, and the Principal Working Group 3 (PWG-3) (Primary System Component Integrity) of the Committee for the Safety of Nuclear Installations (CSNI) was the NEA's sponsor. This meeting was preceded by two prior international activities that were designed to examine the state-of-the-art in fracture analysis capabilities and emphasized applications to the safety evaluation of nuclear power facilities. The first of those two activities was an IAEA Specialists' Meeting on Fracture Mechanics Verification by Large-Scale Testing that was held at the Staatliche Materialprufungsanstalt (MPA) in Stuttgart, Germany, on May 25-27, 1988; the proceedings of that meeting were published 1991.1 The second activity was the CSNI/PWG-3's Fracture Assessment Group's Project FALSIRE (Fracture Analyses of Large-Scale International Reference Experiments). The proceedings of the FALSIRE workshop that was held in Boston, Massachusetts, U.S.A., on May 8-10, 1990, was recently published by the Oak Ridge National Laboratory (ORNL). Those previous activities identified capabilities and shortcomings of various fracture analysis methods based on analyses of six available large-scale experiments. Different modes of fracture behavior, which ranged from brittle to ductile, were considered. In addition, geometry, size, constraint and multiaxial effects were considered. While generally good predictive capabilities were demonstrated for brittle fracture, issues were identified relative to predicting fracture behavior at higher

  12. THE LARGE-SCALE COSMIC-RAY ANISOTROPY AS OBSERVED WITH MILAGRO

    International Nuclear Information System (INIS)

    Abdo, A. A.; Allen, B. T.; Chen, C.; Aune, T.; Berley, D.; Goodman, J. A.; Hopper, B.; Lansdell, C. P.; Casanova, S.; Dingus, B. L.; Hoffman, C. M.; Huentemeyer, P. H.; Ellsworth, R. W.; Fleysher, L.; Fleysher, R.; Kolterman, B. E.; Mincer, A. I.; Gonzalez, M. M.; Linnemann, J. T.; McEnery, J. E.

    2009-01-01

    Results are presented of a harmonic analysis of the large-scale cosmic-ray (CR) anisotropy as observed by the Milagro observatory. We show a two-dimensional display of the sidereal anisotropy projections in right ascension (R.A.) generated by the fitting of three harmonics to 18 separate declination bands. The Milagro observatory is a water Cherenkov detector located in the Jemez mountains near Los Alamos, New Mexico. With a high duty cycle and large field of view, Milagro is an excellent instrument for measuring this anisotropy with high sensitivity at TeV energies. The analysis is conducted using a seven-year data sample consisting of more than 95 billion events, the largest such data set in existence. We observe an anisotropy with a magnitude around 0.1% for CRs with a median energy of 6 TeV. The dominant feature is a deficit region of depth (2.49 ± 0.02 stat. ± 0.09 sys.) x10 -3 in the direction of the Galactic north pole centered at 189 deg R.A. We observe a steady increase in the magnitude of the signal over seven years.

  13. Tension Awareness of Stakeholders in Large Technology Projects : A Duality Perspective

    NARCIS (Netherlands)

    Boonstra, Albert; van Offenbeek, Marjolein; Vos, Janita F.J.

    2017-01-01

    This article analyzes the tensions evolving from project management dilemmas and how they relate to stakeholders in large technology projects. The study addresses an organization-wide electronic health record implementation in a large hospital. It adopts a duality lens in exploring whether and how

  14. Large Scale Software Building with CMake in ATLAS

    Science.gov (United States)

    Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration

    2017-10-01

    The offline software of the ATLAS experiment at the Large Hadron Collider (LHC) serves as the platform for detector data reconstruction, simulation and analysis. It is also used in the detector’s trigger system to select LHC collision events during data taking. The ATLAS offline software consists of several million lines of C++ and Python code organized in a modular design of more than 2000 specialized packages. Because of different workflows, many stable numbered releases are in parallel production use. To accommodate specific workflow requests, software patches with modified libraries are distributed on top of existing software releases on a daily basis. The different ATLAS software applications also require a flexible build system that strongly supports unit and integration tests. Within the last year this build system was migrated to CMake. A CMake configuration has been developed that allows one to easily set up and build the above mentioned software packages. This also makes it possible to develop and test new and modified packages on top of existing releases. The system also allows one to detect and execute partial rebuilds of the release based on single package changes. The build system makes use of CPack for building RPM packages out of the software releases, and CTest for running unit and integration tests. We report on the migration and integration of the ATLAS software to CMake and show working examples of this large scale project in production.

  15. ORNL Pre-test Analyses of A Large-scale Experiment in STYLE

    International Nuclear Information System (INIS)

    Williams, Paul T.; Yin, Shengjun; Klasky, Hilda B.; Bass, Bennett Richard

    2011-01-01

    Oak Ridge National Laboratory (ORNL) is conducting a series of numerical analyses to simulate a large scale mock-up experiment planned within the European Network for Structural Integrity for Lifetime Management non-RPV Components (STYLE). STYLE is a European cooperative effort to assess the structural integrity of (non-reactor pressure vessel) reactor coolant pressure boundary components relevant to ageing and life-time management and to integrate the knowledge created in the project into mainstream nuclear industry assessment codes. ORNL contributes work-in-kind support to STYLE Work Package 2 (Numerical Analysis/Advanced Tools) and Work Package 3 (Engineering Assessment Methods/LBB Analyses). This paper summarizes the current status of ORNL analyses of the STYLE Mock-Up3 large-scale experiment to simulate and evaluate crack growth in a cladded ferritic pipe. The analyses are being performed in two parts. In the first part, advanced fracture mechanics models are being developed and performed to evaluate several experiment designs taking into account the capabilities of the test facility while satisfying the test objectives. Then these advanced fracture mechanics models will be utilized to simulate the crack growth in the large scale mock-up test. For the second part, the recently developed ORNL SIAM-PFM open-source, cross-platform, probabilistic computational tool will be used to generate an alternative assessment for comparison with the advanced fracture mechanics model results. The SIAM-PFM probabilistic analysis of the Mock-Up3 experiment will utilize fracture modules that are installed into a general probabilistic framework. The probabilistic results of the Mock-Up3 experiment obtained from SIAM-PFM will be compared to those results generated using the deterministic 3D nonlinear finite-element modeling approach. The objective of the probabilistic analysis is to provide uncertainty bounds that will assist in assessing the more detailed 3D finite

  16. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  17. Implementation of highly parallel and large scale GW calculations within the OpenAtom software

    Science.gov (United States)

    Ismail-Beigi, Sohrab

    The need to describe electronic excitations with better accuracy than provided by band structures produced by Density Functional Theory (DFT) has been a long-term enterprise for the computational condensed matter and materials theory communities. In some cases, appropriate theoretical frameworks have existed for some time but have been difficult to apply widely due to computational cost. For example, the GW approximation incorporates a great deal of important non-local and dynamical electronic interaction effects but has been too computationally expensive for routine use in large materials simulations. OpenAtom is an open source massively parallel ab initiodensity functional software package based on plane waves and pseudopotentials (http://charm.cs.uiuc.edu/OpenAtom/) that takes advantage of the Charm + + parallel framework. At present, it is developed via a three-way collaboration, funded by an NSF SI2-SSI grant (ACI-1339804), between Yale (Ismail-Beigi), IBM T. J. Watson (Glenn Martyna) and the University of Illinois at Urbana Champaign (Laxmikant Kale). We will describe the project and our current approach towards implementing large scale GW calculations with OpenAtom. Potential applications of large scale parallel GW software for problems involving electronic excitations in semiconductor and/or metal oxide systems will be also be pointed out.

  18. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  19. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    Shimakawa, Yoshio; Nibe, Nobuaki; Hori, Toru

    2002-05-01

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  20. Assessment of online public opinions on large infrastructure projects: A case study of the Three Gorges Project in China

    International Nuclear Information System (INIS)

    Jiang, Hanchen; Qiang, Maoshan; Lin, Peng

    2016-01-01

    Public opinion becomes increasingly salient in the ex post evaluation stage of large infrastructure projects which have significant impacts to the environment and the society. However, traditional survey methods are inefficient in collection and assessment of the public opinion due to its large quantity and diversity. Recently, Social media platforms provide a rich data source for monitoring and assessing the public opinion on controversial infrastructure projects. This paper proposes an assessment framework to transform unstructured online public opinions on large infrastructure projects into sentimental and topical indicators for enhancing practices of ex post evaluation and public participation. The framework uses web crawlers to collect online comments related to a large infrastructure project and employs two natural language processing technologies, including sentiment analysis and topic modeling, with spatio-temporal analysis, to transform these comments into indicators for assessing online public opinion on the project. Based on the framework, we investigate the online public opinion of the Three Gorges Project on China's largest microblogging site, namely, Weibo. Assessment results present spatial-temporal distributions of post intensity and sentiment polarity, reveals major topics with different sentiments and summarizes managerial implications, for ex post evaluation of the world's largest hydropower project. The proposed assessment framework is expected to be widely applied as a methodological strategy to assess public opinion in the ex post evaluation stage of large infrastructure projects. - Highlights: • We developed a framework to assess online public opinion on large infrastructure projects with environmental impacts. • Indicators were built to assess post intensity, sentiment polarity and major topics of the public opinion. • We took the Three Gorges Project (TGP) as an example to demonstrate the effectiveness proposed framework.

  1. Assessment of online public opinions on large infrastructure projects: A case study of the Three Gorges Project in China

    Energy Technology Data Exchange (ETDEWEB)

    Jiang, Hanchen, E-mail: jhc13@mails.tsinghua.edu.cn; Qiang, Maoshan, E-mail: qiangms@tsinghua.edu.cn; Lin, Peng, E-mail: celinpe@mail.tsinghua.edu.cn

    2016-11-15

    Public opinion becomes increasingly salient in the ex post evaluation stage of large infrastructure projects which have significant impacts to the environment and the society. However, traditional survey methods are inefficient in collection and assessment of the public opinion due to its large quantity and diversity. Recently, Social media platforms provide a rich data source for monitoring and assessing the public opinion on controversial infrastructure projects. This paper proposes an assessment framework to transform unstructured online public opinions on large infrastructure projects into sentimental and topical indicators for enhancing practices of ex post evaluation and public participation. The framework uses web crawlers to collect online comments related to a large infrastructure project and employs two natural language processing technologies, including sentiment analysis and topic modeling, with spatio-temporal analysis, to transform these comments into indicators for assessing online public opinion on the project. Based on the framework, we investigate the online public opinion of the Three Gorges Project on China's largest microblogging site, namely, Weibo. Assessment results present spatial-temporal distributions of post intensity and sentiment polarity, reveals major topics with different sentiments and summarizes managerial implications, for ex post evaluation of the world's largest hydropower project. The proposed assessment framework is expected to be widely applied as a methodological strategy to assess public opinion in the ex post evaluation stage of large infrastructure projects. - Highlights: • We developed a framework to assess online public opinion on large infrastructure projects with environmental impacts. • Indicators were built to assess post intensity, sentiment polarity and major topics of the public opinion. • We took the Three Gorges Project (TGP) as an example to demonstrate the effectiveness proposed framework.

  2. Large scale CMB anomalies from thawing cosmic strings

    Energy Technology Data Exchange (ETDEWEB)

    Ringeval, Christophe [Centre for Cosmology, Particle Physics and Phenomenology, Institute of Mathematics and Physics, Louvain University, 2 Chemin du Cyclotron, 1348 Louvain-la-Neuve (Belgium); Yamauchi, Daisuke; Yokoyama, Jun' ichi [Research Center for the Early Universe (RESCEU), Graduate School of Science, The University of Tokyo, Tokyo 113-0033 (Japan); Bouchet, François R., E-mail: christophe.ringeval@uclouvain.be, E-mail: yamauchi@resceu.s.u-tokyo.ac.jp, E-mail: yokoyama@resceu.s.u-tokyo.ac.jp, E-mail: bouchet@iap.fr [Institut d' Astrophysique de Paris, UMR 7095-CNRS, Université Pierre et Marie Curie, 98bis boulevard Arago, 75014 Paris (France)

    2016-02-01

    Cosmic strings formed during inflation are expected to be either diluted over super-Hubble distances, i.e., invisible today, or to have crossed our past light cone very recently. We discuss the latter situation in which a few strings imprint their signature in the Cosmic Microwave Background (CMB) Anisotropies after recombination. Being almost frozen in the Hubble flow, these strings are quasi static and evade almost all of the previously derived constraints on their tension while being able to source large scale anisotropies in the CMB sky. Using a local variance estimator on thousand of numerically simulated Nambu-Goto all sky maps, we compute the expected signal and show that it can mimic a dipole modulation at large angular scales while being negligible at small angles. Interestingly, such a scenario generically produces one cold spot from the thawing of a cosmic string loop. Mixed with anisotropies of inflationary origin, we find that a few strings of tension GU = O(1) × 10{sup −6} match the amplitude of the dipole modulation reported in the Planck satellite measurements and could be at the origin of other large scale anomalies.

  3. Large urban projects and social actors : Forces supporting and opposing the production

    NARCIS (Netherlands)

    Cuenya, B.E.

    2006-01-01

    This research studies, by means of the analysis of a paradigmatic large urban project in Buenos Aires, the production process of a large urban project furthered by the State and directed to create a new centrality. The analysis is focused on the forces supporting and opposition the project that were

  4. On the scaling features of high-latitude geomagnetic field fluctuations during a large geomagnetic storm

    Science.gov (United States)

    De Michelis, Paola; Federica Marcucci, Maria; Consolini, Giuseppe

    2015-04-01

    Recently we have investigated the spatial distribution of the scaling features of short-time scale magnetic field fluctuations using measurements from several ground-based geomagnetic observatories distributed in the northern hemisphere. We have found that the scaling features of fluctuations of the horizontal magnetic field component at time scales below 100 minutes are correlated with the geomagnetic activity level and with changes in the currents flowing in the ionosphere. Here, we present a detailed analysis of the dynamical changes of the magnetic field scaling features as a function of the geomagnetic activity level during the well-known large geomagnetic storm occurred on July, 15, 2000 (the Bastille event). The observed dynamical changes are discussed in relationship with the changes of the overall ionospheric polar convection and potential structure as reconstructed using SuperDARN data. This work is supported by the Italian National Program for Antarctic Research (PNRA) - Research Project 2013/AC3.08 and by the European Community's Seventh Framework Programme ([FP7/2007-2013]) under Grant no. 313038/STORM and

  5. The Hualien Large-Scale Seismic Test for soil-structure interaction research

    International Nuclear Information System (INIS)

    Tang, H.T.; Stepp, J.C.; Cheng, Y.H.

    1991-01-01

    A Large-Scale Seismic Test (LSST) Program at Hualien, Taiwan, has been initiated with the primary objective of obtaining earthquake-induced SSI data at a stiff soil site having similar prototypical nuclear power plant soil conditions. Preliminary soil boring, geophysical testing and ambient and earthquake-induced ground motion monitoring have been conducted to understand the experiment site conditions. More refined field and laboratory tests will be conducted such as the state-of-the-art freezing sampling technique and the large penetration test (LPT) method to characterize the soil constitutive behavior. The test model to be constructed will be similar to the Lotung model. The instrumentation layout will be designed to provide data for studies of SSI, spatial incoherence, soil stability, foundation uplifting, ground motion wave field and structural response. A consortium consisting of EPRI, Taipower, CRIEPI, TEPCO, CEA, EdF and Framatome has been established to carry out the project. It is envisaged that the Hualien SSI array will be ready to record earthquakes by the middle of 1992. The duration of the recording scheduled for five years. (author)

  6. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  7. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  8. Network theory-based analysis of risk interactions in large engineering projects

    International Nuclear Information System (INIS)

    Fang, Chao; Marle, Franck; Zio, Enrico; Bocquet, Jean-Claude

    2012-01-01

    This paper presents an approach based on network theory to deal with risk interactions in large engineering projects. Indeed, such projects are exposed to numerous and interdependent risks of various nature, which makes their management more difficult. In this paper, a topological analysis based on network theory is presented, which aims at identifying key elements in the structure of interrelated risks potentially affecting a large engineering project. This analysis serves as a powerful complement to classical project risk analysis. Its originality lies in the application of some network theory indicators to the project risk management field. The construction of the risk network requires the involvement of the project manager and other team members assigned to the risk management process. Its interpretation improves their understanding of risks and their potential interactions. The outcomes of the analysis provide a support for decision-making regarding project risk management. An example of application to a real large engineering project is presented. The conclusion is that some new insights can be found about risks, about their interactions and about the global potential behavior of the project. - Highlights: ► The method addresses the modeling of complexity in project risk analysis. ► Network theory indicators enable other risks than classical criticality analysis to be highlighted. ► This topological analysis improves project manager's understanding of risks and risk interactions. ► This helps project manager to make decisions considering the position in the risk network. ► An application to a real tramway implementation project in a city is provided.

  9. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  10. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  11. An interactive display system for large-scale 3D models

    Science.gov (United States)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  12. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  13. Minnesota wood energy scale-up project 1994 establishment cost data

    Energy Technology Data Exchange (ETDEWEB)

    Downing, M. [Oak Ridge National Lab., TN (United States); Pierce, R. [Champion International, Alexandria, MN (United States); Kroll, T. [Minnesota Department of Natural Resources-Forestry, St. Cloud, MN (United States)

    1996-03-18

    The Minnesota Wood Energy Scale-up Project began in late 1993 with the first trees planted in the spring of 1994. The purpose of the project is to track and monitor economic costs of planting, maintaining and monitoring larger scale commercial plantings. For 15 years, smaller scale research plantings of hybrid poplar have been used to screen for promising, high-yielding poplar clones. In this project 1000 acres of hybrid poplar trees were planted on Conservation Reserve Program (CRP) land near Alexandria, Minnesota in 1994. The fourteen landowners involved re-contracted with the CRP for five-year extensions of their existing 10-year contracts. These extended contracts will expire in 2001, when the plantings are 7 years old. The end use for the trees planted in the Minnesota Wood Energy Scale-up Project is undetermined. They will belong to the owner of the land on which they are planted. There are no current contracts in place for the wood these trees are projected to supply. The structure of the wood industry in the Minnesota has changed drastically over the past 5 years. Stumpage values for fiber have risen to more than $20 per cord in some areas raising the possibility that these trees could be used for fiber rather than energy. Several legislative mandates have forced the State of Minnesota to pursue renewable energy including biomass energy. These mandates, a potential need for an additional 1700 MW of power by 2008 by Northern States Power, and agricultural policies will all affect development of energy markets for wood produced much like agricultural crops. There has been a tremendous amount of local and international interest in the project. Contractual negotiations between area landowners, the CRP, a local Resource Conservation and Development District, the Minnesota Department of Natural Resources and others are currently underway for additional planting of 1000 acres in spring 1995.

  14. High-Energy Physics Strategies and Future Large-Scale Projects

    CERN Document Server

    Zimmermann, F

    2015-01-01

    We sketch the actual European and international strategies and possible future facilities. In the near term the High Energy Physics (HEP) community will fully exploit the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). Post-LHC options include a linear e+e- collider in Japan (ILC) or at CERN (CLIC), as well as circular lepton or hadron colliders in China (CepC/SppC) and Europe (FCC). We conclude with linear and circular acceleration approaches based on crystals, and some perspectives for the far future of accelerator-based particle physics.

  15. Large-scale perturbations from the waterfall field in hybrid inflation

    International Nuclear Information System (INIS)

    Fonseca, José; Wands, David; Sasaki, Misao

    2010-01-01

    We estimate large-scale curvature perturbations from isocurvature fluctuations in the waterfall field during hybrid inflation, in addition to the usual inflaton field perturbations. The tachyonic instability at the end of inflation leads to an explosive growth of super-Hubble scale perturbations, but they retain the steep blue spectrum characteristic of vacuum fluctuations in a massive field during inflation. The power spectrum thus peaks around the Hubble-horizon scale at the end of inflation. We extend the usual δN formalism to include the essential role of these small fluctuations when estimating the large-scale curvature perturbation. The resulting curvature perturbation due to fluctuations in the waterfall field is second-order and the spectrum is expected to be of order 10 −54 on cosmological scales

  16. Decoupling local mechanics from large-scale structure in modular metamaterials

    Science.gov (United States)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  17. Evaluation of the regional climate response in Australia to large-scale climate modes in the historical NARCliM simulations

    Science.gov (United States)

    Fita, L.; Evans, J. P.; Argüeso, D.; King, A.; Liu, Y.

    2017-10-01

    NARCliM (New South Wales (NSW)/Australian Capital Territory (ACT) Regional Climate Modelling project) is a regional climate modeling project for the Australian area. It is providing a comprehensive dynamically downscaled climate dataset for the CORDEX-AustralAsia region at 50-km resolution, and south-East Australia at a resolution of 10 km. The first phase of NARCliM produced 60-year long reanalysis driven regional simulations to allow evaluation of the regional model performance. This long control period (1950-2009) was used so that the model ability to capture the impact of large scale climate modes on Australian climate could be examined. Simulations are evaluated using a gridded observational dataset. Results show that using model independence as a criteria for choosing atmospheric model configuration from different possible sets of parameterizations may contribute to the regional climate models having different overall biases. The regional models generally capture the regional climate response to large-scale modes better than the driving reanalysis, though no regional model improves on all aspects of the simulated climate.

  18. A study of rotor and platform design trade-offs for large-scale floating vertical axis wind turbines

    Science.gov (United States)

    Griffith, D. Todd; Paquette, Joshua; Barone, Matthew; Goupee, Andrew J.; Fowler, Matthew J.; Bull, Diana; Owens, Brian

    2016-09-01

    Vertical axis wind turbines are receiving significant attention for offshore siting. In general, offshore wind offers proximity to large populations centers, a vast & more consistent wind resource, and a scale-up opportunity, to name a few beneficial characteristics. On the other hand, offshore wind suffers from high levelized cost of energy (LCOE) and in particular high balance of system (BoS) costs owing to accessibility challenges and limited project experience. To address these challenges associated with offshore wind, Sandia National Laboratories is researching large-scale (MW class) offshore floating vertical axis wind turbines (VAWTs). The motivation for this work is that floating VAWTs are a potential transformative technology solution to reduce offshore wind LCOE in deep-water locations. This paper explores performance and cost trade-offs within the design space for floating VAWTs between the configurations for the rotor and platform.

  19. Large-scale fuel cycle centers

    International Nuclear Information System (INIS)

    Smiley, S.H.; Black, K.M.

    1977-01-01

    The United States Nuclear Regulatory Commission (NRC) has considered the nuclear energy center concept for fuel cycle plants in the Nuclear Energy Center Site Survey - 1975 (NECSS-75) -- an important study mandated by the U.S. Congress in the Energy Reorganization Act of 1974 which created the NRC. For the study, NRC defined fuel cycle centers to consist of fuel reprocessing and mixed oxide fuel fabrication plants, and optional high-level waste and transuranic waste management facilities. A range of fuel cycle center sizes corresponded to the fuel throughput of power plants with a total capacity of 50,000 - 300,000 MWe. The types of fuel cycle facilities located at the fuel cycle center permit the assessment of the role of fuel cycle centers in enhancing safeguarding of strategic special nuclear materials -- plutonium and mixed oxides. Siting of fuel cycle centers presents a considerably smaller problem than the siting of reactors. A single reprocessing plant of the scale projected for use in the United States (1500-2000 MT/yr) can reprocess the fuel from reactors producing 50,000-65,000 MWe. Only two or three fuel cycle centers of the upper limit size considered in the NECSS-75 would be required in the United States by the year 2000 . The NECSS-75 fuel cycle center evaluations showed that large scale fuel cycle centers present no real technical difficulties in siting from a radiological effluent and safety standpoint. Some construction economies may be attainable with fuel cycle centers; such centers offer opportunities for improved waste management systems. Combined centers consisting of reactors and fuel reprocessing and mixed oxide fuel fabrication plants were also studied in the NECSS. Such centers can eliminate not only shipment of plutonium, but also mixed oxide fuel. Increased fuel cycle costs result from implementation of combined centers unless the fuel reprocessing plants are commercial-sized. Development of plutonium-burning reactors could reduce any

  20. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)