WorldWideScience

Sample records for canada involving large-scale

  1. Feasibility of an energy conversion system in Canada involving large-scale integrated hydrogen production using solid fuels

    International Nuclear Information System (INIS)

    A large-scale hydrogen production system is proposed using solid fuels and designed to increase the sustainability of alternative energy forms in Canada, and the technical and economic aspects of the system within the Canadian energy market are examined. The work investigates the feasibility and constraints in implementing such a system within the energy infrastructure of Canada. The proposed multi-conversion and single-function system produces hydrogen in large quantities using energy from solid fuels such as coal, tar sands, biomass, municipal solid waste (MSW) and agricultural/forest/industrial residue. The proposed system involves significant technology integration, with various energy conversion processes (such as gasification, chemical looping combustion, anaerobic digestion, combustion power cycles-electrolysis and solar-thermal converters) interconnected to increase the utilization of solid fuels as much as feasible within cost, environmental and other constraints. The analysis involves quantitative and qualitative assessments based on (i) energy resources availability and demand for hydrogen, (ii) commercial viability of primary energy conversion technologies, (iii) academia, industry and government participation, (iv) sustainability and (v) economics. (author)

  2. Sustainability of an energy conversion system in Canada involving large-scale integrated hydrogen production using solid fuels

    Directory of Open Access Journals (Sweden)

    Nirmal V. Gnanapragasam, Bale V. Reddy, Marc A. Rosen

    2011-01-01

    Full Text Available The sustainability of a large-scale hydrogen production system is assessed qualitatively. The system uses solid fuels and aims to increase the sustainability of the energy system in Canada through the use of alternative energy forms. The system involves significant technology integration, with various energy conversion processes (e.g., gasification, chemical looping combustion, anaerobic digestion, combustion power cycles-electrolysis and solar-thermal convertors interconnected to increase the utilization of solid fuels as much as feasible in a sustainable manner within cost, environmental and other constraints. The qualitative analysis involves ten different indicators for each of the three dimensions of sustainability: ecology, sociology and technology, applied to each process in the system and assessed based on a ten-point quality scale. The results indicate that biomasses have better sustainability than coals while newer secondary processes are essential for primary conversion to be sustainable, especially when using coals. Also, new developments in CO2 use (for algae-to-oil and commercial applications and storage will in time help improve sustainability.

  3. Abnormal binding and disruption in large scale networks involved in human partial seizures

    OpenAIRE

    Bartolomei, Fabrice; Guye, Maxime; Wendling, Fabrice

    2013-01-01

    International audience There is a marked increase in the amount of electrophysiological and neuroimaging works dealing with the study of large scale brain connectivity in the epileptic brain. Our view of the epileptogenic process in the brain has largely evolved over the last twenty years from the historical concept of "epileptic focus" to a more complex description of "Epileptogenic networks" involved in the genesis and "propagation" of epileptic activities. In particular, a large number ...

  4. Commercializing Canada's emerging energies : capitalising on large-scale power project opportunities from wind and hydro power, to biomass and clean coal

    International Nuclear Information System (INIS)

    The Canada Institute conference on Commercialising Canada's Emerging Energies was held in Calgary, Alberta, Canada on May 28-29, 2007. This publication provides cutting-edge project updates and best practices on how to take advantage of new business opportunities, while both identifying and mitigating the risks associated with future large-scale projects. Emerging energies - wind, hydro, biomass and clean coal - are no longer the future, they are todayAre you ready to take advantage of Canada's next generation of clean and green power opportunities?Canada's electricity industry is changing dramatically. Power projects are becoming less centralized. Governments are shifting their focus to clean and green sources of energy. The cost-effectiveness of applying emerging energy technologies for large-scale (5mw+) power projects has significantly improved - especially with new regulatory incentives.However, many challenges still need to be addressed to bring many of these projects into the mainstream market. Ensuring adequate supply, system reliability and transmission capacity are among the key technical issues. Improvements to community consultation practice, project planning and implementation skills, and government incentives are also expected to improve emerging energy economics and deliverability

  5. Financial return for government support of large-scale thin-film solar photovoltaic manufacturing in Canada

    International Nuclear Information System (INIS)

    As the Ontario government has recognized that solar photovoltaic (PV) energy conversion is a solution to satisfying energy demands while reducing the adverse anthropogenic impacts on the global environment that compromise social welfare, it has begun to generate policy to support financial incentives for PV. This paper provides a financial analysis for investment in a 1 GW per year turnkey amorphous silicon PV manufacturing plant. The financial benefits for both the provincial and federal governments were quantified for: (i) full construction subsidy, (ii) construction subsidy and sale, (iii) partially subsidize construction, (iv) a publicly owned plant, (v) loan guarantee for construction, and (vi) an income tax holiday. Revenues for the governments are derived from: taxation (personal, corporate, and sales), sales of panels in Ontario, and saved health, environmental and economic costs associated with offsetting coal-fired electricity. Both governments enjoyed positive cash flows from these investments in less than 12 years and in many of the scenarios both governments earned well over 8% on investments from 100 s of millions to $2.4 billion. The results showed that it is in the financial best interest of both the Ontario and Canadian federal governments to implement aggressive fiscal policy to support large-scale PV manufacturing.

  6. Barriers to renewable energy development: A case study of large-scale wind energy in Saskatchewan, Canada

    International Nuclear Information System (INIS)

    Renewable energy is receiving increased attention as a viable alternative to non-renewable electrical generation, however, meeting global energy demands will require a more ambitious renewable energy program than is currently the case. There have been several reviews of potential technological, economic, social, or public barriers and solutions to renewable energy investment. Although important, there is also need for multi-dimensional analyses of these barriers and identification of the most significant underlying barriers if viable solutions are to be developed. In this paper we apply a theoretical framework to examine stakeholder's perceptions and understanding of the barriers to wind energy development in Saskatchewan, Canada. We identify and examine the most significant underlying barriers to investment in renewable energy and the interactions between those barriers. Results show a number of perceived barriers to wind energy investment, however, these barriers can be explained in large part by knowledge barriers, if not disagreement over whether the current level of investment in wind energy is sufficient. We show that barriers to renewable energy cannot be explained solely by technological, social, political, or economic factors in isolation, and that a multi-dimensional approach, identifying and explaining the underlying sources of these barriers, is necessary to develop viable solutions. - Highlights: ► Meeting future wind energy objectives requires an ambitious investment program. ► A framework is applied to identify and explain perceived barriers to wind energy. ► Stakeholders perceived technological and political barriers as the most significant. ► These could be explained by knowledge barriers and complacency with the status quo. ► Even with additional investment these underlying barriers will constrain progress.

  7. The use of public participation and economic appraisal for public involvement in large-scale hydropower projects: Case study of the Nam Theun 2 Hydropower Project

    International Nuclear Information System (INIS)

    Gaining public acceptance is one of the main issues with large-scale low-carbon projects such as hydropower development. It has been recommended by the World Commission on Dams that to gain public acceptance, public involvement is necessary in the decision-making process (). As financially-significant actors in the planning and implementation of large-scale hydropower projects in developing country contexts, the paper examines the ways in which public involvement may be influenced by international financial institutions. Using the case study of the Nam Theun 2 Hydropower Project in Laos, the paper analyses how public involvement facilitated by the Asian Development Bank had a bearing on procedural and distributional justice. The paper analyses the extent of public participation and the assessment of full social and environmental costs of the project in the Cost-Benefit Analysis conducted during the project appraisal stage. It is argued that while efforts were made to involve the public, there were several factors that influenced procedural and distributional justice: the late contribution of the Asian Development Bank in the project appraisal stage; and the issue of non-market values and discount rate to calculate the full social and environmental costs. - Highlights: ► Public acceptance in large-scale hydropower projects is examined. ► Both procedural and distributional justice are important for public acceptance. ► International Financial Institutions can influence the level of public involvement. ► Public involvement benefits consideration of non-market values and discount rates.

  8. The importance of large scale flood over the regular sedimentation in delta development: A case study involving Wax lake delta

    Science.gov (United States)

    Ullah, M. S.; Kolker, A.; Li, C.

    2011-12-01

    It has been widely hypothesized that catastrophic floods can play an important role in the development of a young and growing delta. This study examines the nature and rate of sediment deposition in the Wax lake delta in the Atchafalaya Basin of Louisiana during the Mississippi River flood of from April, 2011 to June, 2011. We hypothesize that the deposition rate that results from large scales floods in the Mississippi/Atchafalaya river outlets have a greater impact in the Wax lake delta development than the deposition rates that result from typical yearly sedimentation. Preliminary results from the cosmogenic 7Be counts from sediment collected from the delta show distinct regional and local sediment deposition patterns during the flood, and rates that are significantly higher when compared to the sedimentation rate of the last few decades. This application of cosmogenic 7Be to distinguish the sediment deposition rate provides a first-order understanding of the deltaic evolution and stratigraphic sequence development in a high-discharge setting.

  9. Comparison of home-composting and large-scale composting  for organic waste management in Québec, Canada.

    OpenAIRE

    Joly, Elsa

    2011-01-01

    The management of the organic fraction of municipal solid waste has become a major issue lately in the province of Québec, Canada. Most of it is landfilled today, which increases the burden on landfills and is environmentally unsound. In order to comply with new government guidelines, municipalities have to develop solutions to recover and recycle organic waste. In this context, this study examines two solutions for treating organic waste: home-composting and a separate biodegradable waste co...

  10. Wing-dimorphism and population expansion of Pterostichus melanarius (Illiger, 1798 at small and large scales in central Alberta, Canada (Coleoptera, Carabidae, Pterostichini

    Directory of Open Access Journals (Sweden)

    Stephane Bourassa

    2011-11-01

    Full Text Available A study spanning ten years revealed changes in wing-morph ratios corroborating the hypothesis that the wing-dimorphic introduced carabid, Pterostichus melanarius Ill., is spreading through flight, from the city of Edmonton, Canada and establishing populations in natural aspen forest of more rural areas 45-50 km to the East. Comparison of wing-morph ratios between P. melanarius and the native wing dimorphic species Agonum retractum LeConte suggests that the spatial variation in ratios for P. melanarius does not reflect underlying environmental variation, but instead the action of selective forces on this wing-dimorphic species. About ten years after its earliest detection in some rural sites the frequency of macropterous individuals in P. melanarius has decreased c. five-fold, but it is still above the level seen in European populations in which the two wing-morphs are thought to exist in equilibrium. P. melanarius is expanding its range in native aspen forest much faster than three other introduced species Clivina fossor L., Carabus granulatus O.F. Müller and Clivina fossor L also encountered in this study. The two Carabus species are flightless, but C. fossor can be dimorphic. Although these four non-native ground beetle species comprise >85% of the carabids collected at sites in urban Edmonton, activity-density of native carabids was similar across the urban-rural gradient, suggesting little direct impact of introduced species on the local abundance of native species. In a second study conducted at a smaller scale near George Lake, Alberta, macropterous individuals of P. melanarius have penetrated furthest and most rapidly into native aspen forest. Furthermore, the percentage of micropterous individuals has increased markedly in areas first colonized a decade previously. Overall, these studies support the idea that macropterous beetles in wing-d dimorphic species are important vanguards for early colonization of unexploited territory, but

  11. LARGE SCALE GLAZED

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    WORLD FAMOUS ARCHITECTS CHALLENGE TODAY THE EXPOSURE OF CONCRETE IN THEIR ARCHITECTURE. IT IS MY HOPE TO BE ABLE TO COMPLEMENT THESE. I TRY TO DEVELOP NEW AESTHETIC POTENTIALS FOR THE CONCRETE AND CERAMICS, IN LARGE SCALES THAT HAS NOT BEEN SEEN BEFORE IN THE CERAMIC AREA. IT IS EXPECTED TO RESUL......: COLOR, LIGHT AND TEXTURE, GLAZED AND UNGLAZED, BUILDING FACADES...

  12. LARGE SCALE GLAZED

    DEFF Research Database (Denmark)

    Bache, Anja Margrethe

    2010-01-01

    WORLD FAMOUS ARCHITECTS CHALLENGE TODAY THE EXPOSURE OF CONCRETE IN THEIR ARCHITECTURE. IT IS MY HOPE TO BE ABLE TO COMPLEMENT THESE. I TRY TO DEVELOP NEW AESTHETIC POTENTIALS FOR THE CONCRETE AND CERAMICS, IN LARGE SCALES THAT HAS NOT BEEN SEEN BEFORE IN THE CERAMIC AREA. IT IS EXPECTED TO RESULT...... IN NEW TYPES OF LARGE SCALE AND VERY THIN, GLAZED CONCRETE FAÇADES IN BUILDING. IF SUCH ARE INTRODUCED IN AN ARCHITECTURAL CONTEXT THEY WILL HAVE A DISTINCTIVE IMPACT ON THE VISUAL EXPRESSION OF THE BUILDING. THE QUESTION IS WHAT KIND. THAT I WILL ATTEMPT TO ANSWER IN THIS ARTICLE THROUGH OBSERVATION...... OF SELECTED EXISTING BUILDINGS IN AND AROUND COPENHAGEN COVERED WITH MOSAIC TILES, UNGLAZED OR GLAZED CLAY TILES. ITS BUILDINGS WHICH HAVE QUALITIES THAT I WOULD LIKE APPLIED, PERHAPS TRANSFORMED OR MOST PREFERABLY, INTERPRETED ANEW, FOR THE LARGE GLAZED CONCRETE PANELS I AM DEVELOPING. KEYWORDS...

  13. Large-Scale Disasters

    Science.gov (United States)

    Gad-El-Hak, Mohamed

    "Extreme" events - including climatic events, such as hurricanes, tornadoes, and drought - can cause massive disruption to society, including large death tolls and property damage in the billions of dollars. Events in recent years have shown the importance of being prepared and that countries need to work together to help alleviate the resulting pain and suffering. This volume presents a review of the broad research field of large-scale disasters. It establishes a common framework for predicting, controlling and managing both manmade and natural disasters. There is a particular focus on events caused by weather and climate change. Other topics include air pollution, tsunamis, disaster modeling, the use of remote sensing and the logistics of disaster management. It will appeal to scientists, engineers, first responders and health-care professionals, in addition to graduate students and researchers who have an interest in the prediction, prevention or mitigation of large-scale disasters.

  14. Conference on Large Scale Optimization

    CERN Document Server

    Hearn, D; Pardalos, P

    1994-01-01

    On February 15-17, 1993, a conference on Large Scale Optimization, hosted by the Center for Applied Optimization, was held at the University of Florida. The con­ ference was supported by the National Science Foundation, the U. S. Army Research Office, and the University of Florida, with endorsements from SIAM, MPS, ORSA and IMACS. Forty one invited speakers presented papers on mathematical program­ ming and optimal control topics with an emphasis on algorithm development, real world applications and numerical results. Participants from Canada, Japan, Sweden, The Netherlands, Germany, Belgium, Greece, and Denmark gave the meeting an important international component. At­ tendees also included representatives from IBM, American Airlines, US Air, United Parcel Serice, AT & T Bell Labs, Thinking Machines, Army High Performance Com­ puting Research Center, and Argonne National Laboratory. In addition, the NSF sponsored attendance of thirteen graduate students from universities in the United States and abro...

  15. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  16. Large scale traffic simulations

    Energy Technology Data Exchange (ETDEWEB)

    Nagel, K.; Barrett, C.L. [Los Alamos National Lab., NM (United States)]|[Santa Fe Institute, NM (United States); Rickert, M. [Los Alamos National Lab., NM (United States)]|[Universitaet zu Koeln (Germany)

    1997-04-01

    Large scale microscopic (i.e. vehicle-based) traffic simulations pose high demands on computational speed in at least two application areas: (i) real-time traffic forecasting, and (ii) long-term planning applications (where repeated {open_quotes}looping{close_quotes} between the microsimulation and the simulated planning of individual person`s behavior is necessary). As a rough number, a real-time simulation of an area such as Los Angeles (ca. 1 million travellers) will need a computational speed of much higher than 1 million {open_quotes}particle{close_quotes} (= vehicle) updates per second. This paper reviews how this problem is approached in different projects and how these approaches are dependent both on the specific questions and on the prospective user community. The approaches reach from highly parallel and vectorizable, single-bit implementations on parallel supercomputers for Statistical Physics questions, via more realistic implementations on coupled workstations, to more complicated driving dynamics implemented again on parallel supercomputers. 45 refs., 9 figs., 1 tab.

  17. Large scale tracking algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    Hansen, Ross L.; Love, Joshua Alan; Melgaard, David Kennett; Karelitz, David B.; Pitts, Todd Alan; Zollweg, Joshua David; Anderson, Dylan Z.; Nandy, Prabal; Whitlow, Gary L.; Bender, Daniel A.; Byrne, Raymond Harry

    2015-01-01

    Low signal-to-noise data processing algorithms for improved detection, tracking, discrimination and situational threat assessment are a key research challenge. As sensor technologies progress, the number of pixels will increase signi cantly. This will result in increased resolution, which could improve object discrimination, but unfortunately, will also result in a significant increase in the number of potential targets to track. Many tracking techniques, like multi-hypothesis trackers, suffer from a combinatorial explosion as the number of potential targets increase. As the resolution increases, the phenomenology applied towards detection algorithms also changes. For low resolution sensors, "blob" tracking is the norm. For higher resolution data, additional information may be employed in the detection and classfication steps. The most challenging scenarios are those where the targets cannot be fully resolved, yet must be tracked and distinguished for neighboring closely spaced objects. Tracking vehicles in an urban environment is an example of such a challenging scenario. This report evaluates several potential tracking algorithms for large-scale tracking in an urban environment.

  18. Large-scale RNAi screen of G protein-coupled receptors involved in larval growth, molting and metamorphosis in the red flour beetle

    Directory of Open Access Journals (Sweden)

    Shah Kapil

    2011-08-01

    Full Text Available Abstract Background The G protein-coupled receptors (GPCRs belong to the largest superfamily of integral cell membrane proteins and play crucial roles in physiological processes including behavior, development and reproduction. Because of their broad and diverse roles in cellular signaling, GPCRs are the therapeutic targets for many prescription drugs. However, there is no commercial pesticide targeting insect GPCRs. In this study, we employed functional genomics methods and used the red flour beetle, Tribolium castaneum, as a model system to study the physiological roles of GPCRs during the larval growth, molting and metamorphosis. Results A total of 111 non-sensory GPCRs were identified in the T. castaneum genome. Thirty-nine of them were not reported previously. Large-scale RNA interference (RNAi screen was used to study the function of all these GPCRs during immature stages. Double-stranded RNA (dsRNA-mediated knockdown in the expression of genes coding for eight GPCRs caused severe developmental arrest and ecdysis failure (with more than 90% mortality after dsRNA injection. These GPCRs include dopamine-2 like receptor (TC007490/D2R and latrophilin receptor (TC001872/Cirl. The majority of larvae injected with TC007490/D2R dsRNA died during larval stage prior to entering pupal stage, suggesting that this GPCR is essential for larval growth and development. Conclusions The results from our study revealed the physiological roles of some GPCRs in T. castaneum. These findings could help in development of novel pesticides targeting these GPCRs.

  19. Canada

    International Nuclear Information System (INIS)

    Canada ranks ninth in the world in anthropogenic carbon dioxide emissions. Canada currently releases 1.9 percent of global carbon dioxide emissions from combustion of fossil fuels, production and treatment of oil and natural gas, and cement manufacturing. Canadian carbon dioxide emissions doubled between 1956 and 1986, growing at an average rate of about 2.4 percent per year. They reached a peak in 1979 and have since fluctuated within 10 percent of that amount. Despite the significant increase in emissions, Canada's share of global carbon dioxide emissions has declined slightly since 1956. On a per capita basis, Canada ranks fourth among nations in carbon dioxide emissions, with about 4.1 tons in 1986. This rate compares with the U.S. average of 5 tons per capita emissions have grown by about 25 percent since 1956. Canada, though a relatively small source of greenhouse gases, has been deeply involved in finding ways to reduce the risk of climatic change. The nation fortunately possesses opportunities to reduce this risk by saving energy, developing nonfossil energy resources, and facilitating international cooperation for capturing these opportunities worldwide

  20. Large-scale river regulation

    International Nuclear Information System (INIS)

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  1. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    main focus. Here the general perception of the nature and role in society of large scale networks as a fundamental infrastructure is analysed. This analysis focuses on the effects of the technical DDN projects and on the perception of network infrastructure as expressed by key decision makers. A......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  2. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  3. Reviving large-scale projects

    International Nuclear Information System (INIS)

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  4. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data dissem

  5. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  6. Testing gravity on Large Scales

    OpenAIRE

    Raccanelli Alvise

    2013-01-01

    We show how it is possible to test general relativity and different models of gravity via Redshift-Space Distortions using forthcoming cosmological galaxy surveys. However, the theoretical models currently used to interpret the data often rely on simplifications that make them not accurate enough for precise measurements. We will discuss improvements to the theoretical modeling at very large scales, including wide-angle and general relativistic corrections; we then show that for wide and deep...

  7. Large scale biomimetic membrane arrays

    DEFF Research Database (Denmark)

    Hansen, Jesper Søndergaard; Perry, Mark; Vogel, Jörg;

    2009-01-01

    peptides and proteins. Next, we tested the scalability of the biomimetic membrane design by establishing lipid bilayers in rectangular 24 x 24 and hexagonal 24 x 27 aperture arrays, respectively. The results presented show that the design is suitable for further developments of sensitive biosensor assays......To establish planar biomimetic membranes across large scale partition aperture arrays, we created a disposable single-use horizontal chamber design that supports combined optical-electrical measurements. Functional lipid bilayers could easily and efficiently be established across CO2 laser micro......, and furthermore demonstrate that the design can conveniently be scaled up to support planar lipid bilayers in large square-centimeter partition arrays....

  8. Japanese large-scale interferometers

    International Nuclear Information System (INIS)

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D

  9. Models of large scale structure

    International Nuclear Information System (INIS)

    The ingredients required to construct models of the cosmic large scale structure are discussed. Input from particle physics leads to a considerable simplification by offering concrete proposals for the geometry of the universe, the nature of the dark matter and the primordial fluctuations that seed the growth of structure. The remaining ingredient is the physical interaction that governs dynamical evolution. Empirical evidence provided by an analysis of a redshift survey of IRAS galaxies suggests that gravity is the main agent shaping the large-scale structure. In addition, this survey implies large values of the mean cosmic density, Ω> or approx.0.5, and is consistent with a flat geometry if IRAS galaxies are somewhat more clustered than the underlying mass. Together with current limits on the density of baryons from Big Bang nucleosynthesis, this lends support to the idea of a universe dominated by non-baryonic dark matter. Results from cosmological N-body simulations evolved from a variety of initial conditions are reviewed. In particular, neutrino dominated and cold dark matter dominated universes are discussed in detail. Finally, it is shown that apparent periodicities in the redshift distributions in pencil-beam surveys arise frequently from distributions which have no intrinsic periodicity but are clustered on small scales. (orig.)

  10. Testing gravity on Large Scales

    Directory of Open Access Journals (Sweden)

    Raccanelli Alvise

    2013-09-01

    Full Text Available We show how it is possible to test general relativity and different models of gravity via Redshift-Space Distortions using forthcoming cosmological galaxy surveys. However, the theoretical models currently used to interpret the data often rely on simplifications that make them not accurate enough for precise measurements. We will discuss improvements to the theoretical modeling at very large scales, including wide-angle and general relativistic corrections; we then show that for wide and deep surveys those corrections need to be taken into account if we want to measure the growth of structures at a few percent level, and so perform tests on gravity, without introducing systematic errors. Finally, we report the results of some recent cosmological model tests carried out using those precise models.

  11. Large scale cluster computing workshop

    Energy Technology Data Exchange (ETDEWEB)

    Dane Skow; Alan Silverman

    2002-12-23

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community.

  12. Handbook of Large-Scale Random Networks

    CERN Document Server

    Bollobas, Bela; Miklos, Dezso

    2008-01-01

    Covers various aspects of large-scale networks, including mathematical foundations and rigorous results of random graph theory, modeling and computational aspects of large-scale networks, as well as areas in physics, biology, neuroscience, sociology and technical areas

  13. Large-Scale Information Systems

    Energy Technology Data Exchange (ETDEWEB)

    D. M. Nicol; H. R. Ammerlahn; M. E. Goldsby; M. M. Johnson; D. E. Rhodes; A. S. Yoshimura

    2000-12-01

    Large enterprises are ever more dependent on their Large-Scale Information Systems (LSLS), computer systems that are distinguished architecturally by distributed components--data sources, networks, computing engines, simulations, human-in-the-loop control and remote access stations. These systems provide such capabilities as workflow, data fusion and distributed database access. The Nuclear Weapons Complex (NWC) contains many examples of LSIS components, a fact that motivates this research. However, most LSIS in use grew up from collections of separate subsystems that were not designed to be components of an integrated system. For this reason, they are often difficult to analyze and control. The problem is made more difficult by the size of a typical system, its diversity of information sources, and the institutional complexities associated with its geographic distribution across the enterprise. Moreover, there is no integrated approach for analyzing or managing such systems. Indeed, integrated development of LSIS is an active area of academic research. This work developed such an approach by simulating the various components of the LSIS and allowing the simulated components to interact with real LSIS subsystems. This research demonstrated two benefits. First, applying it to a particular LSIS provided a thorough understanding of the interfaces between the system's components. Second, it demonstrated how more rapid and detailed answers could be obtained to questions significant to the enterprise by interacting with the relevant LSIS subsystems through simulated components designed with those questions in mind. In a final, added phase of the project, investigations were made on extending this research to wireless communication networks in support of telemetry applications.

  14. Large Scale Nanolaminate Deformable Mirror

    Energy Technology Data Exchange (ETDEWEB)

    Papavasiliou, A; Olivier, S; Barbee, T; Miles, R; Chang, K

    2005-11-30

    This work concerns the development of a technology that uses Nanolaminate foils to form light-weight, deformable mirrors that are scalable over a wide range of mirror sizes. While MEMS-based deformable mirrors and spatial light modulators have considerably reduced the cost and increased the capabilities of adaptive optic systems, there has not been a way to utilize the advantages of lithography and batch-fabrication to produce large-scale deformable mirrors. This technology is made scalable by using fabrication techniques and lithography that are not limited to the sizes of conventional MEMS devices. Like many MEMS devices, these mirrors use parallel plate electrostatic actuators. This technology replicates that functionality by suspending a horizontal piece of nanolaminate foil over an electrode by electroplated nickel posts. This actuator is attached, with another post, to another nanolaminate foil that acts as the mirror surface. Most MEMS devices are produced with integrated circuit lithography techniques that are capable of very small line widths, but are not scalable to large sizes. This technology is very tolerant of lithography errors and can use coarser, printed circuit board lithography techniques that can be scaled to very large sizes. These mirrors use small, lithographically defined actuators and thin nanolaminate foils allowing them to produce deformations over a large area while minimizing weight. This paper will describe a staged program to develop this technology. First-principles models were developed to determine design parameters. Three stages of fabrication will be described starting with a 3 x 3 device using conventional metal foils and epoxy to a 10-across all-metal device with nanolaminate mirror surfaces.

  15. Architecture of Large-Scale Systems

    OpenAIRE

    Koschel, Arne; Astrova, Irina; Deutschkämer, Elena; Ester, Jacob; Feldmann, Johannes

    2013-01-01

    In this paper various techniques in relation to large-scale systems are presented. At first, explanation of large-scale systems and differences from traditional systems are given. Next, possible specifications and requirements on hardware and software are listed. Finally, examples of large-scale systems are presented.

  16. Agri-Environmental Resource Management by Large-Scale Collective Action: Determining KEY Success Factors

    Science.gov (United States)

    Uetake, Tetsuya

    2015-01-01

    Purpose: Large-scale collective action is necessary when managing agricultural natural resources such as biodiversity and water quality. This paper determines the key factors to the success of such action. Design/Methodology/Approach: This paper analyses four large-scale collective actions used to manage agri-environmental resources in Canada and…

  17. Canada

    International Nuclear Information System (INIS)

    Nuclear research and development in Canada started in the 1940s as a responsibility of the federal government. An engineering design team was established at Chalk River, Ontario, to carry out research on heavy water moderated lattices. A zero-energy heavy water moderated research reactor, ZEEP, was built and achieved criticality in September 1945; it was in fact the first human-made operating reactor outside the USA. In 1947, the 20 MW heavy water moderated national research experimental reactor (NRX) started up. It served as one of the most valuable research reactors in the world, and provided the basis for Canada's development of the very successful CANDU series of pressurised heavy water reactors (PHWR) for power generation. Atomic Energy of Canada Limited (AECL) was established in 1952 as a federal Crown Corporation. It has both a public and a commercial mandate. AECL has overall responsibility for Canada's nuclear research and development programme (its public mandate) as well as for the Canadian reactor design (CANDU), engineering and marketing programme (its commercial mandate). Nuclear energy in Canada is a $5 billion per-year industry, representing about 150 firms, 21 000 direct jobs and 10 000 indirect jobs, and ∼$1.2 billion in exports - the value to the country's economy is much higher than the research and development funding provided by the federal government. The CANDU nuclear reactor system was developed by AECL in close collaboration with the Canadian nuclear industry, and in particular with Ontario Hydro (now Ontario Power Generation). Currently, Canada operates 17 CANDU reactors, which contribute 16% of the country's current electricity consumption. There are also 12 CANDU reactors operating abroad (in Argentina, China, India, the Republic of Korea, Pakistan and Romania). AECL is now developing the 'third generation plus' Advanced CANDU Reactor (ACR-1000), and also has the leading role internationally in developing the Generation IV

  18. Curvature constraints from Large Scale Structure

    CERN Document Server

    Di Dio, Enea; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-01-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter $\\Omega_K$ with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on the spatial curvature parameter estimation. We show that constraints on the curvature para...

  19. Introducing Large-Scale Innovation in Schools

    Science.gov (United States)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-08-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  20. Introducing Large-Scale Innovation in Schools

    Science.gov (United States)

    Sotiriou, Sofoklis; Riviou, Katherina; Cherouvis, Stephanos; Chelioti, Eleni; Bogner, Franz X.

    2016-02-01

    Education reform initiatives tend to promise higher effectiveness in classrooms especially when emphasis is given to e-learning and digital resources. Practical changes in classroom realities or school organization, however, are lacking. A major European initiative entitled Open Discovery Space (ODS) examined the challenge of modernizing school education via a large-scale implementation of an open-scale methodology in using technology-supported innovation. The present paper describes this innovation scheme which involved schools and teachers all over Europe, embedded technology-enhanced learning into wider school environments and provided training to teachers. Our implementation scheme consisted of three phases: (1) stimulating interest, (2) incorporating the innovation into school settings and (3) accelerating the implementation of the innovation. The scheme's impact was monitored for a school year using five indicators: leadership and vision building, ICT in the curriculum, development of ICT culture, professional development support, and school resources and infrastructure. Based on about 400 schools, our study produced four results: (1) The growth in digital maturity was substantial, even for previously high scoring schools. This was even more important for indicators such as vision and leadership" and "professional development." (2) The evolution of networking is presented graphically, showing the gradual growth of connections achieved. (3) These communities became core nodes, involving numerous teachers in sharing educational content and experiences: One out of three registered users (36 %) has shared his/her educational resources in at least one community. (4) Satisfaction scores ranged from 76 % (offer of useful support through teacher academies) to 87 % (good environment to exchange best practices). Initiatives such as ODS add substantial value to schools on a large scale.

  1. Automating large-scale reactor systems

    Energy Technology Data Exchange (ETDEWEB)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig.

  2. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  3. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.;

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  4. Canada

    International Nuclear Information System (INIS)

    This paper reports that the Canadian oil and natural gas sector is in for another grim year in 1992. Further streamlining to enhance operating efficiencies and control costs is the first order of the day. About $4 billion worth of producing properties remains on the market, as corporate focus continues to shift to core properties. New management structures put in place in the last two years will be severely tested to improve the sector's financial performance. Massive write-downs in 1990 and 1991 have put balance sheets in much better shape for improved financial performance in the future. Although new long-term debt exceeded redemptions in 1991, largely because of debt- financing of major capital projects, individually most companies are in better shape through significant debt repayment or restructuring. The substantial reductions in interest rates will also help to enhance discretionary cash flow. At this stage, everything appears to be in place to expect that 1992 will represent the bottom of the down-cycle for Canada

  5. Network robustness under large-scale attacks

    CERN Document Server

    Zhou, Qing; Liu, Ruifang

    2012-01-01

    Network Robustness under Large-Scale Attacks provides the analysis of network robustness under attacks, with a focus on large-scale correlated physical attacks. The book begins with a thorough overview of the latest research and techniques to analyze the network responses to different types of attacks over various network topologies and connection models. It then introduces a new large-scale physical attack model coined as area attack, under which a new network robustness measure is introduced and applied to study the network responses. With this book, readers will learn the necessary tools to

  6. Synthesis of Small and Large scale Dynamos

    CERN Document Server

    Subramanian, K

    2000-01-01

    Using a closure model for the evolution of magnetic correlations, we uncover an interesting plausible saturated state of the small-scale fluctuation dynamo (SSD) and a novel anology between quantum mechanical tunneling and the generation of large-scale fields. Large scale fields develop via the $\\alpha$-effect, but as magnetic helicity can only change on a resistive timescale, the time it takes to organize the field into large scales increases with magnetic Reynolds number. This is very similar to the results which obtain from simulations using full MHD.

  7. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  8. Curvature constraints from large scale structure

    Science.gov (United States)

    Di Dio, Enea; Montanari, Francesco; Raccanelli, Alvise; Durrer, Ruth; Kamionkowski, Marc; Lesgourgues, Julien

    2016-06-01

    We modified the CLASS code in order to include relativistic galaxy number counts in spatially curved geometries; we present the formalism and study the effect of relativistic corrections on spatial curvature. The new version of the code is now publicly available. Using a Fisher matrix analysis, we investigate how measurements of the spatial curvature parameter ΩK with future galaxy surveys are affected by relativistic effects, which influence observations of the large scale galaxy distribution. These effects include contributions from cosmic magnification, Doppler terms and terms involving the gravitational potential. As an application, we consider angle and redshift dependent power spectra, which are especially well suited for model independent cosmological constraints. We compute our results for a representative deep, wide and spectroscopic survey, and our results show the impact of relativistic corrections on spatial curvature parameter estimation. We show that constraints on the curvature parameter may be strongly biased if, in particular, cosmic magnification is not included in the analysis. Other relativistic effects turn out to be subdominant in the studied configuration. We analyze how the shift in the estimated best-fit value for the curvature and other cosmological parameters depends on the magnification bias parameter, and find that significant biases are to be expected if this term is not properly considered in the analysis.

  9. Reliable control of large scale flexible structures

    Czech Academy of Sciences Publication Activity Database

    Bakule, Lubomír

    Gdansk : IFAC - GUT, 2007, s. 1-6. ISSN 1367-5788. [IFAC/IFORS/IMACS/IFIP Symposium on Large Scale Complex Systems: Theory and Applications /11./. Gdansk (PL), 23.07.2007-25.07.2007] R&D Projects: GA AV ČR IAA2075304; GA MŠk(CZ) LA 282 Institutional research plan: CEZ:AV0Z10750506 Keywords : decentralized control * large scale systems * decomposition * reliability * flexible structures * redundancy Subject RIV: BC - Control Systems Theory

  10. Hierarchical, Hybrid Control Of Large Scale Systems

    OpenAIRE

    Lygeros, John

    1996-01-01

    This dissertation presents a hierarchical, hybrid point of view to the control of large-scale systems. The analysis is based on a new hybrid dynamical system formulation that allows for the modelling of large scale systems in a modular fashion. Three problems are addressed: controller design, closed loop performance verification and the extension of system autonomy. A control scheme based on semi-autonomous agent operation is first proposed. An algorithm, using ideas from game theory, is pres...

  11. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.)

  12. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    Directory of Open Access Journals (Sweden)

    Anil Rao Pimplapure

    2013-06-01

    Full Text Available In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip based protocols for distributed monitoring and resource management that are suitable for large-scale networked systems. Results of our simulation studies indicate that, regardless of the system size and failure rates in the monitored system, gossip protocols incur a significantly larger overhead than tree-based protocols for achieving the same monitoring quality i.e., estimation accuracy or detection delay.

  13. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  14. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    -network-theory (ANT). Transposed to large scale design, it allows us to conceive site and design in terms of active relationships between multiple heterogeneous actors. An actor can be any thing, idea or person that has an effect on the site; from the topography of the landscape over current development plans to...... for setting the design brief in a large scale urban landscape in Norway, the Jaeren region around the city of Stavanger. In this paper, we first outline the methodological challenges and then present and discuss the proposed method based on our teaching experiences. On this basis, we discuss aspects...... of possible usages of an actor-network approach for design education as well as design practice on a large scale....

  15. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  16. Large scale numerical simulation for superfluid turbulence

    International Nuclear Information System (INIS)

    Large scale numerical simulation of quantum turbulence is performed by using 3-D time-dependent Gross-Pitaevskii equation. The energy spectrum obeying Kolmogorov law and large scale self-similar structure of quantum vortex tangle are found in a fully developed dumped turbulent state. We confirm that inertial range of the energy spectrum becomes large as the system size of the simulation becomes large that is consistent with the result of the normal fluid turbulence. On the other hand, bottleneck effect near coherent length prevents the inertial range from extending to smaller scale. (author)

  17. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad;

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary......A power balancing strategy based on Douglas-Rachford splitting is proposed as a control method for largescale integration of flexible consumers in a Smart Grid. The total power consumption is controlled through a negotiation procedure between all units and a coordinating system level. The balancing...

  18. Modified Newtonian Dynamics of Large Scale Structure

    OpenAIRE

    Nusser, Adi

    2001-01-01

    We examine the implications of Modified Newtonian Dynamics (MOND) on the large scale structure in a Friedmann-Robertson-Walker universe. We employ a ``Jeans swindle'' to write a MOND-type relationship between the fluctuations in the density and the gravitational force, $\\vg$. In linear Newtonian theory, $|\\vg|$ decreases with time and eventually becomes $

  19. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  20. Hierarchical Control for Large-Scale Systems

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    A class of large-seale systems, where the overall objective function is a nonlinear function of performance index of each subsystem, is investigated in this paper. This type of large-scale control problem is non-separable in the sense of conventional hierarchical control. Hierarchical control is extended in the paper to large-scale non-separable control problems, where multiobjective optimization is used as separation strategy. The large-scale non-separable control problem is embedded, under ;ertain conditions, into a family of the weighted Lagrangian formulation. The weighted Lagrangian formulation is separable with respect to subsystems and can be effectively solved using the interaction balance approach at the two lower levels in the proposed three-level solution structure. At the third level, the weighting vector for the weighted Lagrangian formulation is adjusted iteratively to search the optimal weighting vector with which the optimal of the original large-scale non-separable control problem is obtained. Theoretical base of the algorithm is established. Simulation shows that the algorithm is effective.

  1. Inflation, large scale structure and particle physics

    Indian Academy of Sciences (India)

    S F King

    2004-02-01

    We review experimental and theoretical developments in inflation and its application to structure formation, including the curvation idea. We then discuss a particle physics model of supersymmetric hybrid inflation at the intermediate scale in which the Higgs scalar field is responsible for large scale structure, show how such a theory is completely natural in the framework extra dimensions with an intermediate string scale.

  2. Large scale topic modeling made practical

    DEFF Research Database (Denmark)

    Wahlgreen, Bjarne Ørum; Hansen, Lars Kai

    2011-01-01

    Topic models are of broad interest. They can be used for query expansion and result structuring in information retrieval and as an important component in services such as recommender systems and user adaptive advertising. In large scale applications both the size of the database (number of...... topics at par with a much larger case specific vocabulary....

  3. Likelihood analysis of large-scale flows

    CERN Document Server

    Jaffe, A; Jaffe, Andrew; Kaiser, Nick

    1994-01-01

    We apply a likelihood analysis to the data of Lauer & Postman 1994. With P(k) parametrized by (\\sigma_8, \\Gamma), the likelihood function peaks at \\sigma_8\\simeq0.9, \\Gamma\\simeq0.05, indicating at face value very strong large-scale power, though at a level incompatible with COBE. There is, however, a ridge of likelihood such that more conventional power spectra do not seem strongly disfavored. The likelihood calculated using as data only the components of the bulk flow solution peaks at higher \\sigma_8, as suggested by other analyses, but is rather broad. The likelihood incorporating both bulk flow and shear gives a different picture. The components of the shear are all low, and this pulls the peak to lower amplitudes as a compromise. The velocity data alone are therefore {\\em consistent} with models with very strong large scale power which generates a large bulk flow, but the small shear (which also probes fairly large scales) requires that the power would have to be at {\\em very} large scales, which is...

  4. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach. It is...

  5. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; Toth, Balazs; Legros, Guillaume; Eigenbrod, Christian; Smirnov, Nickolay; Fujita, Osamu; Jomaas, Grunde

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  6. Neuroscience research involving older persons in Canada: some legal and neuroethical concerns.

    Science.gov (United States)

    Onyemelukwe, Cheluchi

    2013-09-01

    In this paper, I will examine some legal and ethical issues that arise in relation to neuroscience research involving older persons in Canada. Such research includes research relating to dementia, including Alzheimer's disease. Dementias such as Alzheimer's are organic diseases affecting mainly older persons. They adversely affect mental acuity. There is still much that is unknown about such illnesses, making research on these diseases particularly necessary. In this paper, I will identify and focus on particular concerns with respect to the participation of older persons in research. These concerns are: Inclusion in and Access to Research; Informed Consent; Incidental Findings; and Advance Directives. I will discuss each of these concerns in the context of Canadian research ethics policy and law. The aim of this paper is not to provide an exhaustive discussion of these issues, each of which may rightfully demand a full paper. The aim of this paper is to identify and paint a canvass of these particularly relevant issues, discuss the policy and law on them, identify any existing gaps and propose some solutions to remedy these gaps and protect older persons who participate in research. PMID:24340488

  7. Combining p-values in large scale genomics experiments

    OpenAIRE

    Dmitri V Zaykin; Zhivotovsky, Lev A.; Czika, Wendy; Shao, Susan; Wolfinger, Russell D.

    2007-01-01

    In large-scale genomics experiments involving thousands of statistical tests, such as association scans and microarray expression experiments, a key question is: Which of the L tests represent true associations (TAs)? The traditional way to control false findings is via individual adjustments. In the presence of multiple TAs, p-value combination methods offer certain advantages. Both Fisher’s and Lancaster’s combination methods use an inverse gamma transformation. We identify the relation of ...

  8. Role of optometry school in single day large scale school vision testing

    OpenAIRE

    Anuradha, N.; Krishnakumar Ramani

    2015-01-01

    Background: School vision testing aims at identification and management of refractive errors. Large-scale school vision testing using conventional methods is time-consuming and demands a lot of chair time from the eye care professionals. A new strategy involving a school of optometry in single day large scale school vision testing is discussed. Aim: The aim was to describe a new approach of performing vision testing of school children on a large scale in a single day. Materials and Meth...

  9. Quantum Signature of Cosmological Large Scale Structures

    CERN Document Server

    Capozziello, S; De Siena, S; Illuminati, F; Capozziello, Salvatore; Martino, Salvatore De; Siena, Silvio De; Illuminati, Fabrizio

    1998-01-01

    We demonstrate that to all large scale cosmological structures where gravitation is the only overall relevant interaction assembling the system (e.g. galaxies), there is associated a characteristic unit of action per particle whose order of magnitude coincides with the Planck action constant $h$. This result extends the class of physical systems for which quantum coherence can act on macroscopic scales (as e.g. in superconductivity) and agrees with the absence of screening mechanisms for the gravitational forces, as predicted by some renormalizable quantum field theories of gravity. It also seems to support those lines of thought invoking that large scale structures in the Universe should be connected to quantum primordial perturbations as requested by inflation, that the Newton constant should vary with time and distance and, finally, that gravity should be considered as an effective interaction induced by quantization.

  10. Gravitational Wilson Loop and Large Scale Curvature

    OpenAIRE

    Hamber, H.; Williams, R.

    2007-01-01

    In a quantum theory of gravity the gravitational Wilson loop, defined as a suitable quantum average of a parallel transport operator around a large near-planar loop, provides important information about the large-scale curvature properties of the geometry. Here we shows that such properties can be systematically computed in the strong coupling limit of lattice regularized quantum gravity, by performing local averages over loop bivectors, and over lattice rotations, using an assumed near-unifo...

  11. Large-scale instabilities of helical flows

    OpenAIRE

    Cameron, Alexandre; Alexakis, Alexandros; Brachet, Marc-Étienne

    2016-01-01

    Large-scale hydrodynamic instabilities of periodic helical flows are investigated using $3$D Floquet numerical computations. A minimal three-modes analytical model that reproduce and explains some of the full Floquet results is derived. The growth-rate $\\sigma$ of the most unstable modes (at small scale, low Reynolds number $Re$ and small wavenumber $q$) is found to scale differently in the presence or absence of anisotropic kinetic alpha (\\AKA{}) effect. When an $AKA$ effect is present the s...

  12. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  13. Large Scale Research Project, Daidalos Evaluation Framework

    OpenAIRE

    Cleary, Frances; Ponce de Leon, Miguel; GARCÍA MORENO, Marta; ROMERO VICENTE, Antonio; Roddy, Mark

    2007-01-01

    For large scale research projects operational over a phased timeframe of 2 years or more, the need to take a step back and evaluate their stance and direction is an important activity in providing relevant feedback and recommendations to guide the project towards success in its consecutive phase. The identification of measurable goals and evaluation profile procedures to effectively work towards a useful evaluation of the project was one of the main aims of the Evaluation taskforce. As part o...

  14. Coordination in Large-Scale Agile Development

    OpenAIRE

    Morken, Ragnar Alexander T

    2014-01-01

    In the last decade agile software development methods has become one of themost popular topics within software engineering. Agile software developmentis well accepted in small projects among the practitioner community and inrecent years, there has also been several large-scale projects adopting agilemethodologies, but there is little understanding of how such projects achieveeective coordination, which is known to be a critical factor in software engineering.This thesis describe an explorator...

  15. Cedar-a large scale multiprocessor

    Energy Technology Data Exchange (ETDEWEB)

    Gajski, D.; Kuck, D.; Lawrie, D.; Sameh, A.

    1983-01-01

    This paper presents an overview of Cedar, a large scale multiprocessor being designed at the University of Illinois. This machine is designed to accommodate several thousand high performance processors which are capable of working together on a single job, or they can be partitioned into groups of processors where each group of one or more processors can work on separate jobs. Various aspects of the machine are described including the control methodology, communication network, optimizing compiler and plans for construction. 13 references.

  16. Relationships in Large-Scale Graph Computing

    OpenAIRE

    Petrovic, Dan

    2012-01-01

    In 2009 Grzegorz Czajkowski from Google's system infrastructure team has published an article which didn't get much attention in the SEO community at the time. It was titled "Large-scale graph computing at Google" and gave an excellent insight into the future of Google's search. This article highlights some of the little known facts which lead to transformation of Google's algorithm in the last two years.

  17. Large scale inhomogeneities and the cosmological principle

    International Nuclear Information System (INIS)

    The compatibility of cosmologic principles and possible large scale inhomogeneities of the Universe is discussed. It seems that the strongest symmetry principle which is still compatible with reasonable inhomogeneities, is a full conformal symmetry in the 3-space defined by the cosmological velocity field, but even in such a case, the standard model is isolated from the inhomogeneous ones when the whole evolution is considered. (author)

  18. Revisiting Large Scale Distributed Machine Learning

    OpenAIRE

    Ionescu, Radu Cristian

    2015-01-01

    Nowadays, with the widespread of smartphones and other portable gadgets equipped with a variety of sensors, data is ubiquitous available and the focus of machine learning has shifted from being able to infer from small training samples to dealing with large scale high-dimensional data. In domains such as personal healthcare applications, which motivates this survey, distributed machine learning is a promising line of research, both for scaling up learning algorithms, but mostly for dealing wi...

  19. Financing Large-scale EU Infrastructure Projects

    OpenAIRE

    Latreille, Thierry; Sterdyniak, Henri; Veroni, Paola

    2000-01-01

    This study discusses the proposal which consists of financing large-scale infrastructure investments by issuing EU bonds, first introduced by President Delors in his White Paper on “Growth, Competitiveness and Employment” in 1993. This proposal has been partly implemented insofar as the European Commission and the EIB have financed a number of large projects. But it is not considered as a major tool of European economic policy. Monetary policy is the major instrument of short a...

  20. Large-scale magnetic fields in cosmology

    International Nuclear Information System (INIS)

    Despite the widespread presence of magnetic fields, their origin, evolution and role are still not well understood. Primordial magnetism sounds appealing but is not problem free. The magnetic implications for the large-scale structure of the universe still remain an open issue. This paper outlines the advantages and shortcomings of early-time magnetogenesis and the typical role of B-fields in linear structure-formation scenarios.

  1. Large-scale instabilities of helical flows

    CERN Document Server

    Cameron, Alexandre; Brachet, Marc-Étienne

    2016-01-01

    Large-scale hydrodynamic instabilities of periodic helical flows are investigated using $3$D Floquet numerical computations. A minimal three-modes analytical model that reproduce and explains some of the full Floquet results is derived. The growth-rate $\\sigma$ of the most unstable modes (at small scale, low Reynolds number $Re$ and small wavenumber $q$) is found to scale differently in the presence or absence of anisotropic kinetic alpha (\\AKA{}) effect. When an $AKA$ effect is present the scaling $\\sigma \\propto q\\; Re\\,$ predicted by the $AKA$ effect theory [U. Frisch, Z. S. She, and P. L. Sulem, Physica D: Nonlinear Phenomena 28, 382 (1987)] is recovered for $Re\\ll 1$ as expected (with most of the energy of the unstable mode concentrated in the large scales). However, as $Re$ increases, the growth-rate is found to saturate and most of the energy is found at small scales. In the absence of \\AKA{} effect, it is found that flows can still have large-scale instabilities, but with a negative eddy-viscosity sca...

  2. The consistency problems of large scale structure

    International Nuclear Information System (INIS)

    The combined problems of large scale structure, the need for non-baryonic dark matter if Ω = 1, and the need to make galaxies early in the history of the universe seem to be placing severe constraints on cosmological models. In addition, it is shown that the bulk of the baryonic matter is also dark and must be accounted for as well. The nucleosynthesis arguments are now strongly supported by high energy collider experiments as well as astronomical abundance data. The arguments for dark matter are reviewed and it is shown that observational dynamical arguments and nucleosynthesis are all still consistent at Ω -- 0.1. However, the inflation paradigm requires Ω = 1, thus, the need for non-baryonic dark matter. A non-zero cosmological constant is argued to be an inappropriate solution. Dark matter candidates fall into two categories, hot (neutrino-like) and cold (axion or massive photino-like). New observations of large scale structure in the universe (voids, foam, and large scale velocity fields) seem to be most easily understood if the dominant matter of the universe is in the form of low mass (9eV ≤ m/sub ν/ ≤ 35eV) neutrinos. Cold dark matter, even with biasing, seems unable to duplicate the combination of these observations (of particular significance here are the large velocity fields, if real). However, galaxy formation is difficult with hot matter. The potentially fatal problems of galaxy formation with neutrinos may be remedied by combining them with either cosmic strings or explosive galaxy formation. The former naturally gives the scale-free correlation function for galaxies, clusters, and superclusters

  3. Nonthermal Components in the Large Scale Structure

    Science.gov (United States)

    Miniati, Francesco

    2004-12-01

    I address the issue of nonthermal processes in the large scale structure of the universe. After reviewing the properties of cosmic shocks and their role as particle accelerators, I discuss the main observational results, from radio to γ-ray and describe the processes that are thought be responsible for the observed nonthermal emissions. Finally, I emphasize the important role of γ-ray astronomy for the progress in the field. Non detections at these photon energies have already allowed us important conclusions. Future observations will tell us more about the physics of the intracluster medium, shocks dissipation and CR acceleration.

  4. Large-scale planar lightwave circuits

    Science.gov (United States)

    Bidnyk, Serge; Zhang, Hua; Pearson, Matt; Balakrishnan, Ashok

    2011-01-01

    By leveraging advanced wafer processing and flip-chip bonding techniques, we have succeeded in hybrid integrating a myriad of active optical components, including photodetectors and laser diodes, with our planar lightwave circuit (PLC) platform. We have combined hybrid integration of active components with monolithic integration of other critical functions, such as diffraction gratings, on-chip mirrors, mode-converters, and thermo-optic elements. Further process development has led to the integration of polarization controlling functionality. Most recently, all these technological advancements have been combined to create large-scale planar lightwave circuits that comprise hundreds of optical elements integrated on chips less than a square inch in size.

  5. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  6. Large scale phononic metamaterials for seismic isolation

    International Nuclear Information System (INIS)

    In this work, we numerically examine structures that could be characterized as large scale phononic metamaterials. These novel structures could have band gaps in the frequency spectrum of seismic waves when their dimensions are chosen appropriately, thus raising the belief that they could be serious candidates for seismic isolation structures. Different and easy to fabricate structures were examined made from construction materials such as concrete and steel. The well-known finite difference time domain method is used in our calculations in order to calculate the band structures of the proposed metamaterials

  7. Large scale breeder reactor pump dynamic analyses

    International Nuclear Information System (INIS)

    The lateral natural frequency and vibration response analyses of the Large Scale Breeder Reactor (LSBR) primary pump were performed as part of the total dynamic analysis effort to obtain the fabrication release. The special features of pump modeling are outlined in this paper. The analysis clearly demonstrates the method of increasing the system natural frequency by reducing the generalized mass without significantly changing the generalized stiffness of the structure. Also, a method of computing the maximum relative and absolute steady state responses and associated phase angles at given locations is provided. This type of information is very helpful in generating response versus frequency and phase angle versus frequency plots

  8. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  9. Large scale integration of photovoltaics in cities

    International Nuclear Information System (INIS)

    Highlights: ► We implement the photovoltaics on a large scale. ► We use three-dimensional modelling for accurate photovoltaic simulations. ► We consider the shadowing effect in the photovoltaic simulation. ► We validate the simulated results using detailed hourly measured data. - Abstract: For a large scale implementation of photovoltaics (PV) in the urban environment, building integration is a major issue. This includes installations on roof or facade surfaces with orientations that are not ideal for maximum energy production. To evaluate the performance of PV systems in urban settings and compare it with the building user’s electricity consumption, three-dimensional geometry modelling was combined with photovoltaic system simulations. As an example, the modern residential district of Scharnhauser Park (SHP) near Stuttgart/Germany was used to calculate the potential of photovoltaic energy and to evaluate the local own consumption of the energy produced. For most buildings of the district only annual electrical consumption data was available and only selected buildings have electronic metering equipment. The available roof area for one of these multi-family case study buildings was used for a detailed hourly simulation of the PV power production, which was then compared to the hourly measured electricity consumption. The results were extrapolated to all buildings of the analyzed area by normalizing them to the annual consumption data. The PV systems can produce 35% of the quarter’s total electricity consumption and half of this generated electricity is directly used within the buildings.

  10. Large-Scale Clustering in Bubble Models

    CERN Document Server

    Borgani, S

    1993-01-01

    We analyze the statistical properties of bubble models for the large-scale distribution of galaxies. To this aim, we realize static simulations, in which galaxies are mostly randomly arranged in the regions surrounding bubbles. As a first test, we realize simulations of the Lick map, by suitably projecting the three-dimensional simulations. In this way, we are able to safely compare the angular correlation function implied by a bubbly geometry to that of the APM sample. We find that several bubble models provide an adequate amount of large-scale correlation, which nicely fits that of APM galaxies. Further, we apply the statistics of the count-in-cell moments to the three-dimensional distribution and compare them with available observational data on variance, skewness and kurtosis. Based on our purely geometrical constructions, we find that a well defined hierarchical scaling of higher order moments up to scales $\\sim 70\\hm$. The overall emerging picture is that the bubbly geometry is well suited to reproduce ...

  11. Study on the large scale dynamo transition

    CERN Document Server

    Nigro, Giuseppina

    2010-01-01

    Using the magnetohydrodynamic (MHD) description, we develop a nonlinear dynamo model that couples the evolution of the large scale magnetic field with turbulent dynamics of the plasma at small scale by electromotive force (e.m.f.) in the induction equation at large scale. The nonlinear behavior of the plasma at small scale is described by using a MHD shell model for velocity field and magnetic field fluctuations.The shell model allow to study this problem in a large parameter regime which characterizes the dynamo phenomenon in many natural systems and which is beyond the power of supercomputers at today. Under specific conditions of the plasma turbulent state, the field fluctuations at small scales are able to trigger the dynamo instability. We study this transition considering the stability curve which shows a strong decrease in the critical magnetic Reynolds number for increasing inverse magnetic Prandlt number $\\textrm{Pm}^{-1}$ in the range $[10^{-6},1]$ and slows an increase in the range $[1,10^{8}]$. We...

  12. Large-scale baseload wind power in China

    International Nuclear Information System (INIS)

    This paper presents a novel strategy for developing wind power in large-scale (multi-GW) wind farms in China. It involves combining oversized wind farms, large-scale electrical storage and long-distance transmission lines to deliver 'baseload wind power' to distant electricity demand centers. Baseload wind power is typically more valuable to the electric utility than intermittent wind power, so that storage can be economically attractive even in instances where the cost per kWh is somewhat higher than without storage. The prospective costs for this approach to developing wind power are illustrated by modifying an oversized wind farm at Huitengxile, Inner Mongolia. The site has an average power density of 580 W/m2 at 50 m hub heights and is located 500 km north of Beijing. Using locally mass-produced wind turbines there are good prospects that wind power would be cost-competitive with coal power, on a lifecycle cost basis, while providing substantial net environmental benefits. Finally, the institutional challenges related to the prospect of large-scale wind energy development are addressed. Especially important are policies aimed at developing the capacity for mass production of as much of this technology in China as is feasible. Promising instruments for speeding up the introduction of this technology include: (i) international joint ventures between foreign vendors and developers and Chinese manufacturers; and (ii) wind resource development concessions. (author)

  13. Multivariate Clustering of Large-Scale Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Eliassi-Rad, T; Critchlow, T

    2003-03-04

    Simulations of complex scientific phenomena involve the execution of massively parallel computer programs. These simulation programs generate large-scale data sets over the spatiotemporal space. Modeling such massive data sets is an essential step in helping scientists discover new information from their computer simulations. In this paper, we present a simple but effective multivariate clustering algorithm for large-scale scientific simulation data sets. Our algorithm utilizes the cosine similarity measure to cluster the field variables in a data set. Field variables include all variables except the spatial (x, y, z) and temporal (time) variables. The exclusion of the spatial space is important since 'similar' characteristics could be located (spatially) far from each other. To scale our multivariate clustering algorithm for large-scale data sets, we take advantage of the geometrical properties of the cosine similarity measure. This allows us to reduce the modeling time from O(n{sup 2}) to O(n x g(f(u))), where n is the number of data points, f(u) is a function of the user-defined clustering threshold, and g(f(u)) is the number of data points satisfying the threshold f(u). We show that on average g(f(u)) is much less than n. Finally, even though spatial variables do not play a role in building a cluster, it is desirable to associate each cluster with its correct spatial space. To achieve this, we present a linking algorithm for connecting each cluster to the appropriate nodes of the data set's topology tree (where the spatial information of the data set is stored). Our experimental evaluations on two large-scale simulation data sets illustrate the value of our multivariate clustering and linking algorithms.

  14. Multivariate Clustering of Large-Scale Scientific Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Eliassi-Rad, T; Critchlow, T

    2003-06-13

    Simulations of complex scientific phenomena involve the execution of massively parallel computer programs. These simulation programs generate large-scale data sets over the spatio-temporal space. Modeling such massive data sets is an essential step in helping scientists discover new information from their computer simulations. In this paper, we present a simple but effective multivariate clustering algorithm for large-scale scientific simulation data sets. Our algorithm utilizes the cosine similarity measure to cluster the field variables in a data set. Field variables include all variables except the spatial (x, y, z) and temporal (time) variables. The exclusion of the spatial dimensions is important since ''similar'' characteristics could be located (spatially) far from each other. To scale our multivariate clustering algorithm for large-scale data sets, we take advantage of the geometrical properties of the cosine similarity measure. This allows us to reduce the modeling time from O(n{sup 2}) to O(n x g(f(u))), where n is the number of data points, f(u) is a function of the user-defined clustering threshold, and g(f(u)) is the number of data points satisfying f(u). We show that on average g(f(u)) is much less than n. Finally, even though spatial variables do not play a role in building clusters, it is desirable to associate each cluster with its correct spatial region. To achieve this, we present a linking algorithm for connecting each cluster to the appropriate nodes of the data set's topology tree (where the spatial information of the data set is stored). Our experimental evaluations on two large-scale simulation data sets illustrate the value of our multivariate clustering and linking algorithms.

  15. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    the corner of a discrete L-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe......The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some of...... problems on general form, and a new extension to the methods has been developed for this purpose. The L-curve method is one among several parameter choice methods that can be used in connection with the solution of inverse problems. A part of the work has resulted in a new heuristic for the localization of...

  16. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  17. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  18. Radiations: large scale monitoring in Japan

    International Nuclear Information System (INIS)

    As the consequences of radioactive leaks on their health are a matter of concern for Japanese people, a large scale epidemiological study has been launched by the Fukushima medical university. It concerns the two millions inhabitants of the Fukushima Prefecture. On the national level and with the support of public funds, medical care and follow-up, as well as systematic controls are foreseen, notably to check the thyroid of 360.000 young people less than 18 year old and of 20.000 pregnant women in the Fukushima Prefecture. Some measurements have already been performed on young children. Despite the sometimes rather low measures, and because they know that some parts of the area are at least as much contaminated as it was the case around Chernobyl, some people are reluctant to go back home

  19. Clumps in large scale relativistic jets

    CERN Document Server

    Tavecchio, F; Celotti, A

    2003-01-01

    The relatively intense X-ray emission from large scale (tens to hundreds kpc) jets discovered with Chandra likely implies that jets (at least in powerful quasars) are still relativistic at that distances from the active nucleus. In this case the emission is due to Compton scattering off seed photons provided by the Cosmic Microwave Background, and this on one hand permits to have magnetic fields close to equipartition with the emitting particles, and on the other hand minimizes the requirements about the total power carried by the jet. The emission comes from compact (kpc scale) knots, and we here investigate what we can predict about the possible emission between the bright knots. This is motivated by the fact that bulk relativistic motion makes Compton scattering off the CMB photons efficient even when electrons are cold or mildly relativistic in the comoving frame. This implies relatively long cooling times, dominated by adiabatic losses. Therefore the relativistically moving plasma can emit, by Compton sc...

  20. Large scale water lens for solar concentration.

    Science.gov (United States)

    Mondol, A S; Vogel, B; Bastian, G

    2015-06-01

    Properties of large scale water lenses for solar concentration were investigated. These lenses were built from readily available materials, normal tap water and hyper-elastic linear low density polyethylene foil. Exposed to sunlight, the focal lengths and light intensities in the focal spot were measured and calculated. Their optical properties were modeled with a raytracing software based on the lens shape. We have achieved a good match of experimental and theoretical data by considering wavelength dependent concentration factor, absorption and focal length. The change in light concentration as a function of water volume was examined via the resulting load on the foil and the corresponding change of shape. The latter was extracted from images and modeled by a finite element simulation. PMID:26072893

  1. Supporting large-scale computational science

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R., LLNL

    1998-02-19

    Business needs have driven the development of commercial database systems since their inception. As a result, there has been a strong focus on supporting many users, minimizing the potential corruption or loss of data, and maximizing performance metrics like transactions per second, or TPC-C and TPC-D results. It turns out that these optimizations have little to do with the needs of the scientific community, and in particular have little impact on improving the management and use of large-scale high-dimensional data. At the same time, there is an unanswered need in the scientific community for many of the benefits offered by a robust DBMS. For example, tying an ad-hoc query language such as SQL together with a visualization toolkit would be a powerful enhancement to current capabilities. Unfortunately, there has been little emphasis or discussion in the VLDB community on this mismatch over the last decade. The goal of the paper is to identify the specific issues that need to be resolved before large-scale scientific applications can make use of DBMS products. This topic is addressed in the context of an evaluation of commercial DBMS technology applied to the exploration of data generated by the Department of Energy`s Accelerated Strategic Computing Initiative (ASCI). The paper describes the data being generated for ASCI as well as current capabilities for interacting with and exploring this data. The attraction of applying standard DBMS technology to this domain is discussed, as well as the technical and business issues that currently make this an infeasible solution.

  2. Supporting large-scale computational science

    Energy Technology Data Exchange (ETDEWEB)

    Musick, R

    1998-10-01

    A study has been carried out to determine the feasibility of using commercial database management systems (DBMSs) to support large-scale computational science. Conventional wisdom in the past has been that DBMSs are too slow for such data. Several events over the past few years have muddied the clarity of this mindset: 1. 2. 3. 4. Several commercial DBMS systems have demonstrated storage and ad-hoc quer access to Terabyte data sets. Several large-scale science teams, such as EOSDIS [NAS91], high energy physics [MM97] and human genome [Kin93] have adopted (or make frequent use of) commercial DBMS systems as the central part of their data management scheme. Several major DBMS vendors have introduced their first object-relational products (ORDBMSs), which have the potential to support large, array-oriented data. In some cases, performance is a moot issue. This is true in particular if the performance of legacy applications is not reduced while new, albeit slow, capabilities are added to the system. The basic assessment is still that DBMSs do not scale to large computational data. However, many of the reasons have changed, and there is an expiration date attached to that prognosis. This document expands on this conclusion, identifies the advantages and disadvantages of various commercial approaches, and describes the studies carried out in exploring this area. The document is meant to be brief, technical and informative, rather than a motivational pitch. The conclusions within are very likely to become outdated within the next 5-7 years, as market forces will have a significant impact on the state of the art in scientific data management over the next decade.

  3. Technologies and challenges in large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Engholm-Keller, Kasper; Larsen, Martin Røssel

    2013-01-01

    apoptosis, rely on phosphorylation. This PTM is thus involved in many diseases, rendering localization and assessment of extent of phosphorylation of major scientific interest. MS-based phosphoproteomics, which aims at describing all phosphorylation sites in a specific type of cell, tissue, or organism, has...... become the main technique for discovery and characterization of phosphoproteins in a nonhypothesis driven fashion. In this review, we describe methods for state-of-the-art MS-based analysis of protein phosphorylation as well as the strategies employed in large-scale phosphoproteomic experiments with...

  4. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    likelihood that a change in the activity of one neuronal population affects the activity in another. We argue that these measures access the inherently probabilistic nature of causal influences in the brain, and are thus better suited for large-scale brain network analysis than are DC-based measures. Our work is consistent with recent advances in the philosophical study of probabilistic causality, which originated from inherent conceptual problems with deterministic regularity theories. It also resonates with concepts of stochasticity that were involved in establishing modern physics. In summary, we argue that probabilistic causality is a conceptually appropriate foundation for describing neural causality in the brain.

  5. Analysis and Management of Large-Scale Activities Based on Interface

    Science.gov (United States)

    Yang, Shaofan; Ji, Jingwei; Lu, Ligang; Wang, Zhiyi

    Based on the concepts of system safety engineering, life-cycle and interface that comes from American system safety standard MIL-STD-882E, and apply them to the process of risk analysis and management of large-scale activities. Identify the involved personnel, departments, funds and other contents throughout the life cycle of large-scale activities. Recognize and classify the ultimate risk sources of people, objects and environment of large-scale activities from the perspective of interface. Put forward the accident cause analysis model according to the previous large-scale activities' accidents and combine with the analysis of the risk source interface. Analyze the risks of each interface and summary various types of risks the large-scale activities faced. Come up with the risk management consciousness, policies and regulations, risk control and supervision departments improvement ideas.

  6. Large scale probabilistic available bandwidth estimation

    CERN Document Server

    Thouin, Frederic; Rabbat, Michael

    2010-01-01

    The common utilization-based definition of available bandwidth and many of the existing tools to estimate it suffer from several important weaknesses: i) most tools report a point estimate of average available bandwidth over a measurement interval and do not provide a confidence interval; ii) the commonly adopted models used to relate the available bandwidth metric to the measured data are invalid in almost all practical scenarios; iii) existing tools do not scale well and are not suited to the task of multi-path estimation in large-scale networks; iv) almost all tools use ad-hoc techniques to address measurement noise; and v) tools do not provide enough flexibility in terms of accuracy, overhead, latency and reliability to adapt to the requirements of various applications. In this paper we propose a new definition for available bandwidth and a novel framework that addresses these issues. We define probabilistic available bandwidth (PAB) as the largest input rate at which we can send a traffic flow along a pa...

  7. Food appropriation through large scale land acquisitions

    International Nuclear Information System (INIS)

    The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions (LSLAs) for commercial farming will bring the technology required to close the existing crops yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with LSLAs. We show how up to 300–550 million people could be fed by crops grown in the acquired land, should these investments in agriculture improve crop production and close the yield gap. In contrast, about 190–370 million people could be supported by this land without closing of the yield gap. These numbers raise some concern because the food produced in the acquired land is typically exported to other regions, while the target countries exhibit high levels of malnourishment. Conversely, if used for domestic consumption, the crops harvested in the acquired land could ensure food security to the local populations. (letter)

  8. Large-Scale Clustering of Cosmic Voids

    CERN Document Server

    Chan, Kwan Chuen; Desjacques, Vincent

    2014-01-01

    We study the clustering of voids using $N$-body simulations and simple theoretical models. The excursion-set formalism describes fairly well the abundance of voids identified with the watershed algorithm, although the void formation threshold required is quite different from the spherical collapse value. The void cross bias $b_{\\rm c} $ is measured and its large-scale value is found to be consistent with the peak background split results. A simple fitting formula for $b_{\\rm c} $ is found. We model the void auto-power spectrum taking into account the void biasing and exclusion effect. A good fit to the simulation data is obtained for voids with radii $\\gtrsim$ 30 Mpc/$h$, especially when the void biasing model is extended to 1-loop order. However, the best fit bias parameters do not agree well with the peak-background split results. Being able to fit the void auto-power spectrum is particularly important not only because it is the direct observable in galaxy surveys; our method enables us to treat the bias pa...

  9. Large-scale wind turbine structures

    Science.gov (United States)

    Spera, David A.

    1988-05-01

    The purpose of this presentation is to show how structural technology was applied in the design of modern wind turbines, which were recently brought to an advanced stage of development as sources of renewable power. Wind turbine structures present many difficult problems because they are relatively slender and flexible; subject to vibration and aeroelastic instabilities; acted upon by loads which are often nondeterministic; operated continuously with little maintenance in all weather; and dominated by life-cycle cost considerations. Progress in horizontal-axis wind turbines (HAWT) development was paced by progress in the understanding of structural loads, modeling of structural dynamic response, and designing of innovative structural response. During the past 15 years a series of large HAWTs was developed. This has culminated in the recent completion of the world's largest operating wind turbine, the 3.2 MW Mod-5B power plane installed on the island of Oahu, Hawaii. Some of the applications of structures technology to wind turbine will be illustrated by referring to the Mod-5B design. First, a video overview will be presented to provide familiarization with the Mod-5B project and the important components of the wind turbine system. Next, the structural requirements for large-scale wind turbines will be discussed, emphasizing the difficult fatigue-life requirements. Finally, the procedures used to design the structure will be presented, including the use of the fracture mechanics approach for determining allowable fatigue stresses.

  10. Large scale production of tungsten-188

    International Nuclear Information System (INIS)

    Tungsten-188 is produced in a fission nuclear reactor with double neutron capture on 186W. The authors have explored large scale production yield (100-200 mCi) of 188W from ORNL-High Flux Isotope Reactor (HFIR) and compared this data with the experimental data available from other reactors and the theoretical calculations. The experimental yield of 188W at EOB from the HFIR operating at 85 MWt power and for one cycle irradiation (∼21 days) at the thermal neutron flux of 2x1015, n.s-1 cm-2 is 4 mCi/mg of 186W. This value is lower than the theoretical value by almost a factor of five. However, for one day irradiation at the Brookhaven High Flux Beam Reactor, the yield of 188W is lower than the theoretical value by a factor of two. Factors responsible for these low production yields and the yields of 187W intermediate radionuclide from several targets is discussed

  11. Large Scale Computer Simulation of Erthocyte Membranes

    Science.gov (United States)

    Harvey, Cameron; Revalee, Joel; Laradji, Mohamed

    2007-11-01

    The cell membrane is crucial to the life of the cell. Apart from partitioning the inner and outer environment of the cell, they also act as a support of complex and specialized molecular machinery, important for both the mechanical integrity of the cell, and its multitude of physiological functions. Due to its relative simplicity, the red blood cell has been a favorite experimental prototype for investigations of the structural and functional properties of the cell membrane. The erythrocyte membrane is a composite quasi two-dimensional structure composed essentially of a self-assembled fluid lipid bilayer and a polymerized protein meshwork, referred to as the cytoskeleton or membrane skeleton. In the case of the erythrocyte, the polymer meshwork is mainly composed of spectrin, anchored to the bilayer through specialized proteins. Using a coarse-grained model, recently developed by us, of self-assembled lipid membranes with implicit solvent and using soft-core potentials, we simulated large scale red-blood-cells bilayers with dimensions ˜ 10-1 μm^2, with explicit cytoskeleton. Our aim is to investigate the renormalization of the elastic properties of the bilayer due to the underlying spectrin meshwork.

  12. Developing Large-Scale Bayesian Networks by Composition

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, we investigate the use of Bayesian networks to construct large-scale diagnostic systems. In particular, we consider the development of large-scale...

  13. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  14. Finding New Malicious Domains Using Variational Bayes on Large-Scale Computer Network Data

    Czech Academy of Sciences Publication Activity Database

    Létal, V.; Pevný, T.; Šmídl, Václav; Somol, Petr

    Montréal, Canada: NIPS, 2015, s. 1-10. [NIPS workshop: Advances in Approximate Bayesian Inference. Montreal (CA), 11.12.2015] Grant ostatní: GA ČR(CZ) GA15-08916S Institutional support: RVO:67985556 Keywords : variational bayes * malicious domain detection * large scale network Subject RIV: BD - Theory of Information http://library.utia.cas.cz/separaty/2016/AS/smidl-0455622.pdf

  15. Sensitivity technologies for large scale simulation

    International Nuclear Information System (INIS)

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  16. Sensitivity technologies for large scale simulation.

    Energy Technology Data Exchange (ETDEWEB)

    Collis, Samuel Scott; Bartlett, Roscoe Ainsworth; Smith, Thomas Michael; Heinkenschloss, Matthias (Rice University, Houston, TX); Wilcox, Lucas C. (Brown University, Providence, RI); Hill, Judith C. (Carnegie Mellon University, Pittsburgh, PA); Ghattas, Omar (Carnegie Mellon University, Pittsburgh, PA); Berggren, Martin Olof (University of UppSala, Sweden); Akcelik, Volkan (Carnegie Mellon University, Pittsburgh, PA); Ober, Curtis Curry; van Bloemen Waanders, Bart Gustaaf; Keiter, Eric Richard

    2005-01-01

    Sensitivity analysis is critically important to numerous analysis algorithms, including large scale optimization, uncertainty quantification,reduced order modeling, and error estimation. Our research focused on developing tools, algorithms and standard interfaces to facilitate the implementation of sensitivity type analysis into existing code and equally important, the work was focused on ways to increase the visibility of sensitivity analysis. We attempt to accomplish the first objective through the development of hybrid automatic differentiation tools, standard linear algebra interfaces for numerical algorithms, time domain decomposition algorithms and two level Newton methods. We attempt to accomplish the second goal by presenting the results of several case studies in which direct sensitivities and adjoint methods have been effectively applied, in addition to an investigation of h-p adaptivity using adjoint based a posteriori error estimation. A mathematical overview is provided of direct sensitivities and adjoint methods for both steady state and transient simulations. Two case studies are presented to demonstrate the utility of these methods. A direct sensitivity method is implemented to solve a source inversion problem for steady state internal flows subject to convection diffusion. Real time performance is achieved using novel decomposition into offline and online calculations. Adjoint methods are used to reconstruct initial conditions of a contamination event in an external flow. We demonstrate an adjoint based transient solution. In addition, we investigated time domain decomposition algorithms in an attempt to improve the efficiency of transient simulations. Because derivative calculations are at the root of sensitivity calculations, we have developed hybrid automatic differentiation methods and implemented this approach for shape optimization for gas dynamics using the Euler equations. The hybrid automatic differentiation method was applied to a first

  17. Very Large-Scale Integrated Processor

    Directory of Open Access Journals (Sweden)

    Shigeyuki Takano

    2013-01-01

    Full Text Available In the near future, improvements in semiconductor technology will allow thousands of resources to be implementable on chip. However, a limitation remains for both single large-scale processors and many-core processors. For single processors, this limitation arises from their  design complexity, and regarding the many-core processors, an application is partitioned to several tasks and these partitioned tasks are mapped onto the cores. In this article,  we propose a dynamic chip multiprocessor (CMP model that consists of simple modules (realizing a low design complexity and does not require the application partitioning since the scale of the processor is dynamically variable, looking like up or down scale on demand. This model is based on prior work on adaptive processors that can gather and release resources on chip to dynamically form a processor. The adaptive processor takes a linear topology that realizes a locality based placement and replacement using processing elements themselves through a stack shift of information on the linear topology of the processing element array. Therefore, for the scaling of the processor, a linear topology of the interconnection network has to support the stack shift before and after the up- or down-scaling. Therefore, we propose an interconnection network architecture called a dynamic channel segmentation distribution (dynamic CSD network. In addition the linear topology must be folded on-chip into two-dimensional plane. We also propose a new conceptual topology and its cluster which is a unit of the new topology and is replicated on the chip. We analyzed the cost in terms of the available number of clusters (adaptive processors with a minimum scale and delay in Manhattan-distance of the chip, as well as its peak Giga-Operations per Second (GOPS across the process technology scaling.

  18. Large Scale Flame Spread Environmental Characterization Testing

    Science.gov (United States)

    Clayman, Lauren K.; Olson, Sandra L.; Gokoghi, Suleyman A.; Brooker, John E.; Ferkul, Paul V.; Kacher, Henry F.

    2013-01-01

    Under the Advanced Exploration Systems (AES) Spacecraft Fire Safety Demonstration Project (SFSDP), as a risk mitigation activity in support of the development of a large-scale fire demonstration experiment in microgravity, flame-spread tests were conducted in normal gravity on thin, cellulose-based fuels in a sealed chamber. The primary objective of the tests was to measure pressure rise in a chamber as sample material, burning direction (upward/downward), total heat release, heat release rate, and heat loss mechanisms were varied between tests. A Design of Experiments (DOE) method was imposed to produce an array of tests from a fixed set of constraints and a coupled response model was developed. Supplementary tests were run without experimental design to additionally vary select parameters such as initial chamber pressure. The starting chamber pressure for each test was set below atmospheric to prevent chamber overpressure. Bottom ignition, or upward propagating burns, produced rapid acceleratory turbulent flame spread. Pressure rise in the chamber increases as the amount of fuel burned increases mainly because of the larger amount of heat generation and, to a much smaller extent, due to the increase in gaseous number of moles. Top ignition, or downward propagating burns, produced a steady flame spread with a very small flat flame across the burning edge. Steady-state pressure is achieved during downward flame spread as the pressure rises and plateaus. This indicates that the heat generation by the flame matches the heat loss to surroundings during the longer, slower downward burns. One heat loss mechanism included mounting a heat exchanger directly above the burning sample in the path of the plume to act as a heat sink and more efficiently dissipate the heat due to the combustion event. This proved an effective means for chamber overpressure mitigation for those tests producing the most total heat release and thusly was determined to be a feasible mitigation

  19. GPU-based large-scale visualization

    KAUST Repository

    Hadwiger, Markus

    2013-11-19

    Recent advances in image and volume acquisition as well as computational advances in simulation have led to an explosion of the amount of data that must be visualized and analyzed. Modern techniques combine the parallel processing power of GPUs with out-of-core methods and data streaming to enable the interactive visualization of giga- and terabytes of image and volume data. A major enabler for interactivity is making both the computational and the visualization effort proportional to the amount of data that is actually visible on screen, decoupling it from the full data size. This leads to powerful display-aware multi-resolution techniques that enable the visualization of data of almost arbitrary size. The course consists of two major parts: An introductory part that progresses from fundamentals to modern techniques, and a more advanced part that discusses details of ray-guided volume rendering, novel data structures for display-aware visualization and processing, and the remote visualization of large online data collections. You will learn how to develop efficient GPU data structures and large-scale visualizations, implement out-of-core strategies and concepts such as virtual texturing that have only been employed recently, as well as how to use modern multi-resolution representations. These approaches reduce the GPU memory requirements of extremely large data to a working set size that fits into current GPUs. You will learn how to perform ray-casting of volume data of almost arbitrary size and how to render and process gigapixel images using scalable, display-aware techniques. We will describe custom virtual texturing architectures as well as recent hardware developments in this area. We will also describe client/server systems for distributed visualization, on-demand data processing and streaming, and remote visualization. We will describe implementations using OpenGL as well as CUDA, exploiting parallelism on GPUs combined with additional asynchronous

  20. Lightweight computational steering of very large scale molecular dynamics simulations

    International Nuclear Information System (INIS)

    We present a computational steering approach for controlling, analyzing, and visualizing very large scale molecular dynamics simulations involving tens to hundreds of millions of atoms. Our approach relies on extensible scripting languages and an easy to use tool for building extensions and modules. The system is extremely easy to modify, works with existing C code, is memory efficient, and can be used from inexpensive workstations and networks. We demonstrate how we have used this system to manipulate data from production MD simulations involving as many as 104 million atoms running on the CM-5 and Cray T3D. We also show how this approach can be used to build systems that integrate common scripting languages (including Tcl/Tk, Perl, and Python), simulation code, user extensions, and commercial data analysis packages

  1. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  2. Power suppression at large scales in string inflation

    International Nuclear Information System (INIS)

    We study a possible origin of the anomalous suppression of the power spectrum at large angular scales in the cosmic microwave background within the framework of explicit string inflationary models where inflation is driven by a closed string modulus parameterizing the size of the extra dimensions. In this class of models the apparent power loss at large scales is caused by the background dynamics which involves a sharp transition from a fast-roll power law phase to a period of Starobinsky-like slow-roll inflation. An interesting feature of this class of string inflationary models is that the number of e-foldings of inflation is inversely proportional to the string coupling to a positive power. Therefore once the string coupling is tuned to small values in order to trust string perturbation theory, enough e-foldings of inflation are automatically obtained without the need of extra tuning. Moreover, in the less tuned cases the sharp transition responsible for the power loss takes place just before the last 50-60 e-foldings of inflation. We illustrate these general claims in the case of Fibre Inflation where we study the strength of this transition in terms of the attractor dynamics, finding that it induces a pivot from a blue to a redshifted power spectrum which can explain the apparent large scale power loss. We compute the effects of this pivot for example cases and demonstrate how magnitude and duration of this effect depend on model parameters

  3. Power suppression at large scales in string inflation

    Energy Technology Data Exchange (ETDEWEB)

    Cicoli, Michele [Dipartimento di Fisica ed Astronomia, Università di Bologna, via Irnerio 46, Bologna, 40126 (Italy); Downes, Sean; Dutta, Bhaskar, E-mail: mcicoli@ictp.it, E-mail: sddownes@physics.tamu.edu, E-mail: dutta@physics.tamu.edu [Mitchell Institute for Fundamental Physics and Astronomy, Department of Physics and Astronomy, Texas A and M University, College Station, TX, 77843-4242 (United States)

    2013-12-01

    We study a possible origin of the anomalous suppression of the power spectrum at large angular scales in the cosmic microwave background within the framework of explicit string inflationary models where inflation is driven by a closed string modulus parameterizing the size of the extra dimensions. In this class of models the apparent power loss at large scales is caused by the background dynamics which involves a sharp transition from a fast-roll power law phase to a period of Starobinsky-like slow-roll inflation. An interesting feature of this class of string inflationary models is that the number of e-foldings of inflation is inversely proportional to the string coupling to a positive power. Therefore once the string coupling is tuned to small values in order to trust string perturbation theory, enough e-foldings of inflation are automatically obtained without the need of extra tuning. Moreover, in the less tuned cases the sharp transition responsible for the power loss takes place just before the last 50-60 e-foldings of inflation. We illustrate these general claims in the case of Fibre Inflation where we study the strength of this transition in terms of the attractor dynamics, finding that it induces a pivot from a blue to a redshifted power spectrum which can explain the apparent large scale power loss. We compute the effects of this pivot for example cases and demonstrate how magnitude and duration of this effect depend on model parameters.

  4. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29. ISSN 0098-3500 R&D Projects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse problems * partially separable problems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  5. Understanding large scale groundwater flow to aid in repository siting

    International Nuclear Information System (INIS)

    Atomic Energy of Canada Limited (AECL) with support from Ontario Hydro has developed a concept for the safe disposal of Canada's nuclear fuel waste in a deep (500 to 1000 m) mined repository in plutonic rocks of the Canadian Shield. The disposal concept involves the use of multiple engineered and natural barriers to ensure long-term safety. The geosphere, comprised of the enclosing rock mass and the groundwater which occurs in cracks and pores in the rock, is expected to serve as an important natural barrier to the release and migration of wastes from the engineered repository. Although knowledge of the physical and chemical characteristics of the groundwater in the rock at potential repository sites is needed to help design the engineered barriers of the repository it can also be used to aid in repository siting, to take greater advantage of natural conditions in the geosphere to enhance its role as a barrier in the overall disposal system

  6. Simultaneous analysis of large-scale RNAi screens for pathogen entry

    OpenAIRE

    Rämö, Pauli; Drewek, Anna; Arrieumerlou, Cécile; Beerenwinkel, Niko; Ben-Tekaya, Houchaima; Cardel, Bettina; Casanova, Alain; Conde-Alvarez, Raquel; Cossart, Pascale; Csúcs, Gábor; Eicher, Simone; Emmenlauer, Mario; Greber, Urs; Hardt, Wolf-Dietrich; Helenius, Ari

    2014-01-01

    Background Large-scale RNAi screening has become an important technology for identifying genes involved in biological processes of interest. However, the quality of large-scale RNAi screening is often deteriorated by off-targets effects. In order to find statistically significant effector genes for pathogen entry, we systematically analyzed entry pathways in human host cells for eight pathogens using image-based kinome-wide siRNA screens with siRNAs from three vendors. We propose a Parallel M...

  7. Large-Scale Brain Systems in ADHD: Beyond the Prefrontal-Striatal Model

    OpenAIRE

    Castellanos, F. Xavier; Proal, Erika

    2011-01-01

    Attention-deficit/hyperactivity disorder (ADHD) has long been thought to reflect dysfunction of prefrontal-striatal circuitry, with involvement of other circuits largely ignored. Recent advances in systems neuroscience-based approaches to brain dysfunction enable the development of models of ADHD pathophysiology that encompass a number of different large-scale “resting state” networks. Here we review progress in delineating large-scale neural systems and illustrate their relevance to ADHD. We...

  8. Large-scale assembly of colloidal particles

    Science.gov (United States)

    Yang, Hongta

    This study reports a simple, roll-to-roll compatible coating technology for producing three-dimensional highly ordered colloidal crystal-polymer composites, colloidal crystals, and macroporous polymer membranes. A vertically beveled doctor blade is utilized to shear align silica microsphere-monomer suspensions to form large-area composites in a single step. The polymer matrix and the silica microspheres can be selectively removed to create colloidal crystals and self-standing macroporous polymer membranes. The thickness of the shear-aligned crystal is correlated with the viscosity of the colloidal suspension and the coating speed, and the correlations can be qualitatively explained by adapting the mechanisms developed for conventional doctor blade coating. Five important research topics related to the application of large-scale three-dimensional highly ordered macroporous films by doctor blade coating are covered in this study. The first topic describes the invention in large area and low cost color reflective displays. This invention is inspired by the heat pipe technology. The self-standing macroporous polymer films exhibit brilliant colors which originate from the Bragg diffractive of visible light form the three-dimensional highly ordered air cavities. The colors can be easily changed by tuning the size of the air cavities to cover the whole visible spectrum. When the air cavities are filled with a solvent which has the same refractive index as that of the polymer, the macroporous polymer films become completely transparent due to the index matching. When the solvent trapped in the cavities is evaporated by in-situ heating, the sample color changes back to brilliant color. This process is highly reversible and reproducible for thousands of cycles. The second topic reports the achievement of rapid and reversible vapor detection by using 3-D macroporous photonic crystals. Capillary condensation of a condensable vapor in the interconnected macropores leads to the

  9. Ideal and actual involvement of community pharmacists in health promotion and prevention: a cross-sectional study in Quebec, Canada

    Directory of Open Access Journals (Sweden)

    Laliberté Marie-Claude

    2012-03-01

    Full Text Available Abstract Background An increased interest is observed in broadening community pharmacists' role in public health. To date, little information has been gathered in Canada on community pharmacists' perceptions of their role in health promotion and prevention; however, such data are essential to the development of public-health programs in community pharmacy. A cross-sectional study was therefore conducted to explore the perceptions of community pharmacists in urban and semi-urban areas regarding their ideal and actual levels of involvement in providing health-promotion and prevention services and the barriers to such involvement. Methods Using a five-step modified Dillman's tailored design method, a questionnaire with 28 multiple-choice or open-ended questions (11 pages plus a cover letter was mailed to a random sample of 1,250 pharmacists out of 1,887 community pharmacists practicing in Montreal (Quebec, Canada and surrounding areas. It included questions on pharmacists' ideal level of involvement in providing health-promotion and preventive services; which services were actually offered in their pharmacy, the employees involved, the frequency, and duration of the services; the barriers to the provision of these services in community pharmacy; their opinion regarding the most appropriate health professionals to provide them; and the characteristics of pharmacists, pharmacies and their clientele. Results In all, 571 out of 1,234 (46.3% eligible community pharmacists completed and returned the questionnaire. Most believed they should be very involved in health promotion and prevention, particularly in smoking cessation (84.3%; screening for hypertension (81.8%, diabetes (76.0% and dyslipidemia (56.9%; and sexual health (61.7% to 89.1%; however, fewer respondents reported actually being very involved in providing such services (5.7% [lifestyle, including smoking cessation], 44.5%, 34.8%, 6.5% and 19.3%, respectively. The main barriers to the

  10. Bonus algorithm for large scale stochastic nonlinear programming problems

    CERN Document Server

    Diwekar, Urmila

    2015-01-01

    This book presents the details of the BONUS algorithm and its real world applications in areas like sensor placement in large scale drinking water networks, sensor placement in advanced power systems, water management in power systems, and capacity expansion of energy systems. A generalized method for stochastic nonlinear programming based on a sampling based approach for uncertainty analysis and statistical reweighting to obtain probability information is demonstrated in this book. Stochastic optimization problems are difficult to solve since they involve dealing with optimization and uncertainty loops. There are two fundamental approaches used to solve such problems. The first being the decomposition techniques and the second method identifies problem specific structures and transforms the problem into a deterministic nonlinear programming problem. These techniques have significant limitations on either the objective function type or the underlying distributions for the uncertain variables. Moreover, these ...

  11. Optimization of large-scale fabrication of dielectric elastomer transducers

    DEFF Research Database (Denmark)

    Hassouneh, Suzan Sager

    electrodes on the corrugated surface, and due to these corrugated surfaces the metal electrodes maintain conductivities up to more than 100% strain of the elastomer film. The films are then laminated in multiple layers to fabricate DE transducers. However, the current manufacturing process is not trouble......-strength laminates to perform as monolithic elements. For the front-to-back and front-to-front configurations, conductive elastomers were utilised. One approach involved adding the cheap and conductive filler, exfoliated graphite (EG) to a PDMS matrix to increase dielectric permittivity. The results showed that even......Dielectric elastomers (DEs) have gained substantial ground in many different applications, such as wave energy harvesting, valves and loudspeakers. For DE technology to be commercially viable, it is necessary that any large-scale production operation is nondestructive, efficient and cheap. Danfoss...

  12. Optimal Multilevel Control for Large Scale Interconnected Systems

    Directory of Open Access Journals (Sweden)

    Ahmed M. A. Alomar,

    2014-04-01

    Full Text Available A mathematical model of the finishing mill as an example of a large scale interconnected dynamical system is represented. First the system response due to disturbance only is presented. Then,the control technique applied to the finishing hot rolling steel mill is the optimal multilevel control using state feedback. An optimal controller is developed based on the integrated system model, but due to the complexity of the controllers and tremendous computational efforts involved, a multilevel technique is used in designing and implementing the controllers .The basis of the multilevel technique is described and a computational algorithm is discussed for the control of the finishing mill system . To reduce the mass storage , memory requirements and the computational time of the processor, a sub-optimal multilevel technique is applied to design the controllers of the finishing mill . Comparison between these controllers and conclusion is presented.

  13. Large Scale Spectral Clustering Using Approximate Commute Time Embedding

    CERN Document Server

    Khoa, Nguyen Lu Dang

    2011-01-01

    Spectral clustering is a novel clustering method which can detect complex shapes of data clusters. However, it requires the eigen decomposition of the graph Laplacian matrix, which is proportion to $O(n^3)$ and thus is not suitable for large scale systems. Recently, many methods have been proposed to accelerate the computational time of spectral clustering. These approximate methods usually involve sampling techniques by which a lot information of the original data may be lost. In this work, we propose a fast and accurate spectral clustering approach using an approximate commute time embedding, which is similar to the spectral embedding. The method does not require using any sampling technique and computing any eigenvector at all. Instead it uses random projection and a linear time solver to find the approximate embedding. The experiments in several synthetic and real datasets show that the proposed approach has better clustering quality and is faster than the state-of-the-art approximate spectral clustering ...

  14. Cosmic Ray Acceleration during Large Scale Structure Formation

    CERN Document Server

    Blasi, P

    2004-01-01

    Clusters of galaxies are storage rooms of cosmic rays. They confine the hadronic component of cosmic rays over cosmological time scales due to diffusion, and the electron component due to energy losses. Hadronic cosmic rays can be accelerated during the process of structure formation, because of the supersonic motion of gas in the potential wells created by dark matter. At the shock waves that result from this motion, charged particles can be energized through the first order Fermi process. After discussing the most important evidences for non-thermal phenomena in large scale structures, we describe in some detail the main issues related to the acceleration of particles at these shock waves, emphasizing the possible role of the dynamical backreaction of the accelerated particles on the plasmas involved.

  15. Large-Scale Organizational Performance Improvement.

    Science.gov (United States)

    Pilotto, Rudy; Young, Jonathan O'Donnell

    1999-01-01

    Describes the steps involved in a performance improvement program in the context of a large multinational corporation. Highlights include a training program for managers that explained performance improvement; performance matrices; divisionwide implementation, including strategic planning; organizationwide training of all personnel; and the…

  16. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  17. Adaptive Sampling for Large Scale Boosting

    OpenAIRE

    Dubout, Charles; Fleuret, Francois

    2014-01-01

    Classical Boosting algorithms, such as AdaBoost, build a strong classifier without concern for the computational cost. Some applications, in particular in computer vision, may involve millions of training examples and very large feature spaces. In such contexts, the training time of off-the-shelf Boosting algorithms may become prohibitive. Several methods exist to accelerate training, typically either by sampling the features or the examples used to train the weak learners. Even if some of th...

  18. Baryon Oscillations in the Large Scale Structure

    OpenAIRE

    Cooray, Asantha

    2001-01-01

    We study the possibility for an observational detection of oscillations due to baryons in the matter power spectrum and suggest a new cosmological test using the angular power spectrum of halos. The "standard rulers" of the proposed test involve overall shape of the matter power spectrum and baryon oscillation peaks in projection, as a function of redshift. Since oscillations are erased at non-linear scales, traces at redshifts greater than 1 are generally preferred. Given the decrease in num...

  19. Networking in a Large-Scale Distributed Agile Project

    OpenAIRE

    Moe, Nils Brede; Šmite, Darja; Šāblis, Aivars; Börjesson, Anne-Lie; Andréasson, Pia

    2014-01-01

    Context: In large-scale distributed software projects the expertise may be scattered across multiple locations. Goal: We describe and discuss a large-scale distributed agile project at Ericsson, a multinational telecommunications company headquartered in Sweden. The project is distributed across four development locations (one in Sweden, one in Korea and two in China) and employs 17 teams. In such a large scale environment the challenge is to have as few dependences between teams as possible,...

  20. The disposal of Canada's nuclear fuel waste: public involvement and social aspects

    International Nuclear Information System (INIS)

    This report describes the activities undertaken to provide information to the public about the Canadian Nuclear Fuel Waste Management Program as well as the opportunities for public involvement in the direction and development of the disposal concept through government inquiries and commissions and specific initiatives undertaken by AECL. Public viewpoints and the major issues identified by the public to be of particular concern and importance in evaluating the acceptability of the concept are described. In addition, how the issues have been addressed during the development of the disposal concept or how they could be addressed during implementation of the disposal concept are presented. There is also discussion of public perspectives of risk, the ethical aspects of nuclear fuel waste disposal, and public involvement in siting a nuclear fuel waste disposal facility. The Canadian Nuclear Fuel Waste Management Program is funded jointly by AECL and Ontario Hydro under the auspices of the CANDU Owners Group. (author)

  1. Eviction and loss of income assistance among street-involved youth in Canada.

    Science.gov (United States)

    Zivanovic, Rebecca; Omura, John; Wood, Evan; Nguyen, Paul; Kerr, Thomas; DeBeck, Kora

    2016-05-01

    Loss of housing and income assistance among vulnerable youth has not been well described in the literature, yet it is a crucial issue for public health. This study examines the prevalence and correlates of loss of income assistance as well as eviction among street-involved youth. We collected data from a prospective cohort of street-involved youth aged 14-26. Among 770 participants, 64.3 per cent reported having housing and 77.1 per cent reported receiving income assistance at some point during the study period. Further, 28.6 and 20.0 per cent of youth reported having been evicted and losing income assistance, respectively. In multivariable generalized estimating equations analysis, heavy alcohol use, unprotected sex, being a victim of violence, and homelessness were all independently associated with eviction. Separately, homelessness, recent incarceration, and drug dealing were independently associated with loss of income assistance. Eviction and loss of income assistance are common experiences among street-involved youth with multiple vulnerabilities. Our findings highlight the importance of improving continued engagement with critical social services. PMID:26961260

  2. Using Large-Scale Assessment Scores to Determine Student Grades

    Science.gov (United States)

    Miller, Tess

    2013-01-01

    Many Canadian provinces provide guidelines for teachers to determine students' final grades by combining a percentage of students' scores from provincial large-scale assessments with their term scores. This practice is thought to hold students accountable by motivating them to put effort into completing the large-scale assessment, thereby…

  3. Safeguards instruments for Large-Scale Reprocessing Plants

    Energy Technology Data Exchange (ETDEWEB)

    Hakkila, E.A. [Los Alamos National Lab., NM (United States); Case, R.S.; Sonnier, C. [Sandia National Labs., Albuquerque, NM (United States)

    1993-06-01

    Between 1987 and 1992 a multi-national forum known as LASCAR (Large Scale Reprocessing Plant Safeguards) met to assist the IAEA in development of effective and efficient safeguards for large-scale reprocessing plants. The US provided considerable input for safeguards approaches and instrumentation. This paper reviews and updates instrumentation of importance in measuring plutonium and uranium in these facilities.

  4. INTERNATIONAL WORKSHOP ON LARGE-SCALE REFORESTATION: PROCEEDINGS

    Science.gov (United States)

    The purpose of the workshop was to identify major operational and ecological considerations needed to successfully conduct large-scale reforestation projects throughout the forested regions of the world. Large-scale" for this workshop means projects where, by human effort, approx...

  5. ACTIVE DIMENSIONAL CONTROL OF LARGE-SCALED STEEL STRUCTURES

    OpenAIRE

    Radosław Rutkowski

    2013-01-01

    The article discusses the issues of dimensional control in the construction process of large-scaled steel structures. The main focus is on the analysis of manufacturing tolerances. The article presents the procedure of tolerance analysis usage in process of design and manufacturing of large-scaled steel structures. The proposed solution could significantly improve the manufacturing process.

  6. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  7. Leashes and Lies: Navigating the Colonial Tensions of Institutional Ethics of Research Involving Indigenous Peoples in Canada

    Directory of Open Access Journals (Sweden)

    Martha L. Stiegman

    2015-06-01

    Full Text Available Ethical standards of conduct in research undertaken at Canadian universities involving humans has been guided by the three federal research funding agencies through the Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans (or TCPS for short since 1998. The statement was revised for the first time in 2010 and is now commonly referred to as the TCPS2, which includes an entire chapter (Chapter 9 devoted to the subject of research involving First Nations, Inuit, and Métis peoples of Canada. While the establishment of TCPS2 is an important initial step on the long road towards decolonizing Indigenous research within the academy, our frustrations—which echo those of many colleagues struggling to do research “in a good way” (see, for example, Ball & Janyst 2008; Bull, 2008; Guta et al., 2010 within this framework—highlight the urgent work that remains to be done if university-based researchers are to be enabled by establishment channels to do “ethical” research with Aboriginal peoples. In our (and others’ experience to date, we seem to have been able to do research in a good way, despite, not because of the TCPS2 (see Castleden et al., 2012. The disconnect between the stated goals of TCPS2, and the challenges researchers face when attempting to navigate how individual, rotating members of REBs interpret the TPCS2 and operate within this framework, begs the question: Wherein lies the disconnect? A number of scholars are currently researching this divide (see for example see Guta et al. 2010; Flicker & Worthington, 2011; and Guta et al., 2013. In this editorial, we offer an anecdote to illustrate our experience regarding some of these tensions and then offer reflections about what might need to change for the next iteration of the TCPS.

  8. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... presented as the small-scale model underpredicts the overtopping discharge....... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects are...

  9. Probabilistic cartography of the large-scale structure

    CERN Document Server

    Leclercq, Florent; Lavaux, Guilhem; Wandelt, Benjamin

    2015-01-01

    The BORG algorithm is an inference engine that derives the initial conditions given a cosmological model and galaxy survey data, and produces physical reconstructions of the underlying large-scale structure by assimilating the data into the model. We present the application of BORG to real galaxy catalogs and describe the primordial and late-time large-scale structure in the considered volumes. We then show how these results can be used for building various probabilistic maps of the large-scale structure, with rigorous propagation of uncertainties. In particular, we study dynamic cosmic web elements and secondary effects in the cosmic microwave background.

  10. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  11. Large scale photovoltaic field trials. Second technical report: monitoring phase

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2007-09-15

    This report provides an update on the Large-Scale Building Integrated Photovoltaic Field Trials (LS-BIPV FT) programme commissioned by the Department of Trade and Industry (Department for Business, Enterprise and Industry; BERR). It provides detailed profiles of the 12 projects making up this programme, which is part of the UK programme on photovoltaics and has run in parallel with the Domestic Field Trial. These field trials aim to record the experience and use the lessons learnt to raise awareness of, and confidence in, the technology and increase UK capabilities. The projects involved: the visitor centre at the Gaia Energy Centre in Cornwall; a community church hall in London; council offices in West Oxfordshire; a sports science centre at Gloucester University; the visitor centre at Cotswold Water Park; the headquarters of the Insolvency Service; a Welsh Development Agency building; an athletics centre in Birmingham; a research facility at the University of East Anglia; a primary school in Belfast; and Barnstable civic centre in Devon. The report describes the aims of the field trials, monitoring issues, performance, observations and trends, lessons learnt and the results of occupancy surveys.

  12. Large-scale native preparation of in vitro transcribed RNA.

    Science.gov (United States)

    Keel, Amanda Y; Easton, Laura E; Lukavsky, Peter J; Kieft, Jeffrey S

    2009-01-01

    Biophysical studies of RNA require concentrated samples that are chemically and structurally homogeneous. Historically, the most widely used methods for preparing these samples involve in vitro transcription, denaturation of the RNA, purification based on size, and subsequent refolding. These methods are useful but are inherently slow and do not guarantee that the RNA is properly folded. Possible mis-folding is of particular concern with large, complexly folded RNAs. To address these problems, we have developed methods for purifying in vitro transcribed RNAs in their native, folded states. These methods also have the advantage of being rapid and readily scaled to virtually any size RNA or transcription amount. Two methods are presented: the first is an affinity chromatography approach and the second is a weak ion-exchange chromatography approach. Both use equipment and materials readily available to almost any lab and hence should provide flexibility for those seeking alternate approaches to large-scale purification of RNA in the folded state. PMID:20946782

  13. Parallel block schemes for large scale least squares computations

    Energy Technology Data Exchange (ETDEWEB)

    Golub, G.H.; Plemmons, R.J.; Sameh, A.

    1986-04-01

    Large scale least squares computations arise in a variety of scientific and engineering problems, including geodetic adjustments and surveys, medical image analysis, molecular structures, partial differential equations and substructuring methods in structural engineering. In each of these problems, matrices often arise which possess a block structure which reflects the local connection nature of the underlying physical problem. For example, such super-large nonlinear least squares computations arise in geodesy. Here the coordinates of positions are calculated by iteratively solving overdetermined systems of nonlinear equations by the Gauss-Newton method. The US National Geodetic Survey will complete this year (1986) the readjustment of the North American Datum, a problem which involves over 540 thousand unknowns and over 6.5 million observations (equations). The observation matrix for these least squares computations has a block angular form with 161 diagnonal blocks, each containing 3 to 4 thousand unknowns. In this paper parallel schemes are suggested for the orthogonal factorization of matrices in block angular form and for the associated backsubstitution phase of the least squares computations. In addition, a parallel scheme for the calculation of certain elements of the covariance matrix for such problems is described. It is shown that these algorithms are ideally suited for multiprocessors with three levels of parallelism such as the Cedar system at the University of Illinois. 20 refs., 7 figs.

  14. Power Suppression at Large Scales in String Inflation

    CERN Document Server

    Cicoli, Michele; Dutta, Bhaskar

    2013-01-01

    We study a possible origin of the anomalous suppression of the power spectrum at large angular scales in the cosmic microwave background within the framework of explicit string inflationary models where inflation is driven by a closed string modulus parameterizing the size of the extra dimensions. In this class of models the apparent power loss at large scales is caused by the background dynamics which involves a sharp transition from a fast-roll power law phase to a period of Starobinsky-like slow-roll inflation. An interesting feature of this class of string inflationary models is that the number of e-foldings of inflation is inversely proportional to the string coupling to a positive power. Therefore once the string coupling is tuned to small values in order to trust string perturbation theory, enough e-foldings of inflation are automatically obtained without the need of extra tuning. Moreover, in the less tuned cases the sharp transition responsible for the power loss takes place just before the last 50-6...

  15. Large Scale Applications of HTS in New Zealand

    Science.gov (United States)

    Wimbush, Stuart C.

    New Zealand has one of the longest-running and most consistently funded (relative to GDP) programmes in high temperature superconductor (HTS) development and application worldwide. As a consequence, it has a sustained breadth of involvement in HTS technology development stretching from the materials discovery right through to burgeoning commercial exploitation. This review paper outlines the present large scale projects of the research team at the newly-established Robinson Research Institute of Victoria University of Wellington. These include the construction and grid-based testing of a three-phase 1 MVA 2G HTS distribution transformer utilizing Roebel cable for its high-current secondary windings and the development of a cryogen-free conduction-cooled 1.5 T YBCO-based human extremity magnetic resonance imaging system. Ongoing activities supporting applications development such as low-temperature full-current characterization of commercial superconducting wires and the implementation of inductive flux-pump technologies for efficient brushless coil excitation in superconducting magnets and rotating machines are also described.

  16. Rock sealing - large scale field test and accessory investigations

    International Nuclear Information System (INIS)

    The experience from the pilot field test and the basic knowledge extracted from the lab experiments have formed the basis of the planning of a Large Scale Field Test. The intention is to find out how the 'instrument of rock sealing' can be applied to a number of practical cases, where cutting-off and redirection of groundwater flow in repositories are called for. Five field subtests, which are integrated mutually or with other Stripa projects (3D), are proposed. One of them concerns 'near-field' sealing, i.e. sealing of tunnel floors hosting deposition holes, while two involve sealing of 'disturbed' rock around tunnels. The fourth concerns sealing of a natural fracture zone in the 3D area, and this latter test has the expected spin-off effect of obtaining additional information on the general flow pattern around the northeastern wing of the 3D cross. The fifth test is an option of sealing structures in the Validation Drift. The longevity of major grout types is focussed on as the most important part of the 'Accessory Investigations', and detailed plans have been worked out for that purpose. It is foreseen that the continuation of the project, as outlined in this report, will yield suitable methods and grouts for effective and long-lasting sealing of rock for use at stategic points in repositories. (author)

  17. Reorganizing Complex Network to Improve Large-Scale Multiagent Teamwork

    Directory of Open Access Journals (Sweden)

    Yang Xu

    2014-01-01

    Full Text Available Large-scale multiagent teamwork has been popular in various domains. Similar to human society infrastructure, agents only coordinate with some of the others, with a peer-to-peer complex network structure. Their organization has been proven as a key factor to influence their performance. To expedite team performance, we have analyzed that there are three key factors. First, complex network effects may be able to promote team performance. Second, coordination interactions coming from their sources are always trying to be routed to capable agents. Although they could be transferred across the network via different paths, their sources and sinks depend on the intrinsic nature of the team which is irrelevant to the network connections. In addition, the agents involved in the same plan often form a subteam and communicate with each other more frequently. Therefore, if the interactions between agents can be statistically recorded, we are able to set up an integrated network adjustment algorithm by combining the three key factors. Based on our abstracted teamwork simulations and the coordination statistics, we implemented the adaptive reorganization algorithm. The experimental results briefly support our design that the reorganized network is more capable of coordinating heterogeneous agents.

  18. Factors influencing the decommissioning of large-scale nuclear plants

    International Nuclear Information System (INIS)

    The decision-making process involving the decommissioning of the UK graphite moderated, gas-cooled nuclear power stations is complex. There are timing, engineering, waste disposal, cost and lost generation capacity factors to consider and the overall decision of when and how to proceed with decommissioning may include political and public tolerance dimensions. For the final stage of decommissioning the nuclear industry could either completely dismantle the reactor island leaving a green-field site or, alternatively, the reactor island could be maintained indefinitely with additional super- and substructure containment. At this time the first of these options, or deferred decommissioning, prevails and with this the nuclear industry has expressed considerable confidence that the technology required will become available with passing time, that acceptable radioactive waste disposal methods and facilities will be available and that the eventual costs of decommissioning will not escalate without restraint. If the deferred decommissioning strategy is wrong and it is not possible to completely dismantle the reactor islands a century into the future, then it may be too late to effect sufficient longer term containment to maintain the reactor hulks in a reliable condition. With respect to the final decommissioning of large-scale nuclear plant, it is concluded that the nuclear industry does not know quite how to do it, when it will be attempted and when it will be completed, and they do not know how much it will eventually cost. (author)

  19. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  20. Ferroelectric opening switches for large-scale pulsed power drivers.

    Energy Technology Data Exchange (ETDEWEB)

    Brennecka, Geoffrey L.; Rudys, Joseph Matthew; Reed, Kim Warren; Pena, Gary Edward; Tuttle, Bruce Andrew; Glover, Steven Frank

    2009-11-01

    Fast electrical energy storage or Voltage-Driven Technology (VDT) has dominated fast, high-voltage pulsed power systems for the past six decades. Fast magnetic energy storage or Current-Driven Technology (CDT) is characterized by 10,000 X higher energy density than VDT and has a great number of other substantial advantages, but it has all but been neglected for all of these decades. The uniform explanation for neglect of CDT technology is invariably that the industry has never been able to make an effective opening switch, which is essential for the use of CDT. Most approaches to opening switches have involved plasma of one sort or another. On a large scale, gaseous plasmas have been used as a conductor to bridge the switch electrodes that provides an opening function when the current wave front propagates through to the output end of the plasma and fully magnetizes the plasma - this is called a Plasma Opening Switch (POS). Opening can be triggered in a POS using a magnetic field to push the plasma out of the A-K gap - this is called a Magnetically Controlled Plasma Opening Switch (MCPOS). On a small scale, depletion of electron plasmas in semiconductor devices is used to affect opening switch behavior, but these devices are relatively low voltage and low current compared to the hundreds of kilo-volts and tens of kilo-amperes of interest to pulsed power. This work is an investigation into an entirely new approach to opening switch technology that utilizes new materials in new ways. The new materials are Ferroelectrics and using them as an opening switch is a stark contrast to their traditional applications in optics and transducer applications. Emphasis is on use of high performance ferroelectrics with the objective of developing an opening switch that would be suitable for large scale pulsed power applications. Over the course of exploring this new ground, we have discovered new behaviors and properties of these materials that were here to fore unknown. Some of

  1. Food security through large scale investments in agriculture

    Science.gov (United States)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  2. Some perspective on the Large Scale Scientific Computation Research

    Institute of Scientific and Technical Information of China (English)

    DU; Qiang

    2004-01-01

    The "Large Scale Scientific Computation (LSSC) Research"project is one of the State Major Basic Research projects funded by the Chinese Ministry of Science and Technology in the field ofinformation science and technology.……

  3. Some perspective on the Large Scale Scientific Computation Research

    Institute of Scientific and Technical Information of China (English)

    DU Qiang

    2004-01-01

    @@ The "Large Scale Scientific Computation (LSSC) Research"project is one of the State Major Basic Research projects funded by the Chinese Ministry of Science and Technology in the field ofinformation science and technology.

  4. PetroChina to Expand Dushanzi Refinery on Large Scale

    Institute of Scientific and Technical Information of China (English)

    2005-01-01

    @@ A large-scale expansion project for PetroChina Dushanzi Petrochemical Company has been given the green light, a move which will make it one of the largest refineries and petrochemical complexes in the country.

  5. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Document Server

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  6. New Visions for Large Scale Networks: Research and Applications

    Data.gov (United States)

    Networking and Information Technology Research and Development, Executive Office of the President — This paper documents the findings of the March 12-14, 2001 Workshop on New Visions for Large-Scale Networks: Research and Applications. The workshops objectives...

  7. The fractal octahedron network of the large scale structure

    OpenAIRE

    Battaner, E

    1998-01-01

    In a previous article, we have proposed that the large scale structure network generated by large scale magnetic fields could consist of a network of octahedra only contacting at their vertexes. Assuming such a network could arise at different scales producing a fractal geometry, we study here its properties, and in particular how a sub-octahedron network can be inserted within an octahedron of the large network. We deduce that the scale of the fractal structure would range from $\\approx$100 ...

  8. Optimization of large-scale fabrication of dielectric elastomer transducers

    OpenAIRE

    Hassouneh, Suzan Sager; Skov, Anne Ladegaard; Daugaard, Anders Egede

    2015-01-01

    Dielectric elastomers (DEs) have gained substantial ground in many different applications, such as wave energy harvesting, valves and loudspeakers. For DE technology to becommercially viable, it is necessary that any large-scale production operation is nondestructive, efficient and cheap. Danfoss Polypower A/S employs a large-scale process for manufacturing DE films with one-sided corrugated surfaces. The DEs are manufactured by coating an elastomer mixture to a corrugated carrier web, thereb...

  9. Water Implications of Large-Scale Land Acquisitions in Ghana

    OpenAIRE

    Timothy Olalekan Williams; Benjamin Gyampoh; Fred Kizito; Regassa Namara

    2012-01-01

    This paper examines the water dimensions of recent large-scale land acquisitions for biofuel production in the Ashanti, Brong-Ahafo and Northern regions of Ghana. Using secondary sources of data complemented by individual and group interviews, the paper reveals an almost universal lack of consideration of the implications of large-scale land deals for crop water requirements, the ecological functions of freshwater ecosystems and water rights of local smallholder farmers and other users. It do...

  10. "Blueprint" for the UP Modelling System for Large Scale Hydrology

    OpenAIRE

    J. Ewen

    1997-01-01

    There are at least two needs to be met by the current research efforts on large scale hydrological modelling. The first is for practical conceptual land-surface hydrology schemes for use with existing operational climate and weather forecasting models, to replace the overly simple schemes often used in such models. The second is for models of large scale hydrology which are properly sensitive to changes in physical properties and inputs measured (or predicted) over a wide range of scales, fro...

  11. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  12. Constraining cosmological ultra-large scale structure using numerical relativity

    OpenAIRE

    Braden, Jonathan; Johnson, Matthew C.; Peiris, Hiranya V.; Aguirre, Anthony

    2016-01-01

    Cosmic inflation, a period of accelerated expansion in the early universe, can give rise to large amplitude ultra-large scale inhomogeneities on distance scales comparable to or larger than the observable universe. The cosmic microwave background (CMB) anisotropy on the largest angular scales is sensitive to such inhomogeneities and can be used to constrain the presence of ultra-large scale structure (ULSS). We numerically evolve nonlinear inhomogeneities present at the beginning of inflation...

  13. Land consolidation for large-scale infrastructure projects in Germany

    OpenAIRE

    Hendrickss, Andreas; Lisec, Anka

    2014-01-01

    Large-scale infrastructure projects require the acquisition of appropriate land for their construction and maintenance, while they often cause extensive fragmentations of the affected landscape and land plots as well as significant land loss of the immediately affected land owners. A good practice in this field comes from Germany. In Germany, the so-called “land consolidation for large-scale projects” is used to distribute the land loss among a larger group of land own...

  14. Structural Analysis of Large-Scale Power Systems

    OpenAIRE

    Dai, X Z; Zhang, K. F.

    2012-01-01

    Some fundamental structural characteristics of large-scale power systems are analyzed in the paper. Firstly, the large-scale power system is decomposed into various hierarchical levels: the main system, subsystems, sub-subsystems, down to its basic components. The proposed decomposition method is suitable for arbitrary system topology, and the relations among various decomposed hierarchical levels are explicitly expressed by introducing the interface concept. Then, the structural models of va...

  15. Balancing modern Power System with large scale of wind power

    OpenAIRE

    Basit, Abdul; Altin, Müfit; Anca Daniela HANSEN; Sørensen, Poul Ejnar

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the s...

  16. Increasing Reliability of Communication in Large Scale Distributed Systems

    OpenAIRE

    Malloth, C.

    1995-01-01

    This paper discusses the problem of reliable communication in large scale networks. More precisely, we describe the problem of {\\it non-transitive} communication in a large scale distributed system due to link failure which leads to {\\it partial} partitions. A definition for {\\it partial} partition in contrast to a {\\it total} partition is given. The solution to mask these {\\it partial} partitions and as a consequence providing {\\it transitive} communication, consists in using every possible ...

  17. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  18. Parallel Computational Fluid Dynamics 2007 : Implementations and Experiences on Large Scale and Grid Computing

    CERN Document Server

    2009-01-01

    At the 19th Annual Conference on Parallel Computational Fluid Dynamics held in Antalya, Turkey, in May 2007, the most recent developments and implementations of large-scale and grid computing were presented. This book, comprised of the invited and selected papers of this conference, details those advances, which are of particular interest to CFD and CFD-related communities. It also offers the results related to applications of various scientific and engineering problems involving flows and flow-related topics. Intended for CFD researchers and graduate students, this book is a state-of-the-art presentation of the relevant methodology and implementation techniques of large-scale computing.

  19. Body mass index and its association with lumbar disc herniation and sciatica: a large-scale, population-based study

    OpenAIRE

    Samartzis, D; Karppinen, JI; Luk, KDK; Cheung, KMC

    2014-01-01

    INTRODUCTION: This large-scale study addressed the association of body mass index (BMI), especially overweight / obesity with lumbar disc herniation, its global lumbar involvement and implications with sciatica that little of which is ...

  20. Large Scale Computing and Storage Requirements for Nuclear Physics Research

    Energy Technology Data Exchange (ETDEWEB)

    Gerber, Richard A.; Wasserman, Harvey J.

    2012-03-02

    IThe National Energy Research Scientific Computing Center (NERSC) is the primary computing center for the DOE Office of Science, serving approximately 4,000 users and hosting some 550 projects that involve nearly 700 codes for a wide variety of scientific disciplines. In addition to large-scale computing resources NERSC provides critical staff support and expertise to help scientists make the most efficient use of these resources to advance the scientific mission of the Office of Science. In May 2011, NERSC, DOE’s Office of Advanced Scientific Computing Research (ASCR) and DOE’s Office of Nuclear Physics (NP) held a workshop to characterize HPC requirements for NP research over the next three to five years. The effort is part of NERSC’s continuing involvement in anticipating future user needs and deploying necessary resources to meet these demands. The workshop revealed several key requirements, in addition to achieving its goal of characterizing NP computing. The key requirements include: 1. Larger allocations of computational resources at NERSC; 2. Visualization and analytics support; and 3. Support at NERSC for the unique needs of experimental nuclear physicists. This report expands upon these key points and adds others. The results are based upon representative samples, called “case studies,” of the needs of science teams within NP. The case studies were prepared by NP workshop participants and contain a summary of science goals, methods of solution, current and future computing requirements, and special software and support needs. Participants were also asked to describe their strategy for computing in the highly parallel, “multi-core” environment that is expected to dominate HPC architectures over the next few years. The report also includes a section with NERSC responses to the workshop findings. NERSC has many initiatives already underway that address key workshop findings and all of the action items are aligned with NERSC strategic plans.

  1. The need and the prospects for developing large-scale green corridors to protect biodiversity

    Directory of Open Access Journals (Sweden)

    Shaojie Mu

    2014-03-01

    Full Text Available Habitat fragmentation can decrease species numbers and mobility and increase mortality rates. This process is widely recognized as the primary threat to the loss of biodiversity loss and the primary cause of species extinction. Green corridors provide a natural link between large areas of natural landscape. These corridors facilitate the movement and migration of wildlife. Several foreign environmental conservation groups have been aware of the positive impacts of large-scale green corridors on the protection of biodiversity and landscape connectivity, and a number of attempts have been made to pursue this goal. In China, the Three-North Shelterbelt Forest project launched in 1970s composed large scale ecological barrier system in most area of arid region, which is an example of large-scale green corridors. At present, the applications of green corridors in China primarily involve urban greening. These corridors are generally planned and implemented on a small, local scale without adequate consideration for ecological integrity and connectivity. This article introduces the primary mission and development history of the world’s principal large-scale corridors. The progress of development and the practice of planning for green corridors in China are topics of concern, and the shortage of green corridors in the country has been recognized. Based on the current status of the Nature Reserve System, Urban Corridor Network and specific natural landscapes in China, a feasible development scheme for large-scale green corridors in China has been proposed.

  2. A global review of large-scale experimental manipulations of streamflow

    Science.gov (United States)

    Konrad, C. P.; Olden, J. D.

    2010-12-01

    Ecological objectives have been integrated increasingly into the operation of water resources systems including dams, diversions, and pumps. In some cases, streamflow has been manipulated as a large-scale experiment aimed at both learning and improving river, floodplain, and estuarine resources. Our working group at the National Center for Ecological Analysis and Synthesis has conducted a global review of large-scale flow experiments to assess how these efforts can be effective both scientifically and socially. Rivers, floodplains, and estuaries are particularly challenging subjects for large-scale experiments because they are open systems with strong longitudinal connectivity and heterogeneity. These systems rarely can be isolated from their social context, thus stakeholders and water managers must be involved in experimental planning and implementation. Large-scale flow experiments have been effective for learning and informing management decisions when 1) treatments can be applied discretely - separate from other management interventions - and repeated, 2) directly measured responses have mechanistic links to broader resource objectives, and 3) there is institutional support for investigations and adoption of new information into operating policies. In focused applications with a clear nexus to decision-making, large-scale flow experiments complement longer term monitoring of aquatic ecosystems and evaluation of operations in adaptive management of water resources.

  3. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  4. Large-scale-vortex dynamos in planar rotating convection

    CERN Document Server

    Guervilly, Céline; Jones, Chris A

    2016-01-01

    Several recent studies have demonstrated how large-scale vortices may arise spontaneously in rotating planar convection. Here we examine the dynamo properties of such flows in rotating Boussinesq convection. For moderate values of the magnetic Reynolds number ($100 \\lesssim Rm \\lesssim 550$, with $Rm$ based on the box depth and the convective velocity), a large-scale (i.e. system-size) magnetic field is generated. The amplitude of the magnetic energy oscillates in time, out of phase with the oscillating amplitude of the large-scale vortex. The dynamo mechanism relies on those components of the flow that have length scales lying between that of the large-scale vortex and the typical convective cell size; smaller-scale flows are not required. The large-scale vortex plays a crucial role in the magnetic induction despite being essentially two-dimensional. For larger magnetic Reynolds numbers, the dynamo is small scale, with a magnetic energy spectrum that peaks at the scale of the convective cells. In this case, ...

  5. Unsaturated Hydraulic Conductivity for Evaporation in Large scale Heterogeneous Soils

    Science.gov (United States)

    Sun, D.; Zhu, J.

    2014-12-01

    In this study we aim to provide some practical guidelines of how the commonly used simple averaging schemes (arithmetic, geometric, or harmonic mean) perform in simulating large scale evaporation in a large scale heterogeneous landscape. Previous studies on hydraulic property upscaling focusing on steady state flux exchanges illustrated that an effective hydraulic property is usually more difficult to define for evaporation. This study focuses on upscaling hydraulic properties of large scale transient evaporation dynamics using the idea of the stream tube approach. Specifically, the two main objectives are: (1) if the three simple averaging schemes (i.e., arithmetic, geometric and harmonic means) of hydraulic parameters are appropriate in representing large scale evaporation processes, and (2) how the applicability of these simple averaging schemes depends on the time scale of evaporation processes in heterogeneous soils. Multiple realizations of local evaporation processes are carried out using HYDRUS-1D computational code (Simunek et al, 1998). The three averaging schemes of soil hydraulic parameters were used to simulate the cumulative flux exchange, which is then compared with the large scale average cumulative flux. The sensitivity of the relative errors to the time frame of evaporation processes is also discussed.

  6. Combining p-values in large scale genomics experiments

    Science.gov (United States)

    Zaykin, Dmitri V.; Zhivotovsky, Lev A.; Czika, Wendy; Shao, Susan; Wolfinger, Russell D.

    2008-01-01

    Summary In large-scale genomics experiments involving thousands of statistical tests, such as association scans and microarray expression experiments, a key question is: Which of the L tests represent true associations (TAs)? The traditional way to control false findings is via individual adjustments. In the presence of multiple TAs, p-value combination methods offer certain advantages. Both Fisher’s and Lancaster’s combination methods use an inverse gamma transformation. We identify the relation of the shape parameter of that distribution to the implicit threshold value; p-values below that threshold are favored by the inverse gamma method (GM). We explore this feature to improve power over Fisher’s method when L is large and the number of TAs is moderate. However, the improvement in power provided by combination methods is at the expense of a weaker claim made upon rejection of the null hypothesis – that there are some TAs among the L tests. Thus, GM remains a global test. To allow a stronger claim about a subset of p-values that is smaller than L, we investigate two methods with an explicit truncation: the rank truncated product method (RTP) that combines the first K ordered p-values, and the truncated product method (TPM) that combines p-values that are smaller than a specified threshold. We conclude that TPM allows claims to be made about subsets of p-values, while the claim of the RTP is, like GM, more appropriately about all L tests. GM gives somewhat higher power than TPM, RTP, Fisher, and Simes methods across a range of simulations. PMID:17879330

  7. Combining p-values in large-scale genomics experiments.

    Science.gov (United States)

    Zaykin, Dmitri V; Zhivotovsky, Lev A; Czika, Wendy; Shao, Susan; Wolfinger, Russell D

    2007-01-01

    In large-scale genomics experiments involving thousands of statistical tests, such as association scans and microarray expression experiments, a key question is: Which of the L tests represent true associations (TAs)? The traditional way to control false findings is via individual adjustments. In the presence of multiple TAs, p-value combination methods offer certain advantages. Both Fisher's and Lancaster's combination methods use an inverse gamma transformation. We identify the relation of the shape parameter of that distribution to the implicit threshold value; p-values below that threshold are favored by the inverse gamma method (GM). We explore this feature to improve power over Fisher's method when L is large and the number of TAs is moderate. However, the improvement in power provided by combination methods is at the expense of a weaker claim made upon rejection of the null hypothesis - that there are some TAs among the L tests. Thus, GM remains a global test. To allow a stronger claim about a subset of p-values that is smaller than L, we investigate two methods with an explicit truncation: the rank truncated product method (RTP) that combines the first K-ordered p-values, and the truncated product method (TPM) that combines p-values that are smaller than a specified threshold. We conclude that TPM allows claims to be made about subsets of p-values, while the claim of the RTP is, like GM, more appropriately about all L tests. GM gives somewhat higher power than TPM, RTP, Fisher, and Simes methods across a range of simulations. PMID:17879330

  8. Physiological and technological aspects of large-scale heterologous-protein production with yeasts.

    Science.gov (United States)

    Hensing, M C; Rouwenhorst, R J; Heijnen, J J; van Dijken, J P; Pronk, J T

    1995-01-01

    Commercial production of heterologous proteins by yeasts has gained considerable interest. Expression systems have been developed for Saccharomyces cerevisiae and a number of other yeasts. Generally, much attention is paid to the molecular aspects of heterologous-gene expression. The success of this approach is indicated by the high expression levels that have been obtained in shake-flask cultures. For large-scale production however, possibilities and restrictions related to host-strain physiology and fermentation technology also have to be considered. In this review, these physiological and technological aspects have been evaluated with the aid of numerical simulations. Factors that affect the choice of a carbon substrate for large-scale production involve price, purity and solubility. Since oxygen demand and heat production (which are closely linked) limit the attainable growth rate in large-scale processes, the biomass yield on oxygen is also a key parameter. Large-scale processes impose restrictions on the expression system. Many promoter systems that work well in small-scale systems cannot be implemented in industrial environments. Furthermore, large-scale fed-batch fermentations involve a substantial number of generations. Therefore, even low expression-cassette instability has a profound effect on the overall productivity of the system. Multicopy-integration systems may provide highly stable expression systems for industrial processes. Large-scale fed-batch processes are typically performed at a low growth rate. Therefore, effects of a low growth rate on the physiology and product formation rates of yeasts are of key importance. Due to the low growth rates in the industrial process, a substantial part of the substrate carbon is expended to meet maintenance-energy requirements. Factors that reduce maintenance-energy requirements will therefore have a positive effect on product yield. The relationship between specific growth rate and specific product formation

  9. Toward Improved Support for Loosely Coupled Large Scale Simulation Workflows

    Energy Technology Data Exchange (ETDEWEB)

    Boehm, Swen [ORNL; Elwasif, Wael R [ORNL; Naughton, III, Thomas J [ORNL; Vallee, Geoffroy R [ORNL

    2014-01-01

    High-performance computing (HPC) workloads are increasingly leveraging loosely coupled large scale simula- tions. Unfortunately, most large-scale HPC platforms, including Cray/ALPS environments, are designed for the execution of long-running jobs based on coarse-grained launch capabilities (e.g., one MPI rank per core on all allocated compute nodes). This assumption limits capability-class workload campaigns that require large numbers of discrete or loosely coupled simulations, and where time-to-solution is an untenable pacing issue. This paper describes the challenges related to the support of fine-grained launch capabilities that are necessary for the execution of loosely coupled large scale simulations on Cray/ALPS platforms. More precisely, we present the details of an enhanced runtime system to support this use case, and report on initial results from early testing on systems at Oak Ridge National Laboratory.

  10. Large Scale Anomalies of the Cosmic Microwave Background with Planck

    DEFF Research Database (Denmark)

    Frejsel, Anne Mette

    This thesis focuses on the large scale anomalies of the Cosmic Microwave Background (CMB) and their possible origins. The investigations consist of two main parts. The first part is on statistical tests of the CMB, and the consistency of both maps and power spectrum. We find that the Planck data is...... very consistent, while the WMAP 9 year release appears more contaminated by non-CMB residuals than the 7 year release. The second part is concerned with the anomalies of the CMB from two approaches. One is based on an extended inflationary model as the origin of one specific large scale anomaly, namely....... Here we find evidence that the Planck CMB maps contain residual radiation in the loop areas, which can be linked to some of the large scale CMB anomalies: the point-parity asymmetry, the alignment of quadrupole and octupole and the dipolemodulation....

  11. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  12. Coupling between convection and large-scale circulation

    Science.gov (United States)

    Becker, T.; Stevens, B. B.; Hohenegger, C.

    2014-12-01

    The ultimate drivers of convection - radiation, tropospheric humidity and surface fluxes - are altered both by the large-scale circulation and by convection itself. A quantity to which all drivers of convection contribute is moist static energy, or gross moist stability, respectively. Therefore, a variance analysis of the moist static energy budget in radiative-convective equilibrium helps understanding the interaction of precipitating convection and the large-scale environment. In addition, this method provides insights concerning the impact of convective aggregation on this coupling. As a starting point, the interaction is analyzed with a general circulation model, but a model intercomparison study using a hierarchy of models is planned. Effective coupling parameters will be derived from cloud resolving models and these will in turn be related to assumptions used to parameterize convection in large-scale models.

  13. Large Scale Magnetohydrodynamic Dynamos from Cylindrical Differentially Rotating Flows

    CERN Document Server

    Ebrahimi, F

    2015-01-01

    For cylindrical differentially rotating plasmas threaded with a uniform vertical magnetic field, we study large-scale magnetic field generation from finite amplitude perturbations using analytic theory and direct numerical simulations. Analytically, we impose helical fluctuations, a seed field, and a background flow and use quasi-linear theory for a single mode. The predicted large-scale field growth agrees with numerical simulations in which the magnetorotational instability (MRI) arises naturally. The vertically and azimuthally averaged toroidal field is generated by a fluctuation-induced EMF that depends on differential rotation. Given fluctuations, the method also predicts large-scale field growth for MRI-stable rotation profiles and flows with no rotation but shear.

  14. Large-scale current systems in the dayside Venus ionosphere

    Science.gov (United States)

    Luhmann, J. G.; Elphic, R. C.; Brace, L. H.

    1981-01-01

    The occasional observation of large-scale horizontal magnetic fields within the dayside ionosphere of Venus by the flux gate magnetometer on the Pioneer Venus orbiter suggests the presence of large-scale current systems. Using the measured altitude profiles of the magnetic field and the electron density and temperature, together with the previously reported neutral atmosphere density and composition, it is found that the local ionosphere can be described at these times by a simple steady state model which treats the unobserved quantities, such as the electric field, as parameters. When the model is appropriate, the altitude profiles of the ion and electron velocities and the currents along the satellite trajectory can be inferred. These results elucidate the configurations and sources of the ionospheric current systems which produce the observed large-scale magnetic fields, and in particular illustrate the effect of ion-neutral coupling in the determination of the current system at low altitudes.

  15. Policy Writing as Dialogue: Drafting an Aboriginal Chapter for Canada's Tri-Council Policy Statement: Ethical Conduct for Research Involving Humans

    Directory of Open Access Journals (Sweden)

    Jeff Reading

    2010-07-01

    Full Text Available Writing policy that applies to First Nations, Inuit and Métis peoples in Canada has become more interactive as communities and their representative organizations press for practical recognition of an Aboriginal right of self-determination. When the policy in development is aimed at supporting “respect for human dignity” as it is in the case of ethics of research involving humans, the necessity of engaging the affected population becomes central to the undertaking.

  16. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    Science.gov (United States)

    Blackman, Eric G.

    2015-05-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. Here I discuss how magnetic helicity has come to help us understand the saturation of and sustenance of large scale dynamos, the need for either local or global helicity fluxes to avoid dynamo quenching, and the associated observational consequences. I also discuss how magnetic helicity acts as a hindrance to turbulent diffusion of large scale fields, and thus a helper for fossil remnant large scale field origin models in some contexts. I briefly discuss the connection between large scale fields and accretion disk theory as well. The goal here is to provide a conceptual primer to help the reader efficiently penetrate the literature.

  17. Large-scale ER-damper for seismic protection

    Science.gov (United States)

    McMahon, Scott; Makris, Nicos

    1997-05-01

    A large scale electrorheological (ER) damper has been designed, constructed, and tested. The damper consists of a main cylinder and a piston rod that pushes an ER-fluid through a number of stationary annular ducts. This damper is a scaled- up version of a prototype ER-damper which has been developed and extensively studied in the past. In this paper, results from comprehensive testing of the large-scale damper are presented, and the proposed theory developed for predicting the damper response is validated.

  18. Large-Scale Inverse Problems and Quantification of Uncertainty

    CERN Document Server

    Biegler, Lorenz; Ghattas, Omar

    2010-01-01

    Large-scale inverse problems and associated uncertainty quantification has become an important area of research, central to a wide range of science and engineering applications. Written by leading experts in the field, Large-scale Inverse Problems and Quantification of Uncertainty focuses on the computational methods used to analyze and simulate inverse problems. The text provides PhD students, researchers, advanced undergraduate students, and engineering practitioners with the perspectives of researchers in areas of inverse problems and data assimilation, ranging from statistics and large-sca

  19. The fractal octahedron network of the large scale structure

    CERN Document Server

    Battaner, E

    1998-01-01

    In a previous article, we have proposed that the large scale structure network generated by large scale magnetic fields could consist of a network of octahedra only contacting at their vertexes. Assuming such a network could arise at different scales producing a fractal geometry, we study here its properties, and in particular how a sub-octahedron network can be inserted within an octahedron of the large network. We deduce that the scale of the fractal structure would range from $\\approx$100 Mpc, i.e. the scale of the deepest surveys, down to about 10 Mpc, as other smaller scale magnetic fields were probably destroyed in the radiation dominated Universe.

  20. Participatory Design of Large-Scale Information Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    In this article we discuss how to engage in large-scale information systems development by applying a participatory design (PD) approach that acknowledges the unique situated work practices conducted by the domain experts of modern organizations. We reconstruct the iterative prototyping approach...... into a PD process model that (1) emphasizes PD experiments as transcending traditional prototyping by evaluating fully integrated systems exposed to real work practices; (2) incorporates improvisational change management including anticipated, emergent, and opportunity-based change; and (3) extends...... discuss three challenges to address when dealing with large-scale systems development....

  1. Distributed chaos tuned to large scale coherent motions in turbulence

    CERN Document Server

    Bershadskii, A

    2016-01-01

    It is shown, using direct numerical simulations and laboratory experiments data, that distributed chaos is often tuned to large scale coherent motions in anisotropic inhomogeneous turbulence. The examples considered are: fully developed turbulent boundary layer (range of coherence: $14 < y^{+} < 80$), turbulent thermal convection (in a horizontal cylinder), and Cuette-Taylor flow. Two ways of the tuning have been described: one via fundamental frequency (wavenumber) and another via subharmonic (period doubling). For the second way the large scale coherent motions are a natural component of distributed chaos. In all considered cases spontaneous breaking of space translational symmetry is accompanied by reflexional symmetry breaking.

  2. The CLASSgal code for Relativistic Cosmological Large Scale Structure

    CERN Document Server

    Di Dio, Enea; Lesgourgues, Julien; Durrer, Ruth

    2013-01-01

    We present some accurate and efficient computations of large scale structure observables, obtained with a modified version of the CLASS code which is made publicly available. This code includes all relativistic corrections and computes both the power spectrum Cl(z1,z2) and the corresponding correlation function xi(theta,z1,z2) in linear perturbation theory. For Gaussian initial perturbations, these quantities contain the full information encoded in the large scale matter distribution at the level of linear perturbation theory. We illustrate the usefulness of our code for cosmological parameter estimation through a few simple examples.

  3. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela;

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and...... the real time imbalances are minimized with automatic generation controller and the programmed to regulate active power reserves....

  4. Large-Scale Structure Observables in General Relativity

    CERN Document Server

    Jeong, Donghui

    2014-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider i) redshift perturbation of cosmic clock events; ii) distortion of cosmic rulers, including weak lensing shear and magnification; iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann-Robertson-Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order.

  5. Report of the LASCAR forum: Large scale reprocessing plant safeguards

    International Nuclear Information System (INIS)

    This report has been prepared to provide information on the studies which were carried out from 1988 to 1992 under the auspices of the multinational forum known as Large Scale Reprocessing Plant Safeguards (LASCAR) on safeguards for four large scale reprocessing plants operated or planned to be operated in the 1990s. The report summarizes all of the essential results of these studies. The participants in LASCAR were from France, Germany, Japan, the United Kingdom, the United States of America, the Commission of the European Communities - Euratom, and the International Atomic Energy Agency

  6. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Morteza Hajizadeh-Oghaz; Reza Shoja Razavi; Mohammadreza Loghman Estarki

    2014-08-01

    Yttria–stabilized zirconia nanopowders were synthesized on a relatively large scale using Pechini method. In the present paper, nearly spherical yttria-stabilized zirconia nanopowders with tetragonal structure were synthesized by Pechini process from zirconium oxynitrate hexahydrate, yttrium nitrate, citric acid and ethylene glycol. The phase and structural analyses were accomplished by X-ray diffraction; morphological analysis was carried out by field emission scanning electron microscopy and transmission electron microscopy. The results revealed nearly spherical yttria–stabilized zirconia powder with tetragonal crystal structure and chemical purity of 99.1% by inductively coupled plasma optical emission spectroscopy on a large scale.

  7. Generation Expansion Planning Considering Integrating Large-scale Wind Generation

    DEFF Research Database (Denmark)

    Zhang, Chunyu; Ding, Yi; Østergaard, Jacob;

    2013-01-01

    Generation expansion planning (GEP) is the problem of finding the optimal strategy to plan the Construction of new generation while satisfying technical and economical constraints. In the deregulated and competitive environment, large-scale integration of wind generation (WG) in power system has...... necessitated the inclusion of more innovative and sophisticated approaches in power system investment planning. A bi-level generation expansion planning approach considering large-scale wind generation was proposed in this paper. The first phase is investment decision, while the second phase is production...

  8. Clearing and Labeling Techniques for Large-Scale Biological Tissues.

    Science.gov (United States)

    Seo, Jinyoung; Choe, Minjin; Kim, Sung-Yon

    2016-06-30

    Clearing and labeling techniques for large-scale biological tissues enable simultaneous extraction of molecular and structural information with minimal disassembly of the sample, facilitating the integration of molecular, cellular and systems biology across different scales. Recent years have witnessed an explosive increase in the number of such methods and their applications, reflecting heightened interest in organ-wide clearing and labeling across many fields of biology and medicine. In this review, we provide an overview and comparison of existing clearing and labeling techniques and discuss challenges and opportunities in the investigations of large-scale biological systems. PMID:27239813

  9. Reactor vessel integrity analysis based upon large scale test results

    International Nuclear Information System (INIS)

    The fracture mechanics analysis of a nuclear reactor pressure vessel is discussed to illustrate the impact of knowledge gained by large scale testing on the demonstration of the integrity of such a vessel. The analysis must be able to predict crack initiation, arrest and reinitiation. The basis for the capability to make each prediction, including the large scale test information which is judged appropriate, is identified and the confidence in the applicability of the experimental data to a vessel is discussed. Where there is inadequate data to make a prediction with confidence or where there are apparently conflicting data, recommendations for future testing are presented. 15 refs., 6 figs.. 1 tab

  10. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising...... with high wind power penetration. This paper presents a review of the electricity storage technologies relevant for large power systems. The paper also presents an estimation of the economic feasibility of electricity storage using the west Danish power market area as a case....

  11. Practical Large Scale Syntheses of New Drug Candidates

    Institute of Scientific and Technical Information of China (English)

    Hui-Yin Li

    2001-01-01

    @@ This presentation will be focus on Practical large scale syntheses of lead compounds and drug candidates from three major therapeutic areas from DuPont Pharmaceuticals Research Laboratory: 1). DMP777-a selective, non-toxic, orally active human elastase inhibitor; 2). DMP754-a potent glycoprotein IIb/IIIa antagonist; 3). R-Wafarin-the pure enantiomeric form of wafarin. The key technology used for preparation these drug candidates is asymmetric hydrogenation under very mild reaction conditions, which produced very high quality final products at large scale (>99% de, >99 A% and >99 wt%). Some practical and GMP aspects of process development will be also discussed.

  12. Practical Large Scale Syntheses of New Drug Candidates

    Institute of Scientific and Technical Information of China (English)

    Hui-Yin; Li

    2001-01-01

    This presentation will be focus on Practical large scale syntheses of lead compounds and drug candidates from three major therapeutic areas from DuPont Pharmaceuticals Research Laboratory: 1). DMP777-a selective, non-toxic, orally active human elastase inhibitor; 2). DMP754-a potent glycoprotein IIb/IIIa antagonist; 3). R-Wafarin-the pure enantiomeric form of wafarin. The key technology used for preparation these drug candidates is asymmetric hydrogenation under very mild reaction conditions, which produced very high quality final products at large scale (>99% de, >99 A% and >99 wt%). Some practical and GMP aspects of process development will be also discussed.……

  13. Reliability Evaluation considering Structures of a Large Scale Wind Farm

    OpenAIRE

    Shin, Je-Seok; Cha, Seung-Tae; Wu, Qiuwei; Kim, Jin-O

    2012-01-01

    Wind energy is one of the most widely used renewable energy resources. Wind power has been connected to the grid as large scale wind farm which is made up of dozens of wind turbines, and thescale of wind farm is more increased recently. Due to intermittent and variable wind source, reliability evaluation on wind farm is necessarily required. Also, because large scale offshore wind farm has along repair time and a high repair cost as well as a high investment cost, it is essential to take into...

  14. Fast paths in large-scale dynamic road networks

    CERN Document Server

    Nannicini, Giacomo; Barbier, Gilles; Krob, Daniel; Liberti, Leo

    2007-01-01

    Efficiently computing fast paths in large scale dynamic road networks (where dynamic traffic information is known over a part of the network) is a practical problem faced by several traffic information service providers who wish to offer a realistic fast path computation to GPS terminal enabled vehicles. The heuristic solution method we propose is based on a highway hierarchy-based shortest path algorithm for static large-scale networks; we maintain a static highway hierarchy and perform each query on the dynamically evaluated network.

  15. Large Scale Meteorological Pattern of Extreme Rainfall in Indonesia

    Science.gov (United States)

    Kuswanto, Heri; Grotjahn, Richard; Rachmi, Arinda; Suhermi, Novri; Oktania, Erma; Wijaya, Yosep

    2014-05-01

    dates involving observations from multiple sites (rain gauges). The approach combines the POT (Peaks Over Threshold) with 'declustering' of the data to approximate independence based on the autocorrelation structure of each rainfall series. The cross correlation among sites is considered also to develop the event's criteria yielding a rational choice of the extreme dates given the 'spotty' nature of the intense convection. Based on the identified dates, we are developing a supporting tool for forecasting extreme rainfall based on the corresponding large-scale meteorological patterns (LSMPs). The LSMPs methodology focuses on the larger-scale patterns that the model are better able to forecast, as those larger-scale patterns create the conditions fostering the local EWE. Bootstrap resampling method is applied to highlight the key features that statistically significant with the extreme events. Grotjahn, R., and G. Faure. 2008: Composite Predictor Maps of Extraordinary Weather Events in the Sacramento California Region. Weather and Forecasting. 23: 313-335.

  16. Large-Scale Spray Releases: Initial Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Schonewill, Philip P.; Gauglitz, Phillip A.; Bontha, Jagannadha R.; Daniel, Richard C.; Kurath, Dean E.; Adkins, Harold E.; Billing, Justin M.; Burns, Carolyn A.; Davis, James M.; Enderlin, Carl W.; Fischer, Christopher M.; Jenks, Jeromy WJ; Lukins, Craig D.; MacFarlan, Paul J.; Shutthanandan, Janani I.; Smith, Dennese M.

    2012-12-01

    One of the events postulated in the hazard analysis at the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids with Newtonian fluid behavior. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and across processing facilities in the DOE complex. Two key technical areas were identified where testing results were needed to improve the technical basis by reducing the uncertainty due to extrapolating existing literature results. The first technical need was to quantify the role of slurry particles in small breaches where the slurry particles may plug and result in substantially reduced, or even negligible, respirable fraction formed by high-pressure sprays. The second technical need was to determine the aerosol droplet size distribution and volume from prototypic breaches and fluids, specifically including sprays from larger breaches with slurries where data from the literature are scarce. To address these technical areas, small- and large-scale test stands were constructed and operated with simulants to determine aerosol release fractions and generation rates from a range of breach sizes and geometries. The properties of the simulants represented the range of properties expected in the WTP process streams and included water, sodium salt solutions, slurries containing boehmite or gibbsite, and a hazardous chemical simulant. The effect of anti-foam agents was assessed with most of the simulants. Orifices included round holes and

  17. Main Achievements of Cotton Large-scale Transformation System

    Institute of Scientific and Technical Information of China (English)

    LI Fu-guang; LIU Chuan-liang; WU Zhi-xia; ZHANG Chao-jun; ZHANG Xue-yan

    2008-01-01

    @@ Cotton large-scale transformation methods system was established based on innovation of cotton transformation methods.It obtains 8000 transgenic cotton plants per year by combining Agrobacteriurn turnefaciens-mediated,pollen-tube pathway and biolistic methods together efficiently.More than 1000 transgenie lines are selected from the transgenic plants with molecular assistant breeding and conventional breeding methods.

  18. Main Achievements of Cotton Large-scale Transformation System

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    Cotton large-scale transformation methods system was established based on innovation of cotton transformation methods.It obtains 8000 transgenic cotton plants per year by combining Agrobacterium tumefaciens-mediated,pollen-tube pathway and biolistic methods together efficiently.More than

  19. Large-Scale Machine Learning for Classification and Search

    Science.gov (United States)

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  20. Flexibility in design of large-scale methanol plants

    Institute of Scientific and Technical Information of China (English)

    Esben Lauge Sφrensen; Helge Holm-Larsen; Haldor Topsφe A/S

    2006-01-01

    This paper presents a cost effective design for large-scale methanol production. It is demonstrated how recent technological progress can be utilised to design a methanol plant,which is inexpensive and easy to operate, while at the same time very robust towards variations in feed-stock composition and product specifications.

  1. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  2. Large-scale search for dark-matter axions

    Energy Technology Data Exchange (ETDEWEB)

    Kinion, D; van Bibber, K

    2000-08-30

    We review the status of two ongoing large-scale searches for axions which may constitute the dark matter of our Milky Way halo. The experiments are based on the microwave cavity technique proposed by Sikivie, and marks a ''second-generation'' to the original experiments performed by the Rochester-Brookhaven-Fermilab collaboration, and the University of Florida group.

  3. International Large-Scale Assessments: What Uses, What Consequences?

    Science.gov (United States)

    Johansson, Stefan

    2016-01-01

    Background: International large-scale assessments (ILSAs) are a much-debated phenomenon in education. Increasingly, their outcomes attract considerable media attention and influence educational policies in many jurisdictions worldwide. The relevance, uses and consequences of these assessments are often the focus of research scrutiny. Whilst some…

  4. The Large-Scale Structure of Scientific Method

    Science.gov (United States)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  5. Large-scale V/STOL testing. [in wind tunnels

    Science.gov (United States)

    Koenig, D. G.; Aiken, T. N.; Aoyagi, K.; Falarski, M. D.

    1977-01-01

    Several facets of large-scale testing of V/STOL aircraft configurations are discussed with particular emphasis on test experience in the Ames 40- by 80-foot wind tunnel. Examples of powered-lift test programs are presented in order to illustrate tradeoffs confronting the planner of V/STOL test programs. It is indicated that large-scale V/STOL wind-tunnel testing can sometimes compete with small-scale testing in the effort required (overall test time) and program costs because of the possibility of conducting a number of different tests with a single large-scale model where several small-scale models would be required. The benefits of both high- and full-scale Reynolds numbers, more detailed configuration simulation, and number and type of onboard measurements increase rapidly with scale. Planning must be more detailed at large scale in order to balance the trade-offs between the increased costs, as number of measurements and model configuration variables increase and the benefits of larger amounts of information coming out of one test.

  6. Evaluating Large-scale National Public Management Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Morten Balle; Hansen, Hanne Foss

    This article explores differences and similarities between two evaluations of large-scale administrative reforms which were carried out in the 2000s: The evaluation of the Norwegian NAV reform (EVANAV) and the evaluation of the Danish Local Government Reform (LGR). We provide a comparative analys...

  7. The large scale microwave background anisotropy in decaying particle cosmology

    International Nuclear Information System (INIS)

    We investigate the large-scale anisotropy of the microwave background radiation in cosmological models with decaying particles. The observed value of the quadrupole moment combined with other constraints gives an upper limit on the redshift of the decay z/sub d/ < 3-5. 12 refs., 2 figs

  8. How large-scale subsidence affects stratocumulus transitions (discussion paper)

    NARCIS (Netherlands)

    Van der Dussen, J.J.; De Roode, S.R.; Siebesma, A.P.

    2015-01-01

    Some climate modeling results suggest that the Hadley circulation might weaken in a future climate, causing a subsequent reduction in the large-scale subsidence velocity in the subtropics. In this study we analyze the cloud liquid water path (LWP) budget from large-eddy simulation (LES) results of t

  9. Measuring Large Scale Space Perception in Literary Texts

    OpenAIRE

    Rossi, Paolo

    2006-01-01

    The center and radius of perception associated with a written text are defined, and algorithms for their computation are presented. Indicators for anisotropy in large scale space perception are introduced. The relevance of these notions for the analysis of literary and historical records is briefly discussed and illustrated with an example taken from medieval historiography.

  10. Electric vehicles and large-scale integration of wind power

    DEFF Research Database (Denmark)

    Liu, Wen; Hu, Weihao; Lund, Henrik;

    2013-01-01

    Renewable energy is one of the possible solutions when addressing climate change. Today, large-scale renewable energy integration needs to include the experience to balance the discrepancy between electricity demand and supply. The electrification of transportation may have the potential to deal...

  11. Large-scale prediction of drug-target relationships

    DEFF Research Database (Denmark)

    Kuhn, Michael; Campillos, Mónica; González, Paula;

    2008-01-01

    also provides a more global view on drug-target relations. Here we review recent attempts to apply large-scale computational analyses to predict novel interactions of drugs and targets from molecular and cellular features. In this context, we quantify the family-dependent probability of two proteins to...

  12. Temporal Variation of Large Scale Flows in the Solar Interior

    Indian Academy of Sciences (India)

    Sarbani Basu; H. M. Antia

    2000-09-01

    We attempt to detect short-term temporal variations in the rotation rate and other large scale velocity fields in the outer part of the solar convection zone using the ring diagram technique applied to Michelson Doppler Imager (MDI) data. The measured velocity field shows variations by about 10 m/s on the scale of few days.

  13. Water Implications of Large-Scale Land Acquisitions in Ghana

    Directory of Open Access Journals (Sweden)

    Timothy Olalekan Williams

    2012-06-01

    The paper offers recommendations which can help the government to achieve its stated objective of developing a "policy framework and guidelines for large-scale land acquisitions by both local and foreign investors for biofuels that will protect the interests of investors and the welfare of Ghanaian farmers and landowners".

  14. Reconstruction of hadronic cascades in large-scale neutrino telescopes

    International Nuclear Information System (INIS)

    A strategy that allows for the reconstruction of the direction and energy of hadronic cascades is presented, as well as the preliminary results from corresponding simulation studies of the ANTARES twelve string detector. The analysis techniques are of very generic nature and can thus be easily applied for large-scale neutrino telescopes, such as KM3NeT.

  15. Special issue on decentralized control of large scale complex systems

    Czech Academy of Sciences Publication Activity Database

    Bakule, Lubomír

    2009-01-01

    Roč. 45, č. 1 (2009), s. 1-2. ISSN 0023-5954 R&D Projects: GA MŠk(CZ) LA 282 Institutional research plan: CEZ:AV0Z10750506 Keywords : decentralized control * large scale complex systems Subject RIV: BC - Control Systems Theory Impact factor: 0.445, year: 2009

  16. Resilience of Florida Keys coral communities following large scale disturbances

    Science.gov (United States)

    The decline of coral reefs in the Caribbean over the last 40 years has been attributed to multiple chronic stressors and episodic large-scale disturbances. This study assessed the resilience of coral communities in two different regions of the Florida Keys reef system between 199...

  17. Over-driven control for large-scale MR dampers

    International Nuclear Information System (INIS)

    As semi-active electro-mechanical control devices increase in scale for use in real-world civil engineering applications, their dynamics become increasingly complicated. Control designs that are able to take these characteristics into account will be more effective in achieving good performance. Large-scale magnetorheological (MR) dampers exhibit a significant time lag in their force–response to voltage inputs, reducing the efficacy of typical controllers designed for smaller scale devices where the lag is negligible. A new control algorithm is presented for large-scale MR devices that uses over-driving and back-driving of the commands to overcome the challenges associated with the dynamics of these large-scale MR dampers. An illustrative numerical example is considered to demonstrate the controller performance. Via simulations of the structure using several seismic ground motions, the merits of the proposed control strategy to achieve reductions in various response parameters are examined and compared against several accepted control algorithms. Experimental evidence is provided to validate the improved capabilities of the proposed controller in achieving the desired control force levels. Through real-time hybrid simulation (RTHS), the proposed controllers are also examined and experimentally evaluated in terms of their efficacy and robust performance. The results demonstrate that the proposed control strategy has superior performance over typical control algorithms when paired with a large-scale MR damper, and is robust for structural control applications. (paper)

  18. High-Throughput, Large-Scale SNP Genotyping: Bioinformatics Considerations

    OpenAIRE

    Margetic, Nino

    2004-01-01

    In order to provide a high-throughput, large-scale genotyping facility at the national level we have developed a set of inter-dependent information systems. A combination of commercial, publicly-available and in-house developed tools links a series of data repositories based both on flat files and relational databases providing an almost complete semi-automated pipeline.

  19. Large-Scale Systems Control Design via LMI Optimization

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2015-01-01

    Roč. 44, č. 3 (2015), s. 247-253. ISSN 1392-124X Institutional support: RVO:67985556 Keywords : Combinatorial linear matrix inequalities * large-scale system * decentralized control Subject RIV: BC - Control Systems Theory Impact factor: 0.623, year: 2014

  20. The integrated Sachs-Wolfe Effect -- Large Scale Structure Correlation

    CERN Document Server

    Cooray, A R

    2002-01-01

    We discuss the correlation between late-time integrated Sachs-Wolfe (ISW) effect in the cosmic microwave background (CMB) temperature anisotropies and the large scale structure of the local universe. This correlation has been proposed and studied in the literature as a probe of the dark energy and its physical properties. We consider a variety of large scale structure tracers suitable for a detection of the ISW effect via a cross-correlation. In addition to luminous sources, we suggest the use of tracers such as dark matter halos or galaxy clusters. A suitable catalog of mass selected halos for this purpose can be constructed with upcoming wide-field lensing and Sunyaev-Zel'dovich (SZ) effect surveys. With multifrequency data, the presence of the ISW-large scale structure correlation can also be investigated through a cross-correlation of the frequency cleaned SZ and CMB maps. While convergence maps constructed from lensing surveys of the large scale structure via galaxy ellipticities are less correlated with...

  1. A large-scale industrial CT's data transfer system

    International Nuclear Information System (INIS)

    The large-scale industrial CT generates a large amount of data when it works. To guarantee the reliability of the real-time transfers of those data, the author designs a project by using WLAN technology. And it solves the bottleneck caused by the data rate limitation by using multi-thread technology. (author)

  2. Efficient On-Demand Operations in Large-Scale Infrastructures

    Science.gov (United States)

    Ko, Steven Y.

    2009-01-01

    In large-scale distributed infrastructures such as clouds, Grids, peer-to-peer systems, and wide-area testbeds, users and administrators typically desire to perform "on-demand operations" that deal with the most up-to-date state of the infrastructure. However, the scale and dynamism present in the operating environment make it challenging to…

  3. Participatory Design and the Challenges of Large-Scale Systems

    DEFF Research Database (Denmark)

    Simonsen, Jesper; Hertzum, Morten

    2008-01-01

    With its 10th biannual anniversary conference, Participatory Design (PD) is leaving its teens and must now be considered ready to join the adult world. In this article we encourage the PD community to think big: PD should engage in large-scale information-systems development and opt for a PD...

  4. Automated and Accurate Detection of Soma Location and Surface Morphology in Large-Scale 3D Neuron Images

    OpenAIRE

    Cheng Yan; Anan Li; Bin Zhang,; Wenxiang Ding; Qingming Luo; Hui Gong

    2013-01-01

    Automated and accurate localization and morphometry of somas in 3D neuron images is essential for quantitative studies of neural networks in the brain. However, previous methods are limited in obtaining the location and surface morphology of somas with variable size and uneven staining in large-scale 3D neuron images. In this work, we proposed a method for automated soma locating in large-scale 3D neuron images that contain relatively sparse soma distributions. This method involves three step...

  5. The IR-resummed Effective Field Theory of Large Scale Structures

    CERN Document Server

    Senatore, Leonardo

    2014-01-01

    We present a new method to resum the effect of large scale motions in the Effective Field Theory of Large Scale Structures. Because the linear power spectrum in $\\Lambda$CDM is not scale free the effects of the large scale flows are enhanced. Although previous EFT calculations of the equal-time density power spectrum at one and two loops showed a remarkable agreement with numerical results, they also showed a 2% residual which appeared related to the BAO oscillations. We show that this was indeed the case, explain the physical origin and show how a Lagrangian based calculation removes this differences. We propose a simple method to upgrade existing Eulerian calculations to effectively make them Lagrangian and compare the new results with existing fits to numerical simulations. Our new two-loop results agrees with numerical results up to $k\\sim 0.6 h/$Mpc to within 1% with no oscillatory residuals. We also compute power spectra involving momentum which is significantly more affected by the large scale flows. W...

  6. Bayesian large-scale structure inference: initial conditions and the cosmic web

    OpenAIRE

    Leclercq, Florent; Wandelt, Benjamin

    2014-01-01

    We describe an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the large-scale structure of the inhomogeneous Universe. Our algorithm explores the joint posterior distribution of the many millions of parameters involved via efficient Hamiltonian Markov Chain Monte Carlo sampling. We describe its application to the Sloan Digital Sky Survey data release 7 and an additional non-linear filtering step. We illustrate the use of our ...

  7. The CAD/CAM Solution and Realization for Machining Large Scale Steel-concrete Objects

    Institute of Scientific and Technical Information of China (English)

    2002-01-01

    First the architecture of the integrated CAD/CAM system developed for machining large scale steel-concrete girders using I-DEAS and other developing tools are introduced; then the involved technologies in the development of the system are presented, which include the method for determining the position and orientation of the manufacturing features, the method for geometrical modeling of the girders, the techniques for generating the toolpaths for the 5-axis machining centers, the generative process planning...

  8. The multilevel fast multipole algorithm (MLFMA) for solving large-scale computational electromagnetics problems

    CERN Document Server

    Ergul, Ozgur

    2014-01-01

    The Multilevel Fast Multipole Algorithm (MLFMA) for Solving Large-Scale Computational Electromagnetic Problems provides a detailed and instructional overview of implementing MLFMA. The book: Presents a comprehensive treatment of the MLFMA algorithm, including basic linear algebra concepts, recent developments on the parallel computation, and a number of application examplesCovers solutions of electromagnetic problems involving dielectric objects and perfectly-conducting objectsDiscusses applications including scattering from airborne targets, scattering from red

  9. Large-scale quantification of CVD graphene surface coverage

    Science.gov (United States)

    Ambrosi, Adriano; Bonanni, Alessandra; Sofer, Zdeněk; Pumera, Martin

    2013-02-01

    The extraordinary properties demonstrated for graphene and graphene-related materials can be fully exploited when a large-scale fabrication procedure is made available. Chemical vapor deposition (CVD) of graphene on Cu and Ni substrates is one of the most promising procedures to synthesize large-area and good quality graphene films. Parallel to the fabrication process, a large-scale quality monitoring technique is equally crucial. We demonstrate here a rapid and simple methodology that is able to probe the effectiveness of the growth process over a large substrate area for both Ni and Cu substrates. This method is based on inherent electrochemical signals generated by the underlying metal catalysts when fractures or discontinuities of the graphene film are present. The method can be applied immediately after the CVD growth process without the need for any graphene transfer step and represents a powerful quality monitoring technique for the assessment of large-scale fabrication of graphene by the CVD process.The extraordinary properties demonstrated for graphene and graphene-related materials can be fully exploited when a large-scale fabrication procedure is made available. Chemical vapor deposition (CVD) of graphene on Cu and Ni substrates is one of the most promising procedures to synthesize large-area and good quality graphene films. Parallel to the fabrication process, a large-scale quality monitoring technique is equally crucial. We demonstrate here a rapid and simple methodology that is able to probe the effectiveness of the growth process over a large substrate area for both Ni and Cu substrates. This method is based on inherent electrochemical signals generated by the underlying metal catalysts when fractures or discontinuities of the graphene film are present. The method can be applied immediately after the CVD growth process without the need for any graphene transfer step and represents a powerful quality monitoring technique for the assessment of large-scale

  10. Critical Analysis of Middleware Architectures for Large Scale Distributed Systems

    CERN Document Server

    Pop, Florin; Costan, Alexandru; Andreica, Mugurel Ionut; Tirsa, Eliana-Dina; Stratan, Corina; Cristea, Valentin

    2009-01-01

    Distributed computing is increasingly being viewed as the next phase of Large Scale Distributed Systems (LSDSs). However, the vision of large scale resource sharing is not yet a reality in many areas - Grid computing is an evolving area of computing, where standards and technology are still being developed to enable this new paradigm. Hence, in this paper we analyze the current development of middleware tools for LSDS, from multiple perspectives: architecture, applications and market research. For each perspective we are interested in relevant technologies used in undergoing projects, existing products or services and useful design issues. In the end, based on this approach, we draw some conclusions regarding the future research directions in this area.

  11. Series Design of Large-Scale NC Machine Tool

    Institute of Scientific and Technical Information of China (English)

    TANG Zhi

    2007-01-01

    Product system design is a mature concept in western developed countries. It has been applied in war industry during the last century. However, up until now, functional combination is still the main method for product system design in China. Therefore, in terms of a concept of product generation and product interaction we are in a weak position compared with the requirements of global markets. Today, the idea of serial product design has attracted much attention in the design field and the definition of product generation as well as its parameters has already become the standard in serial product designs. Although the design of a large-scale NC machine tool is complicated, it can be further optimized by the precise exercise of object design by placing the concept of platform establishment firmly into serial product design. The essence of a serial product design has been demonstrated by the design process of a large-scale NC machine tool.

  12. The complexity nature of large-scale software systems

    Institute of Scientific and Technical Information of China (English)

    Yan Dong; Qi Guo-Ning; Gu Xin-Jian

    2006-01-01

    In software engineering, class diagrams are often used to describe the system's class structures in Unified Modelling Language (UML). A class diagram, as a graph, is a collection of static declarative model elements, such as classes, interfaces, and the relationships of their connections with each other. In this paper, class graphs are examined within several Java software systems provided by Sun and IBM, and some new features are found. For a large-scale Java software system, its in-degree distribution tends to an exponential distribution, while its out-degree and degree distributions reveal the power-law behaviour. And then a directed preferential-random model is established to describe the corresponding degree distribution features and evolve large-scale Java software systems.

  13. Electron drift in a large scale solid xenon

    CERN Document Server

    Yoo, J

    2015-01-01

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7\\,cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163\\,K), the drift speed is 0.193 $\\pm$ 0.003 cm/$\\mu$s while the drift speed in the solid phase (157\\,K) is 0.397 $\\pm$ 0.006 cm/$\\mu$s at 900 V/cm over 8.0\\,cm of uniform electric fields. Therefore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon.

  14. Active power reserves evaluation in large scale PVPPs

    DEFF Research Database (Denmark)

    Crăciun, Bogdan-Ionut; Kerekes, Tamas; Sera, Dezso;

    2013-01-01

    The present trend on investing in renewable ways of producing electricity in the detriment of conventional fossil fuel-based plants will lead to a certain point where these plants have to provide ancillary services and contribute to overall grid stability. Photovoltaic (PV) power has the fastest...... growth among all renewable energies and managed to reach high penetration levels creating instabilities which at the moment are corrected by the conventional generation. This paradigm will change in the future scenarios where most of the power is supplied by large scale renewable plants and parts...... of the ancillary services have to be shared by the renewable plants. The main focus of the proposed paper is to technically and economically analyze the possibility of having active power reserves in large scale PV power plants (PVPPs) without any auxiliary storage equipment. The provided reserves should...

  15. Model for large scale circulation of nuclides in nature, 1

    Energy Technology Data Exchange (ETDEWEB)

    Ohnishi, Teruaki

    1988-12-01

    A model for large scale circulation of nuclides was developed, and a computer code named COCAIN was made which simulates this circulation system-dynamically. The natural environment considered in the present paper consists of 2 atmospheres, 8 geospheres and 2 lithospheres. The biosphere is composed of 4 types of edible plants, 5 cattles and their products, 4 water biota and 16 human organs. The biosphere is assumed to be given nuclides from the natural environment mentioned above. With the use of COCAIN, two numerical case studies were carried out; the one is the study on nuclear pollution in nature by the radioactive nuclides originating from the past nuclear bomb tests, and the other is the study on the response of environment and biota to the pulse injection of nuclides into one compartment. From the former case study it was verified that this model can well explain the observation and properly simulate the large scale circulation of nuclides in nature.

  16. Imprint of thawing scalar fields on large scale galaxy overdensity

    CERN Document Server

    Dinda, Bikash R

    2016-01-01

    We calculate the observed galaxy power spectrum for the thawing class of scalar field models taking into account various general relativistic corrections that occur on very large scales. As we need to consider the fluctuations in scalar field on these large scales, the general relativistic corrections in thawing scalar field models are distinctly different from $\\Lambda$CDM and the difference can be upto $15-20\\%$ at some scales. Also there is an interpolation between suppression and enhancement of power in scalar field models compared to the $\\Lambda$CDM model on smaller scales and this happens in a specific redshift range that is quite robust to the form of the scalar field potentials or the choice of different cosmological parameters. This can be useful to distinguish scalar field models from $\\Lambda$CDM with future optical/radio surveys.

  17. Constraining cosmological ultra-large scale structure using numerical relativity

    CERN Document Server

    Braden, Jonathan; Peiris, Hiranya V; Aguirre, Anthony

    2016-01-01

    Cosmic inflation, a period of accelerated expansion in the early universe, can give rise to large amplitude ultra-large scale inhomogeneities on distance scales comparable to or larger than the observable universe. The cosmic microwave background (CMB) anisotropy on the largest angular scales is sensitive to such inhomogeneities and can be used to constrain the presence of ultra-large scale structure (ULSS). We numerically evolve nonlinear inhomogeneities present at the beginning of inflation in full General Relativity to assess the CMB quadrupole constraint on the amplitude of the initial fluctuations and the size of the observable universe relative to a length scale characterizing the ULSS. To obtain a statistically significant number of simulations, we adopt a toy model in which inhomogeneities are injected along a preferred direction. We compute the likelihood function for the CMB quadrupole including both ULSS and the standard quantum fluctuations produced during inflation. We compute the posterior given...

  18. Bayesian large-scale structure inference and cosmic web analysis

    CERN Document Server

    Leclercq, Florent

    2015-01-01

    Surveys of the cosmic large-scale structure carry opportunities for building and testing cosmological theories about the origin and evolution of the Universe. This endeavor requires appropriate data assimilation tools, for establishing the contact between survey catalogs and models of structure formation. In this thesis, we present an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the cosmic web: the BORG algorithm infers the primordial density fluctuations and produces physical reconstructions of the dark matter distribution that underlies observed galaxies, by assimilating the survey data into a cosmological structure formation model. The method, based on Bayesian probability theory, provides accurate means of uncertainty quantification. We demonstrate the application of BORG to the Sloan Digital Sky Survey data and describe the primordial and late-time large-scale structure in the observed volume. We show how the approach has led to the fi...

  19. Ultra-large scale cosmology with next-generation experiments

    CERN Document Server

    Alonso, David; Ferreira, Pedro G; Maartens, Roy; Santos, Mario G

    2015-01-01

    Future surveys of large-scale structure will be able to measure perturbations on the scale of the cosmological horizon, and so could potentially probe a number of novel relativistic effects that are negligibly small on sub-horizon scales. These effects leave distinctive signatures in the power spectra of clustering observables and, if measurable, would open a new window on relativistic cosmology. We quantify the size and detectability of the effects for a range of future large-scale structure surveys: spectroscopic and photometric galaxy redshift surveys, intensity mapping surveys of neutral hydrogen, and continuum surveys of radio galaxies. Our forecasts show that next-generation experiments, reaching out to redshifts z ~ 4, will not be able to detect previously-undetected general-relativistic effects from the single-tracer power spectra alone, although they may be able to measure the lensing magnification in the auto-correlation. We also perform a rigorous joint forecast for the detection of primordial non-...

  20. Magnetic Helicity and Large Scale Magnetic Fields: A Primer

    CERN Document Server

    Blackman, Eric G

    2014-01-01

    Magnetic fields of laboratory, planetary, stellar, and galactic plasmas commonly exhibit significant order on large temporal or spatial scales compared to the otherwise random motions within the hosting system. Such ordered fields can be measured in the case of planets, stars, and galaxies, or inferred indirectly by the action of their dynamical influence, such as jets. Whether large scale fields are amplified in situ or a remnant from previous stages of an object's history is often debated for objects without a definitive magnetic activity cycle. Magnetic helicity, a measure of twist and linkage of magnetic field lines, is a unifying tool for understanding large scale field evolution for both mechanisms of origin. Its importance stems from its two basic properties: (1) magnetic helicity is typically better conserved than magnetic energy; and (2) the magnetic energy associated with a fixed amount of magnetic helicity is minimized when the system relaxes this helical structure to the largest scale available. H...

  1. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  2. Individual skill differences and large-scale environmental learning.

    Science.gov (United States)

    Fields, Alexa W; Shelton, Amy L

    2006-05-01

    Spatial skills are known to vary widely among normal individuals. This project was designed to address whether these individual differences are differentially related to large-scale environmental learning from route (ground-level) and survey (aerial) perspectives. Participants learned two virtual environments (route and survey) with limited exposure and tested on judgments about relative locations of objects. They also performed a series of spatial and nonspatial component skill tests. With limited learning, performance after route encoding was worse than performance after survey encoding. Furthermore, performance after route and survey encoding appeared to be preferentially linked to perspective and object-based transformations, respectively. Together, the results provide clues to how different skills might be engaged by different individuals for the same goal of learning a large-scale environment. PMID:16719662

  3. Large-scale flow generation by inhomogeneous helicity

    CERN Document Server

    Yokoi, Nobumitsu

    2015-01-01

    The effect of kinetic helicity (velocity--vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters into the Reynolds stress (mirrorsymmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with non-uniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of hom...

  4. Large-scale innovation and change in UK higher education

    Directory of Open Access Journals (Sweden)

    Stephen Brown

    2013-09-01

    Full Text Available This paper reflects on challenges universities face as they respond to change. It reviews current theories and models of change management, discusses why universities are particularly difficult environments in which to achieve large scale, lasting change and reports on a recent attempt by the UK JISC to enable a range of UK universities to employ technology to deliver such changes. Key lessons that emerged from these experiences are reviewed covering themes of pervasiveness, unofficial systems, project creep, opposition, pressure to deliver, personnel changes and technology issues. The paper argues that collaborative approaches to project management offer greater prospects of effective large-scale change in universities than either management-driven top-down or more champion-led bottom-up methods. It also argues that while some diminution of control over project outcomes is inherent in this approach, this is outweighed by potential benefits of lasting and widespread adoption of agreed changes.

  5. Volume measurement study for large scale input accountancy tank

    International Nuclear Information System (INIS)

    Large Scale Tank Calibration (LASTAC) facility, including an experimental tank which has the same volume and structure as the input accountancy tank of Rokkasho Reprocessing Plant (RRP) was constructed in Nuclear Material Control Center of Japan. Demonstration experiments have been carried out to evaluate a precision of solution volume measurement and to establish the procedure of highly accurate pressure measurement for a large scale tank with dip-tube bubbler probe system to be applied to the input accountancy tank of RRP. Solution volume in a tank is determined from substitution the solution level for the calibration function obtained in advance, which express a relation between the solution level and its volume in the tank. Therefore, precise solution volume measurement needs a precise calibration function that is determined carefully. The LASTAC calibration experiments using pure water showed good result in reproducibility. (J.P.N.)

  6. Optimal algorithms for scheduling large scale application on heterogeneous systems

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    This paper studies optimal algorithms for scheduling large-scale application on heterogeneous systems using Divis ible Load Theory.A more realistic and general model,i.e.,both processors and communication links may have different speeds and arbitrary start-up costs,and communication is in non-blocking mode,is introduced.Under such environment, the following results are obtained:①Mathematic model and closed-form expressions both for the processing time and the fraction of load for each processor are derived;②the influence of start-up costs on the optimal processing time is analyzed;③for a given heterogeneous systems and a large-scale computing problem,optimal algorithms are proposed.

  7. Model for large scale circulation of nuclides in nature, 1

    International Nuclear Information System (INIS)

    A model for large scale circulation of nuclides was developed, and a computer code named COCAIN was made which simulates this circulation system-dynamically. The natural environment considered in the present paper consists of 2 atmospheres, 8 geospheres and 2 lithospheres. The biosphere is composed of 4 types of edible plants, 5 cattles and their products, 4 water biota and 16 human organs. The biosphere is assumed to be given nuclides from the natural environment mentioned above. With the use of COCAIN, two numerical case studies were carried out; the one is the study on nuclear pollution in nature by the radioactive nuclides originating from the past nuclear bomb tests, and the other is the study on the response of environment and biota to the pulse injection of nuclides into one compartment. From the former case study it was verified that this model can well explain the observation and properly simulate the large scale circulation of nuclides in nature. (author)

  8. LARGE-SCALE MOTIONS IN THE PERSEUS GALAXY CLUSTER

    International Nuclear Information System (INIS)

    By combining large-scale mosaics of ROSAT PSPC, XMM-Newton, and Suzaku X-ray observations, we present evidence for large-scale motions in the intracluster medium of the nearby, X-ray bright Perseus Cluster. These motions are suggested by several alternating and interleaved X-ray bright, low-temperature, low-entropy arcs located along the east-west axis, at radii ranging from ∼10 kpc to over a Mpc. Thermodynamic features qualitatively similar to these have previously been observed in the centers of cool-core clusters, and were successfully modeled as a consequence of the gas sloshing/swirling motions induced by minor mergers. Our observations indicate that such sloshing/swirling can extend out to larger radii than previously thought, on scales approaching the virial radius.

  9. Large-scale Alfvén vortices

    Energy Technology Data Exchange (ETDEWEB)

    Onishchenko, O. G., E-mail: onish@ifz.ru [Institute of Physics of the Earth, 10 B. Gruzinskaya, 123242 Moscow, Russian Federation and Space Research Institute, 84/32 Profsouznaya str., 117997 Moscow (Russian Federation); Pokhotelov, O. A., E-mail: pokh@ifz.ru [Institute of Physics of the Earth, 10 B. Gruzinskaya, 123242 Moscow (Russian Federation); Horton, W., E-mail: wendell.horton@gmail.com [Institute for Fusion Studies and Applied Research Laboratory, University of Texas at Austin, Austin, Texas 78713 (United States); Scullion, E., E-mail: scullie@tcd.ie [School of Physics, Trinity College Dublin, Dublin 2 (Ireland); Fedun, V., E-mail: v.fedun@sheffield.ac.uk [Department of Automatic Control and Systems Engineering, University of Sheffield, Sheffield S13JD (United Kingdom)

    2015-12-15

    The new type of large-scale vortex structures of dispersionless Alfvén waves in collisionless plasma is investigated. It is shown that Alfvén waves can propagate in the form of Alfvén vortices of finite characteristic radius and characterised by magnetic flux ropes carrying orbital angular momentum. The structure of the toroidal and radial velocity, fluid and magnetic field vorticity, the longitudinal electric current in the plane orthogonal to the external magnetic field are discussed.

  10. The Large-Scale Sugarcane Stripper with Automatic Feeding

    OpenAIRE

    Jiaxiang Lin; Wenjie Yan; Jiaping Lin

    2012-01-01

    This study mainly introduce the large-scale sugarcane stripper with automatic feeding, which including the automatic feeding module, cleaning leaves module, collecting module and control module. The machine is an important part of the segmental type sugarcane harvester, using to solve the highest labor intensity problem of cleaning leaves. Collecting the hilly areas sugarcane and cleaning their leaves, can greatly improve the labor productivity and changing the current mode of sugarcane harvest.

  11. Split Architecture for Large Scale Wide Area Networks

    OpenAIRE

    John, Wolfgang; Devlic, Alisa; Ding, Zhemin; Jocha, David; Kern, Andras; Kind, Mario; Köpsel, Andreas; Nordell, Viktor; Sharma, Sachin; Sköldström, Pontus; Staessens, Dimitri; Takacs, Attila; Topp, Steffen; Westphal, F. -Joachim; Woesner, Hagen

    2014-01-01

    This report defines a carrier-grade split architecture based on requirements identified during the SPARC project. It presents the SplitArchitecture proposal, the SPARC concept for Software Defined Networking (SDN) introduced for large-scale wide area networks such as access/aggregation networks, and evaluates technical issues against architectural trade-offs. First we present the control and management architecture of the proposed SplitArchitecture. Here, we discuss a recursive control archit...

  12. Rotation invariant fast features for large-scale recognition

    Science.gov (United States)

    Takacs, Gabriel; Chandrasekhar, Vijay; Tsai, Sam; Chen, David; Grzeszczuk, Radek; Girod, Bernd

    2012-10-01

    We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.

  13. A Large-Scale Study of Online Shopping Behavior

    OpenAIRE

    Nalchigar, Soroosh; Weber, Ingmar

    2012-01-01

    The continuous growth of electronic commerce has stimulated great interest in studying online consumer behavior. Given the significant growth in online shopping, better understanding of customers allows better marketing strategies to be designed. While studies of online shopping attitude are widespread in the literature, studies of browsing habits differences in relation to online shopping are scarce. This research performs a large scale study of the relationship between Internet browsing hab...

  14. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  15. Subspace identification of large-scale interconnected systems

    OpenAIRE

    Haber, Aleksandar; Verhaegen, Michel

    2013-01-01

    We propose a decentralized subspace algorithm for identification of large-scale, interconnected systems that are described by sparse (multi) banded state-space matrices. First, we prove that the state of a local subsystem can be approximated by a linear combination of inputs and outputs of the local subsystems that are in its neighborhood. Furthermore, we prove that for interconnected systems with well-conditioned, finite-time observability Gramians (or observability matrices), the size of th...

  16. Experimental simulation of microinteractions in large scale explosions

    Energy Technology Data Exchange (ETDEWEB)

    Chen, X.; Luo, R.; Yuen, W.W.; Theofanous, T.G. [California Univ., Santa Barbara, CA (United States). Center for Risk Studies and Safety

    1998-01-01

    This paper presents data and analysis of recent experiments conducted in the SIGMA-2000 facility to simulate microinteractions in large scale explosions. Specifically, the fragmentation behavior of a high temperature molten steel drop under high pressure (beyond critical) conditions are investigated. The current data demonstrate, for the first time, the effect of high pressure in suppressing the thermal effect of fragmentation under supercritical conditions. The results support the microinteractions idea, and the ESPROSE.m prediction of fragmentation rate. (author)

  17. Self-calibration of Large Scale Camera Networks

    OpenAIRE

    Goorts, Patrik; MAESEN, Steven; Liu, Yunjun; Dumont, Maarten; Bekaert, Philippe; Lafruit, Gauthier

    2014-01-01

    In this paper, we present a method to calibrate large scale camera networks for multi-camera computer vision applications in sport scenes. The calibration process determines precise camera parameters, both within each camera (focal length, principal point, etc) and inbetween the cameras (their relative position and orientation). To this end, we first extract candidate image correspondences over adjacent cameras, without using any calibration object, solely relying on existing feature matching...

  18. Large scale cross-drive correlation of digital media

    OpenAIRE

    Bruaene, Joseph Van

    2016-01-01

    Approved for public release; distribution is unlimited Traditional digital forensic practices have focused on individual hard disk analysis. As the digital universe continues to grow, and cyber crimes become more prevalent, the ability to make large scale cross-drive correlations among a large corpus of digital media becomes increasingly important. We propose a methodology that builds on bulk-analysis techniques to avoid operating system- and file-system specific parsing. In addition, we a...

  19. GroFi: Large-scale fiber placement research facility

    OpenAIRE

    Krombholz, Christian; Kruse, Felix; Wiedemann, Martin

    2016-01-01

    GroFi is a large research facility operated by the German Aerospace Center’s Center for Lightweight-Production-Technology in Stade. A combination of different layup technologies namely (dry) fiber placement and tape laying, allows the development and validation of new production technologies and processes for large-scale composite components. Due to the use of coordinated and simultaneously working layup units a high flexibility of the research platform is achieved. This allows the investiga...

  20. Network of Experts for Large-Scale Image Categorization

    OpenAIRE

    Ahmed, Karim; Baig, Mohammad Haris; Torresani, Lorenzo

    2016-01-01

    We present a tree-structured network architecture for large-scale image classification. The trunk of the network contains convolutional layers optimized over all classes. At a given depth, the trunk splits into separate branches, each dedicated to discriminate a different subset of classes. Each branch acts as an expert classifying a set of categories that are difficult to tell apart, while the trunk provides common knowledge to all experts in the form of shared features. The training of our ...

  1. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  2. Stochastic Optimization for Large-scale Optimal Transport

    OpenAIRE

    Aude, Genevay; Cuturi, Marco; Peyré, Gabriel; Bach, Francis

    2016-01-01

    Optimal transport (OT) defines a powerful framework to compare probability distributions in a geometrically faithful way. However, the practical impact of OT is still limited because of its computational burden. We propose a new class of stochastic optimization algorithms to cope with large-scale problems routinely encountered in machine learning applications. These methods are able to manipulate arbitrary distributions (either discrete or continuous) by simply requiring to be able to draw sa...

  3. Turbulent large-scale structure effects on wake meandering

    Science.gov (United States)

    Muller, Y.-A.; Masson, C.; Aubrun, S.

    2015-06-01

    This work studies effects of large-scale turbulent structures on wake meandering using Large Eddy Simulations (LES) over an actuator disk. Other potential source of wake meandering such as the instablility mechanisms associated with tip vortices are not treated in this study. A crucial element of the efficient, pragmatic and successful simulations of large-scale turbulent structures in Atmospheric Boundary Layer (ABL) is the generation of the stochastic turbulent atmospheric flow. This is an essential capability since one source of wake meandering is these large - larger than the turbine diameter - turbulent structures. The unsteady wind turbine wake in ABL is simulated using a combination of LES and actuator disk approaches. In order to dedicate the large majority of the available computing power in the wake, the ABL ground region of the flow is not part of the computational domain. Instead, mixed Dirichlet/Neumann boundary conditions are applied at all the computational surfaces except at the outlet. Prescribed values for Dirichlet contribution of these boundary conditions are provided by a stochastic turbulent wind generator. This allows to simulate large-scale turbulent structures - larger than the computational domain - leading to an efficient simulation technique of wake meandering. Since the stochastic wind generator includes shear, the turbulence production is included in the analysis without the necessity of resolving the flow near the ground. The classical Smagorinsky sub-grid model is used. The resulting numerical methodology has been implemented in OpenFOAM. Comparisons with experimental measurements in porous-disk wakes have been undertaken, and the agreements are good. While temporal resolution in experimental measurements is high, the spatial resolution is often too low. LES numerical results provide a more complete spatial description of the flow. They tend to demonstrate that inflow low frequency content - or large- scale turbulent structures - is

  4. Large scale ocean models beyond the traditional approximation

    OpenAIRE

    Lucas, Carine; Mcwilliams, Jim; Rousseau, Antoine

    2016-01-01

    International audience This works corresponds to classes given by A. Rousseau in February 2014 in Toulouse, in the framework of the CIMI labex. The objective is to describe and question the models that are traditionaly used for large scale oceanography, whether in 2D or 3D. Starting from fundamental equations (mass and momentum conservation), it is explained how-thanks to approximations for which we provide justifications-one can build simpler models that allow a realistic numerical implem...

  5. Unsupervised Deep Hashing for Large-scale Visual Search

    OpenAIRE

    Xia, Zhaoqiang; Feng, Xiaoyi; Peng, Jinye; Hadid, Abdenour

    2016-01-01

    Learning based hashing plays a pivotal role in large-scale visual search. However, most existing hashing algorithms tend to learn shallow models that do not seek representative binary codes. In this paper, we propose a novel hashing approach based on unsupervised deep learning to hierarchically transform features into hash codes. Within the heterogeneous deep hashing framework, the autoencoder layers with specific constraints are considered to model the nonlinear mapping between features and ...

  6. Modeling of large-scale oxy-fuel combustion processes

    OpenAIRE

    Yin, Chungen

    2012-01-01

    Quite some studies have been conducted in order to implement oxy-fuel combustion with flue gas recycle in conventional utility boilers as an effective effort of carbon capture and storage. However, combustion under oxy-fuel conditions is significantly different from conventional air-fuel firing, among which radiative heat transfer under oxy-fuel conditions is one of the fundamental issues. This paper demonstrates the nongray-gas effects in modeling of large-scale oxy-fuel combustion processes...

  7. Large Scale Artificial Neural Network Training Using Multi-GPUs

    OpenAIRE

    Wang, Linnan; Wei WU; Xiao, Jianxiong; Yi, Yang

    2015-01-01

    This paper describes a method for accelerating large scale Artificial Neural Networks (ANN) training using multi-GPUs by reducing the forward and backward passes to matrix multiplication. We propose an out-of-core multi-GPU matrix multiplication and integrate the algorithm with the ANN training. The experiments demonstrate that our matrix multiplication algorithm achieves linear speedup on multiple inhomogeneous GPUs. The full paper of this project can be found at [1].

  8. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  9. Large Scale Synthesis of Carbon Nanofibres on Sodium Chloride Support

    OpenAIRE

    Ravindra Rajarao; Badekai Ramachandra Bhat

    2012-01-01

    Large scale synthesis of carbon nanofibres (CNFs) on a sodium chloride support has been achieved. CNFs have been synthesized using metal oxalate (Ni, Co and Fe) as catalyst precursors at 680 C by chemical vapour deposition method. Upon pyrolysis, this catalyst precursors yield catalyst nanoparticles directly. The sodium chloride was used as a catalyst support, it was chosen because of its non‐toxic and water soluble nature. Problems, such as the detrimental effect of CNFs, the detrimental ef...

  10. Petascale computations for Large-scale Atomic and Molecular collisions

    OpenAIRE

    McLaughlin, Brendan M.; Ballance, Connor P.

    2014-01-01

    Petaflop architectures are currently being utilized efficiently to perform large scale computations in Atomic, Molecular and Optical Collisions. We solve the Schroedinger or Dirac equation for the appropriate collision problem using the R-matrix or R-matrix with pseudo-states approach. We briefly outline the parallel methodology used and implemented for the current suite of Breit-Pauli and DARC codes. Various examples are shown of our theoretical results compared with those obtained from Sync...

  11. Fast transient stability simulation of large scale power systems

    OpenAIRE

    Kumar, Sreerama R; Ramanujam, R.; Khincha, HP; Jenkins, L

    1992-01-01

    This paper describes a computationally efficient algorithm for transient stability simulation of large scale power system dynamics. The simultaneous implicit approach proposed by H.V. Dommel and N. Sato [l] has become the state-of-the –arc technique for production grade transient stability simulation programs. This paper proposes certain modifications to the Dommel-Sato method with which significant improvement in computational efficiency could be achieved. Preliminary investigations on a sta...

  12. Exploring the technical challenges of large-scale lifelogging

    OpenAIRE

    Gurrin, Cathal; Smeaton, Alan F.; Qiu, Zhengwei; Doherty, Aiden R.

    2013-01-01

    Ambiently and automatically maintaining a lifelog is an activity that may help individuals track their lifestyle, learning, health and productivity. In this paper we motivate and discuss the technical challenges of developing real-world lifelogging solutions, based on seven years of experience. The gathering, organisation, retrieval and presentation challenges of large-scale lifelogging are dis- cussed and we show how this can be achieved and the benefits that may accrue.

  13. Large-Scale Experiments in a Sandy Aquifer in Denmark

    DEFF Research Database (Denmark)

    Jensen, Karsten Høgh; Bitsch, Karen Bue; Bjerg, Poul Løgstrup

    1993-01-01

    A large-scale natural gradient dispersion experiment was carried out in a sandy aquifer in the western part of Denmark using tritium and chloride as tracers. For both plumes a marked spreading was observed in the longitudinal direction while the spreading in the transverse horizontal and transver...... closely. The following ''best fit'' dispersivity parameters were identified: longitudinal horizontal, 0.45 m; transverse horizontal, 0.001 m; and transverse vertical, 0.0005 m....

  14. HECTR analyses of large-scale premixed hydrogen combustion experiments

    International Nuclear Information System (INIS)

    The HECTR (Hydrogen Event: Containment Transient Response) computer code is a reactor accident analysis tool designed to calculate the transport and combustion of hydrogen and the transient response of the containment. As part of the assessment effort, HECTR has been used to analyze the Nevada Test Site (NTS) large-scale premixed hydrogen combustion experiments. The results of these analyses and the critical review of the combustion model in HECTR is presented in this paper

  15. Large scale optimization algorithms : applications to solution of inverse problems

    OpenAIRE

    Repetti, Audrey

    2015-01-01

    An efficient approach for solving an inverse problem is to define the recovered signal/image as a minimizer of a penalized criterion which is often split in a sum of simpler functions composed with linear operators. In the situations of practical interest, these functions may be neither convex nor smooth. In addition, large scale optimization problems often have to be faced. This thesis is devoted to the design of new methods to solve such difficult minimization problems, while paying attenti...

  16. Large-Scale Post-Crisis Corporate Sector Restructuring

    OpenAIRE

    Mark R. Stone

    2000-01-01

    This paper summarizes the objectives, tasks, and modalities of large-scale, post-crisis corporate restructuring based on nine recent episodes with a view to organizing the policy choices and drawing some general conclusions. These episodes suggest that government-led restructuring efforts should integrate corporate and bank restructuring in a holistic and transparent strategy based on clearly defined objective and including sunset provisions.

  17. Learning Compact Visual Attributes for Large-Scale Image Classification

    OpenAIRE

    Su, Yu; Jurie, Frédéric

    2012-01-01

    International audience Attributes based image classification has received a lot of attention recently, as an interesting tool to share knowledge across different categories or to produce compact signature of images. However, when high classification performance is expected, state-of-the-art results are typically obtained by combining Fisher Vectors (FV) and Spatial Pyramid Matching (SPM), leading to image signatures with dimensionality up to 262,144 [1]. This is a hindrance to large-scale ...

  18. Large-scale acoustic and prosodic investigations of french

    OpenAIRE

    Nemoto, Rena,

    2011-01-01

    This thesis focuses on acoustic and prosodic (fundamental frequency (F0), duration, intensity) analyses of French from large-scale audio corpora portraying different speaking styles: prepared and spontaneous speech. We are interested in particularities of segmental phonetics and prosody that may characterize pronunciation. In French, many errors caused by automatic speech recognition (ASR) systems arise from frequent homophone words, for which ASR systems depend on language model weights. Aut...

  19. Foundations of Large-Scale Multimedia Information Management and Retrieval

    CERN Document Server

    Chang, Edward Y

    2011-01-01

    "Foundations of Large-Scale Multimedia Information Management and Retrieval - Mathematics of Perception" covers knowledge representation and semantic analysis of multimedia data and scalability in signal extraction, data mining, and indexing. The book is divided into two parts: Part I - Knowledge Representation and Semantic Analysis focuses on the key components of mathematics of perception as it applies to data management and retrieval. These include feature selection/reduction, knowledge representation, semantic analysis, distance function formulation for measuring similarity, and

  20. Large-Scale Cortical Dynamics of Sleep Slow Waves

    OpenAIRE

    Botella-Soler, Vicente; Valderrama, Mario; Crépon, Benoît; Navarro, Vincent; Le Van Quyen, Michel

    2012-01-01

    Slow waves constitute the main signature of sleep in the electroencephalogram (EEG). They reflect alternating periods of neuronal hyperpolarization and depolarization in cortical networks. While recent findings have demonstrated their functional role in shaping and strengthening neuronal networks, a large-scale characterization of these two processes remains elusive in the human brain. In this study, by using simultaneous scalp EEG and intracranial recordings in 10 epileptic subjects, we exam...

  1. Indexing of CNN Features for Large Scale Image Search

    OpenAIRE

    Liu, Ruoyu; Zhao, Yao; Wei, Shikui; Zhu, Zhenfeng; Liao, Lixin; Qiu, Shuang

    2015-01-01

    Convolutional neural network (CNN) feature that represents an image with a global and high-dimensional vector has shown highly discriminative capability in image search. Although CNN features are more compact than most of local representation schemes, it still cannot efficiently deal with large-scale image search issues due to its non-negligible computational cost and storage usage. In this paper, we propose a simple but effective image indexing framework to improve the computational and stor...

  2. Query-driven indexing in large-scale distributed systems

    OpenAIRE

    Skobeltsyn, Gleb; Aberer, Karl

    2009-01-01

    Efficient and effective search in large-scale data repositories requires complex indexing solutions deployed on a large number of servers. Web search engines such as Google and Yahoo! already rely upon complex systems to be able to return relevant query results and keep processing times within the comfortable sub-second limit. Nevertheless, the exponential growth of the amount of content on the Web poses serious challenges with respect to scalability. Coping with these challenges requires nov...

  3. Punishment sustains large-scale cooperation in prestate warfare

    OpenAIRE

    Mathew, Sarah; Boyd, Robert

    2011-01-01

    Understanding cooperation and punishment in small-scale societies is crucial for explaining the origins of human cooperation. We studied warfare among the Turkana, a politically uncentralized, egalitarian, nomadic pastoral society in East Africa. Based on a representative sample of 88 recent raids, we show that the Turkana sustain costly cooperation in combat at a remarkably large scale, at least in part, through punishment of free-riders. Raiding parties comprised several hundred warriors an...

  4. Efficient Approximation Algorithms for Optimal Large-scale Network Monitoring

    OpenAIRE

    Kallitsis, Michalis; Stoev, Stilian; Michailidis, George

    2012-01-01

    The growing amount of applications that generate vast amount of data in short time scales render the problem of partial monitoring, coupled with prediction, a rather fundamental one. We study the aforementioned canonical problem under the context of large-scale monitoring of communication networks. We consider the problem of selecting the "best" subset of links so as to optimally predict the quantity of interest at the remaining ones. This is a well know NP-hard problem, and algorithms seekin...

  5. Large Scale Relationship between Aquatic Insect Traits and Climate

    OpenAIRE

    Bhowmik, Avit Kumar; Schäfer, Ralf B.

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated tra...

  6. A thermal energy storage process for large scale electric applications

    OpenAIRE

    Desrues, T; Ruer, J; Marty, P.; Fourmigué, JF

    2009-01-01

    Abstract A new type of thermal energy storage process for large scale electric applications is presented, based on a high temperature heat pump cycle which transforms electrical energy into thermal energy and stores it inside two large regenerators, followed by a thermal engine cycle which transforms the stored thermal energy back into electrical energy. The storage principle is described, and its thermodynamic cycle is analyzed, leading to the theoretical efficiency of the storage...

  7. Large-scale flow experiments for managing river systems

    Science.gov (United States)

    Konrad, Christopher P.; Olden, Julian D.; Lytle, David A.; Melis, Theodore S.; Schmidt, John C.; Bray, Erin N.; Freeman, Mary C.; Gido, Keith B.; Hemphill, Nina P.; Kennard, Mark J.; McMullen, Laura E.; Mims, Meryl C.; Pyron, Mark; Robinson, Christopher T.; Williams, John G.

    2011-01-01

    Experimental manipulations of streamflow have been used globally in recent decades to mitigate the impacts of dam operations on river systems. Rivers are challenging subjects for experimentation, because they are open systems that cannot be isolated from their social context. We identify principles to address the challenges of conducting effective large-scale flow experiments. Flow experiments have both scientific and social value when they help to resolve specific questions about the ecological action of flow with a clear nexus to water policies and decisions. Water managers must integrate new information into operating policies for large-scale experiments to be effective. Modeling and monitoring can be integrated with experiments to analyze long-term ecological responses. Experimental design should include spatially extensive observations and well-defined, repeated treatments. Large-scale flow manipulations are only a part of dam operations that affect river systems. Scientists can ensure that experimental manipulations continue to be a valuable approach for the scientifically based management of river systems.

  8. Systematic Literature Review of Agile Scalability for Large Scale Projects

    Directory of Open Access Journals (Sweden)

    Hina saeeda

    2015-09-01

    Full Text Available In new methods, “agile” has come out as the top approach in software industry for the development of the soft wares. With different shapes agile is applied for handling the issues such as low cost, tight time to market schedule continuously changing requirements, Communication & Coordination, team size and distributed environment. Agile has proved to be successful in the small and medium size project, however, it have several limitations when applied on large size projects. The purpose of this study is to know agile techniques in detail, finding and highlighting its restrictions for large size projects with the help of systematic literature review. The systematic literature review is going to find answers for the Research questions: 1 How to make agile approaches scalable and adoptable for large projects?2 What are the existing methods, approaches, frameworks and practices support agile process in large scale projects? 3 What are limitations of existing agile approaches, methods, frameworks and practices with reference to large scale projects? This study will identify the current research problems of the agile scalability for large size projects by giving a detail literature review of the identified problems, existed work for providing solution to these problems and will find out limitations of the existing work for covering the identified problems in the agile scalability. All the results gathered will be summarized statistically based on these finding remedial work will be planned in future for handling the identified limitations of agile approaches for large scale projects.

  9. Quasars and the large-scale structure of the Universe

    International Nuclear Information System (INIS)

    A problem of studying the Universe large-scale structure is discussed. Last years the Zeldovitch hypothesis turns out the most fruitful in this area. According to the hypothesis formation of plane large-scale inhomogeneities, so-called pancakes, occurs under action of gravitation and shock waves arising at that. Numerical simulation of development processes of such long-wave gravitational instability by means of an electron computer has confirmed a hypothesis of pancakes as of stretched large-scale formations which can create cell structure in distribution of Galaxies. However the investigation into the Universe structure encounters a number of difficulties main of which is the absence of statistically reliable data on distances to galaxies. To overcome the difficulties scientists suggest to use quasars, which owing to extreme luminosity, are seen almost from the Universe boundary accessible for observations. The quasars present a possibility for revealing inhomogeneity in distributions of galaxies and for investigation of galaxy structures subjecting them to powerful radiation on a ray of sight

  10. Large scale structure around a z=2.1 cluster

    CERN Document Server

    Hung, Chao-Ling; Chiang, Yi-Kuan; Capak, Peter; Cowley, Michael J; Darvish, Behnam; Kacprzak, Glenn G; Kovac, K; Lilly, Simon J; Nanayakkara, Themiya; Spitler, Lee R; Tran, Kim-Vy H; Yuan, Tiantian

    2016-01-01

    The most prodigious starburst galaxies are absent in massive galaxy clusters today, but their connection with large scale environments is less clear at $z\\gtrsim2$. We present a search of large scale structure around a galaxy cluster core at $z=2.095$ using a set of spectroscopically confirmed galaxies. We find that both color-selected star-forming galaxies (SFGs) and dusty star-forming galaxies (DSFGs) show significant overdensities around the $z=2.095$ cluster. A total of 8 DSFGs (including 3 X-ray luminous active galactic nuclei, AGNs) and 34 SFGs are found within a 10 arcmin radius (corresponds to $\\sim$15 cMpc at $z\\sim2.1$) from the cluster center and within a redshift range of $\\Delta z=0.02$, which leads to galaxy overdensities of $\\delta_{\\rm DSFG}\\sim12.3$ and $\\delta_{\\rm SFG}\\sim2.8$. The cluster core and the extended DSFG- and SFG-rich structure together demonstrate an active cluster formation phase, in which the cluster is accreting a significant amount of material from large scale structure whi...

  11. Design and fabrication of a large-scale oedometer

    Institute of Scientific and Technical Information of China (English)

    Maryam Mokhtari; Nader Shariatmadari; Ali Akbar Heshmati R; Hossein Salehzadeh

    2015-01-01

    The most common apparatus used to investigate the load−deformation parameters of homogeneous fine-grained soils is a Casagrande-type oedometer. A typical Casagrande oedometer cell has an internal diameter of 76 mm and a height of 19 mm. However, the dimensions of this kind of apparatus do not meet the requirements of some civil engineering applications like studying load−deformation characteristics of specimens with large-diameter particles such as granular materials or municipal solid waste materials. Therefore, it is decided to design and develop a large-scale oedometer with an internal diameter of 490 mm. The new apparatus provides the possibility to evaluate the load−deformation characteristics of soil specimens with different diameter to height ratios. The designed apparatus is able to measure the coefficient of lateral earth pressure at rest. The details and capabilities of the developed oedometer are provided and discussed. To study the performance and efficiency, a number of consolidation tests were performed on Firoozkoh No. 161 sand using the newly developed large scale oedometer made and also the 50 mm diameter Casagrande oedometer. Benchmark test results show that measured consolidation parameters by large scale oedometer are comparable to values measured by Casagrande type oedometer.

  12. A Model of Plasma Heating by Large-Scale Flow

    CERN Document Server

    Pongkitiwanichakul, P; Boldyrev, S; Mason, J; Perez, J C

    2015-01-01

    In this work we study the process of energy dissipation triggered by a slow large scale motion of a magnetized conducting fluid. Our consideration is motivated by the problem of heating the solar corona, which is believed to be governed by fast reconnection events set off by the slow motion of magnetic field lines anchored in the photospheric plasma. To elucidate the physics governing the disruption of the imposed laminar motion and the energy transfer to small scales, we propose a simplified model where the large-scale motion of magnetic field lines is prescribed not at the footpoints but rather imposed volumetrically. As a result, the problem can be treated numerically with an efficient, highly-accurate spectral method, allowing us to use a resolution and statistical ensemble exceeding those of the previous work. We find that, even though the large-scale deformations are slow, they eventually lead to reconnection events that drive a turbulent state at smaller scales. The small-scale turbulence displays many...

  13. Searching for Large Scale Structure in Deep Radio Surveys

    CERN Document Server

    Baleisis, A; Loan, A J; Wall, J V; Baleisis, Audra; Lahav, Ofer; Loan, Andrew J.; Wall, Jasper V.

    1997-01-01

    (Abridged Abstract) We calculate the expected amplitude of the dipole and higher spherical harmonics in the angular distribution of radio galaxies. The median redshift of radio sources in existing catalogues is z=1, which allows us to study large scale structure on scales between those accessible to present optical and infrared surveys, and that of the Cosmic Microwave Background (CMB). The dipole is due to 2 effects which turn out to be of comparable magnitude: (i) our motion with respect to the CMB, and (ii) large scale structure, parameterised here by a family of Cold Dark Matter power-spectra. We make specific predictions for the Green Bank (87GB) and Parkes-MIT-NRAO (PMN) catalogues. For these relatively sparse catalogues both the motion and large scale structure dipole effects are expected to be smaller than the Poisson shot-noise. However, we detect dipole and higher harmonics in the combined 87GB-PMN catalogue which are far larger than expected. We attribute this to a 2 % flux mismatch between the two...

  14. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  15. Star formation associated with a large-scale infrared bubble

    CERN Document Server

    Xu, Jin-Long

    2014-01-01

    Using the data from the Galactic Ring Survey (GRS) and Galactic Legacy Infrared Mid-Plane Survey Extraordinaire (GLIMPSE), we performed a study for a large-scale infrared bubble with a size of about 16 pc at a distance of 2.0 kpc. We present the 12CO J=1-0, 13CO J=1-0 and C18O J=1-0 observations of HII region G53.54-0.01 (Sh2-82) obtained at the the Purple Mountain Observation (PMO) 13.7 m radio telescope to investigate the detailed distribution of associated molecular material. The large-scale infrared bubble shows a half-shell morphology at 8 um. H II regions G53.54-0.01, G53.64+0.24, and G54.09-0.06 are situated on the bubble. Comparing the radio recombination line velocities and associated 13CO J=1-0 components of the three H II regions, we found that the 8 um emission associated with H II region G53.54-0.01 should belong to the foreground emission, and only overlap with the large-scale infrared bubble in the line of sight. Three extended green objects (EGOs, the candidate massive young stellar objects), ...

  16. Line segment extraction for large scale unorganized point clouds

    Science.gov (United States)

    Lin, Yangbin; Wang, Cheng; Cheng, Jun; Chen, Bili; Jia, Fukai; Chen, Zhonggui; Li, Jonathan

    2015-04-01

    Line segment detection in images is already a well-investigated topic, although it has received considerably less attention in 3D point clouds. Benefiting from current LiDAR devices, large-scale point clouds are becoming increasingly common. Most human-made objects have flat surfaces. Line segments that occur where pairs of planes intersect give important information regarding the geometric content of point clouds, which is especially useful for automatic building reconstruction and segmentation. This paper proposes a novel method that is capable of accurately extracting plane intersection line segments from large-scale raw scan points. The 3D line-support region, namely, a point set near a straight linear structure, is extracted simultaneously. The 3D line-support region is fitted by our Line-Segment-Half-Planes (LSHP) structure, which provides a geometric constraint for a line segment, making the line segment more reliable and accurate. We demonstrate our method on the point clouds of large-scale, complex, real-world scenes acquired by LiDAR devices. We also demonstrate the application of 3D line-support regions and their LSHP structures on urban scene abstraction.

  17. BILGO: Bilateral greedy optimization for large scale semidefinite programming

    KAUST Repository

    Hao, Zhifeng

    2013-10-03

    Many machine learning tasks (e.g. metric and manifold learning problems) can be formulated as convex semidefinite programs. To enable the application of these tasks on a large-scale, scalability and computational efficiency are considered as desirable properties for a practical semidefinite programming algorithm. In this paper, we theoretically analyze a new bilateral greedy optimization (denoted BILGO) strategy in solving general semidefinite programs on large-scale datasets. As compared to existing methods, BILGO employs a bilateral search strategy during each optimization iteration. In such an iteration, the current semidefinite matrix solution is updated as a bilateral linear combination of the previous solution and a suitable rank-1 matrix, which can be efficiently computed from the leading eigenvector of the descent direction at this iteration. By optimizing for the coefficients of the bilateral combination, BILGO reduces the cost function in every iteration until the KKT conditions are fully satisfied, thus, it tends to converge to a global optimum. In fact, we prove that BILGO converges to the global optimal solution at a rate of O(1/k), where k is the iteration counter. The algorithm thus successfully combines the efficiency of conventional rank-1 update algorithms and the effectiveness of gradient descent. Moreover, BILGO can be easily extended to handle low rank constraints. To validate the effectiveness and efficiency of BILGO, we apply it to two important machine learning tasks, namely Mahalanobis metric learning and maximum variance unfolding. Extensive experimental results clearly demonstrate that BILGO can solve large-scale semidefinite programs efficiently.

  18. Large-scale flow generation by inhomogeneous helicity

    Science.gov (United States)

    Yokoi, N.; Brandenburg, A.

    2016-03-01

    The effect of kinetic helicity (velocity-vorticity correlation) on turbulent momentum transport is investigated. The turbulent kinetic helicity (pseudoscalar) enters the Reynolds stress (mirror-symmetric tensor) expression in the form of a helicity gradient as the coupling coefficient for the mean vorticity and/or the angular velocity (axial vector), which suggests the possibility of mean-flow generation in the presence of inhomogeneous helicity. This inhomogeneous helicity effect, which was previously confirmed at the level of a turbulence- or closure-model simulation, is examined with the aid of direct numerical simulations of rotating turbulence with nonuniform helicity sustained by an external forcing. The numerical simulations show that the spatial distribution of the Reynolds stress is in agreement with the helicity-related term coupled with the angular velocity, and that a large-scale flow is generated in the direction of angular velocity. Such a large-scale flow is not induced in the case of homogeneous turbulent helicity. This result confirms the validity of the inhomogeneous helicity effect in large-scale flow generation and suggests that a vortex dynamo is possible even in incompressible turbulence where there is no baroclinicity effect.

  19. Large Scale Magnetic Fields: Density Power Spectrum in Redshift Space

    Indian Academy of Sciences (India)

    Rajesh Gopal; Shiv K. Sethi

    2003-09-01

    We compute the density redshift-space power spectrum in the presence of tangled magnetic fields and compare it with existing observations. Our analysis shows that if these magnetic fields originated in the early universe then it is possible to construct models for which the shape of the power spectrum agrees with the large scale slope of the observed power spectrum. However requiring compatibility with observed CMBR anisotropies, the normalization of the power spectrum is too low for magnetic fields to have significant impact on the large scale structure at present. Magnetic fields of a more recent origin generically give density power spectrum ∝ 4 which doesn’t agree with the shape of the observed power spectrum at any scale. Magnetic fields generate curl modes of the velocity field which increase both the quadrupole and hexadecapole of the redshift space power spectrum. For curl modes, the hexadecapole dominates over quadrupole. So the presence of curl modes could be indicated by an anomalously large hexadecapole, which has not yet been computed from observation. It appears difficult to construct models in which tangled magnetic fields could have played a major role in shaping the large scale structure in the present epoch. However if they did, one of the best ways to infer their presence would be from the redshift space effects in the density power spectrum.

  20. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  1. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, J.; Quinby, T.; Caulfield, E.; Gerritsen, M.; Diffendorfer, J.; Haines, S.

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  2. CMB Lensing Bispectrum from Nonlinear Growth of the Large Scale Structure

    CERN Document Server

    Namikawa, Toshiya

    2016-01-01

    We discuss detectability of the nonlinear growth of the large-scale structure in the cosmic microwave background (CMB) lensing. Lensing signals involved in CMB anisotropies have been measured from multiple CMB experiments, such as Atacama Cosmology Telescope (ACT), Planck, POLARBEAR, and South Pole Telescope (SPT). Reconstructed lensing signals are useful to constrain cosmology via their angular power spectrum, while detectability and cosmological application of their bispectrum induced by the nonlinear evolution are not well studied. Extending the analytic estimate of the galaxy lensing bispectrum presented in Takada and Jain (2004) to the CMB case, we show that even near term CMB experiments such as Advanced ACT, Simons Array and SPT3G could detect the CMB lensing bispectrum induced by the nonlinear growth of the large-scale structure. In the case of the CMB Stage-IV, we find that the lensing bispectrum is detectable at $\\gtrsim 50\\,\\sigma$ statistical significance. This precisely measured lensing bispectru...

  3. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    CERN Document Server

    Fonseca, Ricardo A; Fiúza, Frederico; Davidson, Asher; Tsung, Frank S; Mori, Warren B; Silva, Luís O

    2013-01-01

    A new generation of laser wakefield accelerators, supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modeling for further understanding of the underlying physics and identification of optimal regimes, but large scale modeling of these scenarios is computationally heavy and requires efficient use of state-of-the-art Petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed / shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modeling of LWFA, demonstrating speedups of over 1 order of magni...

  4. Economic Model Predictive Control for Large-Scale and Distributed Energy Systems

    DEFF Research Database (Denmark)

    Standardi, Laura

    dynamically decoupled.  The mathematical model of the large-scale energy system embodies the decoupled dynamics of each power units. Moreover,all units of the grid contribute to the overall power production. Economic Model Predictive Control (EMPC) This control strategy is an extension of the Model Predictive...... Sources (RESs) in the smart grids is increasing. These energy sources bring uncertainty to the production due to their fluctuations. Hence,smart grids need suitable control systems that are able to continuously balance power production and consumption.  We apply the Economic Model Predictive Control (EMPC......) strategy to optimise the economic performances of the energy systems and to balance the power production and consumption. In the case of large-scale energy systems, the electrical grid connects a high number of power units. Because of this, the related control problem involves a high number of variables...

  5. Predicting protein functions from redundancies in large-scale protein interaction networks

    Science.gov (United States)

    Samanta, Manoj Pratim; Liang, Shoudan

    2003-01-01

    Interpreting data from large-scale protein interaction experiments has been a challenging task because of the widespread presence of random false positives. Here, we present a network-based statistical algorithm that overcomes this difficulty and allows us to derive functions of unannotated proteins from large-scale interaction data. Our algorithm uses the insight that if two proteins share significantly larger number of common interaction partners than random, they have close functional associations. Analysis of publicly available data from Saccharomyces cerevisiae reveals >2,800 reliable functional associations, 29% of which involve at least one unannotated protein. By further analyzing these associations, we derive tentative functions for 81 unannotated proteins with high certainty. Our method is not overly sensitive to the false positives present in the data. Even after adding 50% randomly generated interactions to the measured data set, we are able to recover almost all (approximately 89%) of the original associations.

  6. A Protocol for the Atomic Capture of Multiple Molecules at Large Scale

    CERN Document Server

    Bertier, Marin; Tedeschi, Cédric

    2012-01-01

    With the rise of service-oriented computing, applications are more and more based on coordination of autonomous services. Envisioned over largely distributed and highly dynamic platforms, expressing this coordination calls for alternative programming models. The chemical programming paradigm, which models applications as chemical solutions where molecules representing digital entities involved in the computation, react together to produce a result, has been recently shown to provide the needed abstractions for autonomic coordination of services. However, the execution of such programs over large scale platforms raises several problems hindering this paradigm to be actually leveraged. Among them, the atomic capture of molecules participating in concur- rent reactions is one of the most significant. In this paper, we propose a protocol for the atomic capture of these molecules distributed and evolving over a large scale platform. As the density of possible reactions is crucial for the liveness and efficiency of...

  7. Bursting and large-scale intermittency in turbulent convection with differential rotation

    International Nuclear Information System (INIS)

    The tilting mechanism, which generates differential rotation in two-dimensional turbulent convection, is shown to produce relaxation oscillations in the mean flow energy integral and bursts in the global fluctuation level, akin to Lotka-Volterra oscillations. The basic reason for such behavior is the unidirectional and conservative transfer of kinetic energy from the fluctuating motions to the mean component of the flows, and its dissipation at large scales. Results from numerical simulations further demonstrate the intimate relation between these low-frequency modulations and the large-scale intermittency of convective turbulence, as manifested by exponential tails in single-point probability distribution functions. Moreover, the spatio-temporal evolution of convective structures illustrates the mechanism triggering avalanche events in the transport process. The latter involves the overlap of delocalized mixing regions when the barrier to transport, produced by the mean component of the flow, transiently disappears

  8. Numerical Modeling of Large-Scale Rocky Coastline Evolution

    Science.gov (United States)

    Limber, P.; Murray, A. B.; Littlewood, R.; Valvo, L.

    2008-12-01

    Seventy-five percent of the world's ocean coastline is rocky. On large scales (i.e. greater than a kilometer), many intertwined processes drive rocky coastline evolution, including coastal erosion and sediment transport, tectonics, antecedent topography, and variations in sea cliff lithology. In areas such as California, an additional aspect of rocky coastline evolution involves submarine canyons that cut across the continental shelf and extend into the nearshore zone. These types of canyons intercept alongshore sediment transport and flush sand to abyssal depths during periodic turbidity currents, thereby delineating coastal sediment transport pathways and affecting shoreline evolution over large spatial and time scales. How tectonic, sediment transport, and canyon processes interact with inherited topographic and lithologic settings to shape rocky coastlines remains an unanswered, and largely unexplored, question. We will present numerical model results of rocky coastline evolution that starts with an immature fractal coastline. The initial shape is modified by headland erosion, wave-driven alongshore sediment transport, and submarine canyon placement. Our previous model results have shown that, as expected, an initial sediment-free irregularly shaped rocky coastline with homogeneous lithology will undergo smoothing in response to wave attack; headlands erode and mobile sediment is swept into bays, forming isolated pocket beaches. As this diffusive process continues, pocket beaches coalesce, and a continuous sediment transport pathway results. However, when a randomly placed submarine canyon is introduced to the system as a sediment sink, the end results are wholly different: sediment cover is reduced, which in turn increases weathering and erosion rates and causes the entire shoreline to move landward more rapidly. The canyon's alongshore position also affects coastline morphology. When placed offshore of a headland, the submarine canyon captures local sediment

  9. Application of simplified models to CO2 migration and immobilization in large-scale geological systems

    KAUST Repository

    Gasda, Sarah E.

    2012-07-01

    Long-term stabilization of injected carbon dioxide (CO 2) is an essential component of risk management for geological carbon sequestration operations. However, migration and trapping phenomena are inherently complex, involving processes that act over multiple spatial and temporal scales. One example involves centimeter-scale density instabilities in the dissolved CO 2 region leading to large-scale convective mixing that can be a significant driver for CO 2 dissolution. Another example is the potentially important effect of capillary forces, in addition to buoyancy and viscous forces, on the evolution of mobile CO 2. Local capillary effects lead to a capillary transition zone, or capillary fringe, where both fluids are present in the mobile state. This small-scale effect may have a significant impact on large-scale plume migration as well as long-term residual and dissolution trapping. Computational models that can capture both large and small-scale effects are essential to predict the role of these processes on the long-term storage security of CO 2 sequestration operations. Conventional modeling tools are unable to resolve sufficiently all of these relevant processes when modeling CO 2 migration in large-scale geological systems. Herein, we present a vertically-integrated approach to CO 2 modeling that employs upscaled representations of these subgrid processes. We apply the model to the Johansen formation, a prospective site for sequestration of Norwegian CO 2 emissions, and explore the sensitivity of CO 2 migration and trapping to subscale physics. Model results show the relative importance of different physical processes in large-scale simulations. The ability of models such as this to capture the relevant physical processes at large spatial and temporal scales is important for prediction and analysis of CO 2 storage sites. © 2012 Elsevier Ltd.

  10. Statistical Modeling of Large-Scale Scientific Simulation Data

    Energy Technology Data Exchange (ETDEWEB)

    Eliassi-Rad, T; Baldwin, C; Abdulla, G; Critchlow, T

    2003-11-15

    With the advent of massively parallel computer systems, scientists are now able to simulate complex phenomena (e.g., explosions of a stars). Such scientific simulations typically generate large-scale data sets over the spatio-temporal space. Unfortunately, the sheer sizes of the generated data sets make efficient exploration of them impossible. Constructing queriable statistical models is an essential step in helping scientists glean new insight from their computer simulations. We define queriable statistical models to be descriptive statistics that (1) summarize and describe the data within a user-defined modeling error, and (2) are able to answer complex range-based queries over the spatiotemporal dimensions. In this chapter, we describe systems that build queriable statistical models for large-scale scientific simulation data sets. In particular, we present our Ad-hoc Queries for Simulation (AQSim) infrastructure, which reduces the data storage requirements and query access times by (1) creating and storing queriable statistical models of the data at multiple resolutions, and (2) evaluating queries on these models of the data instead of the entire data set. Within AQSim, we focus on three simple but effective statistical modeling techniques. AQSim's first modeling technique (called univariate mean modeler) computes the ''true'' (unbiased) mean of systematic partitions of the data. AQSim's second statistical modeling technique (called univariate goodness-of-fit modeler) uses the Andersen-Darling goodness-of-fit method on systematic partitions of the data. Finally, AQSim's third statistical modeling technique (called multivariate clusterer) utilizes the cosine similarity measure to cluster the data into similar groups. Our experimental evaluations on several scientific simulation data sets illustrate the value of using these statistical models on large-scale simulation data sets.

  11. Infectious diseases in large-scale cat hoarding investigations.

    Science.gov (United States)

    Polak, K C; Levy, J K; Crawford, P C; Leutenegger, C M; Moriello, K A

    2014-08-01

    Animal hoarders accumulate animals in over-crowded conditions without adequate nutrition, sanitation, and veterinary care. As a result, animals rescued from hoarding frequently have a variety of medical conditions including respiratory infections, gastrointestinal disease, parasitism, malnutrition, and other evidence of neglect. The purpose of this study was to characterize the infectious diseases carried by clinically affected cats and to determine the prevalence of retroviral infections among cats in large-scale cat hoarding investigations. Records were reviewed retrospectively from four large-scale seizures of cats from failed sanctuaries from November 2009 through March 2012. The number of cats seized in each case ranged from 387 to 697. Cats were screened for feline leukemia virus (FeLV) and feline immunodeficiency virus (FIV) in all four cases and for dermatophytosis in one case. A subset of cats exhibiting signs of upper respiratory disease or diarrhea had been tested for infections by PCR and fecal flotation for treatment planning. Mycoplasma felis (78%), calicivirus (78%), and Streptococcus equi subspecies zooepidemicus (55%) were the most common respiratory infections. Feline enteric coronavirus (88%), Giardia (56%), Clostridium perfringens (49%), and Tritrichomonas foetus (39%) were most common in cats with diarrhea. The seroprevalence of FeLV and FIV were 8% and 8%, respectively. In the one case in which cats with lesions suspicious for dermatophytosis were cultured for Microsporum canis, 69/76 lesional cats were culture-positive; of these, half were believed to be truly infected and half were believed to be fomite carriers. Cats from large-scale hoarding cases had high risk for enteric and respiratory infections, retroviruses, and dermatophytosis. Case responders should be prepared for mass treatment of infectious diseases and should implement protocols to prevent transmission of feline or zoonotic infections during the emergency response and when

  12. Properties of large-scale methane/hydrogen jet fires

    International Nuclear Information System (INIS)

    A future economy based on reduction of carbon-based fuels for power generation and transportation may consider hydrogen as possible energy carrier Extensive and widespread use of hydrogen might require a pipeline network. The alternatives might be the use of the existing natural gas network or to design a dedicated network. Whatever the solution, mixing hydrogen with natural gas will modify the consequences of accidents, substantially The French National Research Agency (ANR) funded project called HYDROMEL focuses on these critical questions Within this project large-scale jet fires have been studied experimentally and numerically The main characteristics of these flames including visible length, radiation fluxes and blowout have been assessed. (authors)

  13. Large-scale glaciation on Earth and on Mars

    OpenAIRE

    Greve, Ralf

    2007-01-01

    This habilitation thesis combines ten publications of the author which are concerned with the large-scale dynamics and thermodynamics of ice sheets and ice shelves. Ice sheets are ice masses with a minimum area of 50,000 km2 which rest on solid land, whereas ice shelves consist of floating ice nourished by the mass flow from an adjacent ice sheet, typically stabilized by large bays. Together, they represent the major part of the cryosphere of the Earth. Furthermore, ice on Earth occurs in the...

  14. Large Scale Self-Similar Skeletal Structure of the Universe

    CERN Document Server

    Rantsev-Kartinov, Valentin A

    2005-01-01

    An analysis of the redshift maps of galaxies and quasars has revealed large-scale self-similar skeletal structures of the Universe of the same topology which had been found earlier in a wide range of phenomena, spatial scales and environments. The "cartwheel" type of structure with diameter ~ 1.5 10^27 cm is discovered in this analysis by means of the method of multi-level dynamical contrasting. Similar skeletal structures in size up to 1.5 10^28 cm are found also in the redshift maps of quasars.

  15. Decentralized stabilization of large-scale civil structures

    Czech Academy of Sciences Publication Activity Database

    Bakule, Lubomír; Papík, Martin; Rehák, Branislav

    Cape Town: IFAC, 2014, s. 10427-10432. ISBN 978-3-902823-62-5. [The 19th World Congress of the IFAC /2014/. Cape Town (ZA), 24.08.2014-29.08.2014] R&D Projects: GA ČR GA13-02149S; GA MŠk(CZ) LG12014; GA MŠk(CZ) LG12008 Institutional support: RVO:67985556 Keywords : Decentralized control * efficient strategies for large scale complex systems * monitoring and control of spatially distributed systems Subject RIV: BC - Control Systems Theory http://library.utia.cas.cz/separaty/2014/AS/bakule-0431251.pdf

  16. Large scale PV plants - also in Denmark. Project report

    Energy Technology Data Exchange (ETDEWEB)

    Ahm, P. (PA Energy, Malling (Denmark)); Vedde, J. (SiCon. Silicon and PV consulting, Birkeroed (Denmark))

    2011-04-15

    Large scale PV (LPV) plants, plants with a capacity of more than 200 kW, has since 2007 constituted an increasing share of the global PV installations. In 2009 large scale PV plants with cumulative power more that 1,3 GWp were connected to the grid. The necessary design data for LPV plants in Denmark are available or can be found, although irradiance data could be improved. There seems to be very few institutional barriers for LPV projects, but as so far no real LPV projects have been processed, these findings have to be regarded as preliminary. The fast growing number of very large scale solar thermal plants for district heating applications supports these findings. It has further been investigated, how to optimize the lay-out of LPV plants. Under the Danish irradiance conditions with several winter months with very low solar height PV installations on flat surfaces will have to balance the requirements of physical space - and cost, and the loss of electricity production due to shadowing effects. The potential for LPV plants in Denmark are found in three main categories: PV installations on flat roof of large commercial buildings, PV installations on other large scale infrastructure such as noise barriers and ground mounted PV installations. The technical potential for all three categories is found to be significant and in the range of 50 - 250 km2. In terms of energy harvest PV plants will under Danish conditions exhibit an overall efficiency of about 10 % in conversion of the energy content of the light compared to about 0,3 % for biomass. The theoretical ground area needed to produce the present annual electricity consumption of Denmark at 33-35 TWh is about 300 km2 The Danish grid codes and the electricity safety regulations mention very little about PV and nothing about LPV plants. It is expected that LPV plants will be treated similarly to big wind turbines. A number of LPV plant scenarios have been investigated in detail based on real commercial offers and

  17. Hijacking Bitcoin: Large-scale Network Attacks on Cryptocurrencies

    OpenAIRE

    Apostolaki, Maria; Zohar, Aviv; Vanbever, Laurent

    2016-01-01

    Bitcoin is without a doubt the most successful cryptocurrency in circulation today, making it an extremely valuable target for attackers. Indeed, many studies have highlighted ways to compromise one or several Bitcoin nodes. In this paper, we take a different perspective and study the effect of large-scale network-level attacks such as the ones that may be launched by Autonomous Systems (ASes). We show that attacks that are commonly believed to be hard, such as isolating 50% of the mining pow...

  18. Electric power generation in large-scale power plants

    International Nuclear Information System (INIS)

    Future electric power consumption will be depending on the economic development of the Federal Republic of Germany. Thermal power plants are fueled with non-renewable energy sources, i.e. coal, petroleum, natural gas or nuclear power. It is therefore important to assess the global coverage of these energy sources and to take stock of the reserves of the Federal Republic of Germany. If the waste heat left from electric power generation was made use of in dual-purpose power plants total energy consumption could be considerably reduced. Large-scale power plants do have to face and cope with the lack of distribution networks to supply the consumer. (DG)

  19. Large-Scale Environmental Effects of the Cluster Distribution

    CERN Document Server

    Plionis, M

    2001-01-01

    Using the APM cluster distribution we find interesting alignment effects: (1) Cluster substructure is strongly correlated with the tendency of clusters to be aligned with their nearest neighbour and in general with the nearby clusters that belong to the same supercluster, (2) Clusters belonging in superclusters show a statistical significant tendency to be aligned with the major axis orientation of their parent supercluster. Furthermore we find that dynamically young clusters are more clustered than the overall cluster population. These are strong indications that cluster develop in a hierarchical fashion by merging along the large-scale filamentary superclusters within which they are embedded.

  20. GroFi: Large-scale fiber placement research facility

    OpenAIRE

    Krombholz, Christian; Kruse, Felix; Wiedemann, Martin

    2016-01-01

    GroFi is a large research facility operated by the German Aerospace Center’s Center for Lightweight-Production-Technology in Stade. A combination of dierent layup technologies namely (dry) ber placement and tape laying, allows the development and validation of new production technologiesand processes for large-scale composite components. Due to the use of coordinated and simultaneously working layup units a high exibility of the research platform is achieved. This allows the investigation of ...

  1. Measuring Large-Scale Social Networks with High Resolution

    DEFF Research Database (Denmark)

    Stopczynski, Arkadiusz; Sekara, Vedran; Sapiezynski, Piotr;

    2014-01-01

    , telecommunication, social networks, location, and background information (personality, demographics, health, politics) for a densely connected population of 1 000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation......This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years-the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions...

  2. Large Scale Simulations of the Euler Equations on GPU Clusters

    KAUST Repository

    Liebmann, Manfred

    2010-08-01

    The paper investigates the scalability of a parallel Euler solver, using the Vijayasundaram method, on a GPU cluster with 32 Nvidia Geforce GTX 295 boards. The aim of this research is to enable large scale fluid dynamics simulations with up to one billion elements. We investigate communication protocols for the GPU cluster to compensate for the slow Gigabit Ethernet network between the GPU compute nodes and to maintain overall efficiency. A diesel engine intake-port and a nozzle, meshed in different resolutions, give good real world examples for the scalability tests on the GPU cluster. © 2010 IEEE.

  3. Large-Scale Experiments in a Sandy Aquifer in Denmark

    DEFF Research Database (Denmark)

    Jensen, Karsten Høgh; Bitsch, Karen Bue; Bjerg, Poul Løgstrup

    1993-01-01

    A large-scale natural gradient dispersion experiment was carried out in a sandy aquifer in the western part of Denmark using tritium and chloride as tracers. For both plumes a marked spreading was observed in the longitudinal direction while the spreading in the transverse horizontal and transverse...... vertical directions was very small. The horizontal transport parameters of the advection-dispersion equation were investigated by applying an optimization model to observed breakthrough curves of tritium representing depth averaged concentrations. No clear trend in dispersion parameters with travel...

  4. Large scale obscuration and related climate effects open literature bibliography

    Energy Technology Data Exchange (ETDEWEB)

    Russell, N.A.; Geitgey, J.; Behl, Y.K.; Zak, B.D.

    1994-05-01

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ``Nuclear Winter Controversy`` in the early 1980`s. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest.

  5. Synthesis and sensing application of large scale bilayer graphene

    Science.gov (United States)

    Hong, Sung Ju; Yoo, Jung Hoon; Baek, Seung Jae; Park, Yung Woo

    2012-02-01

    We have synthesized large scale bilayer graphene by using Chemical Vapor Deposition (CVD) in atmospheric pressure. Bilayer graphene was grown by using CH4, H2 and Ar gases. The growth temperature was 1050^o. Conventional FET measurement shows ambipolar transfer characteristics. Results of Raman spectroscopy, Atomic Force microscope (AFM) and Transmission Electron Microscope (TEM) indicate the film is bilayer graphene. Especially, adlayer structure which interrupt uniformity was reduced in low methane flow condition. Furthermore, large size CVD bilayer graphene film can be investigated to apply sensor devices. By using conventional photolithography process, we have fabricated device array structure and studied sensing behavior.

  6. Monochromatic waves induced by large-scale parametric forcing.

    Science.gov (United States)

    Nepomnyashchy, A; Abarzhi, S I

    2010-03-01

    We study the formation and stability of monochromatic waves induced by large-scale modulations in the framework of the complex Ginzburg-Landau equation with parametric nonresonant forcing dependent on the spatial coordinate. In the limiting case of forcing with very large characteristic length scale, analytical solutions for the equation are found and conditions of their existence are outlined. Stability analysis indicates that the interval of existence of a monochromatic wave can contain a subinterval where the wave is stable. We discuss potential applications of the model in rheology, fluid dynamics, and optics. PMID:20365907

  7. Large-Scale Purification of Peroxisomes for Preparative Applications.

    Science.gov (United States)

    Cramer, Jana; Effelsberg, Daniel; Girzalsky, Wolfgang; Erdmann, Ralf

    2015-09-01

    This protocol is designed for large-scale isolation of highly purified peroxisomes from Saccharomyces cerevisiae using two consecutive density gradient centrifugations. Instructions are provided for harvesting up to 60 g of oleic acid-induced yeast cells for the preparation of spheroplasts and generation of organellar pellets (OPs) enriched in peroxisomes and mitochondria. The OPs are loaded onto eight continuous 36%-68% (w/v) sucrose gradients. After centrifugation, the peak peroxisomal fractions are determined by measurement of catalase activity. These fractions are subsequently pooled and subjected to a second density gradient centrifugation using 20%-40% (w/v) Nycodenz. PMID:26330621

  8. On decentralized control of large-scale systems

    Science.gov (United States)

    Siljak, D. D.

    1978-01-01

    A scheme is presented for decentralized control of large-scale linear systems which are composed of a number of interconnected subsystems. By ignoring the interconnections, local feedback controls are chosen to optimize each decoupled subsystem. Conditions are provided to establish compatibility of the individual local controllers and achieve stability of the overall system. Besides computational simplifications, the scheme is attractive because of its structural features and the fact that it produces a robust decentralized regulator for large dynamic systems, which can tolerate a wide range of nonlinearities and perturbations among the subsystems.

  9. Large scale obscuration and related climate effects open literature bibliography

    International Nuclear Information System (INIS)

    Large scale obscuration and related climate effects of nuclear detonations first became a matter of concern in connection with the so-called ''Nuclear Winter Controversy'' in the early 1980's. Since then, the world has changed. Nevertheless, concern remains about the atmospheric effects of nuclear detonations, but the source of concern has shifted. Now it focuses less on global, and more on regional effects and their resulting impacts on the performance of electro-optical and other defense-related systems. This bibliography reflects the modified interest

  10. Reliability Evaluation considering Structures of a Large Scale Wind Farm

    DEFF Research Database (Denmark)

    Shin, Je-Seok; Cha, Seung-Tae; Wu, Qiuwei;

    2012-01-01

    Wind energy is one of the most widely used renewable energy resources. Wind power has been connected to the grid as large scale wind farm which is made up of dozens of wind turbines, and the scale of wind farm is more increased recently. Due to intermittent and variable wind source, reliability...... wind farm which is able to enhance a capability of delivering a power instead of controlling an uncontrollable output of wind power. Therefore, this paper introduces a method to evaluate the reliability depending upon structures of wind farm and to reflect the result to the planning stage of wind farm....

  11. Petascale computations for Large-scale Atomic and Molecular collisions

    CERN Document Server

    McLaughlin, Brendan M

    2014-01-01

    Petaflop architectures are currently being utilized efficiently to perform large scale computations in Atomic, Molecular and Optical Collisions. We solve the Schroedinger or Dirac equation for the appropriate collision problem using the R-matrix or R-matrix with pseudo-states approach. We briefly outline the parallel methodology used and implemented for the current suite of Breit-Pauli and DARC codes. Various examples are shown of our theoretical results compared with those obtained from Synchrotron Radiation facilities and from Satellite observations. We also indicate future directions and implementation of the R-matrix codes on emerging GPU architectures.

  12. Catalytic synthesis of large-scale GaN nanorods

    International Nuclear Information System (INIS)

    A novel rare earth metal seed was employed as the catalyst for the growth of GaN nanorods. Large-scale GaN nanorods were synthesized successfully through ammoniating Ga2O3/Tb films sputtered on Si(1 1 1) substrates. Scanning electron microscopy, X-ray diffraction, transmission electron microscopy, high-resolution transmission electron microscopy, and X-ray photoelectron spectroscopy were used to characterize the structure, morphology, and composition of the samples. The results demonstrate that the nanorods are high-quality single-crystal GaN with hexagonal wurtzite structure. The growth mechanism of GaN nanorods is also discussed

  13. ROSA-IV large scale test facility (LSTF) system description

    International Nuclear Information System (INIS)

    The ROSA-IV Program's large scale test facility (LSTF) is a test facility for integral simulation of thermal-hydraulic response of a pressurized water reactor (PWR) during a small break loss-of-coolant accident (LOCA) or an operational transient. This document provides the necessary background information to interpret the experimental data obtained from the LSTF experiments. The information provided includes LSTF test objectives and approach, the LSTF design philosopy, the component and geometry description, the instrumentation and data acquisition system description, and the outline of experiments to be performed. (author)

  14. Testing Dark Energy Models through Large Scale Structure

    CERN Document Server

    Avsajanishvili, Olga; Arkhipova, Natalia A; Kahniashvili, Tina

    2015-01-01

    We explore the scalar field quintessence freezing model of dark energy with the inverse Ratra-Peebles potential. We study the cosmic expansion and the large scale structure growth rate. We use recent measurements of the growth rate and the baryon acoustic oscillation peak positions to constrain the matter density $\\Omega_\\mathrm{m}$ parameter and the model parameter $\\alpha$ that describes the steepness of the scalar field potential. We solve jointly the equations for the background expansion and for the growth rate of matter perturbations. The obtained theoretical results are compared with the observational data. We perform the Baysian data analysis to derive constraints on the model parameters.

  15. Floodplain management in Africa: Large scale analysis of flood data

    Science.gov (United States)

    Padi, Philip Tetteh; Baldassarre, Giuliano Di; Castellarin, Attilio

    2011-01-01

    To mitigate a continuously increasing flood risk in Africa, sustainable actions are urgently needed. In this context, we describe a comprehensive statistical analysis of flood data in the African continent. The study refers to quality-controlled, large and consistent databases of flood data, i.e. maximum discharge value and times series of annual maximum flows. Probabilistic envelope curves are derived for the African continent by means of a large scale regional analysis. Moreover, some initial insights on the statistical characteristics of African floods are provided. The results of this study are relevant and can be used to get some indications to support flood management in Africa.

  16. Large-Scale Self-Consistent Nuclear Mass Calculations

    CERN Document Server

    Stoitsov, M V; Dobaczewski, J; Nazarewicz, W

    2006-01-01

    The program of systematic large-scale self-consistent nuclear mass calculations that is based on the nuclear density functional theory represents a rich scientific agenda that is closely aligned with the main research directions in modern nuclear structure and astrophysics, especially the radioactive nuclear beam physics. The quest for the microscopic understanding of the phenomenon of nuclear binding represents, in fact, a number of fundamental and crucial questions of the quantum many-body problem, including the proper treatment of correlations and dynamics in the presence of symmetry breaking. Recent advances and open problems in the field of nuclear mass calculations are presented and discussed.

  17. Efficient topology estimation for large scale optical mapping

    OpenAIRE

    Elibol, Armagan

    2011-01-01

    Large scale image mosaicing methods are in great demand among scientists who study different aspects of the seabed, and have been fostered by impressive advances in the capabilities of underwater robots in gathering optical data from the seafloor. Cost and weight constraints mean that lowcost Remotely operated vehicles (ROVs) usually have a very limited number of sensors. When a low-cost robot carries out a seafloor survey using a down-looking camera, it usually follows a predetermined trajecto...

  18. Simple Method for Large-Scale Fabrication of Plasmonic Structures

    CERN Document Server

    Makarov, Sergey V; Mukhin, Ivan S; Shishkin, Ivan I; Mozharov, Alexey M; Krasnok, Alexander E; Belov, Pavel A

    2015-01-01

    A novel method for single-step, lithography-free, and large-scale laser writing of nanoparticle-based plasmonic structures has been developed. Changing energy of femtosecond laser pulses and thickness of irradiated gold film it is possible to vary diameter of the gold nanoparticles, while the distance between them can be varied by laser scanning parameters. This method has an advantage over the most previously demonstrated methods in its simplicity and versatility, while the quality of the structures is good enough for many applications. In particular, resonant light absorbtion/scattering and surface-enhanced Raman scattering have been demonstrated on the fabricated nanostructures.

  19. Cost Overruns in Large-scale Transportation Infrastructure Projects

    DEFF Research Database (Denmark)

    Cantarelli, Chantal C; Flyvbjerg, Bent; Molin, Eric J. E;

    2010-01-01

    Managing large-scale transportation infrastructure projects is difficult due to frequent misinformation about the costs which results in large cost overruns that often threaten the overall project viability. This paper investigates the explanations for cost overruns that are given in the literature....... Overall, four categories of explanations can be distinguished: technical, economic, psychological, and political. Political explanations have been seen to be the most dominant explanations for cost overruns. Agency theory is considered the most interesting for political explanations and an eclectic theory...

  20. Large-scale computing techniques for complex system simulations

    CERN Document Server

    Dubitzky, Werner; Schott, Bernard

    2012-01-01

    Complex systems modeling and simulation approaches are being adopted in a growing number of sectors, including finance, economics, biology, astronomy, and many more. Technologies ranging from distributed computing to specialized hardware are explored and developed to address the computational requirements arising in complex systems simulations. The aim of this book is to present a representative overview of contemporary large-scale computing technologies in the context of complex systems simulations applications. The intention is to identify new research directions in this field and

  1. Solar cycle changes of large-scale solar wind structure

    OpenAIRE

    Manoharan, P. K

    2011-01-01

    In this paper, I present the results on large-scale evolution of density turbulence of solar wind in the inner heliosphere during 1985 - 2009. At a given distance from the Sun, the density turbulence is maximum around the maximum phase of the solar cycle and it reduces to ~70%, near the minimum phase. However, in the current minimum of solar activity, the level of turbulence has gradually decreased, starting from the year 2005, to the present level of ~30%. These results suggest that the sour...

  2. Robust morphological measures for large-scale structure

    CERN Document Server

    Buchert, T

    1994-01-01

    A complete family of statistical descriptors for the morphology of large--scale structure based on Minkowski--Functionals is presented. These robust and significant measures can be used to characterize the local and global morphology of spatial patterns formed by a coverage of point sets which represent galaxy samples. Basic properties of these measures are highlighted and their relation to the `genus statistics' is discussed. Test models like a Poissonian point process and samples generated from a Voronoi--model are put into perspective.

  3. A relativistic view on large scale N-body simulations

    International Nuclear Information System (INIS)

    We discuss the relation between the output of Newtonian N-body simulations on scales that approach or exceed the particle horizon to the description of general relativity. At leading order, the Zeldovich approximation is correct on large scales, coinciding with the general relativistic result. At second order in the initial metric potential, the trajectories of particles deviate from the second order Newtonian result and hence the validity of second order Lagrangian perturbation theory initial conditions should be reassessed when used in very large simulations. We also advocate using the expression for the synchronous gauge density as a well behaved measure of density fluctuations on such scales. (paper)

  4. Enabling Large-Scale Biomedical Analysis in the Cloud

    Directory of Open Access Journals (Sweden)

    Ying-Chih Lin

    2013-01-01

    Full Text Available Recent progress in high-throughput instrumentations has led to an astonishing growth in both volume and complexity of biomedical data collected from various sources. The planet-size data brings serious challenges to the storage and computing technologies. Cloud computing is an alternative to crack the nut because it gives concurrent consideration to enable storage and high-performance computing on large-scale data. This work briefly introduces the data intensive computing system and summarizes existing cloud-based resources in bioinformatics. These developments and applications would facilitate biomedical research to make the vast amount of diversification data meaningful and usable.

  5. Double beta decay with large scale Yb-loaded scintillators

    OpenAIRE

    Zuber, K.

    2000-01-01

    The potential of large scale Yb-loaded liquid scintillators as proposed for solar neutrino spectroscopy are investigated with respect to double beta decay. The potential for beta-beta- - decay of 176Yb as well as the beta+/EC - decay for 168Yb is discussed. Not only getting for the first time an experimental half-life limit on 176Yb - decay, this will even be at least comparable or better than existing ones from other isotopes. Also for the first time a realistic chance to detect beta+/EC - d...

  6. Large Scale Composite Manufacturing for Heavy Lift Launch Vehicles

    Science.gov (United States)

    Stavana, Jacob; Cohen, Leslie J.; Houseal, Keth; Pelham, Larry; Lort, Richard; Zimmerman, Thomas; Sutter, James; Western, Mike; Harper, Robert; Stuart, Michael

    2012-01-01

    Risk reduction for the large scale composite manufacturing is an important goal to produce light weight components for heavy lift launch vehicles. NASA and an industry team successfully employed a building block approach using low-cost Automated Tape Layup (ATL) of autoclave and Out-of-Autoclave (OoA) prepregs. Several large, curved sandwich panels were fabricated at HITCO Carbon Composites. The aluminum honeycomb core sandwich panels are segments of a 1/16th arc from a 10 meter cylindrical barrel. Lessons learned highlight the manufacturing challenges required to produce light weight composite structures such as fairings for heavy lift launch vehicles.

  7. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and

  8. Survey of large-scale isotope applications: nuclear technology field

    Energy Technology Data Exchange (ETDEWEB)

    Dewitt, R.

    1977-01-21

    A preliminary literature survey of potential large-scale isotope applications was made according to topical fields; i.e., nuclear, biological, medical, environmental, agricultural, geological, and industrial. Other than the possible expansion of established large-scale isotope applications such as uranium, boron, lithium, and hydrogen, no new immediate isotope usage appears to be developing. Over the long term a change in emphasis for isotope applications was identified which appears to be more responsive to societal concerns for health, the environment, and the conservation of materials and energy. For gram-scale applications, a variety of isotopes may be required for use as nonradioactive ''activable'' tracers. A more detailed survey of the nuclear field identified a potential need for large amounts (tons) of special isotopic materials for advanced reactor components and structures. At this need for special materials and the development of efficient separation methods progresses, the utilization of isotopes from nuclear wastes for beneficial uses should also progress.

  9. Large-scale mapping of mutations affecting zebrafish development

    Directory of Open Access Journals (Sweden)

    Neuhauss Stephan C

    2007-01-01

    Full Text Available Abstract Background Large-scale mutagenesis screens in the zebrafish employing the mutagen ENU have isolated several hundred mutant loci that represent putative developmental control genes. In order to realize the potential of such screens, systematic genetic mapping of the mutations is necessary. Here we report on a large-scale effort to map the mutations generated in mutagenesis screening at the Max Planck Institute for Developmental Biology by genome scanning with microsatellite markers. Results We have selected a set of microsatellite markers and developed methods and scoring criteria suitable for efficient, high-throughput genome scanning. We have used these methods to successfully obtain a rough map position for 319 mutant loci from the Tübingen I mutagenesis screen and subsequent screening of the mutant collection. For 277 of these the corresponding gene is not yet identified. Mapping was successful for 80 % of the tested loci. By comparing 21 mutation and gene positions of cloned mutations we have validated the correctness of our linkage group assignments and estimated the standard error of our map positions to be approximately 6 cM. Conclusion By obtaining rough map positions for over 300 zebrafish loci with developmental phenotypes, we have generated a dataset that will be useful not only for cloning of the affected genes, but also to suggest allelism of mutations with similar phenotypes that will be identified in future screens. Furthermore this work validates the usefulness of our methodology for rapid, systematic and inexpensive microsatellite mapping of zebrafish mutations.

  10. State-of-the-art of large scale biogas plants

    International Nuclear Information System (INIS)

    A survey of the technological state of large scale biogas plants in Europe treating manure is given. 83 plants are in operation at present. Of these, 16 are centralised digestion plants. Transport costs at centralised digestion plants amounts to between 25 and 40 percent of the total operational costs. Various transport equipment is used. Most large scale digesters are CSTRs, but serial, contact, 2-step, and plug-flow digesters are also found. Construction materials are mostly steel and concrete. Mesophilic digestion is most common (56%), thermophilic digestion is used in 17% of the plants, combined mesophilic and thermophilic digestion is used in 28% of the centralised plants. Mixing of digester content is performed with gas injection, propellers, and gas-liquid displacement. Heating is carried out using external or internal heat exchangers. Heat recovery is only used in Denmark. Gas purification equipment is commonplace, but not often needed. Several plants use separation of the digested manure, often as part of a post-treatment/-purification process or for the production of 'compost'. Screens, sieve belt separaters, centrifuges and filter presses are employed. The use of biogas varies considerably. In some cases, combined heat and power stations are supplying the grid and district heating systems. Other plants use only either the electricity or heat. (au)

  11. IP over optical multicasting for large-scale video delivery

    Science.gov (United States)

    Jin, Yaohui; Hu, Weisheng; Sun, Weiqiang; Guo, Wei

    2007-11-01

    In the IPTV systems, multicasting will play a crucial role in the delivery of high-quality video services, which can significantly improve bandwidth efficiency. However, the scalability and the signal quality of current IPTV can barely compete with the existing broadcast digital TV systems since it is difficult to implement large-scale multicasting with end-to-end guaranteed quality of service (QoS) in packet-switched IP network. China 3TNet project aimed to build a high performance broadband trial network to support large-scale concurrent streaming media and interactive multimedia services. The innovative idea of 3TNet is that an automatic switched optical networks (ASON) with the capability of dynamic point-to-multipoint (P2MP) connections replaces the conventional IP multicasting network in the transport core, while the edge remains an IP multicasting network. In this paper, we will introduce the network architecture and discuss challenges in such IP over Optical multicasting for video delivery.

  12. Properties of Cosmic Shock Waves in Large Scale Structure Formation

    CERN Document Server

    Miniati, F; Kang, H; Jones, T W; Cen, R; Ostriker, J P; Miniati, Francesco; Ryu, Dongsu; Kang, Hyesung; Cen, Renyue; Ostriker, Jeremiah P.

    2000-01-01

    We have examined the properties of shock waves in simulations of large scale structure formation for two cosmological scenarios (a SCDM and a LCDM with Omega =1). Large-scale shocks result from accretion onto sheets, filaments and Galaxy Clusters (GCs) on a scale of circa 5 Mpc/h in both cases. Energetic motions, both residual of past accretion history and due to current asymmetric inflow along filaments, generate additional, common shocks on a scale of about 1 Mpc/h, which penetrate deep inside GCs. Also collisions between substructures inside GCs form merger shocks. Consequently, the topology of the shocks is very complex and highly connected. During cosmic evolution the comoving shock surface density decreases, reflecting the ongoing structure merger process in both scenarios. Accretion shocks have very high Mach numbers (10-10^3), when photo-heating of the pre-shock gas is not included. The typical shock speed is of order v_{sh}(z) =H(z)lambda_{NL}(z), with lambda_{NL}(z) the wavelength scale of the nonli...

  13. Alignment of quasar polarizations with large-scale structures

    Science.gov (United States)

    Hutsemékers, D.; Braibant, L.; Pelgrims, V.; Sluse, D.

    2014-12-01

    We have measured the optical linear polarization of quasars belonging to Gpc scale quasar groups at redshift z ~ 1.3. Out of 93 quasars observed, 19 are significantly polarized. We found that quasar polarization vectors are either parallel or perpendicular to the directions of the large-scale structures to which they belong. Statistical tests indicate that the probability that this effect can be attributed to randomly oriented polarization vectors is on the order of 1%. We also found that quasars with polarization perpendicular to the host structure preferentially have large emission line widths while objects with polarization parallel to the host structure preferentially have small emission line widths. Considering that quasar polarization is usually either parallel or perpendicular to the accretion disk axis depending on the inclination with respect to the line of sight, and that broader emission lines originate from quasars seen at higher inclinations, we conclude that quasar spin axes are likely parallel to their host large-scale structures. Based on observations made with ESO Telescopes at the La Silla Paranal Observatory under program ID 092.A-0221.Table 1 is available in electronic form at http://www.aanda.org

  14. Simulating subsurface heterogeneity improves large-scale water resources predictions

    Science.gov (United States)

    Hartmann, A. J.; Gleeson, T.; Wagener, T.; Wada, Y.

    2014-12-01

    Heterogeneity is abundant everywhere across the hydrosphere. It exists in the soil, the vadose zone and the groundwater. In large-scale hydrological models, subsurface heterogeneity is usually not considered. Instead average or representative values are chosen for each of the simulated grid cells, not incorporating any sub-grid variability. This may lead to unreliable predictions when the models are used for assessing future water resources availability, floods or droughts, or when they are used for recommendations for more sustainable water management. In this study we use a novel, large-scale model that takes into account sub-grid heterogeneity for the simulation of groundwater recharge by using statistical distribution functions. We choose all regions over Europe that are comprised by carbonate rock (~35% of the total area) because the well understood dissolvability of carbonate rocks (karstification) allows for assessing the strength of subsurface heterogeneity. Applying the model with historic data and future climate projections we show that subsurface heterogeneity lowers the vulnerability of groundwater recharge on hydro-climatic extremes and future changes of climate. Comparing our simulations with the PCR-GLOBWB model we can quantify the deviations of simulations for different sub-regions in Europe.

  15. What determines large scale clustering: halo mass or environment?

    CERN Document Server

    Pujol, Arnau; Jiménez, Noelia; Gaztañaga, Enrique

    2015-01-01

    We study the large scale halo bias b as a function of the environment (defined here as the background dark matter density fluctuation, d) and show that environment, and not halo mass m, is the main cause of large scale clustering. More massive haloes have a higher clustering because they live in denser regions, while low mass haloes can be found in a wide range of environments, and hence they have a lower clustering. Using a Halo Occupation Distribution (HOD) test, we can predict b(m) from b(d), but we cannot predict b(d) from b(m), which shows that environment is more fundamental for bias than mass. This has implications for the HOD model interpretation of the galaxy clustering, since when a galaxy selection is affected by environment, the standard HOD implementation fails. We show that the effects of environment are very important for colour selected samples in semi-analytic models of galaxy formation. In these cases, bias can be better recovered if we use environmental density instead of mass as the HOD va...

  16. Large-scale network-level processes during entrainment.

    Science.gov (United States)

    Lithari, Chrysa; Sánchez-García, Carolina; Ruhnau, Philipp; Weisz, Nathan

    2016-03-15

    Visual rhythmic stimulation evokes a robust power increase exactly at the stimulation frequency, the so-called steady-state response (SSR). Localization of visual SSRs normally shows a very focal modulation of power in visual cortex and led to the treatment and interpretation of SSRs as a local phenomenon. Given the brain network dynamics, we hypothesized that SSRs have additional large-scale effects on the brain functional network that can be revealed by means of graph theory. We used rhythmic visual stimulation at a range of frequencies (4-30Hz), recorded MEG and investigated source level connectivity across the whole brain. Using graph theoretical measures we observed a frequency-unspecific reduction of global density in the alpha band "disconnecting" visual cortex from the rest of the network. Also, a frequency-specific increase of connectivity between occipital cortex and precuneus was found at the stimulation frequency that exhibited the highest resonance (30Hz). In conclusion, we showed that SSRs dynamically re-organized the brain functional network. These large-scale effects should be taken into account not only when attempting to explain the nature of SSRs, but also when used in various experimental designs. PMID:26835557

  17. Large-Scale Mass Distribution in the Illustris-Simulation

    CERN Document Server

    Haider, Markus; Vogelsberger, Mark; Genel, Shy; Springel, Volker; Torrey, Paul; Hernquist, Lars

    2015-01-01

    Observations at low redshifts thus far fail to account for all of the baryons expected in the Universe according to cosmological constraints. A large fraction of the baryons presumably resides in a thin and warm-hot medium between the galaxies, where they are difficult to observe due to their low densities and high temperatures. Cosmological simulations of structure formation can be used to verify this picture and provide quantitative predictions for the distribution of mass in different large-scale structure components. Here we study the distribution of baryons and dark matter at different epochs using data from the Illustris Simulation. We identify regions of different dark matter density with the primary constituents of large-scale structure, allowing us to measure mass and volume of haloes, filaments and voids. At redshift zero, we find that 49 % of the dark matter and 23 % of the baryons are within haloes. The filaments of the cosmic web host a further 45 % of the dark matter and 46 % of the baryons. The...

  18. A Novel Approach Towards Large Scale Cross-Media Retrieval

    Institute of Scientific and Technical Information of China (English)

    Bo Lu; Guo-Ren Wang; Ye Yuan

    2012-01-01

    With the rapid development of Internet and multimedia technology,cross-media retrieval is concerned to retrieve all the related media objects with multi-modality by submitting a query media object.Unfortunately,the complexity and the heterogeneity of multi-modality have posed the following two major challenges for cross-media retrieval:1) how to construct a unified and compact model for media objects with multi-modality,2) how to improve the performance of retrieval for large scale cross-media database.In this paper,we propose a novel method which is dedicate to solving these issues to achieve effective and accurate cross-media retrieval.Firstly,a multi-modality semantic relationship graph (MSRG) is constructed using the semantic correlation amongst the media objects with multi-modality.Secondly,all the media objects in MSRG are mapped onto an isomorphic semantic space.Further,an efficient indexing MK-tree based on heterogeneous data distribution is proposed to manage the media objects within the semantic space and improve the performance of cross-media retrieval.Extensive experiments on real large scale cross-media datasets indicate that our proposal dramatically improves the accuracy and efficiency of cross-media retrieval,outperforming the existing methods significantly.

  19. Very large-scale motions in a turbulent pipe flow

    Science.gov (United States)

    Lee, Jae Hwa; Jang, Seong Jae; Sung, Hyung Jin

    2011-11-01

    Direct numerical simulation of a turbulent pipe flow with ReD=35000 was performed to investigate the spatially coherent structures associated with very large-scale motions. The corresponding friction Reynolds number, based on pipe radius R, is R+=934, and the computational domain length is 30 R. The computed mean flow statistics agree well with previous DNS data at ReD=44000 and 24000. Inspection of the instantaneous fields and two-point correlation of the streamwise velocity fluctuations showed that the very long meandering motions exceeding 25R exist in logarithmic and wake regions, and the streamwise length scale is almost linearly increased up to y/R ~0.3, while the structures in the turbulent boundary layer only reach up to the edge of the log-layer. Time-resolved instantaneous fields revealed that the hairpin packet-like structures grow with continuous stretching along the streamwise direction and create the very large-scale structures with meandering in the spanwise direction, consistent with the previous conceptual model of Kim & Adrian (1999). This work was supported by the Creative Research Initiatives of NRF/MEST of Korea (No. 2011-0000423).

  20. Large-scale direct shear testing of geocell reinforced soil

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    The tests on the shear property of geocell reinforced soils were carried out by using large-scale direct shear equipment with shear-box-dimensions of 500 mm×500 mm×400 mm (length×width×height).Three types of specimens,silty gravel soil,geoceli reinforced silty gravel soil and geoceli reinforood cement stabilizing silty gravel soil were used to investigate the shear stress-displacement behavior,the shear strength and the strengthening mechanism of geocell reinforced soils.The comparisons of large-scale shear test with triaxial compression test for the same type of soil were conducted to evaluate the influences of testing method on the shear strength as well.The test results show that the unreinforced soil and geocell reinforced soil give similar nonlinear features on the behavior of shear stress and displacement.The geocell reinforced cement stabilizing soil has a quasi-elastic characteristic in the case of normal stress coming up to 1.0 GPa.The tests with the reinforcement of geocell result in an increase of 244% in cohesion,and the tests with the geocell and the cement stabilization result in an increase of 10 times in cohesion compared with the unreinforced soil.The friction angle does not change markedly.The geocell reinforcement develops a large amount of cohesion on the shear strength of soils.

  1. Large Scale and Performance tests of the ATLAS Online Software

    Institute of Scientific and Technical Information of China (English)

    Alexandrov; H.Wolters; 等

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system.It encompasses the functionality needed to configure,control and monitor the DAQ.Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal.Resular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system.Feedback is received and returned into the development process.Studies of the system.behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size,Large scale and performance tests of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software.Of particular interest were the run control state transitions in various configurations of the run control hierarchy.For the purpose of the tests,the software from other Trigger/DAQ sub-systems has been emulated.This paper presents a brief overview of the online system structure,its components and the large scale integration tests and their results.

  2. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  3. The large scale magnetic fields of thin accretion disks

    CERN Document Server

    Cao, Xinwu

    2013-01-01

    Large scale magnetic field threading an accretion disk is a key ingredient in the jet formation model. The most attractive scenario for the origin of such a large scale field is the advection of the field by the gas in the accretion disk from the interstellar medium or a companion star. However, it is realized that outward diffusion of the accreted field is fast compared to the inward accretion velocity in a geometrically thin accretion disk if the value of the Prandtl number Pm is around unity. In this work, we revisit this problem considering the angular momentum of the disk is removed predominantly by the magnetically driven outflows. The radial velocity of the disk is significantly increased due to the presence of the outflows. Using a simplified model for the vertical disk structure, we find that even moderately weak fields can cause sufficient angular momentum loss via a magnetic wind to balance outward diffusion. There are two equilibrium points, one at low field strengths corresponding to a plasma-bet...

  4. Evaluating Unmanned Aerial Platforms for Cultural Heritage Large Scale Mapping

    Science.gov (United States)

    Georgopoulos, A.; Oikonomou, C.; Adamopoulos, E.; Stathopoulou, E. K.

    2016-06-01

    When it comes to large scale mapping of limited areas especially for cultural heritage sites, things become critical. Optical and non-optical sensors are developed to such sizes and weights that can be lifted by such platforms, like e.g. LiDAR units. At the same time there is an increase in emphasis on solutions that enable users to get access to 3D information faster and cheaper. Considering the multitude of platforms, cameras and the advancement of algorithms in conjunction with the increase of available computing power this challenge should and indeed is further investigated. In this paper a short review of the UAS technologies today is attempted. A discussion follows as to their applicability and advantages, depending on their specifications, which vary immensely. The on-board cameras available are also compared and evaluated for large scale mapping. Furthermore a thorough analysis, review and experimentation with different software implementations of Structure from Motion and Multiple View Stereo algorithms, able to process such dense and mostly unordered sequence of digital images is also conducted and presented. As test data set, we use a rich optical and thermal data set from both fixed wing and multi-rotor platforms over an archaeological excavation with adverse height variations and using different cameras. Dense 3D point clouds, digital terrain models and orthophotos have been produced and evaluated for their radiometric as well as metric qualities.

  5. Topographically Engineered Large Scale Nanostructures for Plasmonic Biosensing

    Science.gov (United States)

    Xiao, Bo; Pradhan, Sangram K.; Santiago, Kevin C.; Rutherford, Gugu N.; Pradhan, Aswini K.

    2016-04-01

    We demonstrate that a nanostructured metal thin film can achieve enhanced transmission efficiency and sharp resonances and use a large-scale and high-throughput nanofabrication technique for the plasmonic structures. The fabrication technique combines the features of nanoimprint and soft lithography to topographically construct metal thin films with nanoscale patterns. Metal nanogratings developed using this method show significantly enhanced optical transmission (up to a one-order-of-magnitude enhancement) and sharp resonances with full width at half maximum (FWHM) of ~15nm in the zero-order transmission using an incoherent white light source. These nanostructures are sensitive to the surrounding environment, and the resonance can shift as the refractive index changes. We derive an analytical method using a spatial Fourier transformation to understand the enhancement phenomenon and the sensing mechanism. The use of real-time monitoring of protein-protein interactions in microfluidic cells integrated with these nanostructures is demonstrated to be effective for biosensing. The perpendicular transmission configuration and large-scale structures provide a feasible platform without sophisticated optical instrumentation to realize label-free surface plasmon resonance (SPR) sensing.

  6. Large scale and performance tests of the ATLAS online software

    International Nuclear Information System (INIS)

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system. It encompasses the functionality needed to configure, control and monitor the DAQ. Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal. Regular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system. Feedback is received and returned into the development process. Studies of the system behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size. Large scale and performance test of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software. Of particular interest were the run control state transitions in various configurations of the run control hierarchy. For the purpose of the tests, the software from other Trigger/DAQ sub-systems has been emulated. The author presents a brief overview of the online system structure, its components and the large scale integration tests and their results

  7. Maestro: an orchestration framework for large-scale WSN simulations.

    Science.gov (United States)

    Riliskis, Laurynas; Osipov, Evgeny

    2014-01-01

    Contemporary wireless sensor networks (WSNs) have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VM)instances that provide an optimal balance of performance and cost for a given simulation. PMID:24647123

  8. Maestro: An Orchestration Framework for Large-Scale WSN Simulations

    Directory of Open Access Journals (Sweden)

    Laurynas Riliskis

    2014-03-01

    Full Text Available Contemporary wireless sensor networks (WSNs have evolved into large and complex systems and are one of the main technologies used in cyber-physical systems and the Internet of Things. Extensive research on WSNs has led to the development of diverse solutions at all levels of software architecture, including protocol stacks for communications. This multitude of solutions is due to the limited computational power and restrictions on energy consumption that must be accounted for when designing typical WSN systems. It is therefore challenging to develop, test and validate even small WSN applications, and this process can easily consume significant resources. Simulations are inexpensive tools for testing, verifying and generally experimenting with new technologies in a repeatable fashion. Consequently, as the size of the systems to be tested increases, so does the need for large-scale simulations. This article describes a tool called Maestro for the automation of large-scale simulation and investigates the feasibility of using cloud computing facilities for such task. Using tools that are built into Maestro, we demonstrate a feasible approach for benchmarking cloud infrastructure in order to identify cloud Virtual Machine (VMinstances that provide an optimal balance of performance and cost for a given simulation.

  9. Halo detection via large-scale Bayesian inference

    Science.gov (United States)

    Merson, Alexander I.; Jasche, Jens; Abdalla, Filipe B.; Lahav, Ofer; Wandelt, Benjamin; Jones, D. Heath; Colless, Matthew

    2016-08-01

    We present a proof-of-concept of a novel and fully Bayesian methodology designed to detect haloes of different masses in cosmological observations subject to noise and systematic uncertainties. Our methodology combines the previously published Bayesian large-scale structure inference algorithm, HAmiltonian Density Estimation and Sampling algorithm (HADES), and a Bayesian chain rule (the Blackwell-Rao estimator), which we use to connect the inferred density field to the properties of dark matter haloes. To demonstrate the capability of our approach, we construct a realistic galaxy mock catalogue emulating the wide-area 6-degree Field Galaxy Survey, which has a median redshift of approximately 0.05. Application of HADES to the catalogue provides us with accurately inferred three-dimensional density fields and corresponding quantification of uncertainties inherent to any cosmological observation. We then use a cosmological simulation to relate the amplitude of the density field to the probability of detecting a halo with mass above a specified threshold. With this information, we can sum over the HADES density field realisations to construct maps of detection probabilities and demonstrate the validity of this approach within our mock scenario. We find that the probability of successful detection of haloes in the mock catalogue increases as a function of the signal to noise of the local galaxy observations. Our proposed methodology can easily be extended to account for more complex scientific questions and is a promising novel tool to analyse the cosmic large-scale structure in observations.

  10. Ecohydrological modeling for large-scale environmental impact assessment.

    Science.gov (United States)

    Woznicki, Sean A; Nejadhashemi, A Pouyan; Abouali, Mohammad; Herman, Matthew R; Esfahanian, Elaheh; Hamaamin, Yaseen A; Zhang, Zhen

    2016-02-01

    Ecohydrological models are frequently used to assess the biological integrity of unsampled streams. These models vary in complexity and scale, and their utility depends on their final application. Tradeoffs are usually made in model scale, where large-scale models are useful for determining broad impacts of human activities on biological conditions, and regional-scale (e.g. watershed or ecoregion) models provide stakeholders greater detail at the individual stream reach level. Given these tradeoffs, the objective of this study was to develop large-scale stream health models with reach level accuracy similar to regional-scale models thereby allowing for impacts assessments and improved decision-making capabilities. To accomplish this, four measures of biological integrity (Ephemeroptera, Plecoptera, and Trichoptera taxa (EPT), Family Index of Biotic Integrity (FIBI), Hilsenhoff Biotic Index (HBI), and fish Index of Biotic Integrity (IBI)) were modeled based on four thermal classes (cold, cold-transitional, cool, and warm) of streams that broadly dictate the distribution of aquatic biota in Michigan. The Soil and Water Assessment Tool (SWAT) was used to simulate streamflow and water quality in seven watersheds and the Hydrologic Index Tool was used to calculate 171 ecologically relevant flow regime variables. Unique variables were selected for each thermal class using a Bayesian variable selection method. The variables were then used in development of adaptive neuro-fuzzy inference systems (ANFIS) models of EPT, FIBI, HBI, and IBI. ANFIS model accuracy improved when accounting for stream thermal class rather than developing a global model. PMID:26595397

  11. Large scale petroleum reservoir simulation and parallel preconditioning algorithms research

    Institute of Scientific and Technical Information of China (English)

    SUN; Jiachang; CAO; Jianwen

    2004-01-01

    Solving large scale linear systems efficiently plays an important role in a petroleum reservoir simulator, and the key part is how to choose an effective parallel preconditioner. Properly choosing a good preconditioner has been beyond the pure algebraic field. An integrated preconditioner should include such components as physical background, characteristics of PDE mathematical model, nonlinear solving method, linear solving algorithm, domain decomposition and parallel computation. We first discuss some parallel preconditioning techniques, and then construct an integrated preconditioner, which is based on large scale distributed parallel processing, and reservoir simulation-oriented. The infrastructure of this preconditioner contains such famous preconditioning construction techniques as coarse grid correction, constraint residual correction and subspace projection correction. We essentially use multi-step means to integrate totally eight types of preconditioning components in order to give out the final preconditioner. Million-grid cell scale industrial reservoir data were tested on native high performance computers. Numerical statistics and analyses show that this preconditioner achieves satisfying parallel efficiency and acceleration effect.

  12. Development of large-scale functional brain networks in children.

    Directory of Open Access Journals (Sweden)

    Kaustubh Supekar

    2009-07-01

    Full Text Available The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7-9 y and 22 young-adults (ages 19-22 y. Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar "small-world" organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism.

  13. Development of large-scale functional brain networks in children.

    Science.gov (United States)

    Supekar, Kaustubh; Musen, Mark; Menon, Vinod

    2009-07-01

    The ontogeny of large-scale functional organization of the human brain is not well understood. Here we use network analysis of intrinsic functional connectivity to characterize the organization of brain networks in 23 children (ages 7-9 y) and 22 young-adults (ages 19-22 y). Comparison of network properties, including path-length, clustering-coefficient, hierarchy, and regional connectivity, revealed that although children and young-adults' brains have similar "small-world" organization at the global level, they differ significantly in hierarchical organization and interregional connectivity. We found that subcortical areas were more strongly connected with primary sensory, association, and paralimbic areas in children, whereas young-adults showed stronger cortico-cortical connectivity between paralimbic, limbic, and association areas. Further, combined analysis of functional connectivity with wiring distance measures derived from white-matter fiber tracking revealed that the development of large-scale brain networks is characterized by weakening of short-range functional connectivity and strengthening of long-range functional connectivity. Importantly, our findings show that the dynamic process of over-connectivity followed by pruning, which rewires connectivity at the neuronal level, also operates at the systems level, helping to reconfigure and rebalance subcortical and paralimbic connectivity in the developing brain. Our study demonstrates the usefulness of network analysis of brain connectivity to elucidate key principles underlying functional brain maturation, paving the way for novel studies of disrupted brain connectivity in neurodevelopmental disorders such as autism. PMID:19621066

  14. Systematic renormalization of the effective theory of Large Scale Structure

    Science.gov (United States)

    Akbar Abolhasani, Ali; Mirbabayi, Mehrdad; Pajer, Enrico

    2016-05-01

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we show that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to large scale density contrast δ and momentum density π(k) scale as k2 and k, respectively. Third, we demonstrate that the common practice of introducing counterterms only in the Euler equation when one is interested in correlators of δ is indeed valid to all orders.

  15. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  16. Online education in a large scale rehabilitation institution.

    Science.gov (United States)

    Mazzoleni, M Cristina; Rognoni, Carla; Pagani, Marco; Imbriani, Marcello

    2012-01-01

    Large scale multiple venue institutions face problems when delivering educations to their healthcare staff. The present study is aimed at evaluating the feasibility of relying on e-learning for at least part of the training of the Salvatore Maugeri Foundation healthcare staff. The paper reports the results of the delivery of e-learning courses to the personnel during a span of time of 7 months in order to assess the attitude to online courses attendance, the proportion between administered online education and administered traditional education, the economic sustainability of the online education delivery process. 37% of the total healthcare staff have attended online courses and 46% of nurses have proved to be the very active. The ratio between total number of credits and total number of courses for online and traditional education are respectively 18268/5 and 20354/96. These results point out that eLearning is not at all a niche tool used (or usable) by a limited number of people. Economic sustainability, assessed via personnel work hour saving, has been demonstrated. When distance learning is appropriate, online education is an effective, sustainable, well accepted mean to support and promote healthcare staff's education in a large scale institution. PMID:22491113

  17. Detecting differential protein expression in large-scale population proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Soyoung; Qian, Weijun; Camp, David G.; Smith, Richard D.; Tompkins, Ronald G.; Davis, Ronald W.; Xiao, Wenzhong

    2014-06-17

    Mass spectrometry-based high-throughput quantitative proteomics shows great potential in clinical biomarker studies, identifying and quantifying thousands of proteins in biological samples. However, methods are needed to appropriately handle issues/challenges unique to mass spectrometry data in order to detect as many biomarker proteins as possible. One issue is that different mass spectrometry experiments generate quite different total numbers of quantified peptides, which can result in more missing peptide abundances in an experiment with a smaller total number of quantified peptides. Another issue is that the quantification of peptides is sometimes absent, especially for less abundant peptides and such missing values contain the information about the peptide abundance. Here, we propose a Significance Analysis for Large-scale Proteomics Studies (SALPS) that handles missing peptide intensity values caused by the two mechanisms mentioned above. Our model has a robust performance in both simulated data and proteomics data from a large clinical study. Because varying patients’ sample qualities and deviating instrument performances are not avoidable for clinical studies performed over the course of several years, we believe that our approach will be useful to analyze large-scale clinical proteomics data.

  18. Glass badge dosimetry system for large scale personal monitoring

    International Nuclear Information System (INIS)

    Glass Badge using silver activated phosphate glass dosemeter was specially developed for large scale personal monitoring. And dosimetry systems such as an automatic leader and a dose equipment calculation algorithm were developed at once to achieve reasonable personal monitoring. In large scale personal monitoring, both of precision for dosimetry and confidence for lot of personal data handling become very important. The silver activated phosphate glass dosemeter has basically excellent characteristics for dosimetry such as homogeneous and stable sensitivity, negligible fading and so on. Glass Badge was designed to measure 10 keV - 10 MeV range of photon. 300 keV - 3 MeV range of beta, and 0.025 eV - 15 MeV range of neutron by included SSNTD. And developed Glass Badge dosimetry system has not only these basic characteristics but also lot of features to keep good precision for dosimetry and data handling. In this presentation, features of Glass Badge dosimetry systems and examples for practical personal monitoring systems will be presented. (Author)

  19. Building Participation in Large-scale Conservation: Lessons from Belize and Panama

    Directory of Open Access Journals (Sweden)

    Jesse Guite Hastings

    2015-01-01

    Full Text Available Motivated by biogeography and a desire for alignment with the funding priorities of donors, the twenty-first century has seen big international NGOs shifting towards a large-scale conservation approach. This shift has meant that even before stakeholders at the national and local scale are involved, conservation programmes often have their objectives defined and funding allocated. This paper uses the experiences of Conservation International′s Marine Management Area Science (MMAS programme in Belize and Panama to explore how to build participation at the national and local scale while working within the bounds of the current conservation paradigm. Qualitative data about MMAS was gathered through a multi-sited ethnographic research process, utilising document review, direct observation, and semi-structured interviews with 82 informants in Belize, Panama, and the United States of America. Results indicate that while a large-scale approach to conservation disadvantages early national and local stakeholder participation, this effect can be mediated through focusing engagement efforts, paying attention to context, building horizontal and vertical partnerships, and using deliberative processes that promote learning. While explicit consideration of geopolitics and local complexity alongside biogeography in the planning phase of a large-scale conservation programme is ideal, actions taken by programme managers during implementation can still have a substantial impact on conservation outcomes.

  20. A Decentralized Multivariable Robust Adaptive Voltage and Speed Regulator for Large-Scale Power Systems

    Science.gov (United States)

    Okou, Francis A.; Akhrif, Ouassima; Dessaint, Louis A.; Bouchard, Derrick

    2013-05-01

    This papter introduces a decentralized multivariable robust adaptive voltage and frequency regulator to ensure the stability of large-scale interconnnected generators. Interconnection parameters (i.e. load, line and transormer parameters) are assumed to be unknown. The proposed design approach requires the reformulation of conventiaonal power system models into a multivariable model with generator terminal voltages as state variables, and excitation and turbine valve inputs as control signals. This model, while suitable for the application of modern control methods, introduces problems with regards to current design techniques for large-scale systems. Interconnection terms, which are treated as perturbations, do not meet the common matching condition assumption. A new adaptive method for a certain class of large-scale systems is therefore introduces that does not require the matching condition. The proposed controller consists of nonlinear inputs that cancel some nonlinearities of the model. Auxiliary controls with linear and nonlinear components are used to stabilize the system. They compensate unknown parametes of the model by updating both the nonlinear component gains and excitation parameters. The adaptation algorithms involve the sigma-modification approach for auxiliary control gains, and the projection approach for excitation parameters to prevent estimation drift. The computation of the matrix-gain of the controller linear component requires the resolution of an algebraic Riccati equation and helps to solve the perturbation-mismatching problem. A realistic power system is used to assess the proposed controller performance. The results show that both stability and transient performance are considerably improved following a severe contingency.

  1. Improvements in materials and fabrication-large scale production and testing

    Energy Technology Data Exchange (ETDEWEB)

    Peter Kneisel

    1991-08-19

    Several laboratories around the world are presently involved in large scale construction projects utilizing superconducting cavities. The reasons for this multitude of activities are a matured technology based on better understanding of the phenomena encountered in superconducting cavities and the influence of material properties and quality assurance measures. At KEK, a string of 32 five-cell 508 MHz niobium cavities has been assembled for Tristan, and since autumn of 1989, many thousand hours of operations experience exist now with this system. CEBAF has started the production of its nearly 200 m long (338 cavities) superconducting accelerator section. The successful acceleration of electrons to 45 MeV in the front-end-test provided a glimpse into the complexity of such a system. CERN and DESY are also building superconducting cavities on a large scale, and at Darmstadt much experience has been gained with a recirculating linac. At several laboratories plans for future applications of superconducting cavities in B-factories, linear colliders, or free electron lasers exist. This paper gives an overview of the present construction projects and discusses some of the difficulties associated with the transition from a laboratory scale to large scale project.

  2. Nearly incompressible fluids: hydrodynamics and large scale inhomogeneity.

    Science.gov (United States)

    Hunana, P; Zank, G P; Shaikh, D

    2006-08-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as "nearly incompressible hydrodynamics," is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term "locally incompressible" to describe the equations. This term should be distinguished from the term "nearly incompressible," which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  3. The Genetic Etiology of Tourette Syndrome: Large-Scale Collaborative Efforts on the Precipice of Discovery

    Science.gov (United States)

    Georgitsi, Marianthi; Willsey, A. Jeremy; Mathews, Carol A.; State, Matthew; Scharf, Jeremiah M.; Paschou, Peristera

    2016-01-01

    Gilles de la Tourette Syndrome (TS) is a childhood-onset neurodevelopmental disorder that is characterized by multiple motor and phonic tics. It has a complex etiology with multiple genes likely interacting with environmental factors to lead to the onset of symptoms. The genetic basis of the disorder remains elusive. However, multiple resources and large-scale projects are coming together, launching a new era in the field and bringing us on the verge of discovery. The large-scale efforts outlined in this report are complementary and represent a range of different approaches to the study of disorders with complex inheritance. The Tourette Syndrome Association International Consortium for Genetics (TSAICG) has focused on large families, parent-proband trios and cases for large case-control designs such as genomewide association studies (GWAS), copy number variation (CNV) scans, and exome/genome sequencing. TIC Genetics targets rare, large effect size mutations in simplex trios, and multigenerational families. The European Multicentre Tics in Children Study (EMTICS) seeks to elucidate gene-environment interactions including the involvement of infection and immune mechanisms in TS etiology. Finally, TS-EUROTRAIN, a Marie Curie Initial Training Network, aims to act as a platform to unify large-scale projects in the field and to educate the next generation of experts. Importantly, these complementary large-scale efforts are joining forces to uncover the full range of genetic variation and environmental risk factors for TS, holding great promise for identifying definitive TS susceptibility genes and shedding light into the complex pathophysiology of this disorder. PMID:27536211

  4. Advection/diffusion of large scale magnetic field in accretion disks

    Directory of Open Access Journals (Sweden)

    R. V. E. Lovelace

    2009-02-01

    Full Text Available Activity of the nuclei of galaxies and stellar mass systems involving disk accretion to black holes is thought to be due to (1 a small-scale turbulent magnetic field in the disk (due to the magneto-rotational instability or MRI which gives a large viscosity enhancing accretion, and (2 a large-scale magnetic field which gives rise to matter outflows and/or electromagnetic jets from the disk which also enhances accretion. An important problem with this picture is that the enhanced viscosity is accompanied by an enhanced magnetic diffusivity which acts to prevent the build up of a significant large-scale field. Recent work has pointed out that the disk's surface layers are non-turbulent and thus highly conducting (or non-diffusive because the MRI is suppressed high in the disk where the magnetic and radiation pressures are larger than the thermal pressure. Here, we calculate the vertical (z profiles of the stationary accretion flows (with radial and azimuthal components, and the profiles of the large-scale, magnetic field taking into account the turbulent viscosity and diffusivity due to the MRI and the fact that the turbulence vanishes at the surface of the disk. We derive a sixth-order differential equation for the radial flow velocity vr(z which depends mainly on the midplane thermal to magnetic pressure ratio β>1 and the Prandtl number of the turbulence P=viscosity/diffusivity. Boundary conditions at the disk surface take into account a possible magnetic wind or jet and allow for a surface current in the highly conducting surface layer. The stationary solutions we find indicate that a weak (β>1 large-scale field does not diffuse away as suggested by earlier work.

  5. Penetration of Large Scale Electric Field to Inner Magnetosphere

    Science.gov (United States)

    Chen, S. H.; Fok, M. C. H.; Sibeck, D. G.; Wygant, J. R.; Spence, H. E.; Larsen, B.; Reeves, G. D.; Funsten, H. O.

    2015-12-01

    The direct penetration of large scale global electric field to the inner magnetosphere is a critical element in controlling how the background thermal plasma populates within the radiation belts. These plasma populations provide the source of particles and free energy needed for the generation and growth of various plasma waves that, at critical points of resonances in time and phase space, can scatter or energize radiation belt particles to regulate the flux level of the relativistic electrons in the system. At high geomagnetic activity levels, the distribution of large scale electric fields serves as an important indicator of how prevalence of strong wave-particle interactions extend over local times and radial distances. To understand the complex relationship between the global electric fields and thermal plasmas, particularly due to the ionospheric dynamo and the magnetospheric convection effects, and their relations to the geomagnetic activities, we analyze the electric field and cold plasma measurements from Van Allen Probes over more than two years period and simulate a geomagnetic storm event using Coupled Inner Magnetosphere-Ionosphere Model (CIMI). Our statistical analysis of the measurements from Van Allan Probes and CIMI simulations of the March 17, 2013 storm event indicate that: (1) Global dawn-dusk electric field can penetrate the inner magnetosphere inside the inner belt below L~2. (2) Stronger convections occurred in the dusk and midnight sectors than those in the noon and dawn sectors. (3) Strong convections at multiple locations exist at all activity levels but more complex at higher activity levels. (4) At the high activity levels, strongest convections occur in the midnight sectors at larger distances from the Earth and in the dusk sector at closer distances. (5) Two plasma populations of distinct ion temperature isotropies divided at L-Shell ~2, indicating distinct heating mechanisms between inner and outer radiation belts. (6) CIMI

  6. Large-Scale Graphene Film Deposition for Monolithic Device Fabrication

    Science.gov (United States)

    Al-shurman, Khaled

    Since 1958, the concept of integrated circuit (IC) has achieved great technological developments and helped in shrinking electronic devices. Nowadays, an IC consists of more than a million of compacted transistors. The majority of current ICs use silicon as a semiconductor material. According to Moore's law, the number of transistors built-in on a microchip can be double every two years. However, silicon device manufacturing reaches its physical limits. To explain, there is a new trend to shrinking circuitry to seven nanometers where a lot of unknown quantum effects such as tunneling effect can not be controlled. Hence, there is an urgent need for a new platform material to replace Si. Graphene is considered a promising material with enormous potential applications in many electronic and optoelectronics devices due to its superior properties. There are several techniques to produce graphene films. Among these techniques, chemical vapor deposition (CVD) offers a very convenient method to fabricate films for large-scale graphene films. Though CVD method is suitable for large area growth of graphene, the need for transferring a graphene film to silicon-based substrates is required. Furthermore, the graphene films thus achieved are, in fact, not single crystalline. Also, graphene fabrication utilizing Cu and Ni at high growth temperature contaminates the substrate that holds Si CMOS circuitry and CVD chamber as well. So, lowering the deposition temperature is another technological milestone for the successful adoption of graphene in integrated circuits fabrication. In this research, direct large-scale graphene film fabrication on silicon based platform (i.e. SiO2 and Si3N4) at low temperature was achieved. With a focus on low-temperature graphene growth, hot-filament chemical vapor deposition (HF-CVD) was utilized to synthesize graphene film using 200 nm thick nickel film. Raman spectroscopy was utilized to examine graphene formation on the bottom side of the Ni film

  7. Climatological context for large-scale coral bleaching

    Science.gov (United States)

    Barton, A. D.; Casey, K. S.

    2005-12-01

    Large-scale coral bleaching was first observed in 1979 and has occurred throughout virtually all of the tropics since that time. Severe bleaching may result in the loss of live coral and in a decline of the integrity of the impacted coral reef ecosystem. Despite the extensive scientific research and increased public awareness of coral bleaching, uncertainties remain about the past and future of large-scale coral bleaching. In order to reduce these uncertainties and place large-scale coral bleaching in the longer-term climatological context, specific criteria and methods for using historical sea surface temperature (SST) data to examine coral bleaching-related thermal conditions are proposed by analyzing three, 132 year SST reconstructions: ERSST, HadISST1, and GISST2.3b. These methodologies are applied to case studies at Discovery Bay, Jamaica (77.27°W, 18.45°N), Sombrero Reef, Florida, USA (81.11°W, 24.63°N), Academy Bay, Galápagos, Ecuador (90.31°W, 0.74°S), Pearl and Hermes Reef, Northwest Hawaiian Islands, USA (175.83°W, 27.83°N), Midway Island, Northwest Hawaiian Islands, USA (177.37°W, 28.25°N), Davies Reef, Australia (147.68°E, 18.83°S), and North Male Atoll, Maldives (73.35°E, 4.70°N). The results of this study show that (1) The historical SST data provide a useful long-term record of thermal conditions in reef ecosystems, giving important insight into the thermal history of coral reefs and (2) While coral bleaching and anomalously warm SSTs have occurred over much of the world in recent decades, case studies in the Caribbean, Northwest Hawaiian Islands, and parts of other regions such as the Great Barrier Reef exhibited SST conditions and cumulative thermal stress prior to 1979 that were comparable to those conditions observed during the strong, frequent coral bleaching events since 1979. This climatological context and knowledge of past environmental conditions in reef ecosystems may foster a better understanding of how coral reefs will

  8. Sex Trade Involvement in Sao Paulo, Brazil and Toronto, Canada: Narratives of Social Exclusion and Fragmented Identities

    Science.gov (United States)

    Kidd, Sean A.; Liborio, Renata Maria Coimbra

    2011-01-01

    An extensive international literature has been developed regarding the risk trajectories of sex trade-involved children and youth. This literature has not, however, substantially incorporated the narratives of youths regarding their experiences. In this article, the contemporary literature on child and youth sex trade-involvement is reviewed and…

  9. Collaboration and involvement of persons with lived experience in planning Canada's At Home/Chez Soi project.

    Science.gov (United States)

    Nelson, Geoffrey; Macnaughton, Eric; Curwood, Susan Eckerle; Egalité, Nathalie; Voronka, Jijian; Fleury, Marie-Josée; Kirst, Maritt; Flowers, Linsay; Patterson, Michelle; Dudley, Michael; Piat, Myra; Goering, Paula

    2016-03-01

    Planning the implementation of evidence-based mental health services entails commitment to both rigour and community relevance, which entails navigating the challenges of collaboration between professionals and community members in a planning environment which is neither 'top-down' nor 'bottom-up'. This research focused on collaboration among different stakeholders (e.g. researchers, service-providers, persons with lived experience [PWLE]) at five project sites across Canada in the planning of At Home/Chez Soi, a Housing First initiative for homeless people with mental health problems. The research addressed the question of what strategies worked well or less well in achieving successful collaboration, given the opportunities and challenges within this complex 'hybrid' planning environment. Using qualitative methods, 131 local stakeholders participated in key informant or focus group interviews between October 2009 and February 2010. Site researchers identified themes in the data, using the constant comparative method. Strategies that enhanced collaboration included the development of a common vision, values and purpose around the Housing First approach, developing a sense of belonging and commitment among stakeholders, bridging strategies employed by Site Co-ordinators and multiple strategies to engage PWLE. At the same time, a tight timeline, initial tensions, questions and resistance regarding project and research parameters, and lack of experience in engaging PWLE challenged collaboration. In a hybrid planning environment, clear communication and specific strategies are required that flow from an understanding that the process is neither fully participatory nor expert-driven, but rather a hybrid of both. PMID:25689287

  10. Using Large Scale Test Results for Pedagogical Purposes

    DEFF Research Database (Denmark)

    Dolin, Jens

    2012-01-01

    educational system and the different theoretical foundations of PISA and most teachers’ pedagogically oriented, formative assessment, thus explaining the teacher resentment towards LSTs. Finally, some principles for linking LSTs to teachers’ pedagogical practice will be presented.......The use and influence of large scale tests (LST), both national and international, has increased dramatically within the last decade. This process has revealed a tension between the legitimate need for information about the performance of the educational system and teachers to inform policy, and...... the teachers’ and students’ use of this information for pedagogical purposes in the classroom. We know well how the policy makers interpret and use the outcomes of such tests, but we know less about how teachers make use of LSTs to inform their pedagogical practice. An important question is whether...

  11. Including investment risk in large-scale power market models

    DEFF Research Database (Denmark)

    Lemming, Jørgen Kjærgaard; Meibom, P.

    2003-01-01

    Long-term energy market models can be used to examine investments in production technologies, however, with market liberalisation it is crucial that such models include investment risks and investor behaviour. This paper analyses how the effect of investment risk on production technology selection...... can be included in large-scale partial equilibrium models of the power market. The analyses are divided into a part about risk measures appropriate for power market investors and a more technical part about the combination of a risk-adjustment model and a partial-equilibrium model. To illustrate the...... analyses quantitatively, a framework based on an iterative interaction between the equilibrium model and a separate risk-adjustment module was constructed. To illustrate the features of the proposed modelling approach we examined how uncertainty in demand and variable costs affects the optimal choice of...

  12. On the Hyperbolicity of Large-Scale Networks

    CERN Document Server

    Kennedy, W Sean; Saniee, Iraj

    2013-01-01

    Through detailed analysis of scores of publicly available data sets corresponding to a wide range of large-scale networks, from communication and road networks to various forms of social networks, we explore a little-studied geometric characteristic of real-life networks, namely their hyperbolicity. In smooth geometry, hyperbolicity captures the notion of negative curvature; within the more abstract context of metric spaces, it can be generalized as d-hyperbolicity. This generalized definition can be applied to graphs, which we explore in this report. We provide strong evidence that communication and social networks exhibit this fundamental property, and through extensive computations we quantify the degree of hyperbolicity of each network in comparison to its diameter. By contrast, and as evidence of the validity of the methodology, applying the same methods to the road networks shows that they are not hyperbolic, which is as expected. Finally, we present practical computational means for detection of hyperb...

  13. Large Scale 3D Image Reconstruction in Optical Interferometry

    CERN Document Server

    Schutz, Antony; Mary, David; Thiébaut, Eric; Soulez, Ferréol

    2015-01-01

    Astronomical optical interferometers (OI) sample the Fourier transform of the intensity distribution of a source at the observation wavelength. Because of rapid atmospheric perturbations, the phases of the complex Fourier samples (visibilities) cannot be directly exploited , and instead linear relationships between the phases are used (phase closures and differential phases). Consequently, specific image reconstruction methods have been devised in the last few decades. Modern polychromatic OI instruments are now paving the way to multiwavelength imaging. This paper presents the derivation of a spatio-spectral ("3D") image reconstruction algorithm called PAINTER (Polychromatic opticAl INTErferometric Reconstruction software). The algorithm is able to solve large scale problems. It relies on an iterative process, which alternates estimation of polychromatic images and of complex visibilities. The complex visibilities are not only estimated from squared moduli and closure phases, but also from differential phase...

  14. Large-Scale Quantitative Analysis of Painting Arts

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-01-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images – the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances. PMID:25501877

  15. Large-Scale Quantitative Analysis of Painting Arts

    Science.gov (United States)

    Kim, Daniel; Son, Seung-Woo; Jeong, Hawoong

    2014-12-01

    Scientists have made efforts to understand the beauty of painting art in their own languages. As digital image acquisition of painting arts has made rapid progress, researchers have come to a point where it is possible to perform statistical analysis of a large-scale database of artistic paints to make a bridge between art and science. Using digital image processing techniques, we investigate three quantitative measures of images - the usage of individual colors, the variety of colors, and the roughness of the brightness. We found a difference in color usage between classical paintings and photographs, and a significantly low color variety of the medieval period. Interestingly, moreover, the increment of roughness exponent as painting techniques such as chiaroscuro and sfumato have advanced is consistent with historical circumstances.

  16. Spatial solitons in photonic lattices with large-scale defects

    Institute of Scientific and Technical Information of China (English)

    Yang Xiao-Yu; Zheng Jiang-Bo; Dong Liang-Wei

    2011-01-01

    We address the existence, stability and propagation dynamics of solitons supported by large-scale defects surrounded by the harmonic photonic lattices imprinted in the defocusing saturable nonlinear medium. Several families of soliton solutions, including flat-topped, dipole-like, and multipole-like solitons, can be supported by the defected lattices with different heights of defects. The width of existence domain of solitons is determined solely by the saturable parameter. The existence domains of various types of solitons can be shifted by the variations of defect size, lattice depth and soliton order. Solitons in the model are stable in a wide parameter window, provided that the propagation constant exceeds a critical value, which is in sharp contrast to the case where the soliton trains is supported by periodic lattices imprinted in defocusing saturable nonlinear medium. We also find stable solitons in the semi-infinite gap which rarely occur in the defocusing media.

  17. Matrix-free Large Scale Bayesian inference in cosmology

    CERN Document Server

    Jasche, Jens

    2014-01-01

    In this work we propose a new matrix-free implementation of the Wiener sampler which is traditionally applied to high dimensional analysis when signal covariances are unknown. Specifically, the proposed method addresses the problem of jointly inferring a high dimensional signal and its corresponding covariance matrix from a set of observations. Our method implements a Gibbs sampling adaptation of the previously presented messenger approach, permitting to cast the complex multivariate inference problem into a sequence of uni-variate random processes. In this fashion, the traditional requirement of inverting high dimensional matrices is completely eliminated from the inference process, resulting in an efficient algorithm that is trivial to implement. Using cosmic large scale structure data as a showcase, we demonstrate the capabilities of our Gibbs sampling approach by performing a joint analysis of three dimensional density fields and corresponding power-spectra from Gaussian mock catalogues. These tests clear...

  18. Towards large-scale plasma-assisted synthesis of nanowires

    International Nuclear Information System (INIS)

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  19. Large-scale characterization of the murine cardiac proteome.

    Science.gov (United States)

    Cosme, Jake; Emili, Andrew; Gramolini, Anthony O

    2013-01-01

    Cardiomyopathies are diseases of the heart that result in impaired cardiac muscle function. This dysfunction can progress to an inability to supply blood to the body. Cardiovascular diseases play a large role in overall global morbidity. Investigating the protein changes in the heart during disease can uncover pathophysiological mechanisms and potential therapeutic targets. Establishing a global protein expression "footprint" can facilitate more targeted studies of diseases of the heart.In the technical review presented here, we present methods to elucidate the heart's proteome through subfractionation of the cellular compartments to reduce sample complexity and improve detection of lower abundant proteins during multidimensional protein identification technology analysis. Analysis of the cytosolic, microsomal, and mitochondrial subproteomes separately in order to characterize the murine cardiac proteome is advantageous by simplifying complex cardiac protein mixtures. In combination with bioinformatic analysis and genome correlation, large-scale protein changes can be identified at the cellular compartment level in this animal model. PMID:23606244

  20. Unfolding large-scale online collaborative human dynamics

    CERN Document Server

    Zha, Yilong; Zhou, Changsong

    2015-01-01

    Large-scale interacting human activities underlie all social and economic phenomena, but quantitative understanding of regular patterns and mechanism is very challenging and still rare. Self-organized online collaborative activities with precise record of event timing provide unprecedented opportunity. Our empirical analysis of the history of millions of updates in Wikipedia shows a universal double power-law distribution of time intervals between consecutive updates of an article. We then propose a generic model to unfold collaborative human activities into three modules: (i) individual behavior characterized by Poissonian initiation of an action, (ii) human interaction captured by a cascading response to others with a power-law waiting time, and (iii) population growth due to increasing number of interacting individuals. This unfolding allows us to obtain analytical formula that is fully supported by the universal patterns in empirical data. Our modeling approaches reveal "simplicity" beyond complex interac...

  1. Planck intermediate results. XLII. Large-scale Galactic magnetic fields

    CERN Document Server

    Adam, R; Alves, M I R; Ashdown, M; Aumont, J; Baccigalupi, C; Banday, A J; Barreiro, R B; Bartolo, N; Battaner, E; Benabed, K; Benoit-Lévy, A; Bernard, J -P; Bersanelli, M; Bielewicz, P; Bonavera, L; Bond, J R; Borrill, J; Bouchet, F R; Boulanger, F; Bucher, M; Burigana, C; Butler, R C; Calabrese, E; Cardoso, J -F; Catalano, A; Chiang, H C; Christensen, P R; Colombo, L P L; Combet, C; Couchot, F; Crill, B P; Curto, A; Cuttaia, F; Danese, L; Davis, R J; de Bernardis, P; de Rosa, A; de Zotti, G; Delabrouille, J; Dickinson, C; Diego, J M; Dolag, K; Doré, O; Ducout, A; Dupac, X; Elsner, F; Enßlin, T A; Eriksen, H K; Ferrière, K; Finelli, F; Forni, O; Frailis, M; Fraisse, A A; Franceschi, E; Galeotta, S; Ganga, K; Ghosh, T; Giard, M; Gjerløw, E; González-Nuevo, J; Górski, K M; Gregorio, A; Gruppuso, A; Gudmundsson, J E; Hansen, F K; Harrison, D L; Hernández-Monteagudo, C; Herranz, D; Hildebrandt, S R; Hobson, M; Hornstrup, A; Hurier, G; Jaffe, A H; Jaffe, T R; Jones, W C; Juvela, M; Keihänen, E; Keskitalo, R; Kisner, T S; Knoche, J; Kunz, M; Kurki-Suonio, H; Lamarre, J -M; Lasenby, A; Lattanzi, M; Lawrence, C R; Leahy, J P; Leonardi, R; Levrier, F; Lilje, P B; Linden-Vørnle, M; López-Caniego, M; Lubin, P M; Macías-Pérez, J F; Maggio, G; Maino, D; Mandolesi, N; Mangilli, A; Maris, M; Martin, P G; Masi, S; Melchiorri, A; Mennella, A; Migliaccio, M; Miville-Deschênes, M -A; Moneti, A; Montier, L; Morgante, G; Munshi, D; Murphy, J A; Naselsky, P; Nati, F; Natoli, P; Nørgaard-Nielsen, H U; Oppermann, N; Orlando, E; Pagano, L; Pajot, F; Paladini, R; Paoletti, D; Pasian, F; Perotto, L; Pettorino, V; Piacentini, F; Piat, M; Pierpaoli, E; Plaszczynski, S; Pointecouteau, E; Polenta, G; Ponthieu, N; Pratt, G W; Prunet, S; Puget, J -L; Rachen, J P; Reinecke, M; Remazeilles, M; Renault, C; Renzi, A; Ristorcelli, I; Rocha, G; Rossetti, M; Roudier, G; Rubiño-Martín, J A; Rusholme, B; Sandri, M; Santos, D; Savelainen, M; Scott, D; Spencer, L D; Stolyarov, V; Stompor, R; Strong, A W; Sudiwala, R; Sunyaev, R; Suur-Uski, A -S; Sygnet, J -F; Tauber, J A; Terenzi, L; Toffolatti, L; Tomasi, M; Tristram, M; Tucci, M; Valenziano, L; Valiviita, J; Van Tent, B; Vielva, P; Villa, F; Wade, L A; Wandelt, B D; Wehus, I K; Yvon, D; Zacchei, A; Zonca, A

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature were largely constrained by synchrotron emission and Faraday rotation measures. We select three different but representative models and compare their predicted polarized synchrotron and dust emission with that measured by the Planck satellite. We first update these models to match the Planck synchrotron products using a common model for the cosmic-ray leptons. We discuss the impact on this analysis of the ongoing problems of component separation in the Planck microwave bands and of the uncertain cosmic-ray spectrum. In particular, the inferred degree of ordering in the magnetic fields is sensitive to these systematic uncertainties. We then compare the resulting simulated emission to the observed dust emission and find that the dust predictions do not match the morphology in the Planck data, particularly the vertical profile in latitude. We show how the dust data can then be used to further improve these magnetic field models, particu...

  2. An optimal design methodology for large-scale gas liquefaction

    International Nuclear Information System (INIS)

    Highlights: ► Configuration selection and parametric optimization carried out simultaneously for gas liquefaction systems. ► Effective Heat Transfer Factor proposed to indicate the performance of heat exchanger networks. ► Relatively high exergy efficiency of liquefaction process achievable under some general assumptions. -- Abstract: This paper presents an optimization methodology for thermodynamic design of large scale gas liquefaction systems. Such a methodology enables configuration selection and parametric optimization to be implemented simultaneously. Exergy efficiency and genetic algorithm have been chosen as an evaluation index and an evaluation criterion, respectively. The methodology has been applied to the design of expander cycle based liquefaction processes. Liquefaction processes of hydrogen, methane and nitrogen are selected as case studies and the simulation results show that relatively high exergy efficiencies (52% for hydrogen and 58% for methane and nitrogen) are achievable based on very general consumptions.

  3. Robust Failure Detection Architecture for Large Scale Distributed Systems

    CERN Document Server

    Dobre, Ciprian Mihai; Costan, Alexandru; Andreica, Mugurel Ionut; Cristea, Valentin

    2009-01-01

    Failure detection is a fundamental building block for ensuring fault tolerance in large scale distributed systems. There are lots of approaches and implementations in failure detectors. Providing flexible failure detection in off-the-shelf distributed systems is difficult. In this paper we present an innovative solution to this problem. Our approach is based on adaptive, decentralized failure detectors, capable of working asynchronous and independent on the application flow. The proposed solution considers an architecture for the failure detectors, based on clustering, the use of a gossip-based algorithm for detection at local level and the use of a hierarchical structure among clusters of detectors along which traffic is channeled. The solution can scale to a large number of nodes, considers the QoS requirements of both applications and resources, and includes fault tolerance and system orchestration mechanisms, added in order to asses the reliability and availability of distributed systems.

  4. Applications of large-scale density functional theory in biology.

    Science.gov (United States)

    Cole, Daniel J; Hine, Nicholas D M

    2016-10-01

    Density functional theory (DFT) has become a routine tool for the computation of electronic structure in the physics, materials and chemistry fields. Yet the application of traditional DFT to problems in the biological sciences is hindered, to a large extent, by the unfavourable scaling of the computational effort with system size. Here, we review some of the major software and functionality advances that enable insightful electronic structure calculations to be performed on systems comprising many thousands of atoms. We describe some of the early applications of large-scale DFT to the computation of the electronic properties and structure of biomolecules, as well as to paradigmatic problems in enzymology, metalloproteins, photosynthesis and computer-aided drug design. With this review, we hope to demonstrate that first principles modelling of biological structure-function relationships are approaching a reality. PMID:27494095

  5. Tidal power plant may develop into large-scale industry

    International Nuclear Information System (INIS)

    Hammerfest was the first city in Norway with hydroelectric power production and the first city in Northern Europe to have electric street lights. Recently, technologists within the city's electricity supply industry have suggested that Hammerfest should pioneer the field of tidal energy. The idea is to create a new Norwegian large-scale industry. The technology is being developed by the company Hammerfest Stroem. A complete plant is planned to be installed in Kvalsundet. It will include turbine, generator, converters, transmission to land and delivery to the network. Once fully developed, in 2004, the plant will be sold. The company expects to install similar plants elsewhere in Norway and abroad. It is calculated that for a tidewater current of 2.5 m/s, the worldwide potential is about 450 TWh

  6. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore;

    2008-01-01

    techniques and support input rates that can be orders of magnitude larger. The following three contributions make the grouping algorithms scalable. First, the basic grouping algorithm is expressed as a continuous stream query in a data stream management system to allow for a very large flow of requests......Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost and...... utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  7. Large-scale Structure in f(T) Gravity

    CERN Document Server

    Li, Baojiu; Barrow, John D

    2011-01-01

    In this work we study the cosmology of the general f(T) gravity theory. We express the modified Einstein equations using covariant quantities, and derive the gauge-invariant perturbation equations in covariant form. We consider a specific choice of f(T), designed to explain the observed late-time accelerating cosmic expansion without including an exotic dark energy component. Our numerical solution shows that the extra degree of freedom of such f(T) gravity models generally decays as one goes to smaller scales, and consequently its effects on scales such as galaxies and galaxies clusters are small. But on large scales, this degree of freedom can produce large deviations from the standard LCDM scenario, leading to severe constraints on the f(T) gravity models as an explanation to the cosmic acceleration.

  8. Preliminary design study of a large scale graphite oxidation loop

    International Nuclear Information System (INIS)

    A preliminary design study of a large scale graphite oxidation loop was performed in order to assess feasibility and to estimate capital costs. The nominal design operates at 50 atmospheres helium and 1800 F with a graphite specimen 30 inches long and 10 inches in diameter. It was determined that a simple single walled design was not practical at this time because of a lack of commercially available thick walled high temperature alloys. Two alternative concepts, at reduced operating pressure, were investigated. Both were found to be readily fabricable to operate at 1800 F and capital cost estimates for these are included. A design concept, which is outside the scope of this study, was briefly considered

  9. Large-scale dynamic compaction of natural salt

    International Nuclear Information System (INIS)

    A large-scale dynamic compaction demonstration of natural salt was successfully completed. About 40 m3 of salt were compacted in three, 2-m lifts by dropping a 9,000-kg weight from a height of 15 m in a systematic pattern to achieve desired compaction energy. To enhance compaction, 1 wt% water was added to the relatively dry mine-run salt. The average compacted mass fractional density was 0.90 of natural intact salt, and in situ nitrogen permeabilities averaged 9X10-14m2. This established viability of dynamic compacting for placing salt shaft seal components. The demonstration also provided compacted salt parameters needed for shaft seal system design and performance assessments of the Waste Isolation Pilot Plant

  10. Large scale magnetic fields in galaxies at high redshifts

    Science.gov (United States)

    Bernet, M. L.; Miniati, F.; Lilly, S. J.; Kronberg, P. P.; Dessauges-Zavadsky, M.

    2012-09-01

    In a recent study we have used a large sample of extragalactic radio sources to investigate the redshift evolution of the Rotation Measure (RM) of polarized quasars up to z ≈ 3.0. We found that the dispersion in the RM distribution of quasars increases at higher redshifts and hypothesized that MgII intervening systems were responsible for the observed trend. To test this hypothesis, we have recently obtained high-resolution UVES/VLT spectra for 76 quasars in our sample and in the redshift range 0.6 < z < 2.0. We found a clear correlation between the presence of strong MgII systems and large RMs. This implies that normal galaxies at z ≈ 1 already had large-scale magnetic fields comparable to those seen today.

  11. Alignment of quasar polarizations with large-scale structures

    CERN Document Server

    Hutsemékers, Damien; Pelgrims, Vincent; Sluse, Dominique

    2014-01-01

    We have measured the optical linear polarization of quasars belonging to Gpc-scale quasar groups at redshift z ~ 1.3. Out of 93 quasars observed, 19 are significantly polarized. We found that quasar polarization vectors are either parallel or perpendicular to the directions of the large-scale structures to which they belong. Statistical tests indicate that the probability that this effect can be attributed to randomly oriented polarization vectors is of the order of 1%. We also found that quasars with polarization perpendicular to the host structure preferentially have large emission line widths while objects with polarization parallel to the host structure preferentially have small emission line widths. Considering that quasar polarization is usually either parallel or perpendicular to the accretion disk axis depending on the inclination with respect to the line of sight, and that broader emission lines originate from quasars seen at higher inclinations, we conclude that quasar spin axes are likely parallel ...

  12. SIMON: Remote collaboration system based on large scale simulation

    International Nuclear Information System (INIS)

    Development of SIMON (SImulation MONitoring) system is described. SIMON aims to investigate many physical phenomena of tokamak type nuclear fusion plasma by simulation and to exchange information and to carry out joint researches with scientists in the world using internet. The characteristics of SIMON are followings; 1) decrease load of simulation by trigger sending method, 2) visualization of simulation results and hierarchical structure of analysis, 3) decrease of number of license by using command line when software is used, 4) improvement of support for using network of simulation data output by use of HTML (Hyper Text Markup Language), 5) avoidance of complex built-in work in client part and 6) small-sized and portable software. The visualization method of large scale simulation, remote collaboration system by HTML, trigger sending method, hierarchical analytical method, introduction into three-dimensional electromagnetic transportation code and technologies of SIMON system are explained. (S.Y.)

  13. Large-scale anisotropy in stably stratified rotating flows

    CERN Document Server

    Marino, R; Rosenberg, D L; Pouquet, A

    2014-01-01

    We present results from direct numerical simulations of the Boussinesq equations in the presence of rotation and/or stratification, both in the vertical direction. The runs are forced isotropically and randomly at small scales and have spatial resolutions of up to $1024^3$ grid points and Reynolds numbers of $\\approx 1000$. We first show that solutions with negative energy flux and inverse cascades develop in rotating turbulence, whether or not stratification is present. However, the purely stratified case is characterized instead by an early-time, highly anisotropic transfer to large scales with almost zero net isotropic energy flux. This is consistent with previous studies that observed the development of vertically sheared horizontal winds, although only at substantially later times. However, and unlike previous works, when sufficient scale separation is allowed between the forcing scale and the domain size, the total energy displays a perpendicular (horizontal) spectrum with power law behavior compatible ...

  14. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 105 and 108 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  15. Chirping for large-scale maritime archaeological survey

    DEFF Research Database (Denmark)

    Grøn, Ole; Boldreel, Lars Ole

    2014-01-01

    -resolution subbottom profilers. This paper presents a strategy for cost-effective, large-scale mapping of previously undetected sediment-embedded sites and wrecks based on subbottom profiling with chirp systems. The mapping strategy described includes (a) definition of line spacing depending on the target; (b......) interactive surveying, for example, immediate detailed investigation of potential archaeological anomalies on detection with a denser pattern of subbottom survey lines; (c) onboard interpretation during data acquisition; (d) recognition of nongeological anomalies. Consequently, this strategy differs from...... those employed in several detailed studies of known wreck sites and from the way in which geologists map the sea floor and the geological column beneath it. The strategy has been developed on the basis of extensive practical experience gained during the use of an off-the-shelf 2D chirp system and, given...

  16. Recovery Act - Large Scale SWNT Purification and Solubilization

    Energy Technology Data Exchange (ETDEWEB)

    Michael Gemano; Dr. Linda B. McGown

    2010-10-07

    The goal of this Phase I project was to establish a quantitative foundation for development of binary G-gels for large-scale, commercial processing of SWNTs and to develop scientific insight into the underlying mechanisms of solubilization, selectivity and alignment. In order to accomplish this, we performed systematic studies to determine the effects of G-gel composition and experimental conditions that will enable us to achieve our goals that include (1) preparation of ultra-high purity SWNTs from low-quality, commercial SWNT starting materials, (2) separation of MWNTs from SWNTs, (3) bulk, non-destructive solubilization of individual SWNTs in aqueous solution at high concentrations (10-100 mg/mL) without sonication or centrifugation, (4) tunable enrichment of subpopulations of the SWNTs based on metallic vs. semiconductor properties, diameter, or chirality and (5) alignment of individual SWNTs.

  17. Large-scale radio structure of R Aquarii

    International Nuclear Information System (INIS)

    Radio continuum observations of the R Aqr symbiotic star system, using the compact D configuration of the VLA at 6-cm wavelength, reveal a large-scale about 2-arcmin structure engulfing the binary, which has long been known to have a similar optical nebula. This optical/radio nebula possesses about 4 x 10 to the 42nd ergs of kinetic energy which is typical of a recurrent nova outburst. Moreover, a cluster of a dozen additional 6-cm radio sources were observed in proximity to R Aqr, most of these discrete sources lie about 3 arcmin south and/or west of R Aqr and, coupled with previous 20-cm data, spectral indices limits suggest a thermal nature for some of these sources. If the thermal members of the cluster are associated with R Aqr, it may indicate a prehistoric eruption of the system's suspected recurrent nova. The nonthermal cluster members may be extragalactic background radio sources. 15 references

  18. The large-scale radio structure of R Aquarii

    Science.gov (United States)

    Hollis, J. M.; Michalitsianos, A. G.; Oliversen, R. J.; Yusef-Zadeh, F.; Kafatos, M.

    1987-01-01

    Radio continuum observations of the R Aqr symbiotic star system, using the compact D configuration of the VLA at 6-cm wavelength, reveal a large-scale about 2-arcmin structure engulfing the binary, which has long been known to have a similar optical nebula. This optical/radio nebula possesses about 4 x 10 to the 42nd ergs of kinetic energy which is typical of a recurrent nova outburst. Moreover, a cluster of a dozen additional 6-cm radio sources were observed in proximity to R Aqr, most of these discrete sources lie about 3 arcmin south and/or west of R Aqr and, coupled with previous 20-cm data, spectral indices limits suggest a thermal nature for some of these sources. If the thermal members of the cluster are associated with R Aqr, it may indicate a prehistoric eruption of the system's suspected recurrent nova. The nonthermal cluster members may be extragalactic background radio sources.

  19. Large scale production of wood chips for fuel

    International Nuclear Information System (INIS)

    The paper is based on the results of the national Wood Energy Technology Programme in 1999 - 2004 and the practical experiences of forest fuel production organizations in Finland. Traditionally, the major barriers to the large-scale use of forest residues for fuel are high cost of production, unsatisfactory fuel quality and unreliable supply. To overcome the barriers, the supply system must be integrated with the existing timber procurement organizations of the forest industries, procurement logistics must be refined, productivity of work must be improved through machine and system development and through learning, and the receiving and handling of chips at a plant must be adapted to wood fuels of variable quality. When the special requirements are met, wood chips are a viable and environmentally friendly fuel for large heating and CHP plants. (author)

  20. Structural Quality of Service in Large-Scale Networks

    DEFF Research Database (Denmark)

    Pedersen, Jens Myrup

    , telephony and data. To meet the requirements of the different applications, and to handle the increased vulnerability to failures, the ability to design robust networks providing good Quality of Service is crucial. However, most planning of large-scale networks today is ad-hoc based, leading to highly...... complex networks lacking predictability and global structural properties. The thesis applies the concept of Structural Quality of Service to formulate desirable global properties, and it shows how regular graph structures can be used to obtain such properties.......Digitalization has created the base for co-existence and convergence in communications, leading to an increasing use of multi service networks. This is for example seen in the Fiber To The Home implementations, where a single fiber is used for virtually all means of communication, including TV...

  1. How smooth is the Universe on large scales?

    CERN Document Server

    Wu, K K S; Rees, Martin J; Wu, Kelvin K. S.; Lahav, Ofer; Rees, Martin J.

    1998-01-01

    New measurements of galaxy clustering and background radiations can provide improved constraints on the isotropy and homogeneity of the Universe on scales larger than 100 $h^{-1}$ Mpc. In particular, the angular distribution of radio sources and the X-Ray Background probe density fluctuations on scales intermediate between those explored by galaxy surveys and Cosmic Microwave Background experiments. On scales larger than 300 $h^{-1}$ Mpc the distribution of both mass and luminous sources satisfies well the `Cosmological Principle' of isotropy and homogeneity. Although the fractal dimension of the galaxy distribution on scales $\\lta 20 \\Mpc$ is $D_2 \\approx 1.2-2.2$, the fluctuations in the X-ray Background and in the Cosmic Microwave Background are consistent with $D_2=3$ to within $10^{-4}$ on the very large scales. We also discuss limits on non-Gaussian fluctuations.

  2. Morphological fluctuations of large-scale structure the PSCz survey

    CERN Document Server

    Kerscher, M; Schmalzing, J; Beisbart, C; Buchert, T; Wagner, H

    2001-01-01

    In a follow-up study to a previous analysis of the IRAS 1.2Jy catalogue, we quantify the morphological fluctuations in the PSCz survey. We use a variety of measures, among them the family of scalar Minkowski functionals. We confirm the existence of significant fluctuations that are discernible in volume-limited samples out to 200Mpc/h. In contrast to earlier findings, comparisons with cosmological N-body simulations reveal that the observed fluctuations roughly agree with the cosmic variance found in corresponding mock samples. While two-point measures, e.g. the variance of count-in-cells, fluctuate only mildly, the fluctuations in the morphology on large scales indicate the presence of coherent structures that are at least as large as the sample.

  3. Statistics of Caustics in Large-Scale Structure Formation

    CERN Document Server

    Feldbrugge, Job; van de Weygaert, Rien

    2014-01-01

    The cosmic web is a complex spatial pattern of walls, filaments, cluster nodes and underdense void regions. It emerged through gravitational amplification from the Gaussian primordial density field. Here we infer analytical expressions for the spatial statistics of caustics in the evolving large-scale mass distribution. In our analysis, following the quasi-linear Zeldovich formalism and confined to the 1D and 2D situation, we compute number density and correlation properties of caustics in cosmic density fields that evolve from Gaussian primordial conditions. The analysis can be straightforwardly extended to the 3D situation. We moreover, are currently extending the approach to the non-linear regime of structure formation by including higher order Lagrangian approximations and Lagrangian effective field theory.

  4. Experimental Investigation of Large-Scale Bubbly Plumes

    International Nuclear Information System (INIS)

    Carefully planned and instrumented experiments under well-defined boundary conditions have been carried out on large-scale, isothermal, bubbly plumes. The data obtained is meant to validate newly developed, high-resolution numerical tools for 3D transient, two-phase flow modelling. Several measurement techniques have been utilised to collect data from the experiments: particle image velocimetry, optical probes, electromagnetic probes, and visualisation. Bubble and liquid velocity fields, void-fraction distributions, bubble size and interfacial-area-concentration distributions have all been measured in the plume region, as well as recirculation velocities in the surrounding pool. The results obtained from the different measurement techniques have been compared. In general, the two-phase flow data obtained from the different techniques are found to be consistent, and of high enough quality for validating numerical simulation tools for 3D bubbly flows. (author)

  5. Large-scale quantum networks based on graphs

    Science.gov (United States)

    Epping, Michael; Kampermann, Hermann; Bruß, Dagmar

    2016-05-01

    Society relies and depends increasingly on information exchange and communication. In the quantum world, security and privacy is a built-in feature for information processing. The essential ingredient for exploiting these quantum advantages is the resource of entanglement, which can be shared between two or more parties. The distribution of entanglement over large distances constitutes a key challenge for current research and development. Due to losses of the transmitted quantum particles, which typically scale exponentially with the distance, intermediate quantum repeater stations are needed. Here we show how to generalise the quantum repeater concept to the multipartite case, by describing large-scale quantum networks, i.e. network nodes and their long-distance links, consistently in the language of graphs and graph states. This unifying approach comprises both the distribution of multipartite entanglement across the network, and the protection against errors via encoding. The correspondence to graph states also provides a tool for optimising the architecture of quantum networks.

  6. Large scale cosmic-ray anisotropy with KASCADE

    CERN Document Server

    Antoni, T; Badea, A F; Bekk, K; Bercuci, A; Blümer, H; Bozdog, H; Brancus, I M; Büttner, C; Daumiller, K; Doll, P; Engel, R; Engler, J; Fessler, F; Gils, H J; Glasstetter, R; Haungs, A; Heck, D; Hörandel, J R; Kampert, K H; Klages, H O; Maier, G; Mathes, H J; Mayer, H J; Milke, J; Müller, M; Obenland, R; Oehlschläger, J; Ostapchenko, S; Petcu, M; Rebel, H; Risse, A; Risse, M; Roth, M; Schatz, G; Schieler, H; Scholz, J; Thouw, T; Ulrich, H; Van, J; Buren; Vardanyan, A S; Weindl, A; Wochele, J; Zabierowski, J

    2004-01-01

    The results of an analysis of the large scale anisotropy of cosmic rays in the PeV range are presented. The Rayleigh formalism is applied to the right ascension distribution of extensive air showers measured by the KASCADE experiment.The data set contains about 10^8 extensive air showers in the energy range from 0.7 to 6 PeV. No hints for anisotropy are visible in the right ascension distributions in this energy range. This accounts for all showers as well as for subsets containing showers induced by predominantly light respectively heavy primary particles. Upper flux limits for Rayleigh amplitudes are determined to be between 10^-3 at 0.7 PeV and 10^-2 at 6 PeV primary energy.

  7. Optimal Wind Energy Integration in Large-Scale Electric Grids

    Science.gov (United States)

    Albaijat, Mohammad H.

    The major concern in electric grid operation is operating under the most economical and reliable fashion to ensure affordability and continuity of electricity supply. This dissertation investigates the effects of such challenges, which affect electric grid reliability and economic operations. These challenges are: 1. Congestion of transmission lines, 2. Transmission lines expansion, 3. Large-scale wind energy integration, and 4. Phaser Measurement Units (PMUs) optimal placement for highest electric grid observability. Performing congestion analysis aids in evaluating the required increase of transmission line capacity in electric grids. However, it is necessary to evaluate expansion of transmission line capacity on methods to ensure optimal electric grid operation. Therefore, the expansion of transmission line capacity must enable grid operators to provide low-cost electricity while maintaining reliable operation of the electric grid. Because congestion affects the reliability of delivering power and increases its cost, the congestion analysis in electric grid networks is an important subject. Consequently, next-generation electric grids require novel methodologies for studying and managing congestion in electric grids. We suggest a novel method of long-term congestion management in large-scale electric grids. Owing to the complication and size of transmission line systems and the competitive nature of current grid operation, it is important for electric grid operators to determine how many transmission lines capacity to add. Traditional questions requiring answers are "Where" to add, "How much of transmission line capacity" to add, and "Which voltage level". Because of electric grid deregulation, transmission lines expansion is more complicated as it is now open to investors, whose main interest is to generate revenue, to build new transmission lines. Adding a new transmission capacity will help the system to relieve the transmission system congestion, create

  8. Split Bregman method for large scale fused Lasso

    CERN Document Server

    Ye, Gui-Bo

    2010-01-01

    rdering of regression or classification coefficients occurs in many real-world applications. Fused Lasso exploits this ordering by explicitly regularizing the differences between neighboring coefficients through an $\\ell_1$ norm regularizer. However, due to nonseparability and nonsmoothness of the regularization term, solving the fused Lasso problem is computationally demanding. Existing solvers can only deal with problems of small or medium size, or a special case of the fused Lasso problem in which the predictor matrix is identity matrix. In this paper, we propose an iterative algorithm based on split Bregman method to solve a class of large-scale fused Lasso problems, including a generalized fused Lasso and a fused Lasso support vector classifier. We derive our algorithm using augmented Lagrangian method and prove its convergence properties. The performance of our method is tested on both artificial data and real-world applications including proteomic data from mass spectrometry and genomic data from array...

  9. Theoretical expectations for bulk flows in large-scale surveys

    Science.gov (United States)

    Feldman, Hume A.; Watkins, Richard

    1994-01-01

    We calculate the theoretical expectation for the bulk motion of a large-scale survey of the type recently carried out by Lauer and Postman. Included are the effects of survey geometry, errors in the distance measurements, clustering properties of the sample, and different assumed power spectra. We considered the power spectrum calculated from the Infrared Astronomy Satellite (IRAS)-QDOT survey, as well as spectra from hot + cold and standard cold dark matter models. We find that measurement uncertainty, sparse sampling, and clustering can lead to a much larger expectation for the bulk motion of a cluster sample than for the volume as a whole. However, our results suggest that the expected bulk motion is still inconsistent with that reported by Lauer and Postman at the 95%-97% confidence level.

  10. Interloper bias in future large-scale structure surveys

    CERN Document Server

    Pullen, Anthony R; Dore, Olivier; Raccanelli, Alvise

    2015-01-01

    Next-generation spectroscopic surveys will map the large-scale structure of the observable universe, using emission line galaxies as tracers. While each survey will map the sky with a specific emission line, interloping emission lines can masquerade as the survey's intended emission line at different redshifts. Interloping lines from galaxies that are not removed can contaminate the power spectrum measurement, mixing correlations from various redshifts and diluting the true signal. We assess the potential for power spectrum contamination, finding that an interloper fraction worse than 0.2% could bias power spectrum measurements for future surveys by more than 10% of statistical errors, while also biasing inferences based on the power spectrum. We also construct a formalism for predicting biases for cosmological parameter measurements, and we demonstrate that a 0.3% interloper fraction could bias measurements of the growth rate by more than 10% of the error, which can affect constraints from upcoming surveys o...

  11. An Extensible Timing Infrastructure for Adaptive Large-scale Applications

    CERN Document Server

    Stark, Dylan; Goodale, Tom; Radke, Thomas; Schnetter, Erik

    2007-01-01

    Real-time access to accurate and reliable timing information is necessary to profile scientific applications, and crucial as simulations become increasingly complex, adaptive, and large-scale. The Cactus Framework provides flexible and extensible capabilities for timing information through a well designed infrastructure and timing API. Applications built with Cactus automatically gain access to built-in timers, such as gettimeofday and getrusage, system-specific hardware clocks, and high-level interfaces such as PAPI. We describe the Cactus timer interface, its motivation, and its implementation. We then demonstrate how this timing information can be used by an example scientific application to profile itself, and to dynamically adapt itself to a changing environment at run time.

  12. Solar cycle, solar rotation and large-scale circulation

    International Nuclear Information System (INIS)

    The Glossary is designed to be a technical dictionary that will provide solar workers of various specialties, students, other astronomers and theoreticians with concise information on the nature and the properties of phenomena of solar and solar-terrestrial physics. Each term, or group of related terms, is given a concise phenomenological and quantitative description, including the relationship to other phenomena and an interpretation in terms of physical processes. The references are intended to lead the non-specialist reader into the literature. This section deals with: solar (activity) cycle; Hale cycle; long-term activity variations; dynamos; differential rotation; rotation of the convective zone; Carrington rotation; oblateness; meridional flow; and giant cells or large-scale circulation. (B.R.H.)

  13. Planning under uncertainty solving large-scale stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  14. Large-Scale PC Management and Configuration for SNS Diagnostics

    International Nuclear Information System (INIS)

    The Spallation Neutron Source (SNS) project's diagnostics group has begun its implementation of more than 300 PC-based Network Attached Devices (NADs). An implementation of this size creates many challenges, such as distribution of patches and software upgrades; virus/worm potentials; and the configuration management, including interaction with the SNS relational database. As part of the initial solution, a base operating system (OS) configuration has been determined and computer management software has been implemented. Each PC requires a unique configuration, but all are based on a common OS and supporting applications. The diagnostics group has started with an implementation of an XP Embedded (XPe) OS and uses Altiris registered eXpress Deployment SolutionTM. The use of XPe and Altiris gives the diagnostics group the ability to easily configure, distribute, and manage software on a large scale. This paper describes the initial experience and discusses plans for the future

  15. SOLVING TRUST REGION PROBLEM IN LARGE SCALE OPTIMIZATION

    Institute of Scientific and Technical Information of China (English)

    Bing-sheng He

    2000-01-01

    This paper presents a new method for solving the basic problem in the “model trust region” approach to large scale minimization: Compute a vector x such that 1/2xTHx + cTx = min, subject to the constraint ‖x‖2≤a. The method is a combination of the CG method and a projection and contraction (PC) method. The first (CG) method with x0 = 0 as the start point either directly offers a solution of the problem, or--as soon as the norm of the iterate greater than a, --it gives a suitable starting point and a favourable choice of a crucial scaling parameter in the second (PC) method. Some numerical examples are given, which indicate that the method is applicable.

  16. A large-scale evaluation of computational protein function prediction.

    Science.gov (United States)

    Radivojac, Predrag; Clark, Wyatt T; Oron, Tal Ronnen; Schnoes, Alexandra M; Wittkop, Tobias; Sokolov, Artem; Graim, Kiley; Funk, Christopher; Verspoor, Karin; Ben-Hur, Asa; Pandey, Gaurav; Yunes, Jeffrey M; Talwalkar, Ameet S; Repo, Susanna; Souza, Michael L; Piovesan, Damiano; Casadio, Rita; Wang, Zheng; Cheng, Jianlin; Fang, Hai; Gough, Julian; Koskinen, Patrik; Törönen, Petri; Nokso-Koivisto, Jussi; Holm, Liisa; Cozzetto, Domenico; Buchan, Daniel W A; Bryson, Kevin; Jones, David T; Limaye, Bhakti; Inamdar, Harshal; Datta, Avik; Manjari, Sunitha K; Joshi, Rajendra; Chitale, Meghana; Kihara, Daisuke; Lisewski, Andreas M; Erdin, Serkan; Venner, Eric; Lichtarge, Olivier; Rentzsch, Robert; Yang, Haixuan; Romero, Alfonso E; Bhat, Prajwal; Paccanaro, Alberto; Hamp, Tobias; Kaßner, Rebecca; Seemayer, Stefan; Vicedo, Esmeralda; Schaefer, Christian; Achten, Dominik; Auer, Florian; Boehm, Ariane; Braun, Tatjana; Hecht, Maximilian; Heron, Mark; Hönigschmid, Peter; Hopf, Thomas A; Kaufmann, Stefanie; Kiening, Michael; Krompass, Denis; Landerer, Cedric; Mahlich, Yannick; Roos, Manfred; Björne, Jari; Salakoski, Tapio; Wong, Andrew; Shatkay, Hagit; Gatzmann, Fanny; Sommer, Ingolf; Wass, Mark N; Sternberg, Michael J E; Škunca, Nives; Supek, Fran; Bošnjak, Matko; Panov, Panče; Džeroski, Sašo; Šmuc, Tomislav; Kourmpetis, Yiannis A I; van Dijk, Aalt D J; ter Braak, Cajo J F; Zhou, Yuanpeng; Gong, Qingtian; Dong, Xinran; Tian, Weidong; Falda, Marco; Fontana, Paolo; Lavezzo, Enrico; Di Camillo, Barbara; Toppo, Stefano; Lan, Liang; Djuric, Nemanja; Guo, Yuhong; Vucetic, Slobodan; Bairoch, Amos; Linial, Michal; Babbitt, Patricia C; Brenner, Steven E; Orengo, Christine; Rost, Burkhard; Mooney, Sean D; Friedberg, Iddo

    2013-03-01

    Automated annotation of protein function is challenging. As the number of sequenced genomes rapidly grows, the overwhelming majority of protein products can only be annotated computationally. If computational predictions are to be relied upon, it is crucial that the accuracy of these methods be high. Here we report the results from the first large-scale community-based critical assessment of protein function annotation (CAFA) experiment. Fifty-four methods representing the state of the art for protein function prediction were evaluated on a target set of 866 proteins from 11 organisms. Two findings stand out: (i) today's best protein function prediction algorithms substantially outperform widely used first-generation methods, with large gains on all types of targets; and (ii) although the top methods perform well enough to guide experiments, there is considerable need for improvement of currently available tools. PMID:23353650

  17. Mass Efficiencies for Common Large-Scale Precision Space Structures

    Science.gov (United States)

    Williams, R. Brett; Agnes, Gregory S.

    2005-01-01

    This paper presents a mass-based trade study for large-scale deployable triangular trusses, where the longerons can be monocoque tubes, isogrid tubes, or coilable longeron trusses. Such structures are typically used to support heavy reflectors, solar panels, or other instruments, and are subject to thermal gradients that can vary a great deal based on orbital altitude, location in orbit, and self-shadowing. While multi layer insulation (MLI) blankets are commonly used to minimize the magnitude of these thermal disturbances, they subject the truss to a nonstructural mass penalty. This paper investigates the impact of these add-on thermal protection layers on selecting the lightest precision structure for a given loading scenario.

  18. Measuring galaxy environments in large scale photometric surveys

    CERN Document Server

    Etherington, James

    2015-01-01

    The properties of galaxies in the local universe have been shown to depend upon their environment. Future large scale photometric surveys such as DES and Euclid will be vital to gain insight into the evolution of galaxy properties and the role of environment. Large samples come at the cost of redshift precision and this affects the measurement of environment. We study this by measuring environments using SDSS spectroscopic and photometric redshifts and also simulated photometric redshifts with a range of uncertainties. We consider the Nth nearest neighbour and fixed aperture methods and evaluate the impact of the aperture parameters and the redshift uncertainty. We find that photometric environments have a smaller dynamic range than spectroscopic measurements because uncertain redshifts scatter galaxies from dense environments into less dense environments. At the expected redshift uncertainty of DES, 0.1, there is Spearman rank correlation coefficient of 0.4 between the measurements using the optimal paramete...

  19. Forced vibration test of the Hualien large scale SSI model

    International Nuclear Information System (INIS)

    A Large-Scale Seismic Test (LSST) Program has been conducted at Hualien, Taiwan (Tang et al., 1991), to obtain earthquake-induced soil-structure interaction (SSI) data in a stiff soil site environment The Hualien program is a follow on of the Lotung program which is of soft soil site. Forced vibration tests of the Hualien 1/4-scale containment SSI test model were conducted in October, 1992 before backfill (without embedment) and in February, 1993 after backfill (with embedment) for the purpose of defining basic dynamic characteristics of the soil-structure system. Two horizontal directions excitation (NS, EW) are applied on the roof floor and on the basemat. Vertical excitation is applied on the basemat only. This paper describes the results of the forced vibration tests of the model without embedment. (author)

  20. Safeguarding of large scale reprocessing and MOX plants

    International Nuclear Information System (INIS)

    In May 97, the IAEA Board of Governors approved the final measures of the ''93+2'' safeguards strengthening programme, thus improving the international non-proliferation regime by enhancing the effectiveness and efficiency of safeguards verification. These enhancements are not however, a revolution in current practices, but rather an important step in the continuous evolution of the safeguards system. The principles embodied in 93+2, for broader access to information and increased physical access already apply, in a pragmatic way, to large scale reprocessing and MOX fabrication plants. In these plants, qualitative measures and process monitoring play an important role in addition to accountancy and material balance evaluations in attaining the safeguard's goals. This paper will reflect on the safeguards approaches adopted for these large bulk handling facilities and draw analogies, conclusions and lessons for the forthcoming implementation of the 93+2 Programme. (author)

  1. How Large-Scale Research Facilities Connect to Global Research

    DEFF Research Database (Denmark)

    Lauto, Giancarlo; Valentin, Finn

    2013-01-01

    institutional settings. Policies mandating LSRFs should consider that research prioritized on the basis of technological relevance limits the international reach of collaborations. Additionally, the propensity for international collaboration is lower for resident scientists than for those affiliated......Policies for large-scale research facilities (LSRFs) often highlight their spillovers to industrial innovation and their contribution to the external connectivity of the regional innovation system hosting them. Arguably, the particular institutional features of LSRFs are conducive for collaborative...... research. However, based on data on publications produced in 2006–2009 at the Neutron Science Directorate of Oak Ridge National Laboratory in Tennessee (United States), we find that internationalization of its collaborative research is restrained by coordination costs similar to those characterizing other...

  2. Large-scale oxide nanostructures grown by thermal oxidation

    International Nuclear Information System (INIS)

    Large scale oxide nanostructures of CuO, Fe2O3, Co3O4, ZnO, etc. were prepared by catalyst-free thermal oxidation process in atmosphere using pure metal as the starting material. Various single crystalline nanostructure arrays, including nanowires, nanobelts, nononeedles, nanoflakes, and nanowalls were obtained. These nanostructures can be grown from bulk materials, like foils or sheet, or from the microsized metal powders and the pre-deposited metal film. The growth time, temperature and substrate have important effects on the morphology, size and distribution of the nanostructures. Different from V-S or V-L-S mechanisms, the growth of nanostructure is found to be based on the metal ion diffusion process. The gradual oxidation process of the metals was clearly demonstrated. The properties of these nanostructures including gas sensing, magnetism, photoluminescence, and field emission were extensively investigated

  3. Impact of Parallel Computing on Large Scale Aeroelastic Computations

    Science.gov (United States)

    Guruswamy, Guru P.; Kwak, Dochan (Technical Monitor)

    2000-01-01

    Aeroelasticity is computationally one of the most intensive fields in aerospace engineering. Though over the last three decades the computational speed of supercomputers have substantially increased, they are still inadequate for large scale aeroelastic computations using high fidelity flow and structural equations. In addition to reaching a saturation in computational speed because of changes in economics, computer manufactures are stopping the manufacturing of mainframe type supercomputers. This has led computational aeroelasticians to face the gigantic task of finding alternate approaches for fulfilling their needs. The alternate path to over come speed and availability limitations of mainframe type supercomputers is to use parallel computers. During this decade several different architectures have evolved. In FY92 the US Government started the High Performance Computing and Communication (HPCC) program. As a participant in this program NASA developed several parallel computational tools for aeroelastic applications. This talk describes the impact of those application tools on high fidelity based multidisciplinary analysis.

  4. Scalable Parallel Distance Field Construction for Large-Scale Applications.

    Science.gov (United States)

    Yu, Hongfeng; Xie, Jinrong; Ma, Kwan-Liu; Kolla, Hemanth; Chen, Jacqueline H

    2015-10-01

    Computing distance fields is fundamental to many scientific and engineering applications. Distance fields can be used to direct analysis and reduce data. In this paper, we present a highly scalable method for computing 3D distance fields on massively parallel distributed-memory machines. A new distributed spatial data structure, named parallel distance tree, is introduced to manage the level sets of data and facilitate surface tracking over time, resulting in significantly reduced computation and communication costs for calculating the distance to the surface of interest from any spatial locations. Our method supports several data types and distance metrics from real-world applications. We demonstrate its efficiency and scalability on state-of-the-art supercomputers using both large-scale volume datasets and surface models. We also demonstrate in-situ distance field computation on dynamic turbulent flame surfaces for a petascale combustion simulation. Our work greatly extends the usability of distance fields for demanding applications. PMID:26357251

  5. Deep Feature Learning and Cascaded Classifier for Large Scale Data

    DEFF Research Database (Denmark)

    Prasoon, Adhish

    state-of-the-art method for cartilage segmentation using one stage nearest neighbour classifier. Our method achieved better results than the state-of-the-art method for tibial as well as femoral cartilage segmentation. The next main contribution of the thesis deals with learning features autonomously...... learning architecture that autonomously learns the features from the images is the main insight of this study. While training the convolutional neural networks for segmentation purposes, the commonly used cost function does not consider the labels of the neighbourhood pixels/voxels. We propose spatially......This thesis focuses on voxel/pixel classification based approaches for image segmentation. The main application is segmentation of articular cartilage in knee MRIs. The first major contribution of the thesis deals with large scale machine learning problems. Many medical imaging problems need huge...

  6. Honeycomb: Visual Analysis of Large Scale Social Networks

    Science.gov (United States)

    van Ham, Frank; Schulz, Hans-Jörg; Dimicco, Joan M.

    The rise in the use of social network sites allows us to collect large amounts of user reported data on social structures and analysis of this data could provide useful insights for many of the social sciences. This analysis is typically the domain of Social Network Analysis, and visualization of these structures often proves invaluable in understanding them. However, currently available visual analysis tools are not very well suited to handle the massive scale of this network data, and often resolve to displaying small ego networks or heavily abstracted networks. In this paper, we present Honeycomb, a visualization tool that is able to deal with much larger scale data (with millions of connections), which we illustrate by using a large scale corporate social networking site as an example. Additionally, we introduce a new probability based network metric to guide users to potentially interesting or anomalous patterns and discuss lessons learned during design and implementation.

  7. Reconstructing a Large-Scale Population for Social Simulation

    Science.gov (United States)

    Fan, Zongchen; Meng, Rongqing; Ge, Yuanzheng; Qiu, Xiaogang

    The advent of social simulation has provided an opportunity to research on social systems. More and more researchers tend to describe the components of social systems in a more detailed level. Any simulation needs the support of population data to initialize and implement the simulation systems. However, it's impossible to get the data which provide full information about individuals and households. We propose a two-step method to reconstruct a large-scale population for a Chinese city according to Chinese culture. Firstly, a baseline population is generated through gathering individuals into households one by one; secondly, social relationships such as friendship are assigned to the baseline population. Through a case study, a population of 3,112,559 individuals gathered in 1,133,835 households is reconstructed for Urumqi city, and the results show that the generated data can respect the real data quite well. The generated data can be applied to support modeling of some social phenomenon.

  8. Properties of large-scale TIDs observed in Central China

    Institute of Scientific and Technical Information of China (English)

    TANG; Qiulin(汤秋林); WAN; Weixing(万卫星); NING; Baiqi(宁百齐); YUAN; Hong(袁洪)

    2002-01-01

    This paper investigates the large scale travelling ionospheric disturbances (LSTIDs) using the observation data of an HF Doppler array located in Central China. The data observed in a high solar activity year (year 1989) are analyzed to obtain the main propagation parameters of LSTIDs such as period, horizontal phase velocity and propagating direction. Results are outlined as follows: Most of the LSTIDs propagate southward; others tend to propagate northward, mostly in summer; dispersion of most LSTIDs is matched with that of Lamb pseudomode, while others have the dispersion of long period gravity wave mode. The horizontal phase velocities of these two modes are about 220 and 450 m/s respectively. The analysis shows that LSTIDs are strongly pertinent to solar activity and space magnetic storms; thus the results presented here are significant for the research of ionospheric weather in mid-low latitude region.

  9. Large scale oil lease automation and electronic custody transfer

    International Nuclear Information System (INIS)

    Typically, oil field production operations have only been automated at fields with long term production profiles and enhanced recovery. The automation generally consists of monitoring and control at the wellhead and centralized facilities. However, Union Pacific Resources Co. (UPRC) has successfully implemented a large scale automation program for rapid-decline primary recovery Austin Chalk wells where purchasers buy and transport oil from each individual wellsite. This project has resulted in two significant benefits. First, operators are using the system to re-engineer their work processes. Second, an inter-company team created a new electronic custody transfer method. This paper will describe: the progression of the company's automation objectives in the area; the field operator's interaction with the system, and the related benefits; the research and development of the new electronic custody transfer method

  10. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­‐scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  11. Galaxy clustering and the origin of large-scale flows

    Science.gov (United States)

    Juszkiewicz, R.; Yahil, A.

    1989-01-01

    Peebles's 'cosmic virial theorem' is extended from its original range of validity at small separations, where hydrostatic equilibrium holds, to large separations, in which linear gravitational stability theory applies. The rms pairwise velocity difference at separation r is shown to depend on the spatial galaxy correlation function xi(x) only for x less than r. Gravitational instability theory can therefore be tested by comparing the two up to the maximum separation for which both can reliably be determined, and there is no dependence on the poorly known large-scale density and velocity fields. With the expected improvement in the data over the next few years, however, this method should yield a reliable determination of omega.

  12. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  13. Thermophoretically induced large-scale deformations around microscopic heat centers

    Science.gov (United States)

    Puljiz, Mate; Orlishausen, Michael; Köhler, Werner; Menzel, Andreas M.

    2016-05-01

    Selectively heating a microscopic colloidal particle embedded in a soft elastic matrix is a situation of high practical relevance. For instance, during hyperthermic cancer treatment, cell tissue surrounding heated magnetic colloidal particles is destroyed. Experiments on soft elastic polymeric matrices suggest a very long-ranged, non-decaying radial component of the thermophoretically induced displacement fields around the microscopic heat centers. We theoretically confirm this conjecture using a macroscopic hydrodynamic two-fluid description. Both thermophoretic and elastic effects are included in this theory. Indeed, we find that the elasticity of the environment can cause the experimentally observed large-scale radial displacements in the embedding matrix. Additional experiments confirm the central role of elasticity. Finally, a linearly decaying radial component of the displacement field in the experiments is attributed to the finite size of the experimental sample. Similar results are obtained from our theoretical analysis under modified boundary conditions.

  14. Observational signatures of modified gravity on ultra-large scales

    CERN Document Server

    Baker, Tessa

    2015-01-01

    Extremely large surveys with future experiments like Euclid and the SKA will soon allow us to access perturbation modes close to the Hubble scale, with wavenumbers $k \\sim {\\cal H}$. If a modified gravity theory is responsible for cosmic acceleration, the Hubble scale is a natural regime for deviations from General Relativity (GR) to become manifest. The majority of studies to date have concentrated on the consequences of alternative gravity theories for the subhorizon, quasi-static regime, however. We investigate how modifications to the gravitational field equations affect perturbations around the Hubble scale, and how this translates into deviations of ultra large-scale relativistic observables from their GR behaviour. Adopting a model-independent ethos that relies only on the broad physical properties of gravity theories, we find that the deviations of the observables are small unless modifications to GR are drastic. The angular dependence and redshift evolution of the deviations is highly parameterisatio...

  15. Large Scale Cosmic Perturbation from Evaporation of Primordial Black Holes

    CERN Document Server

    Fujita, Tomohiro; Kawasaki, Masahiro

    2013-01-01

    We present a novel mechanism to generate the cosmic perturbation from evaporation of primordial black holes. A mass of a field is fluctuated if it is given by a vacuum expectation value of a light scalar field because of the quantum fluctuation during inflation. The fluctuated mass causes variations of the evaporation time of the primordial black holes. Therefore provided the primordial black holes dominate the universe when they evaporate, primordial cosmic perturbations are generated. We find that the amplitude of the large scale curvature perturbation generated in this scenario can be consistent with the observed value. Interestingly, our mechanism works even if all fields which are responsible for inflation and the generation of the cosmic perturbation are decoupled from the visible sector except for the gravitational interaction. An implication to the running spectral index is also discussed.

  16. Large-scale research experts voice opinion on Chernobyl

    International Nuclear Information System (INIS)

    In mid-September 1986 the Work Study Group for Large-Scale Research, which comprises the 13 German, publicly funded research institutes, invited experts to the Science Centre at Bonn-Bad Godesberg to discuss the inter-relationship between the Ukrainian reactor accident and the future energy supply in the Federal Republic of Germany. The event was attended predominantly by scientists belonging to the AGF and from the media, representatives of colleges, of scientific organisations, ministries and authorities, associations, embassies of various countries, of the German Government and the German Atomic Forum. The general atmosphere at the discussions differed pleasantly from the politically or election motivated statements uttered during recent months due to the objectivity which prevailed. (orig.)

  17. Isolating relativistic effects in large-scale structure

    CERN Document Server

    Bonvin, Camille

    2014-01-01

    We present a fully relativistic calculation of the observed galaxy number counts in the linear regime. We show that besides the density fluctuations and redshift-space distortions, various relativistic effects contribute to observations at large scales. These effects all have the same physical origin: they result from the fact that our coordinate system, namely the galaxy redshift and the incoming photons' direction, is distorted by inhomogeneities in our universe. We then discuss the impact of the relativistic effects on the angular power spectrum and on the two-point correlation function in configuration space. We show that the latter is very well adapted to isolate the relativistic effects since it naturally makes use of the symmetries of the different contributions. In particular, we discuss how the Doppler effect and the gravitational redshift distortions can be isolated by looking for a dipole in the cross-correlation function between a bright and a faint population of galaxies.

  18. Automatic Installation and Configuration for Large Scale Farms

    CERN Document Server

    Novák, J

    2005-01-01

    Since the early appearance of commodity hardware, the utilization of computers rose rapidly, and they became essential in all areas of life. Soon it was realized that nodes are able to work cooperatively, in order to solve new, more complex tasks. This conception got materialized in coherent aggregations of computers called farms and clusters. Collective application of nodes, being efficient and economical, was adopted in education, research and industry before long. But maintainance, especially in large scale, appeared as a problem to be resolved. New challenges needed new methods and tools. Development work has been started to build farm management applications and frameworks. In the first part of the thesis, these systems are introduced. After a general description of the matter, a comparative analysis of different approaches and tools illustrates the practical aspects of the theoretical discussion. CERN, the European Organization of Nuclear Research is the largest Particle Physics laboratory in the world....

  19. Systematic Renormalization of the Effective Theory of Large Scale Structure

    CERN Document Server

    Abolhasani, Ali Akbar; Pajer, Enrico

    2015-01-01

    A perturbative description of Large Scale Structure is a cornerstone of our understanding of the observed distribution of matter in the universe. Renormalization is an essential and defining step to make this description physical and predictive. Here we introduce a systematic renormalization procedure, which neatly associates counterterms to the UV-sensitive diagrams order by order, as it is commonly done in quantum field theory. As a concrete example, we renormalize the one-loop power spectrum and bispectrum of both density and velocity. In addition, we present a series of results that are valid to all orders in perturbation theory. First, we show that while systematic renormalization requires temporally non-local counterterms, in practice one can use an equivalent basis made of local operators. We give an explicit prescription to generate all counterterms allowed by the symmetries. Second, we present a formal proof of the well-known general argument that the contribution of short distance perturbations to l...

  20. Building a Large-Scale Knowledge Base for Machine Translation

    CERN Document Server

    Knight, K; Knight, Kevin; Luk, Steve K.

    1994-01-01

    Knowledge-based machine translation (KBMT) systems have achieved excellent results in constrained domains, but have not yet scaled up to newspaper text. The reason is that knowledge resources (lexicons, grammar rules, world models) must be painstakingly handcrafted from scratch. One of the hypotheses being tested in the PANGLOSS machine translation project is whether or not these resources can be semi-automatically acquired on a very large scale. This paper focuses on the construction of a large ontology (or knowledge base, or world model) for supporting KBMT. It contains representations for some 70,000 commonly encountered objects, processes, qualities, and relations. The ontology was constructed by merging various online dictionaries, semantic networks, and bilingual resources, through semi-automatic methods. Some of these methods (e.g., conceptual matching of semantic taxonomies) are broadly applicable to problems of importing/exporting knowledge from one KB to another. Other methods (e.g., bilingual match...

  1. Large-scale mean patterns in turbulent convection

    CERN Document Server

    Emran, Mohammad S

    2015-01-01

    Large-scale patterns, which are well-known from the spiral defect chaos regime of thermal convection at Rayleigh numbers $Ra 10^5$. They are uncovered when the turbulent fields are averaged in time and turbulent fluctuations are thus removed. We apply the Boussinesq closure to estimate turbulent viscosities and diffusivities, respectively. The resulting turbulent Rayleigh number $Ra_{\\ast}$, that describes the convection of the mean patterns, is indeed in the spiral defect chaos range. The turbulent Prandtl numbers are smaller than one with $0.2\\le Pr_{\\ast}\\le 0.4$ for Prandtl numbers $0.7 \\le Pr\\le 10$. Finally, we demonstrate that these mean flow patterns are robust to an additional finite-amplitude side wall-forcing when the level of turbulent fluctuations in the flow is sufficiently high.

  2. Isolating relativistic effects in large-scale structure

    International Nuclear Information System (INIS)

    We present a fully relativistic calculation of the observed galaxy number counts in the linear regime. We show that besides the density fluctuations and redshift-space distortions, various relativistic effects contribute to observations at large scales. These effects all have the same physical origin: they result from the fact that our coordinate system, namely the galaxy redshift and the incoming photons’ direction, is distorted by inhomogeneities in our Universe. We then discuss the impact of the relativistic effects on the angular power spectrum and on the two-point correlation function in configuration space. We show that the latter is very well adapted to isolate the relativistic effects since it naturally makes use of the symmetries of the different contributions. In particular, we discuss how the Doppler effect and the gravitational redshift distortions can be isolated by looking for a dipole in the cross-correlation function between a bright and a faint population of galaxies. (paper)

  3. Optimization of large scale food production using Lean Manufacturing principles

    DEFF Research Database (Denmark)

    Engelund, Eva Høy; Friis, Alan; Breum, Gitte

    2009-01-01

    This paper discusses how the production principles of Lean Manufacturing (Lean) can be applied in a large-scale meal production. Lean principles are briefly presented, followed by a field study of how a kitchen at a Danish hospital has implemented Lean in the daily production. In the kitchen, the...... main purposes of implementing Lean were to rationalise internal procedures and to increase production efficiency following a change from cook-serve production to cook-chill, and a reduction in the number of employees. It was also important that product quality and working environment should not be...... negatively affected by the rationalisation of production procedures. The field study shows that Lean principles can be applied in meal production and can result in increased production efficiency and systematic improvement of product quality without negative effects on the working environment. The results...

  4. Design Performance Standards for Large Scale Wind Farms

    DEFF Research Database (Denmark)

    Gordon, Mark

    2009-01-01

    of connection into the Eastern Australian power system under the Rules and guidelines set out by AEMC and NEMMCO (AEMO). Where applicable some international practices are also mentioned. Standards are designed to serve as a technical envelope under which wind farm proponents design the plant and......This document presents, discusses and provides a general guide on electrical performance standard requirements for connection of large scale onshore wind farms into HV transmission networks. Experiences presented here refer mainly to technical requirements and issues encountered during the process...... maintain ongoing technical compliance of the plant during its operational lifetime. This report is designed to provide general technical information for the wind farm connection engineer to be aware of during the process of connection, registration and operation of wind power plants interconnected into the...

  5. Biased galaxy formation and large-scale structure

    Science.gov (United States)

    Berlind, Andreas Alan

    The biased relation between the galaxy and mass distributions lies at the intersection of large scale structure in the universe and the process of galaxy formation. I study the nature of galaxy bias and its connections to galaxy clustering and galaxy formation physics. Galaxy bias has traditionally been viewed as an obstacle to constraining cosmological parameters by studying galaxy clustering. I examine the effect of bias on measurements of the cosmological density parameter Wm by techniques that exploit the gravity-induced motions of galaxies. Using a variety of environmental bias models applied to N-body simulations, I find that, in most cases, the quantity estimated by these techniques is the value of W0.6m/bs , where bs is the ratio of rms galaxy fluctuations to rms mass fluctuations on large scales. Moreover, I find that different methods should, in principle, agree with each other and it is thus unlikely that non-linear or scale-dependent bias is responsible for the discrepancies that exist among current measurements. One can also view the influence of bias on galaxy clustering as a strength rather than a weakness, since it provides us with a potentially powerful way to constrain galaxy formation theories. With this goal in mind, I develop the "Halo Occupation Distribution" (HOD), a physically motivated and complete formulation of bias that is based on the distribution of galaxies within virialized dark matter halos. I explore the sensitivity of galaxy clustering statistics to features of the HOD and focus on how the HOD may be empirically constrained from galaxy clustering data. I make the connection to the physics of galaxy formation by studying the HOD predicted by the two main theoretical methods of modeling galaxy formation. I find that, despite many differences between them, the two methods predict the same HOD, suggesting that galaxy bias is determined by robust features of the hierarchical galaxy formation process rather than details of gas cooling

  6. Large-scale mass distribution in the Illustris simulation

    Science.gov (United States)

    Haider, M.; Steinhauser, D.; Vogelsberger, M.; Genel, S.; Springel, V.; Torrey, P.; Hernquist, L.

    2016-04-01

    Observations at low redshifts thus far fail to account for all of the baryons expected in the Universe according to cosmological constraints. A large fraction of the baryons presumably resides in a thin and warm-hot medium between the galaxies, where they are difficult to observe due to their low densities and high temperatures. Cosmological simulations of structure formation can be used to verify this picture and provide quantitative predictions for the distribution of mass in different large-scale structure components. Here we study the distribution of baryons and dark matter at different epochs using data from the Illustris simulation. We identify regions of different dark matter density with the primary constituents of large-scale structure, allowing us to measure mass and volume of haloes, filaments and voids. At redshift zero, we find that 49 per cent of the dark matter and 23 per cent of the baryons are within haloes more massive than the resolution limit of 2 × 108 M⊙. The filaments of the cosmic web host a further 45 per cent of the dark matter and 46 per cent of the baryons. The remaining 31 per cent of the baryons reside in voids. The majority of these baryons have been transported there through active galactic nuclei feedback. We note that the feedback model of Illustris is too strong for heavy haloes, therefore it is likely that we are overestimating this amount. Categorizing the baryons according to their density and temperature, we find that 17.8 per cent of them are in a condensed state, 21.6 per cent are present as cold, diffuse gas, and 53.9 per cent are found in the state of a warm-hot intergalactic medium.

  7. Investigation of the large scale regional hydrogeological situation at Ceberg

    International Nuclear Information System (INIS)

    The present study forms part of the large-scale groundwater flow studies within the SR 97 project. The site of interest is Ceberg. Within the present study two different regional scale groundwater models have been constructed, one large regional model with an areal extent of about 300 km2 and one semi-regional model with an areal extent of about 50 km2. Different types of boundary conditions have been applied to the models. Topography driven pressures, constant infiltration rates, non-linear infiltration combined specified pressure boundary conditions, and transfer of groundwater pressures from the larger model to the semi-regional model. The present model has shown that: -Groundwater flow paths are mainly local. Large-scale groundwater flow paths are only seen below the depth of the hypothetical repository (below 500 meters) and are very slow. -Locations of recharge and discharge, to and from the site area are in the close vicinity of the site. -The low contrast between major structures and the rock mass means that the factor having the major effect on the flowpaths is the topography. -A sufficiently large model, to incorporate the recharge and discharge areas to the local site is in the order of kilometres. -A uniform infiltration rate boundary condition does not give a good representation of the groundwater movements in the model. -A local site model may be located to cover the site area and a few kilometers of the surrounding region. In order to incorporate all recharge and discharge areas within the site model, the model will be somewhat larger than site scale models at other sites. This is caused by the fact that the discharge areas are divided into three distinct areas to the east, south and west of the site. -Boundary conditions may be supplied to the site model by means of transferring groundwater pressures obtained with the semi-regional model

  8. High Fidelity Simulations of Large-Scale Wireless Networks

    Energy Technology Data Exchange (ETDEWEB)

    Onunkwo, Uzoma [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Benz, Zachary [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-11-01

    The worldwide proliferation of wireless connected devices continues to accelerate. There are 10s of billions of wireless links across the planet with an additional explosion of new wireless usage anticipated as the Internet of Things develops. Wireless technologies do not only provide convenience for mobile applications, but are also extremely cost-effective to deploy. Thus, this trend towards wireless connectivity will only continue and Sandia must develop the necessary simulation technology to proactively analyze the associated emerging vulnerabilities. Wireless networks are marked by mobility and proximity-based connectivity. The de facto standard for exploratory studies of wireless networks is discrete event simulations (DES). However, the simulation of large-scale wireless networks is extremely difficult due to prohibitively large turnaround time. A path forward is to expedite simulations with parallel discrete event simulation (PDES) techniques. The mobility and distance-based connectivity associated with wireless simulations, however, typically doom PDES and fail to scale (e.g., OPNET and ns-3 simulators). We propose a PDES-based tool aimed at reducing the communication overhead between processors. The proposed solution will use light-weight processes to dynamically distribute computation workload while mitigating communication overhead associated with synchronizations. This work is vital to the analytics and validation capabilities of simulation and emulation at Sandia. We have years of experience in Sandia’s simulation and emulation projects (e.g., MINIMEGA and FIREWHEEL). Sandia’s current highly-regarded capabilities in large-scale emulations have focused on wired networks, where two assumptions prevent scalable wireless studies: (a) the connections between objects are mostly static and (b) the nodes have fixed locations.

  9. In situ vitrification large-scale operational acceptance test analysis

    International Nuclear Information System (INIS)

    A thermal treatment process is currently under study to provide possible enhancement of in-place stabilization of transuranic and chemically contaminated soil sites. The process is known as in situ vitrification (ISV). In situ vitrification is a remedial action process that destroys solid and liquid organic contaminants and incorporates radionuclides into a glass-like material that renders contaminants substantially less mobile and less likely to impact the environment. A large-scale operational acceptance test (LSOAT) was recently completed in which more than 180 t of vitrified soil were produced in each of three adjacent settings. The LSOAT demonstrated that the process conforms to the functional design criteria necessary for the large-scale radioactive test (LSRT) to be conducted following verification of the performance capabilities of the process. The energy requirements and vitrified block size, shape, and mass are sufficiently equivalent to those predicted by the ISV mathematical model to confirm its usefulness as a predictive tool. The LSOAT demonstrated an electrode replacement technique, which can be used if an electrode fails, and techniques have been identified to minimize air oxidation, thereby extending electrode life. A statistical analysis was employed during the LSOAT to identify graphite collars and an insulative surface as successful cold cap subsidence techniques. The LSOAT also showed that even under worst-case conditions, the off-gas system exceeds the flow requirements necessary to maintain a negative pressure on the hood covering the area being vitrified. The retention of simulated radionuclides and chemicals in the soil and off-gas system exceeds requirements so that projected emissions are one to two orders of magnitude below the maximum permissible concentrations of contaminants at the stack

  10. Disinformative data in large-scale hydrological modelling

    Directory of Open Access Journals (Sweden)

    A. Kauffeldt

    2013-07-01

    Full Text Available Large-scale hydrological modelling has become an important tool for the study of global and regional water resources, climate impacts, and water-resources management. However, modelling efforts over large spatial domains are fraught with problems of data scarcity, uncertainties and inconsistencies between model forcing and evaluation data. Model-independent methods to screen and analyse data for such problems are needed. This study aimed at identifying data inconsistencies in global datasets using a pre-modelling analysis, inconsistencies that can be disinformative for subsequent modelling. The consistency between (i basin areas for different hydrographic datasets, and (ii between climate data (precipitation and potential evaporation and discharge data, was examined in terms of how well basin areas were represented in the flow networks and the possibility of water-balance closure. It was found that (i most basins could be well represented in both gridded basin delineations and polygon-based ones, but some basins exhibited large area discrepancies between flow-network datasets and archived basin areas, (ii basins exhibiting too-high runoff coefficients were abundant in areas where precipitation data were likely affected by snow undercatch, and (iii the occurrence of basins exhibiting losses exceeding the potential-evaporation limit was strongly dependent on the potential-evaporation data, both in terms of numbers and geographical distribution. Some inconsistencies may be resolved by considering sub-grid variability in climate data, surface-dependent potential-evaporation estimates, etc., but further studies are needed to determine the reasons for the inconsistencies found. Our results emphasise the need for pre-modelling data analysis to identify dataset inconsistencies as an important first step in any large-scale study. Applying data-screening methods before modelling should also increase our chances to draw robust conclusions from subsequent

  11. Large scale genotyping study for asthma in the Japanese population

    Directory of Open Access Journals (Sweden)

    Shibasaki Masanao

    2009-03-01

    Full Text Available Abstract Background Asthma is a complex phenotype that is influenced by both genetic and environmental factors. Genome-wide linkage and association studies have been performed to identify susceptibility genes for asthma. These studies identified new genes and pathways implicated in this disease, many of which were previously unknown. Objective To perform a large-scale genotyping study to identify asthma-susceptibility genes in the Japanese population. Methods We performed a large-scale, three-stage association study on 288 atopic asthmatics and 1032 controls, by using multiplex PCR-Invader assay methods at 82,935 single nucleotide polymorphisms (SNPs (1st stage. SNPs that were strongly associated with asthma were further genotyped in samples from asthmatic families (216 families, 762 members, 2nd stage, 541 independent patients, and 744 controls (3rd stage. Results SNPs located in the 5' region of PEX19 (rs2820421 were significantly associated with P st to the 3rd stage analyses; however, the P values did not reach statistically significant levels (combined, P = 3.8 × 10-5; statistically significant levels with Bonferroni correction, P = 6.57 × 10-7. SNPs on HPCAL1 (rs3771140 and on IL18R1 (rs3213733 were associated with asthma in the 1st and 2nd stage analyses, but the associations were not observed in the 3rd stage analysis. Conclusion No association attained genome-wide significance, but several loci for possible association emerged. Future studies are required to validate these results for the prevention and treatment of asthma.

  12. A method for the large scale isolation of high transformation efficiency fungal genomic DNA.

    Science.gov (United States)

    Zhang, D; Yang, Y; Castlebury, L A; Cerniglia, C E

    1996-12-01

    A procedure for isolation of genomic DNA from the zygomycete Cunninghamella elegans and other filamentous fungi and yeasts is reported. This procedure involves disruption of cells by grinding using dry ice, removal of polysaccharides using cetyltrimethylammonium bromide and by phenol extractions, and precipitation of DNA with isopropanol at room temperature. The isolation method produced large scale (approximate 1 mg DNA/5 g wet cells) and highly purified high molecular mass DNA. Sau3AI partially digested DNA showed high transformation efficiency (> 10(6)/100 ng DNA) when ligated to ZAP-express lambda vector. PMID:8961565

  13. Bayesian large-scale structure inference: initial conditions and the cosmic web

    CERN Document Server

    Leclercq, Florent

    2014-01-01

    We describe an innovative statistical approach for the ab initio simultaneous analysis of the formation history and morphology of the large-scale structure of the inhomogeneous Universe. Our algorithm explores the joint posterior distribution of the many millions of parameters involved via efficient Hamiltonian Markov Chain Monte Carlo sampling. We describe its application to the Sloan Digital Sky Survey data release 7 and an additional non-linear filtering step. We illustrate the use of our findings for cosmic web analysis: identification of structures via tidal shear analysis and inference of dark matter voids.

  14. Minimization of Linear Functionals Defined on| Solutions of Large-Scale Discrete Ill-Posed Problems

    DEFF Research Database (Denmark)

    Elden, Lars; Hansen, Per Christian; Rojas, Marielba

    2003-01-01

    The minimization of linear functionals de ned on the solutions of discrete ill-posed problems arises, e.g., in the computation of con dence intervals for these solutions. In 1990, Elden proposed an algorithm for this minimization problem based on a parametric-programming reformulation involving the...... solution of a sequence of trust-region problems, and using matrix factorizations. In this paper, we describe MLFIP, a large-scale version of this algorithm where a limited-memory trust-region solver is used on the subproblems. We illustrate the use of our algorithm in connection with an inverse heat...

  15. Direct large-scale synthesis of perovskite barium strontium titanate nano-particles from solutions

    International Nuclear Information System (INIS)

    This paper reports a wet chemical synthesis technique for large-scale fabrication of perovskite barium strontium titanate nano-particles near room temperature and under ambient pressure. The process employs titanium alkoxide and alkali earth hydroxides as starting materials and involves very simple operation steps. Particle size and crystallinity of the particles are controllable by changing the processing parameters. Observations by X-ray diffraction, scanning electron microscopy and transmission electron microscopy TEM indicate that the particles are well-crystallized, chemically stoichiometric and ∼50nm in diameter. The nanoparticles can be sintered into ceramics at 1150 deg. C and show typical ferroelectric hysteresis loops

  16. Towards large scale modelling of wetland water dynamics in northern basins.

    Science.gov (United States)

    Pedinotti, V.; Sapriza, G.; Stone, L.; Davison, B.; Pietroniro, A.; Quinton, W. L.; Spence, C.; Wheater, H. S.

    2015-12-01

    Understanding the hydrological behaviour of low topography, wetland-dominated sub-arctic areas is one major issue needed for the improvement of large scale hydrological models. These wet organic soils cover a large extent of Northern America and have a considerable impact on the rainfall-runoff response of a catchment. Moreover their strong interactions with the lower atmosphere and the carbon cycle make of these areas a noteworthy component of the regional climate system. In the framework of the Changing Cold Regions Network (CCRN), this study aims at providing a model for wetland water dynamics that can be used for large scale applications in cold regions. The modelling system has two main components : a) the simulation of surface runoff using the Modélisation Environmentale Communautaire - Surface and Hydrology (MESH) land surface model driven with several gridded atmospheric datasets and b) the routing of surface runoff using the WATROUTE channel scheme. As a preliminary study, we focus on two small representative study basins in Northern Canada : Scotty Creek in the lower Liard River valley of the Northwest Territories and Baker Creek, located a few kilometers north of Yellowknife. Both areas present characteristic landscapes dominated by a series of peat plateaus, channel fens, small lakes and bogs. Moreover, they constitute important fieldwork sites with detailed data to support our modelling study. The challenge of our new wetland model is to represent the hydrological functioning of the various landscape units encountered in those watersheds and their interactions using simple numerical formulations that can be later extended to larger basins such as the Mackenzie river basin. Using observed datasets, the performance of the model to simulate the temporal evolution of hydrological variables such as the water table depth, frost table depth and discharge is assessed.

  17. Large-Scale Solvent-Free Chlorination of Hydroxy-Pyrimidines, -Pyridines, -Pyrazines and -Amides Using Equimolar POCl3

    Directory of Open Access Journals (Sweden)

    Zhihua Sun

    2012-04-01

    Full Text Available Chlorination with equimolar POCl3 can be efficiently achieved not only for hydroxypyrimidines, but also for many other substrates such as 2-hydroxy-pyridines, -quinoxalines, or even -amides. The procedure is solvent-free and involves heating in a sealed reactor at high temperatures using one equivalent of pyridine as base. It is suitable for large scale (multigram batch preparations.

  18. Potential climatic impacts and reliability of large-scale offshore wind farms

    International Nuclear Information System (INIS)

    -based installations. However, the intermittency caused by the significant seasonal wind variations over several major offshore sites is substantial, and demands further options to ensure the reliability of large-scale offshore wind power. The method that we used to simulate the offshore wind turbine effect on the lower atmosphere involved simply increasing the ocean surface drag coefficient. While this method is consistent with several detailed fine-scale simulations of wind turbines, it still needs further study to ensure its validity. New field observations of actual wind turbine arrays are definitely required to provide ultimate validation of the model predictions presented here.

  19. Energy from the desert very large scale PV power : state of the art and into the future

    CERN Document Server

    Komoto, Keiichi; Cunow, Edwin; Megherbi, Karim; Faiman, David; van der Vleuten, Peter

    2013-01-01

    The fourth volume in the established Energy from the Desert series examines and evaluates the potential and feasibility of Very Large Scale Photovoltaic Power Generation (VLS-PV) systems, which have capacities ranging from several megawatts to gigawatts, and to develop practical project proposals toward implementing the VLS-PV systems in the future. It comprehensively analyses all major issues involved in such large scale applications, based on the latest scientific and technological developments by means of close international co-operation with experts from different countries. From t

  20. Large-scale simulations of layered double hydroxide nanocomposite materials

    Science.gov (United States)

    Thyveetil, Mary-Ann

    Layered double hydroxides (LDHs) have the ability to intercalate a multitude of anionic species. Atomistic simulation techniques such as molecular dynamics have provided considerable insight into the behaviour of these materials. We review these techniques and recent algorithmic advances which considerably improve the performance of MD applications. In particular, we discuss how the advent of high performance computing and computational grids has allowed us to explore large scale models with considerable ease. Our simulations have been heavily reliant on computational resources on the UK's NGS (National Grid Service), the US TeraGrid and the Distributed European Infrastructure for Supercomputing Applications (DEISA). In order to utilise computational grids we rely on grid middleware to launch, computationally steer and visualise our simulations. We have integrated the RealityGrid steering library into the Large-scale Atomic/Molecular Massively Parallel Simulator (LAMMPS) 1 . which has enabled us to perform re mote computational steering and visualisation of molecular dynamics simulations on grid infrastruc tures. We also use the Application Hosting Environment (AHE) 2 in order to launch simulations on remote supercomputing resources and we show that data transfer rates between local clusters and super- computing resources can be considerably enhanced by using optically switched networks. We perform large scale molecular dynamics simulations of MgiAl-LDHs intercalated with either chloride ions or a mixture of DNA and chloride ions. The systems exhibit undulatory modes, which are suppressed in smaller scale simulations, caused by the collective thermal motion of atoms in the LDH layers. Thermal undulations provide elastic properties of the system including the bending modulus, Young's moduli and Poisson's ratios. To explore the interaction between LDHs and DNA. we use molecular dynamics techniques to per form simulations of double stranded, linear and plasmid DNA up