WorldWideScience

Sample records for future large-scale observatories

  1. Supernova relic electron neutrinos and anti-neutrinos in future large-scale observatories

    International Nuclear Information System (INIS)

    Volpe, C.; Welzel, J.

    2007-01-01

    We investigate the signal from supernova relic neutrinos in future large scale observatories, such as MEMPHYS (UNO, Hyper-K), LENA and GLACIER, at present under study. We discuss that complementary information might be gained from the observation of supernova relic electron antineutrinos and neutrinos using the scattering on protons on one hand, and on nuclei such as oxygen, carbon or argon on the other hand. When determining the relic neutrino fluxes we also include, for the first time, the coupling of the neutrino magnetic moment to magnetic fields within the core collapse supernova. We present numerical results on both the relic ν e and ν-bar e fluxes and on the number of events for ν e + C 12 , ν e + O 16 , ν e + Ar 40 and ν-bar e + p for various oscillation scenarios. The observation of supernova relic neutrinos might provide us with unique information on core-collapse supernova explosions, on the star formation history and on neutrino properties, that still remain unknown. (authors)

  2. Supernova relic electron neutrinos and anti-neutrinos in future large-scale observatories

    Energy Technology Data Exchange (ETDEWEB)

    Volpe, C.; Welzel, J. [Institut de Physique Nuclueaire, 91 - Orsay (France)

    2007-07-01

    We investigate the signal from supernova relic neutrinos in future large scale observatories, such as MEMPHYS (UNO, Hyper-K), LENA and GLACIER, at present under study. We discuss that complementary information might be gained from the observation of supernova relic electron antineutrinos and neutrinos using the scattering on protons on one hand, and on nuclei such as oxygen, carbon or argon on the other hand. When determining the relic neutrino fluxes we also include, for the first time, the coupling of the neutrino magnetic moment to magnetic fields within the core collapse supernova. We present numerical results on both the relic {nu}{sub e} and {nu}-bar{sub e} fluxes and on the number of events for {nu}{sub e} + C{sup 12}, {nu}{sub e} + O{sup 16}, {nu}{sub e} + Ar{sup 40} and {nu}-bar{sub e} + p for various oscillation scenarios. The observation of supernova relic neutrinos might provide us with unique information on core-collapse supernova explosions, on the star formation history and on neutrino properties, that still remain unknown. (authors)

  3. A future large-aperture UVOIR space observatory: reference designs

    Science.gov (United States)

    Rioux, Norman; Thronson, Harley; Feinberg, Lee; Stahl, H. Philip; Redding, Dave; Jones, Andrew; Sturm, James; Collins, Christine; Liu, Alice

    2015-09-01

    Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. We describe the feasibility assessment of system thermal and dynamic stability for supporting coronagraphy. The observatory is in a Sun-Earth L2 orbit providing a stable thermal environment and excellent field of regard. Reference designs include a 36-segment 9.2 m aperture telescope that stows within a five meter diameter launch vehicle fairing. Performance needs developed under the study are traceable to a variety of reference designs including options for a monolithic primary mirror.

  4. The Saskatchewan River Basin - a large scale observatory for water security research (Invited)

    Science.gov (United States)

    Wheater, H. S.

    2013-12-01

    The 336,000 km2 Saskatchewan River Basin (SaskRB) in Western Canada illustrates many of the issues of Water Security faced world-wide. It poses globally-important science challenges due to the diversity in its hydro-climate and ecological zones. With one of the world's more extreme climates, it embodies environments of global significance, including the Rocky Mountains (source of the major rivers in Western Canada), the Boreal Forest (representing 30% of Canada's land area) and the Prairies (home to 80% of Canada's agriculture). Management concerns include: provision of water resources to more than three million inhabitants, including indigenous communities; balancing competing needs for water between different uses, such as urban centres, industry, agriculture, hydropower and environmental flows; issues of water allocation between upstream and downstream users in the three prairie provinces; managing the risks of flood and droughts; and assessing water quality impacts of discharges from major cities and intensive agricultural production. Superimposed on these issues is the need to understand and manage uncertain water futures, including effects of economic growth and environmental change, in a highly fragmented water governance environment. Key science questions focus on understanding and predicting the effects of land and water management and environmental change on water quantity and quality. To address the science challenges, observational data are necessary across multiple scales. This requires focussed research at intensively monitored sites and small watersheds to improve process understanding and fine-scale models. To understand large-scale effects on river flows and quality, land-atmosphere feedbacks, and regional climate, integrated monitoring, modelling and analysis is needed at large basin scale. And to support water management, new tools are needed for operational management and scenario-based planning that can be implemented across multiple scales and

  5. Large-Scale Science Observatories: Building on What We Have Learned from USArray

    Science.gov (United States)

    Woodward, R.; Busby, R.; Detrick, R. S.; Frassetto, A.

    2015-12-01

    With the NSF-sponsored EarthScope USArray observatory, the Earth science community has built the operational capability and experience to tackle scientific challenges at the largest scales, such as a Subduction Zone Observatory. In the first ten years of USArray, geophysical instruments were deployed across roughly 2% of the Earth's surface. The USArray operated a rolling deployment of seismic stations that occupied ~1,700 sites across the USA, made co-located atmospheric observations, occupied hundreds of sites with magnetotelluric sensors, expanded a backbone reference network of seismic stations, and provided instruments to PI-led teams that deployed thousands of additional seismic stations. USArray included a comprehensive outreach component that directly engaged hundreds of students at over 50 colleges and universities to locate station sites and provided Earth science exposure to roughly 1,000 landowners who hosted stations. The project also included a comprehensive data management capability that received, archived and distributed data, metadata, and data products; data were acquired and distributed in real time. The USArray project was completed on time and under budget and developed a number of best practices that can inform other large-scale science initiatives that the Earth science community is contemplating. Key strategies employed by USArray included: using a survey, rather than hypothesis-driven, mode of observation to generate comprehensive, high quality data on a large-scale for exploration and discovery; making data freely and openly available to any investigator from the very onset of the project; and using proven, commercial, off-the-shelf systems to ensure a fast start and avoid delays due to over-reliance on unproven technology or concepts. Scope was set ambitiously, but managed carefully to avoid overextending. Configuration was controlled to ensure efficient operations while providing consistent, uniform observations. Finally, community

  6. An Engineering Design Reference Mission for a Future Large-Aperture UVOIR Space Observatory

    Science.gov (United States)

    Thronson, Harley A.; Bolcar, Matthew R.; Clampin, Mark; Crooke, Julie A.; Redding, David; Rioux, Norman; Stahl, H. Philip

    2016-01-01

    From the 2010 NRC Decadal Survey and the NASA Thirty-Year Roadmap, Enduring Quests, Daring Visions, to the recent AURA report, From Cosmic Birth to Living Earths, multiple community assessments have recommended development of a large-aperture UVOIR space observatory capable of achieving a broad range of compelling scientific goals. Of these priority science goals, the most technically challenging is the search for spectroscopic biomarkers in the atmospheres of exoplanets in the solar neighborhood. Here we present an engineering design reference mission (EDRM) for the Advanced Technology Large-Aperture Space Telescope (ATLAST), which was conceived from the start as capable of breakthrough science paired with an emphasis on cost control and cost effectiveness. An EDRM allows the engineering design trade space to be explored in depth to determine what are the most demanding requirements and where there are opportunities for margin against requirements. Our joint NASA GSFC/JPL/MSFC/STScI study team has used community-provided science goals to derive mission needs, requirements, and candidate mission architectures for a future large-aperture, non-cryogenic UVOIR space observatory. The ATLAST observatory is designed to operate at a Sun-Earth L2 orbit, which provides a stable thermal environment and excellent field of regard. Our reference designs have emphasized a serviceable 36-segment 9.2 m aperture telescope that stows within a five-meter diameter launch vehicle fairing. As part of our cost-management effort, this particular reference mission builds upon the engineering design for JWST. Moreover, it is scalable to a variety of launch vehicle fairings. Performance needs developed under the study are traceable to a variety of additional reference designs, including options for a monolithic primary mirror.

  7. The Landscape Evolution Observatory: a large-scale controllable infrastructure to study coupled Earth-surface processes

    Science.gov (United States)

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferré, T.P.A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-01-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  8. S-net : Construction of large scale seafloor observatory network for tsunamis and earthquakes along the Japan Trench

    Science.gov (United States)

    Mochizuki, M.; Uehira, K.; Kanazawa, T.; Shiomi, K.; Kunugi, T.; Aoi, S.; Matsumoto, T.; Sekiguchi, S.; Yamamoto, N.; Takahashi, N.; Nakamura, T.; Shinohara, M.; Yamada, T.

    2017-12-01

    NIED has launched the project of constructing a seafloor observatory network for tsunamis and earthquakes after the occurrence of the 2011 Tohoku Earthquake to enhance reliability of early warnings of tsunamis and earthquakes. The observatory network was named "S-net". The S-net project has been financially supported by MEXT.The S-net consists of 150 seafloor observatories which are connected in line with submarine optical cables. The total length of submarine optical cable is about 5,500 km. The S-net covers the focal region of the 2011 Tohoku Earthquake and its vicinity regions. Each observatory equips two units of a high sensitive pressure gauges as a tsunami meter and four sets of three-component seismometers. The S-net is composed of six segment networks. Five of six segment networks had been already installed. Installation of the last segment network covering the outer rise area have been finally finished by the end of FY2016. The outer rise segment has special features like no other five segments of the S-net. Those features are deep water and long distance. Most of 25 observatories on the outer rise segment are located at the depth of deeper than 6,000m WD. Especially, three observatories are set on the seafloor of deeper than about 7.000m WD, and then the pressure gauges capable of being used even at 8,000m WD are equipped on those three observatories. Total length of the submarine cables of the outer rise segment is about two times longer than those of the other segments. The longer the cable system is, the higher voltage supply is needed, and thus the observatories on the outer rise segment have high withstanding voltage characteristics. We employ a dispersion management line of a low loss formed by combining a plurality of optical fibers for the outer rise segment cable, in order to achieve long-distance, high-speed and large-capacity data transmission Installation of the outer rise segment was finished and then full-scale operation of S-net has started

  9. Property and instrumental heritage of the Bordeaux Astronomical Observatory; What future?

    Science.gov (United States)

    de La Noë, J.; Charlot, P.; Grousset, F.

    2009-11-01

    In the years 1870, the Government of the Third Republic decided to develop scientific and technical research. Such an effort contributed to supporting and creating universities and other institutes such as astronomical observatories. The dual wish of the Bordeaux council and professors at the Faculté des Sciences de Bordeaux led to the foundation of the astronomical Observatory of Bordeaux. It was set up by Georges Rayet in the years 1880's. The observatory owns a property of 12 hectares with a dozen of buildings, five domes housing an instrument, a Würzburg radiotelescope, a 2.5 meter radiotelescope, and a large collection of about 250 instruments, 4 500 photographic plates, drawings, slides for teaching astronomy, maps of the Carte du Ciel and 200 files of archives. In addition, the library contains about a thousand books for the period 1600-1950. The future of the observatory is not clear at the present time, when the Laboratoire d'Astrophysique will leave to the campus in a few years.

  10. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  11. Eliminating large-scale magnetospheric current perturbations from long-term geomagnetic observatory data

    Science.gov (United States)

    Pick, L.; Korte, M. C.

    2016-12-01

    Magnetospheric currents generate the largest external contribution to the geomagnetic field observed on Earth. Of particular importance is the solar-driven effect of the ring current whose fluctuations overlap with internal field secular variation (SV). Recent core field models thus co-estimate this effect but their validity is limited to the last 15 years offering satellite data. We aim at eliminating magnetospheric modulation from the whole geomagnetic observatory record from 1840 onwards in order to obtain clean long-term SV that will enhance core flow and geodynamo studies.The ring current effect takes form of a southward directed external dipole field aligned with the geomagnetic main field axis. Commonly the Dst index (Sugiura, 1964) is used to parametrize temporal variations of this dipole term. Because of baseline instabilities, the alternative RC index was derived from hourly means of 21 stations spanning 1997-2013 (Olsen et al., 2014). We follow their methodology based on annual means from a reduced station set spanning 1960-2010. The absolute level of the variation so determined is "hidden" in the static lithospheric offsets taken as quiet-time means. We tackle this issue by subtracting crustal biases independently calculated for each observatory from an inversion of combined Swarm satellite and observatory data.Our index reproduces the original annual RC index variability with a reasonable offset of -10 nT in the reference time window 2000-2010. Prior to that it depicts a long-term trend consistent with the external dipole term from COV-OBS (Gillet et al., 2013), being the only long-term field model available for comparison. Sharper variations that are better correlated with the Ap index than the COV-OBS solution lend support to the usefulness of our initial modeling approach. Following a detailed sensitivity study of station choice future work will focus on increasing the resolution from annual to hourly means.

  12. Future changes in large-scale transport and stratosphere-troposphere exchange

    Science.gov (United States)

    Abalos, M.; Randel, W. J.; Kinnison, D. E.; Garcia, R. R.

    2017-12-01

    Future changes in large-scale transport are investigated in long-term (1955-2099) simulations of the Community Earth System Model - Whole Atmosphere Community Climate Model (CESM-WACCM) under an RCP6.0 climate change scenario. We examine artificial passive tracers in order to isolate transport changes from future changes in emissions and chemical processes. The model suggests enhanced stratosphere-troposphere exchange in both directions (STE), with decreasing tropospheric and increasing stratospheric tracer concentrations in the troposphere. Changes in the different transport processes are evaluated using the Transformed Eulerian Mean continuity equation, including parameterized convective transport. Dynamical changes associated with the rise of the tropopause height are shown to play a crucial role on future transport trends.

  13. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  14. Assessment of present and future large-scale semiconductor detector systems

    International Nuclear Information System (INIS)

    Spieler, H.G.; Haller, E.E.

    1984-11-01

    The performance of large-scale semiconductor detector systems is assessed with respect to their theoretical potential and to the practical limitations imposed by processing techniques, readout electronics and radiation damage. In addition to devices which detect reaction products directly, the analysis includes photodetectors for scintillator arrays. Beyond present technology we also examine currently evolving structures and techniques which show potential for producing practical devices in the foreseeable future

  15. Architecture for large-scale automatic web accessibility evaluation based on the UWEM methodology

    DEFF Research Database (Denmark)

    Ulltveit-Moe, Nils; Olsen, Morten Goodwin; Pillai, Anand B.

    2008-01-01

    The European Internet Accessibility project (EIAO) has developed an Observatory for performing large scale automatic web accessibility evaluations of public sector web sites in Europe. The architecture includes a distributed web crawler that crawls web sites for links until either a given budget...... of web pages have been identified or the web site has been crawled exhaustively. Subsequently, a uniform random subset of the crawled web pages is sampled and sent for accessibility evaluation and the evaluation results are stored in a Resource Description Format (RDF) database that is later loaded...... challenges that the project faced and the solutions developed towards building a system capable of regular large-scale accessibility evaluations with sufficient capacity and stability. It also outlines some possible future architectural improvements....

  16. Prototyping a large-scale distributed system for the Great Observatories era - NASA Astrophysics Data System (ADS)

    Science.gov (United States)

    Shames, Peter

    1990-01-01

    The NASA Astrophysics Data System (ADS) is a distributed information system intended to support research in the Great Observatories era, to simplify access to data, and to enable simultaneous analyses of multispectral data sets. Here, the user agent and interface, its functions, and system components are examined, and the system architecture and infrastructure is addressed. The present status of the system and related future activities are examined.

  17. Large scale anisotropy studies with the Auger Observatory

    International Nuclear Information System (INIS)

    Santos, E.M.; Letessier-Selvon, A.

    2006-01-01

    With the increasing Auger surface array data sample of the highest energy cosmic rays, large scale anisotropy studies at this part of the spectrum become a promising path towards the understanding of the origin of ultra-high energy cosmic particles. We describe the methods underlying the search for distortions in the cosmic rays arrival directions over large angular scales, that is, bigger than those commonly employed in the search for correlations with point-like sources. The widely used tools, known as coverage maps, are described and some of the issues involved in their calculations are presented through Monte Carlo based studies. Coverage computation requires a deep knowledge on the local detection efficiency, including the influence of weather parameters like temperature and pressure. Particular attention is devoted to a new proposed method to extract the coverage, based upon the assumption of time factorization of an extensive air shower detector acceptance. We use Auger monitoring data to test the goodness of such a hypothesis. We finally show the necessity of using more than one coverage to extract any possible anisotropic pattern on the sky, by pointing to some of the biases present in commonly used methods based, for example, on the scrambling of the UTC arrival times for each event. (author)

  18. The World Space Observatory Ultraviolet (WSO-UV), as a bridge to future UV astronomy

    Science.gov (United States)

    Shustov, B.; Gómez de Castro, A. I.; Sachkov, M.; Vallejo, J. C.; Marcos-Arenal, P.; Kanev, E.; Savanov, I.; Shugarov, A.; Sichevskii, S.

    2018-04-01

    Ultraviolet (UV) astronomy is a vital branch of space astronomy. Many dozens of short-term UV-experiments in space, as well as long-term observatories, have brought a very important knowledge on the physics and chemistry of the Universe during the last decades. Unfortunately, no large UV-observatories are planned to be launched by most of space agencies in the coming 10-15 years. Conversely, the large UVOIR observatories of the future will appear not earlier than in 2030s. This paper briefly describes the projects that have been proposed by various groups. We conclude that the World Space Observatory-Ultraviolet (WSO-UV) will be the only 2-m class UV telescope with capabilities similar to those of the HST for the next decade. The WSO-UV has been described in detail in previous publications, and this paper updates the main characteristics of its instruments and the current state of the whole project. It also addresses the major science topics that have been included in the core program of the WSO-UV, making this core program very relevant to the current state of the UV-astronomy. Finally, we also present here the ground segment architecture that will implement this program.

  19. Moving toward queue operations at the Large Binocular Telescope Observatory

    Science.gov (United States)

    Edwards, Michelle L.; Summers, Doug; Astier, Joseph; Suarez Sola, Igor; Veillet, Christian; Power, Jennifer; Cardwell, Andrew; Walsh, Shane

    2016-07-01

    The Large Binocular Telescope Observatory (LBTO), a joint scientific venture between the Instituto Nazionale di Astrofisica (INAF), LBT Beteiligungsgesellschaft (LBTB), University of Arizona, Ohio State University (OSU), and the Research Corporation, is one of the newest additions to the world's collection of large optical/infrared ground-based telescopes. With its unique, twin 8.4m mirror design providing a 22.8 meter interferometric baseline and the collecting area of an 11.8m telescope, LBT has a window of opportunity to exploit its singular status as the "first" of the next generation of Extremely Large Telescopes (ELTs). Prompted by urgency to maximize scientific output during this favorable interval, LBTO recently re-evaluated its operations model and developed a new strategy that augments classical observing with queue. Aided by trained observatory staff, queue mode will allow for flexible, multi-instrument observing responsive to site conditions. Our plan is to implement a staged rollout that will provide many of the benefits of queue observing sooner rather than later - with more bells and whistles coming in future stages. In this paper, we outline LBTO's new scientific model, focusing specifically on our "lean" resourcing and development, reuse and adaptation of existing software, challenges presented from our one-of-a-kind binocular operations, and lessons learned. We also outline further stages of development and our ultimate goals for queue.

  20. Early laser operations at the Large Binocular Telescope Observatory

    Science.gov (United States)

    Rahmer, Gustavo; Lefebvre, Michael; Christou, Julian; Raab, Walfried; Rabien, Sebastian; Ziegleder, Julian; Borelli, José L.; Gässler, Wolfgang

    2014-08-01

    ARGOS is the GLAO (Ground-Layer Adaptive Optics) Rayleigh-based LGS (Laser Guide Star) facility for the Large Binocular Telescope Observatory (LBTO). It is dedicated for observations with LUCI1 and LUCI2, LBTO's pair of NIR imagers and multi-object spectrographs. The system projects three laser beams from the back of each of the two secondary mirror units, which create two constellations circumscribed on circles of 2 arcmin radius with 120 degree spacing. Each of the six Nd:YAG lasers provides a beam of green (532nm) pulses at a rate of 10kHz with a power of 14W to 18W. We achieved first on-sky propagation on the night of November 5, 2013, and commissioning of the full system will take place during 2014. We present the initial results of laser operations at the observatory, including safety procedures and the required coordination with external agencies (FAA, Space Command, and Military Airspace Manager). We also describe our operational procedures and report on our experiences with aircraft spotters. Future plans for safer and more efficient aircraft monitoring and detection are discussed.

  1. Important aspects of Eastern Mediterranean large-scale variability revealed from data of three fixed observatories

    Science.gov (United States)

    Bensi, Manuel; Velaoras, Dimitris; Cardin, Vanessa; Perivoliotis, Leonidas; Pethiakis, George

    2015-04-01

    Long-term variations of temperature and salinity observed in the Adriatic and Aegean Seas seem to be regulated by larger-scale circulation modes of the Eastern Mediterranean (EMed) Sea, such as the recently discovered feedback mechanisms, namely the BiOS (Bimodal Oscillating System) and the internal thermohaline pump theories. These theories are the results of interpretation of many years' observations, highlighting possible interactions between two key regions of the EMed. Although repeated oceanographic cruises carried out in the past or planned for the future are a very useful tool for understanding the interaction between the two basins (e.g. alternating dense water formation, salt ingressions), recent long time-series of high frequency (up to 1h) sampling have added valuable information to the interpretation of internal mechanisms for both areas (i.e. mesoscale eddies, evolution of fast internal processes, etc.). During the last 10 years, three deep observatories were deployed and maintained in the Adriatic, Ionian, and Aegean Seas: they are respectively, the E2-M3A, the Pylos, and the E1-M3A. All are part of the largest European network of Fixed Point Open Ocean Observatories (FixO3, http://www.fixo3.eu/). Herein, from the analysis of temperature and salinity, and potential density time series collected at the three sites from the surface down to the intermediate and deep layers, we will discuss the almost perfect anti-correlated behavior between the Adriatic and the Aegean Seas. Our data, collected almost continuously since 2006, reveal that these observatories well represent the thermohaline variability of their own areas. Interestingly, temperature and salinity in the intermediate layer suddenly increased in the South Adriatic from the end of 2011, exactly when they started decreasing in the Aegean Sea. Moreover, Pylos data used together with additional ones (e.g. Absolute dynamic topography, temperature and salinity data from other platforms) collected

  2. The LAGO (Large Aperture GRB Observatory) in Peru

    Science.gov (United States)

    Tueros-Cuadros, E.; Otiniano, L.; Chirinos, J.; Soncco, C.; Guevara-Day, W.

    2012-07-01

    The Large Aperture GRBs Observatory is a continental-wide observatory devised to detect high energy (around 100 GeV) component of Gamma Ray Bursts (GRBs), by using the single particle technique in arrays of Water Cherenkov Detectors (WCDs) at high mountain sites of Argentina, Bolivia, Colombia, Guatemala, Mexico, Venezuela and Peru. Details of the instalation and operation of the detectors in Marcapomacocha in Peru at 4550 m.a.s.l. are given. The detector calibration method will also be shown.

  3. Expected Future Conditions for Secure Power Operation with Large Scale of RES Integration

    International Nuclear Information System (INIS)

    Majstrovic, G.; Majstrovic, M.; Sutlovic, E.

    2015-01-01

    EU energy strategy is strongly focused on the large scale integration of renewable energy sources. The most dominant part here is taken by variable sources - wind power plants. Grid integration of intermittent sources along with keeping the system stable and secure is one of the biggest challenges for the TSOs. This part is often neglected by the energy policy makers, so this paper deals with expected future conditions for secure power system operation with large scale wind integration. It gives an overview of expected wind integration development in EU, as well as expected P/f regulation and control needs. The paper is concluded with several recommendations. (author).

  4. Toward a global multi-scale heliophysics observatory

    Science.gov (United States)

    Semeter, J. L.

    2017-12-01

    We live within the only known stellar-planetary system that supports life. What we learn about this system is not only relevant to human society and its expanding reach beyond Earth's surface, but also to our understanding of the origins and evolution of life in the universe. Heliophysics is focused on solar-terrestrial interactions mediated by the magnetic and plasma environment surrounding the planet. A defining feature of energy flow through this environment is interaction across physical scales. A solar disturbance aimed at Earth can excite geospace variability on scales ranging from thousands of kilometers (e.g., global convection, region 1 and 2 currents, electrojet intensifications) to 10's of meters (e.g., equatorial spread-F, dispersive Alfven waves, plasma instabilities). Most "geospace observatory" concepts are focused on a single modality (e.g., HF/UHF radar, magnetometer, optical) providing a limited parameter set over a particular spatiotemporal resolution. Data assimilation methods have been developed to couple heterogeneous and distributed observations, but resolution has typically been prescribed a-priori and according to physical assumptions. This paper develops a conceptual framework for the next generation multi-scale heliophysics observatory, capable of revealing and quantifying the complete spectrum of cross-scale interactions occurring globally within the geospace system. The envisioned concept leverages existing assets, enlists citizen scientists, and exploits low-cost access to the geospace environment. Examples are presented where distributed multi-scale observations have resulted in substantial new insight into the inner workings of our stellar-planetary system.

  5. The Large Observatory For x-ray Timing

    DEFF Research Database (Denmark)

    Feroci, M.; Herder, J. W. den; Bozzo, E.

    2014-01-01

    The Large Observatory For x-ray Timing (LOFT) was studied within ESA M3 Cosmic Vision framework and participated in the final down-selection for a launch slot in 2022-2024. Thanks to the unprecedented combination of effective area and spectral resolution of its main instrument, LOFT will study th...

  6. Developing a NASA strategy for the verification of large space telescope observatories

    Science.gov (United States)

    Crooke, Julie A.; Gunderson, Johanna A.; Hagopian, John G.; Levine, Marie

    2006-06-01

    In July 2005, the Office of Program Analysis and Evaluation (PA&E) at NASA Headquarters was directed to develop a strategy for verification of the performance of large space telescope observatories, which occurs predominantly in a thermal vacuum test facility. A mission model of the expected astronomical observatory missions over the next 20 years was identified along with performance, facility and resource requirements. Ground testing versus alternatives was analyzed to determine the pros, cons and break points in the verification process. Existing facilities and their capabilities were examined across NASA, industry and other government agencies as well as the future demand for these facilities across NASA's Mission Directorates. Options were developed to meet the full suite of mission verification requirements, and performance, cost, risk and other analyses were performed. Findings and recommendations from the study were presented to the NASA Administrator and the NASA Strategic Management Council (SMC) in February 2006. This paper details the analysis, results, and findings from this study.

  7. The effect of the geomagnetic field on cosmic ray energy estimates and large scale anisotropy searches on data from the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Abreu, P.; /Lisbon, IST; Aglietta, M.; /IFSI, Turin; Ahn, E.J.; /Fermilab; Albuquerque, I.F.M.; /Sao Paulo U.; Allard, D.; /APC, Paris; Allekotte, I.; /Centro Atomico Bariloche; Allen, J.; /New York U.; Allison, P.; /Ohio State U.; Alvarez Castillo, J.; /Mexico U., ICN; Alvarez-Muniz, J.; /Santiago de Compostela U.; Ambrosio, M.; /Naples U. /INFN, Naples /Nijmegen U., IMAPP

    2011-11-01

    We present a comprehensive study of the influence of the geomagnetic field on the energy estimation of extensive air showers with a zenith angle smaller than 60{sup o}, detected at the Pierre Auger Observatory. The geomagnetic field induces an azimuthal modulation of the estimated energy of cosmic rays up to the {approx} 2% level at large zenith angles. We present a method to account for this modulation of the reconstructed energy. We analyse the effect of the modulation on large scale anisotropy searches in the arrival direction distributions of cosmic rays. At a given energy, the geomagnetic effect is shown to induce a pseudo-dipolar pattern at the percent level in the declination distribution that needs to be accounted for. In this work, we have identified and quantified a systematic uncertainty affecting the energy determination of cosmic rays detected by the surface detector array of the Pierre Auger Observatory. This systematic uncertainty, induced by the influence of the geomagnetic field on the shower development, has a strength which depends on both the zenith and the azimuthal angles. Consequently, we have shown that it induces distortions of the estimated cosmic ray event rate at a given energy at the percent level in both the azimuthal and the declination distributions, the latter of which mimics an almost dipolar pattern. We have also shown that the induced distortions are already at the level of the statistical uncertainties for a number of events N {approx_equal} 32 000 (we note that the full Auger surface detector array collects about 6500 events per year with energies above 3 EeV). Accounting for these effects is thus essential with regard to the correct interpretation of large scale anisotropy measurements taking explicitly profit from the declination distribution.

  8. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    Science.gov (United States)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning

  9. FutureGen 2.0 Oxy-combustion Large Scale Test – Final Report

    Energy Technology Data Exchange (ETDEWEB)

    Kenison, LaVesta [URS, Pittsburgh, PA (United States); Flanigan, Thomas [URS, Pittsburgh, PA (United States); Hagerty, Gregg [URS, Pittsburgh, PA (United States); Gorrie, James [Air Liquide, Kennesaw, GA (United States); Leclerc, Mathieu [Air Liquide, Kennesaw, GA (United States); Lockwood, Frederick [Air Liquide, Kennesaw, GA (United States); Falla, Lyle [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Macinnis, Jim [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Fedak, Mathew [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Yakle, Jeff [Babcock & Wilcox and Burns McDonnell, Kansas City, MO (United States); Williford, Mark [Futuregen Industrial Alliance, Inc., Morgan County, IL (United States); Wood, Paul [Futuregen Industrial Alliance, Inc., Morgan County, IL (United States)

    2016-04-01

    The primary objectives of the FutureGen 2.0 CO2 Oxy-Combustion Large Scale Test Project were to site, permit, design, construct, and commission, an oxy-combustion boiler, gas quality control system, air separation unit, and CO2 compression and purification unit, together with the necessary supporting and interconnection utilities. The project was to demonstrate at commercial scale (168MWe gross) the capability to cleanly produce electricity through coal combustion at a retrofitted, existing coal-fired power plant; thereby, resulting in near-zeroemissions of all commonly regulated air emissions, as well as 90% CO2 capture in steady-state operations. The project was to be fully integrated in terms of project management, capacity, capabilities, technical scope, cost, and schedule with the companion FutureGen 2.0 CO2 Pipeline and Storage Project, a separate but complementary project whose objective was to safely transport, permanently store and monitor the CO2 captured by the Oxy-combustion Power Plant Project. The FutureGen 2.0 Oxy-Combustion Large Scale Test Project successfully achieved all technical objectives inclusive of front-end-engineering and design, and advanced design required to accurately estimate and contract for the construction, commissioning, and start-up of a commercial-scale "ready to build" power plant using oxy-combustion technology, including full integration with the companion CO2 Pipeline and Storage project. Ultimately the project did not proceed to construction due to insufficient time to complete necessary EPC contract negotiations and commercial financing prior to expiration of federal co-funding, which triggered a DOE decision to closeout its participation in the project. Through the work that was completed, valuable technical, commercial, and programmatic lessons were learned. This project has significantly advanced the development of near-zero emission technology and will

  10. Large scale anisotropy studies of ultra high energy cosmic rays using data taken with the surface detector of the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Grigat, Marius

    2011-06-10

    The distribution of arrival directions of cosmic rays is remarkably uniform over the complete spectrum of energies. At large angular scales only tiny deviations from isotropy have been observed and huge statistics are required to quantify the corresponding amplitudes. The measurement of cosmic rays with energies above 10{sup 15} eV is only feasible with large, earthbound observatories: The cosmic ray primary particles initiate cascades of secondary particles in the Earth's atmosphere. Every aspect of the development of these air showers down to the measurement of the resulting particles at ground level needs to be well understood and controlled in order to precisely reconstruct the properties of the primary particle. The development of air showers is subject to systematic distortions caused by the magnetic field of the Earth. Both this and other local effects are capable of inducing false anisotropy into the distribution of arrival directions. In this thesis, the effect of the geomagnetic field on the energy measurement is modelled and quantified; consequently, a correction of the energy estimator is derived. Furthermore, a method is introduced to fit dipolar patterns to the distribution of arrival directions of cosmic rays as observed from the field of view of the surface detector of the Pierre Auger Observatory. After correcting for all relevant local effects the method is applied to data and the parameters of a potentially underlying dipole are determined and evaluated. (orig.)

  11. Large scale anisotropy studies of ultra high energy cosmic rays using data taken with the surface detector of the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Grigat, Marius

    2011-06-10

    The distribution of arrival directions of cosmic rays is remarkably uniform over the complete spectrum of energies. At large angular scales only tiny deviations from isotropy have been observed and huge statistics are required to quantify the corresponding amplitudes. The measurement of cosmic rays with energies above 10{sup 15} eV is only feasible with large, earthbound observatories: The cosmic ray primary particles initiate cascades of secondary particles in the Earth's atmosphere. Every aspect of the development of these air showers down to the measurement of the resulting particles at ground level needs to be well understood and controlled in order to precisely reconstruct the properties of the primary particle. The development of air showers is subject to systematic distortions caused by the magnetic field of the Earth. Both this and other local effects are capable of inducing false anisotropy into the distribution of arrival directions. In this thesis, the effect of the geomagnetic field on the energy measurement is modelled and quantified; consequently, a correction of the energy estimator is derived. Furthermore, a method is introduced to fit dipolar patterns to the distribution of arrival directions of cosmic rays as observed from the field of view of the surface detector of the Pierre Auger Observatory. After correcting for all relevant local effects the method is applied to data and the parameters of a potentially underlying dipole are determined and evaluated. (orig.)

  12. Large scale anisotropy studies of ultra high energy cosmic rays using data taken with the surface detector of the Pierre Auger Observatory

    International Nuclear Information System (INIS)

    Grigat, Marius

    2011-01-01

    The distribution of arrival directions of cosmic rays is remarkably uniform over the complete spectrum of energies. At large angular scales only tiny deviations from isotropy have been observed and huge statistics are required to quantify the corresponding amplitudes. The measurement of cosmic rays with energies above 10 15 eV is only feasible with large, earthbound observatories: The cosmic ray primary particles initiate cascades of secondary particles in the Earth's atmosphere. Every aspect of the development of these air showers down to the measurement of the resulting particles at ground level needs to be well understood and controlled in order to precisely reconstruct the properties of the primary particle. The development of air showers is subject to systematic distortions caused by the magnetic field of the Earth. Both this and other local effects are capable of inducing false anisotropy into the distribution of arrival directions. In this thesis, the effect of the geomagnetic field on the energy measurement is modelled and quantified; consequently, a correction of the energy estimator is derived. Furthermore, a method is introduced to fit dipolar patterns to the distribution of arrival directions of cosmic rays as observed from the field of view of the surface detector of the Pierre Auger Observatory. After correcting for all relevant local effects the method is applied to data and the parameters of a potentially underlying dipole are determined and evaluated. (orig.)

  13. Observing trans-Planckian ripples in the primordial power spectrum with future large scale structure probes

    DEFF Research Database (Denmark)

    Hamann, Jan; Hannestad, Steen; Sloth, Martin Snoager

    2008-01-01

    We revisit the issue of ripples in the primordial power spectra caused by trans-Planckian physics, and the potential for their detection by future cosmological probes. We find that for reasonably large values of the first slow-roll parameter epsilon (> 0.001), a positive detection of trans......-Planckian ripples can be made even if the amplitude is as low as 10^-4. Data from the Large Synoptic Survey Telescope (LSST) and the proposed future 21 cm survey with the Fast Fourier Transform Telescope (FFTT) will be particularly useful in this regard. If the scale of inflation is close to its present upper bound...

  14. Large high altitude air shower observatory (LHAASO) project

    International Nuclear Information System (INIS)

    He Huihai

    2010-01-01

    The Large High Altitude Air Shower Observatory (LHAASO) project focuses mainly on the study of 40 GeV-1 PeV gamma ray astronomy and 10 TeV-1 EeV cosmic ray physics. It consists of a 1 km 2 extensive air shower array with 40 000 m 2 muon detectors, 90,000m 2 water Cerenkov detector array, 5 000 m 2 shower core detector array and an air Cerenkov/fluorescence telescope array. Prototype detectors are designed with some of them already in operation. A prototype array of 1% size of LHAASO will be built at the Yangbajing Cosmic Ray Observatory and used to coincidently measure cosmic rays with the ARGO-YBJ experiment. (authors)

  15. Large-scale theoretical calculations in molecular science - design of a large computer system for molecular science and necessary conditions for future computers

    Energy Technology Data Exchange (ETDEWEB)

    Kashiwagi, H [Institute for Molecular Science, Okazaki, Aichi (Japan)

    1982-06-01

    A large computer system was designed and established for molecular science under the leadership of molecular scientists. Features of the computer system are an automated operation system and an open self-service system. Large-scale theoretical calculations have been performed to solve many problems in molecular science, using the computer system. Necessary conditions for future computers are discussed on the basis of this experience.

  16. Large-scale theoretical calculations in molecular science - design of a large computer system for molecular science and necessary conditions for future computers

    International Nuclear Information System (INIS)

    Kashiwagi, H.

    1982-01-01

    A large computer system was designed and established for molecular science under the leadership of molecular scientists. Features of the computer system are an automated operation system and an open self-service system. Large-scale theoretical calculations have been performed to solve many problems in molecular science, using the computer system. Necessary conditions for future computers are discussed on the basis of this experience. (orig.)

  17. The future of primordial features with large-scale structure surveys

    International Nuclear Information System (INIS)

    Chen, Xingang; Namjoo, Mohammad Hossein; Dvorkin, Cora; Huang, Zhiqi; Verde, Licia

    2016-01-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  18. The future of primordial features with large-scale structure surveys

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Xingang; Namjoo, Mohammad Hossein [Institute for Theory and Computation, Harvard-Smithsonian Center for Astrophysics, 60 Garden Street, Cambridge, MA 02138 (United States); Dvorkin, Cora [Department of Physics, Harvard University, Cambridge, MA 02138 (United States); Huang, Zhiqi [School of Physics and Astronomy, Sun Yat-Sen University, 135 Xingang Xi Road, Guangzhou, 510275 (China); Verde, Licia, E-mail: xingang.chen@cfa.harvard.edu, E-mail: dvorkin@physics.harvard.edu, E-mail: huangzhq25@sysu.edu.cn, E-mail: mohammad.namjoo@cfa.harvard.edu, E-mail: liciaverde@icc.ub.edu [ICREA and ICC-UB, University of Barcelona (IEEC-UB), Marti i Franques, 1, Barcelona 08028 (Spain)

    2016-11-01

    Primordial features are one of the most important extensions of the Standard Model of cosmology, providing a wealth of information on the primordial Universe, ranging from discrimination between inflation and alternative scenarios, new particle detection, to fine structures in the inflationary potential. We study the prospects of future large-scale structure (LSS) surveys on the detection and constraints of these features. We classify primordial feature models into several classes, and for each class we present a simple template of power spectrum that encodes the essential physics. We study how well the most ambitious LSS surveys proposed to date, including both spectroscopic and photometric surveys, will be able to improve the constraints with respect to the current Planck data. We find that these LSS surveys will significantly improve the experimental sensitivity on features signals that are oscillatory in scales, due to the 3D information. For a broad range of models, these surveys will be able to reduce the errors of the amplitudes of the features by a factor of 5 or more, including several interesting candidates identified in the recent Planck data. Therefore, LSS surveys offer an impressive opportunity for primordial feature discovery in the next decade or two. We also compare the advantages of both types of surveys.

  19. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  20. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  1. Globalizing Lessons Learned from Regional-scale Observatories

    Science.gov (United States)

    Glenn, S. M.

    2016-02-01

    The Mid Atlantic Regional Association Coastal Ocean Observing System (MARACOOS) has accumulated a decade of experience designing, building and operating a Regional Coastal Ocean Observing System for the U.S. Integrated Ocean Observing System (IOOS). MARACOOS serves societal goals and supports scientific discovery at the scale of a Large Marine Ecosystem (LME). Societal themes include maritime safety, ecosystem decision support, coastal inundation, water quality and offshore energy. Scientific results that feed back on societal goals with better products include improved understanding of seasonal transport pathways and their impact on phytoplankton blooms and hypoxia, seasonal evolution of the subsurface Mid Atlantic Cold Pool and its impact on fisheries, biogeochemical transformations in coastal plumes, coastal ocean evolution and impact on hurricane intensities, and storm sediment transport pathways. As the global ocean observing requirements grow to support additional societal needs for information on fisheries and aquaculture, ocean acidification and deoxygenation, water quality and offshore development, global observing will necessarily evolve to include more coastal observations and forecast models at the scale of the world's many LMEs. Here we describe our efforts to share lessons learned between the observatory operators at the regional-scale of the LMEs. Current collaborators are spread across Europe, and also include Korea, Indonesia, Australia, Brazil and South Africa. Specific examples include the development of a world standard QA/QC approach for HF Radar data that will foster the sharing of data between countries, basin-scale underwater glider missions between internationally-distributed glider ports to developed a shared understanding of operations and an ongoing evaluation of the global ocean models in which the regional models for the LME will be nested, and joint training programs to develop the distributed teams of scientists and technicians

  2. A new proposed approach for future large-scale de-carbonization coal-fired power plants

    International Nuclear Information System (INIS)

    Xu, Gang; Liang, Feifei; Wu, Ying; Yang, Yongping; Zhang, Kai; Liu, Wenyi

    2015-01-01

    The post-combustion CO 2 capture technology provides a feasible and promising method for large-scale CO 2 capture in coal-fired power plants. However, the large-scale CO 2 capture in conventionally designed coal-fired power plants is confronted with various problems, such as the selection of the steam extraction point and steam parameter mismatch. To resolve these problems, an improved design idea for the future coal-fired power plant with large-scale de-carbonization is proposed. A main characteristic of the proposed design is the adoption of a back-pressure steam turbine, which extracts the suitable steam for CO 2 capture and ensures the stability of the integrated system. A new let-down steam turbine generator is introduced to retrieve the surplus energy from the exhaust steam of the back-pressure steam turbine when CO 2 capture is cut off. Results show that the net plant efficiency of the improved design is 2.56% points higher than that of the conventional one when CO 2 capture ratio reaches 80%. Meanwhile, the net plant efficiency of the improved design maintains the same level to that of the conventional design when CO 2 capture is cut off. Finally, the match between the extracted steam and the heat demand of the reboiler is significantly increased, which solves the steam parameter mismatch problem. The techno-economic analysis indicates that the proposed design is a cost-effective approach for the large-scale CO 2 capture in coal-fired power plants. - Highlights: • Problems caused by CO 2 capture in the power plant are deeply analyzed. • An improved design idea for coal-fired power plants with CO 2 capture is proposed. • Thermodynamic, exergy and techno-economic analyses are quantitatively conducted. • Energy-saving effects are found in the proposed coal-fired power plant design idea

  3. The LOFT (Large Observatory for X-ray Timing) background simulations

    DEFF Research Database (Denmark)

    Campana, R.; Feroci, M.; Del Monte, E.

    2012-01-01

    The Large Observatory For X-ray Timing (LOFT) is an innovative medium-class mission selected for an assessment phase in the framework of the ESA M3 Cosmic Vision call. LOFT is intended to answer fundamental questions about the behavior of matter in theh very strong gravitational and magnetic fields...

  4. Rapid Large Scale Reprocessing of the ODI Archive using the QuickReduce Pipeline

    Science.gov (United States)

    Gopu, A.; Kotulla, R.; Young, M. D.; Hayashi, S.; Harbeck, D.; Liu, W.; Henschel, R.

    2015-09-01

    The traditional model of astronomers collecting their observations as raw instrument data is being increasingly replaced by astronomical observatories serving standard calibrated data products to observers and to the public at large once proprietary restrictions are lifted. For this model to be effective, observatories need the ability to periodically re-calibrate archival data products as improved master calibration products or pipeline improvements become available, and also to allow users to rapidly calibrate their data on-the-fly. Traditional astronomy pipelines are heavily I/O dependent and do not scale with increasing data volumes. In this paper, we present the One Degree Imager - Portal, Pipeline and Archive (ODI-PPA) calibration pipeline framework which integrates the efficient and parallelized QuickReduce pipeline to enable a large number of simultaneous, parallel data reduction jobs - initiated by operators AND/OR users - while also ensuring rapid processing times and full data provenance. Our integrated pipeline system allows re-processing of the entire ODI archive (˜15,000 raw science frames, ˜3.0 TB compressed) within ˜18 hours using twelve 32-core compute nodes on the Big Red II supercomputer. Our flexible, fast, easy to operate, and highly scalable framework improves access to ODI data, in particular when data rates double with an upgraded focal plane (scheduled for 2015), and also serve as a template for future data processing infrastructure across the astronomical community and beyond.

  5. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  6. High-Energy Physics Strategies and Future Large-Scale Projects

    CERN Document Server

    Zimmermann, F

    2015-01-01

    We sketch the actual European and international strategies and possible future facilities. In the near term the High Energy Physics (HEP) community will fully exploit the physics potential of the Large Hadron Collider (LHC) through its high-luminosity upgrade (HL-LHC). Post-LHC options include a linear e+e- collider in Japan (ILC) or at CERN (CLIC), as well as circular lepton or hadron colliders in China (CepC/SppC) and Europe (FCC). We conclude with linear and circular acceleration approaches based on crystals, and some perspectives for the far future of accelerator-based particle physics.

  7. Dossier of the observatory workshops of April 10, 2002; Dossier des ateliers de l'Observatoire du 10 avril 2002

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2002-04-01

    The workshops organized by the French observatory of indoor air quality on April 10, 2002 aimed at making a status of the lessons learnt from the pilot study carried out during 2001 by the observatory on 90 residential buildings and 9 schools. This document summarizes the preliminary results obtained during this study and presents the 2002-2003 operational campaign of investigation which will be carried out on 800 sites (accommodations and schools). The pilot study has permitted to perform a first adjustment of the measurement means and has demonstrated the feasibility of the future large-scale campaign. (J.S.)

  8. Operations of and Future Plans for the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, : J.; Abreu, P.; Aglietta, M.; Aguirre, C.; Ahn, E.J.; Allard, D.; Allekotte, I.; Allen, J.; Alvarez-Muniz, J.; Ambrosio, M.; Anchordoqui, L.

    2009-06-01

    These are presentations to be presented at the 31st International Cosmic Ray Conference, in Lodz, Poland during July 2009. It consists of the following presentations: (1) Performance and operation of the Surface Detectors of the Pierre Auger Observatory; (2) Extension of the Pierre Auger Observatory using high-elevation fluorescence telescopes (HEAT); (3) AMIGA - Auger Muons and Infill for the Ground Array of the Pierre Auger Observatory; (4) Radio detection of Cosmic Rays at the southern Auger Observatory; (5) Hardware Developments for the AMIGA enhancement at the Pierre Auger Observatory; (6) A simulation of the fluorescence detectors of the Pierre Auger Observatory using GEANT 4; (7) Education and Public Outreach at the Pierre Auger Observatory; (8) BATATA: A device to characterize the punch-through observed in underground muon detectors and to operate as a prototype for AMIGA; and (9) Progress with the Northern Part of the Pierre Auger Observatory.

  9. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    Science.gov (United States)

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  10. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe.

    Science.gov (United States)

    Blaas, Harry; Kroeze, Carolien

    2014-10-15

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed. To this end, scenarios for the year 2050 are analysed, assuming that in the 27 countries of the European Union fossil diesel will be replaced by biodiesel from algae. Estimates are made for the required fertiliser inputs to algae parks, and how this may increase concentrations of nitrogen and phosphorus in coastal waters, potentially leading to eutrophication. The Global NEWS (Nutrient Export from WaterSheds) model has been used to estimate the transport of nitrogen and phosphorus to the European coastal waters. The results indicate that the amount of nitrogen and phosphorus in the coastal waters may increase considerably in the future as a result of large-scale production of algae for the production of biodiesel, even in scenarios assuming effective waste water treatment and recycling of waste water in algae production. To ensure sustainable production of biodiesel from micro-algae, it is important to develop cultivation systems with low nutrient losses to the environment. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. The Carl Sagan solar and stellar observatories as remote observatories

    Science.gov (United States)

    Saucedo-Morales, J.; Loera-Gonzalez, P.

    In this work we summarize recent efforts made by the University of Sonora, with the goal of expanding the capability for remote operation of the Carl Sagan Solar and Stellar Observatories, as well as the first steps that have been taken in order to achieve autonomous robotic operation in the near future. The solar observatory was established in 2007 on the university campus by our late colleague A. Sánchez-Ibarra. It consists of four solar telescopes mounted on a single equatorial mount. On the other hand, the stellar observatory, which saw the first light on 16 February 2010, is located 21 km away from Hermosillo, Sonora at the site of the School of Agriculture of the University of Sonora. Both observatories can now be remotely controlled, and to some extent are able to operate autonomously. In this paper we discuss how this has been accomplished in terms of the use of software as well as the instruments under control. We also briefly discuss the main scientific and educational objectives, the future plans to improve the control software and to construct an autonomous observatory on a mountain site, as well as the opportunities for collaborations.

  12. The Pierre Auger Observatory Upgrade - Preliminary Design Report

    Energy Technology Data Exchange (ETDEWEB)

    Aab, Alexander [Univ. Siegen (Germany); et al.

    2016-04-12

    The Pierre Auger Observatory has begun a major Upgrade of its already impressive capabilities, with an emphasis on improved mass composition determination using the surface detectors of the Observatory. Known as AugerPrime, the upgrade will include new 4 m2 plastic scintillator detectors on top of all 1660 water-Cherenkov detectors, updated and more flexible surface detector electronics, a large array of buried muon detectors, and an extended duty cycle for operations of the fluorescence detectors. This Preliminary Design Report was produced by the Collaboration in April 2015 as an internal document and information for funding agencies. It outlines the scientific and technical case for AugerPrime. We now release it to the public via the arXiv server. We invite you to review the large number of fundamental results already achieved by the Observatory and our plans for the future.

  13. Resolving the Circumstellar Environment of the Galactic B[e] Supergiant Star MWC 137 from Large to Small Scales

    Science.gov (United States)

    Kraus, Michaela; Liimets, Tiina; Cappa, Cristina E.; Cidale, Lydia S.; Nickeler, Dieter H.; Duronea, Nicolas U.; Arias, Maria L.; Gunawan, Diah S.; Oksala, Mary E.; Borges Fernandes, Marcelo; Maravelias, Grigoris; Curé, Michel; Santander-García, Miguel

    2017-11-01

    The Galactic object MWC 137 has been suggested to belong to the group of B[e] supergiants. However, with its large-scale optical bipolar ring nebula and high-velocity jet and knots, it is a rather atypical representative of this class. We performed multiwavelength observations spreading from the optical to the radio regimes. Based on optical imaging and long-slit spectroscopic data, we found that the northern parts of the large-scale nebula are predominantly blueshifted, while the southern regions appear mostly redshifted. We developed a geometrical model consisting of two double cones. Although various observational features can be approximated with such a scenario, the observed velocity pattern is more complex. Using near-infrared integral-field unit spectroscopy, we studied the hot molecular gas in the vicinity of the star. The emission from the hot CO gas arises in a small-scale disk revolving around the star on Keplerian orbits. Although the disk itself cannot be spatially resolved, its emission is reflected by the dust arranged in arc-like structures and the clumps surrounding MWC 137 on small scales. In the radio regime, we mapped the cold molecular gas in the outskirts of the optical nebula. We found that large amounts of cool molecular gas and warm dust embrace the optical nebula in the east, south, and west. No cold gas or dust was detected in the north and northwestern regions. Despite the new insights into the nebula kinematics gained from our studies, the real formation scenario of the large-scale nebula remains an open issue. Based on observations collected with (1) the ESO VLT Paranal Observatory under programs 094.D-0637(B) and 097.D-0033(A), (2) the MPG 2.2 m telescope at La Silla Observatory, Chile, under programs 096.A-9030(A) and 096.A-9039(A), (3) the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the NSF on behalf of the Gemini partnership: the

  14. THE LARGE-SCALE COSMIC-RAY ANISOTROPY AS OBSERVED WITH MILAGRO

    International Nuclear Information System (INIS)

    Abdo, A. A.; Allen, B. T.; Chen, C.; Aune, T.; Berley, D.; Goodman, J. A.; Hopper, B.; Lansdell, C. P.; Casanova, S.; Dingus, B. L.; Hoffman, C. M.; Huentemeyer, P. H.; Ellsworth, R. W.; Fleysher, L.; Fleysher, R.; Kolterman, B. E.; Mincer, A. I.; Gonzalez, M. M.; Linnemann, J. T.; McEnery, J. E.

    2009-01-01

    Results are presented of a harmonic analysis of the large-scale cosmic-ray (CR) anisotropy as observed by the Milagro observatory. We show a two-dimensional display of the sidereal anisotropy projections in right ascension (R.A.) generated by the fitting of three harmonics to 18 separate declination bands. The Milagro observatory is a water Cherenkov detector located in the Jemez mountains near Los Alamos, New Mexico. With a high duty cycle and large field of view, Milagro is an excellent instrument for measuring this anisotropy with high sensitivity at TeV energies. The analysis is conducted using a seven-year data sample consisting of more than 95 billion events, the largest such data set in existence. We observe an anisotropy with a magnitude around 0.1% for CRs with a median energy of 6 TeV. The dominant feature is a deficit region of depth (2.49 ± 0.02 stat. ± 0.09 sys.) x10 -3 in the direction of the Galactic north pole centered at 189 deg R.A. We observe a steady increase in the magnitude of the signal over seven years.

  15. The COronal Solar Magnetism Observatory (COSMO) Large Aperture Coronagraph

    Science.gov (United States)

    Tomczyk, Steve; Gallagher, Dennis; Wu, Zhen; Zhang, Haiying; Nelson, Pete; Burkepile, Joan; Kolinksi, Don; Sutherland, Lee

    2013-04-01

    The COSMO is a facility dedicated to observing coronal and chromospheric magnetic fields. It will be located on a mountaintop in the Hawaiian Islands and will replace the current Mauna Loa Solar Observatory (MLSO). COSMO will provide unique observations of the global coronal magnetic fields and its environment to enhance the value of data collected by other observatories on the ground (e.g. SOLIS, BBO NST, Gregor, ATST, EST, Chinese Giant Solar Telescope, NLST, FASR) and in space (e.g. SDO, Hinode, SOHO, GOES, STEREO, Solar-C, Solar Probe+, Solar Orbiter). COSMO will employ a fleet of instruments to cover many aspects of measuring magnetic fields in the solar atmosphere. The dynamics and energy flow in the corona are dominated by magnetic fields. To understand the formation of CMEs, their relation to other forms of solar activity, and their progression out into the solar wind requires measurements of coronal magnetic fields. The large aperture coronagraph, the Chromospheric and Prominence Magnetometer and the K-Coronagraph form the COSMO instrument suite to measure magnetic fields and the polarization brightness of the low corona used to infer electron density. The large aperture coronagraph will employ a 1.5 meter fuse silica singlet lens, birefringent filters, and a spectropolarimeter to cover fields of view of up to 1 degree. It will observe the corona over a wide range of emission lines from 530.3 nm through 1083.0 nm allowing for magnetic field measurements over a wide range of coronal temperatures (e.g. FeXIV at 530.3 nm, Fe X at 637.4 nm, Fe XIII at 1074.7 and 1079.8 nm. These lines are faint and require the very large aperture. NCAR and NSF have provided funding to bring the large aperture coronagraph to a preliminary design review state by the end of 2013. As with all data from Mauna Loa, the data products from COSMO will be available to the community via the Mauna Loa website: http://mlso.hao.ucar.edu

  16. The European Drought Observatory (EDO): Current State and Future Directions

    Science.gov (United States)

    Vogt, Jürgen; Sepulcre, Guadalupe; Magni, Diego; Valentini, Luana; Singleton, Andrew; Micale, Fabio; Barbosa, Paulo

    2013-04-01

    Europe has repeatedly been affected by droughts, resulting in considerable ecological and economic damage and climate change studies indicate a trend towards increasing climate variability most likely resulting in more frequent drought occurrences also in Europe. Against this background, the European Commission's Joint Research Centre (JRC) is developing methods and tools for assessing, monitoring and forecasting droughts in Europe and develops a European Drought Observatory (EDO) to complement and integrate national activities with a European view. At the core of the European Drought Observatory (EDO) is a portal, including a map server, a metadata catalogue, a media-monitor and analysis tools. The map server presents Europe-wide up-to-date information on the occurrence and severity of droughts, which is complemented by more detailed information provided by regional, national and local observatories through OGC compliant web mapping and web coverage services. In addition, time series of historical maps as well as graphs of the temporal evolution of drought indices for individual grid cells and administrative regions in Europe can be retrieved and analysed. Current work is focusing on validating the available products, developing combined indicators, improving the functionalities, extending the linkage to additional national and regional drought information systems and testing options for medium-range probabilistic drought forecasting across Europe. Longer-term goals include the development of long-range drought forecasting products, the analysis of drought hazard and risk, the monitoring of drought impact and the integration of EDO in a global drought information system. The talk will provide an overview on the development and state of EDO, the different products, and the ways to include a wide range of stakeholders (i.e. European, national river basin, and local authorities) in the development of the system as well as an outlook on the future developments.

  17. New ultracool subdwarfs identified in large-scale surveys using Virtual Observatory tools. I. UKIDSS LAS DR5 vs. SDSS DR7

    Science.gov (United States)

    Lodieu, N.; Espinoza Contreras, M.; Zapatero Osorio, M. R.; Solano, E.; Aberasturi, M.; Martín, E. L.

    2012-06-01

    Aims: The aim of the project is to improve our knowledge of the low-mass and low-metallicity population to investigate the influence of metallicity on the stellar (and substellar) mass function. Methods: We present the results of a photometric and proper motion search aimed at discovering ultracool subdwarfs in large-scale surveys. We employed and combined the Fifth Data Release (DR5) of the UKIRT Infrared Deep Sky Survey (UKIDSS) Large Area Survey (LAS) and the Sloan Digital Sky Survey (SDSS) Data Release 7 complemented with ancillary data from the Two Micron All-Sky Survey (2MASS), the DEep Near-Infrared Survey (DENIS) and the SuperCOSMOS Sky Surveys (SSS). Results: The SDSS DR7 vs. UKIDSS LAS DR5 search returned a total of 32 ultracool subdwarf candidates, only two of which are recognised as a subdwarf in the literature. Twenty-seven candidates, including the two known ones, were followed-up spectroscopically in the optical between 600 and 1000 nm, thus covering strong spectral features indicative of low metallicity (e.g., CaH), 21 with the Very Large Telescope, one with the Nordic Optical Telescope, and five were extracted from the Sloan spectroscopic database to assess (or refute) their low-metal content. We confirm 20 candidates as subdwarfs, extreme subdwarfs, or ultra-subdwarfs with spectral types later than M5; this represents a success rate of ≥ 60%. Among those 20 new subdwarfs, we identify two early-L subdwarfs that are very likely located within 100 pc, which we propose as templates for future searches because they are the first examples of their subclass. Another seven sources are solar-metallicity M dwarfs with spectral types between M4 and M7 without Hα emission, suggesting that they are old M dwarfs. The remaining five candidates do not have spectroscopic follow-up yet; only one remains as a bona-fide ultracool subdwarf after revision of their proper motions. We assigned spectral types based on the current classification schemes and, when

  18. Pro-Amateur Observatories as a Significant Resource for Professional Astronomers - Taurus Hill Observatory

    Science.gov (United States)

    Haukka, H.; Hentunen, V.-P.; Nissinen, M.; Salmi, T.; Aartolahti, H.; Juutilainen, J.; Vilokki, H.

    2013-09-01

    Taurus Hill Observatory (THO), observatory code A95, is an amateur observatory located in Varkaus, Finland. The observatory is maintained by the local astronomical association of Warkauden Kassiopeia [8]. THO research team has observed and measured various stellar objects and phenomena. Observatory has mainly focuse d on asteroid [1] and exoplanet light curve measurements, observing the gamma rays burst, supernova discoveries and monitoring [2]. We also do long term monitoring projects [3]. THO research team has presented its research work on previous EPSC meetings ([4], [5],[6], [7]) and got very supportive reactions from the European planetary science community. The results and publications that pro-amateur based observatories, like THO, have contributed, clearly demonstrates that pro-amateurs area significant resource for the professional astronomers now and even more in the future.

  19. European Southern Observatory

    CERN Multimedia

    CERN PhotoLab

    1970-01-01

    Professor A. Blaauw, Director general of the European Southern Observatory, with George Hampton on his right, signs the Agreement covering collaboration with CERN in the construction of the large telescope to be installed at the ESO Observatory in Chile.

  20. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  1. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Nusser, Adi [Physics Department and the Asher Space Science Institute-Technion, Haifa 32000 (Israel); Branchini, Enzo [Department of Physics, Universita Roma Tre, Via della Vasca Navale 84, 00146 Rome (Italy); Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu [Departments of Astronomy and Physics, University of California, Berkeley, CA 94720 (United States)

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  2. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  3. Community Observatories: Fostering Ideas that STEM From Ocean Sense: Local Observations. Global Connections.

    Science.gov (United States)

    Pelz, M. S.; Ewing, N.; Hoeberechts, M.; Riddell, D. J.; McLean, M. A.; Brown, J. C. K.

    2015-12-01

    Ocean Networks Canada (ONC) uses education and communication to inspire, engage and educate via innovative "meet them where they are, and take them where they need to go" programs. ONC data are accessible via the internet allowing for the promotion of programs wherever the learners are located. We use technologies such as web portals, mobile apps and citizen science to share ocean science data with many different audiences. Here we focus specifically on one of ONC's most innovative programs: community observatories and the accompanying Ocean Sense program. The approach is based on equipping communities with the same technology enabled on ONC's large cabled observatories. ONC operates the world-leading NEPTUNE and VENUS cabled ocean observatories and they collect data on physical, chemical, biological, and geological aspects of the ocean over long time periods, supporting research on complex Earth processes in ways not previously possible. Community observatories allow for similar monitoring on a smaller scale, and support STEM efforts via a teacher-led program: Ocean Sense. This program, based on local observations and global connections improves data-rich teaching and learning via visualization tools, interactive plotting interfaces and lesson plans for teachers that focus on student inquiry and exploration. For example, students use all aspects of STEM by accessing, selecting, and interpreting data in multiple dimensions, from their local community observatories to the larger VENUS and NEPTUNE networks. The students make local observations and global connections in all STEM areas. The first year of the program with teachers and students who use this innovative technology is described. Future community observatories and their technological applications in education, communication and STEM efforts are also described.

  4. Future hydrogen markets for large-scale hydrogen production systems

    International Nuclear Information System (INIS)

    Forsberg, Charles W.

    2007-01-01

    The cost of delivered hydrogen includes production, storage, and distribution. For equal production costs, large users (>10 6 m 3 /day) will favor high-volume centralized hydrogen production technologies to avoid collection costs for hydrogen from widely distributed sources. Potential hydrogen markets were examined to identify and characterize those markets that will favor large-scale hydrogen production technologies. The two high-volume centralized hydrogen production technologies are nuclear energy and fossil energy with carbon dioxide sequestration. The potential markets for these technologies are: (1) production of liquid fuels (gasoline, diesel and jet) including liquid fuels with no net greenhouse gas emissions and (2) peak electricity production. The development of high-volume centralized hydrogen production technologies requires an understanding of the markets to (1) define hydrogen production requirements (purity, pressure, volumes, need for co-product oxygen, etc.); (2) define and develop technologies to use the hydrogen, and (3) create the industrial partnerships to commercialize such technologies. (author)

  5. Challenges and Opportunities to Developing Synergies Among Diverse Environmental Observatories: FSML, NEON, and GLEON

    Science.gov (United States)

    Williamson, C. E.; Weathers, K. C.; Knoll, L. B.; Brentrup, J.

    2012-12-01

    Recent rapid advances in sensor technology and cyberinfrastructure have enabled the development of numerous environmental observatories ranging from local networks at field stations and marine laboratories (FSML) to continental scale observatories such as the National Ecological Observatory Network (NEON) to global scale observatories such as the Global Lake Ecological Observatory Network (GLEON). While divergent goals underlie the initial development of these observatories, and they are often designed to serve different communities, many opportunities for synergies exist. In addition, the use of existing infrastructure may enhance the cost-effectiveness of building and maintaining large scale observatories. For example, FSMLs are established facilities with the staff and infrastructure to host sensor nodes of larger networks. Many field stations have existing staff and long-term databases as well as smaller sensor networks that are the product of a single or small group of investigators with a unique data management system embedded in a local or regional community. These field station based facilities and data are a potentially untapped gold mine for larger continental and global scale observatories; common ecological and environmental challenges centered on understanding the impacts of changing climate, land use, and invasive species often underlie these efforts. The purpose of this talk is to stimulate a dialog on the challenges of merging efforts across these different spatial and temporal scales, as well as addressing how to develop synergies among observatory networks with divergent roots and philosophical approaches. For example, FSMLs have existing long-term databases and facilities, while NEON has sparse past data but a well-developed template and closely coordinated team working in a coherent format across a continental scale. GLEON on the other hand is a grass-roots network of experts in science, information technology, and engineering with a common goal

  6. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    Science.gov (United States)

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  7. The role of large scale storage in a GB low carbon energy future: Issues and policy challenges

    International Nuclear Information System (INIS)

    Gruenewald, Philipp; Cockerill, Tim; Contestabile, Marcello; Pearson, Peter

    2011-01-01

    Large scale storage offers the prospect of capturing and using excess electricity within a low carbon energy system, which otherwise might have to be wasted. Incorporating the role of storage into current scenario tools is challenging, because it requires high temporal resolution to reflect the effects of intermittent sources on system balancing. This study draws on results from a model with such resolution. It concludes that large scale storage could become economically viable for scenarios with high penetration of renewables. As the proportion of intermittent sources increases, the optimal type of storage shifts towards solutions with low energy related costs, even at the expense of efficiency. However, a range of uncertainties have been identified, concerning storage technology development, the regulatory environment, alternatives to storage and the stochastic uncertainty of year-on-year revenues. All of these negatively affect the cost of finance and the chances of successful market uptake. We argue, therefore, that, if the possible wider system and social benefits from the presence of storage are to be achieved, stronger and more strategic policy support may be necessary. More work on the social and system benefits of storage is needed to gauge the appropriate extent of support measures. - Highlights: → Time resolved modelling shows future potential for large scale power storage in GB. → The value of storage is highly sensitive to a range of parameters. → Uncertainty over the revenue from storage could pose a barrier to investment. → To realise wider system benefits stronger and more strategic policy support may be necessary.

  8. Transformative Science for the Next Decade with the Green Bank Observatory

    Science.gov (United States)

    O'Neil, Karen; Frayer, David; Ghigo, Frank; Lockman, Felix; Lynch, Ryan; Maddalena, Ronald; minter, Anthony; Prestage, Richard

    2018-01-01

    With new instruments and improved performance, the 100m Green Bank Telescope is now demonstrating its full potential. On this 60th anniversary of the groundbreaking for the Green Bank Observatory, we can look forward to the future of the facility for the next 5, 10, and even 20 years. Here we describe the results from a recent workshop, “Transformative Science for the Next Decade with the Green Bank Observatory: Big Questions, Large Programs, and New Instruments,” and describe the scientific plans for our facility.

  9. Data standards for the international virtual observatory

    Directory of Open Access Journals (Sweden)

    R J Hanisch

    2006-11-01

    Full Text Available A primary goal of the International Virtual Observatory Alliance, which brings together Virtual Observatory Projects from 16 national and international development projects, is to develop, evaluate, test, and agree upon standards for astronomical data formatting, data discovery, and data delivery. In the three years that the IVOA has been in existence, substantial progress has been made on standards for tabular data, imaging data, spectroscopic data, and large-scale databases and on managing the metadata that describe data collections and data access services. In this paper, I describe how the IVOA operates and give my views as to why such a broadly based international collaboration has been able to make such rapid progress.

  10. The Renovation and Future Capabilities of the Thacher Observatory

    Science.gov (United States)

    O'Neill, Katie; Osuna, Natalie; Edwards, Nick; Klink, Douglas; Swift, Jonathan; Vyhnal, Chris; Meyer, Kurt

    2016-01-01

    The Thacher School is in the process of renovating the campus observatory with a new meter class telescope and full automation capabilities for the purpose of scientific research and education. New equipment on site has provided a preliminary site characterization including seeing and V-band sky brightness measurements. These data, along with commissioning data from the MINERVA project (which uses comparable hardware) are used to estimate the capabilities of the observatory once renovation is complete. Our V-band limiting magnitude is expected to be better than 21.3 for a one minute integration time, and we estimate that milli-magnitude precision photometry will be possible for a V=14.5 point source over approximately 5 min timescales. The quick response, autonomous operation, and multi-band photometric capabilities of the renovated observatory will make it a powerful follow-up science facility for exoplanets, eclipsing binaries, near-Earth objects, stellar variability, and supernovae.

  11. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  12. Production of black holes in TeV-scale gravity

    International Nuclear Information System (INIS)

    Ringwald, A.

    2002-12-01

    Copious production of microscopic black holes is one of the least model-dependent predictions of TeV-scale gravity scenarios. We review the arguments behind this assertion and discuss opportunities to track the striking associated signatures in the near future. These include searches at neutrino telescopes, such as AMANDA and RICE, at cosmic ray air shower facilities, such as the Pierre Auger Observatory, and at colliders, such as the Large Hadron Collider. (orig.)

  13. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  14. Future axion searches with the International Axion Observatory (IAXO)

    CERN Document Server

    Irastorza, I G; Cantatore, G; Carmona, J M; Caspi, S; Cetin, S A; Christensen, F E; Dael, A; Dafni, T; Davenport, M; Derbin, A V; Desch, K; Diago, A; Döbrich, B; Dudarev, A; Eleftheriadis, C; Fanourakis, G; Ferrer-Ribas, E; Galán, J; García, J A; Garza, J.G; Geralis, T; Gimeno, B; Giomataris, I; Gninenko, S; Gómez, H; Guendelman, E; Hailey, C J; Hiramatsu, T; Hoffmann, D H H; Horns, D; Iguaz, F J; Isern, J; Jakobsen, A C; Jaeckel, J; Jakovčić, K; Kaminski, J; Kawasaki, M; Krčmar, M; Krieger, C; Lakić, B; Lindner, A; Liolios, A; Luzón, G; Ortega, I; Papaevangelou, T; Pivovaroff, M J; Raffelt, G; Redondo, J; Ringwald, A; Russenschuck, S; Ruz, J; Saikawa, K; Savvidis, I; Sekiguchi, T; Shilon, I; Sikivie, P; Silva, H; Kate, H ten; Tomas, A; Troitsky, S; Vafeiadis, T; Bibber, K van; Vedrine, P; Villar, J A; Vogel, J K; Walckiers, L; Wester, W; Yildiz, S C; Zioutas, K

    2013-01-01

    The International Axion Observatory (IAXO) is a new generation axion helioscope aiming at a sensitivity to the axion-photon coupling of gaγ few × 10−12 GeV−1, i.e. 1–1.5 orders of magnitude beyond the one achieved by CAST, currently the most sensitive axion helioscope. The main elements of IAXO are an increased magnetic field volume together with extensive use of x-ray focusing optics and low background detectors, innovations already successfully tested in CAST. Additional physics cases of IAXO could include the detection of electron-coupled axions invoked to explain the white dwarf cooling, relic axions, and a large variety of more generic axion-like particles (ALPs) and other novel excitations at the low-energy frontier of elementary particle physics.

  15. NEMO-SN1 observatory developments in view of the European Research Infrastructures EMSO and KM3NET

    Energy Technology Data Exchange (ETDEWEB)

    Favali, Paolo, E-mail: emsopp@ingv.i [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Roma 2, Via di Vigna Murata 605, 00143 Roma (Italy); Beranzoli, Laura [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Roma 2, Via di Vigna Murata 605, 00143 Roma (Italy); Italiano, Francesco [Istituto Nazionale di Geofisica e Vulcanologia (INGV), Sect. Palermo, Via Ugo La Malfa 153, 90146 Palermo (Italy); Migneco, Emilio; Musumeci, Mario; Papaleo, Riccardo [Istituto Nazionale di Fisica Nucleare (INFN), Laboratori Nazionali del Sud, Via di S. Sofia 62, 95125 Catania (Italy)

    2011-01-21

    NEMO-SN1 (Western Ionian Sea off Eastern Sicily), the first real-time multiparameter observatory operating in Europe since 2005, is one of the nodes of the upcoming European ESFRI large-scale research infrastructure EMSO (European Multidisciplinary Seafloor Observatory), a network of seafloor observatories placed at marine sites on the European Continental Margin. NEMO-SN1 constitutes also an important test-site for the study of prototypes of Kilometre Cube Neutrino Telescope (KM3NeT), another European ESFRI large-scale research infrastructure. Italian resources have been devoted to the development of NEMO-SN1 facilities and logistics, as with the PEGASO project, while the EC project ESONET-NoE is funding a demonstration mission and a technological test. EMSO and KM3NeT are presently in the Preparatory Phase as projects funded under the EC-FP7.

  16. Large-scale structure in the universe: Theory vs observations

    International Nuclear Information System (INIS)

    Kashlinsky, A.; Jones, B.J.T.

    1990-01-01

    A variety of observations constrain models of the origin of large scale cosmic structures. We review here the elements of current theories and comment in detail on which of the current observational data provide the principal constraints. We point out that enough observational data have accumulated to constrain (and perhaps determine) the power spectrum of primordial density fluctuations over a very large range of scales. We discuss the theories in the light of observational data and focus on the potential of future observations in providing even (and ever) tighter constraints. (orig.)

  17. An Observatory to Enhance the Preparation of Future California Teachers

    Science.gov (United States)

    Connolly, L.; Lederer, S.

    2004-12-01

    With a major grant from the W. M. Keck Foundation, California State University, San Bernardino is establishing a state-of-the-art teaching astronomical observatory. The Observatory will be fundamental to an innovative undergraduate physics and astronomy curriculum for Physics and Liberal Studies majors and will be integrated into our General Education program. The critical need for a research and educational observatory is linked to changes in California's Science Competencies for teacher certification. Development of the Observatory will also complement a new infusion of NASA funding and equipment support for our growing astronomy education programs and the University's established Strategic Plan for excellence in education and teacher preparation. The Observatory will consist of two domed towers. One tower will house a 20" Ritchey-Chretien telescope equipped with a CCD camera in conjunction with either UBVRI broadband filters or a spectrometer for evening laboratories and student research projects. The second tower will house the university's existing 12" Schmidt-Cassegrain optical telescope coupled with a CCD camera and an array of filters. A small aperture solar telescope will be attached to the 12" for observing solar prominences while a milar filter can be attached to the 12" for sunspot viewing. We have been very fortunate to receive a challenge grant of \\600,000 from the W. M. Keck Foundation to equip the two domed towers; we continue to seek a further \\800,000 to meet our construction needs. Funding also provided by the California State University, San Bernardino.

  18. Development of Armenian-Georgian Virtual Observatory

    Science.gov (United States)

    Mickaelian, Areg; Kochiashvili, Nino; Astsatryan, Hrach; Harutyunian, Haik; Magakyan, Tigran; Chargeishvili, Ketevan; Natsvlishvili, Rezo; Kukhianidze, Vasil; Ramishvili, Giorgi; Sargsyan, Lusine; Sinamyan, Parandzem; Kochiashvili, Ia; Mikayelyan, Gor

    2009-10-01

    The Armenian-Georgian Virtual Observatory (ArGVO) project is the first initiative in the world to create a regional VO infrastructure based on national VO projects and regional Grid. The Byurakan and Abastumani Astrophysical Observatories are scientific partners since 1946, after establishment of the Byurakan observatory . The Armenian VO project (ArVO) is being developed since 2005 and is a part of the International Virtual Observatory Alliance (IVOA). It is based on the Digitized First Byurakan Survey (DFBS, the digitized version of famous Markarian survey) and other Armenian archival data. Similarly, the Georgian VO will be created to serve as a research environment to utilize the digitized Georgian plate archives. Therefore, one of the main goals for creation of the regional VO is the digitization of large amounts of plates preserved at the plate stacks of these two observatories. The total amount of plates is more than 100,000 units. Observational programs of high importance have been selected and some 3000 plates will be digitized during the next two years; the priority is being defined by the usefulness of the material for future science projects, like search for new objects, optical identifications of radio, IR, and X-ray sources, study of variability and proper motions, etc. Having the digitized material in VO standards, a VO database through the regional Grid infrastructure will be active. This partnership is being carried out in the framework of the ISTC project A-1606 "Development of Armenian-Georgian Grid Infrastructure and Applications in the Fields of High Energy Physics, Astrophysics and Quantum Physics".

  19. Production of black holes in TeV-scale gravity

    International Nuclear Information System (INIS)

    Ringwald, A.

    2003-01-01

    Copious production of microscopic black holes is one of the least model-dependent predictions of TeV-scale gravity scenarios. We review the arguments behind this assertion and discuss opportunities to track the striking associated signatures in the near future. These include searches at neutrino telescopes, such as AMANDA and RICE, at cosmic ray air shower facilities, such as the Pierre Auger Observatory, and at colliders, such as the Large Hadron Collider. (Abstract Copyright [2003], Wiley Periodicals, Inc.)

  20. Decommissioning of nuclear reprocessing plants French past experience and approach to future large scale operations

    International Nuclear Information System (INIS)

    Jean Jacques, M.; Maurel, J.J.; Maillet, J.

    1994-01-01

    Over the years, France has built up significant experience in dismantling nuclear fuel reprocessing facilities or various types of units representative of a modern reprocessing plant. However, only small or medium scale operations have been carried out so far. To prepare the future decommissioning of large size industrial facilities such as UP1 (Marcoule) and UP2 (La Hague), new technologies must be developed to maximize waste recycling and optimize direct operations by operators, taking the integrated dose and cost aspects into account. The decommissioning and dismantling methodology comprises: a preparation phase for inventory, choice and installation of tools and arrangement of working areas, a dismantling phase with decontamination, and a final contamination control phase. Detailed description of dismantling operations of the MA Pu finishing facility (La Hague) and of the RM2 radio metallurgical laboratory (CEA-Fontenay-aux-Roses) are given as examples. (J.S.). 3 tabs

  1. The Virtual Watershed Observatory: Cyberinfrastructure for Model-Data Integration and Access

    Science.gov (United States)

    Duffy, C.; Leonard, L. N.; Giles, L.; Bhatt, G.; Yu, X.

    2011-12-01

    The Virtual Watershed Observatory (VWO) is a concept where scientists, water managers, educators and the general public can create a virtual observatory from integrated hydrologic model results, national databases and historical or real-time observations via web services. In this paper, we propose a prototype for automated and virtualized web services software using national data products for climate reanalysis, soils, geology, terrain and land cover. The VWO has the broad purpose of making accessible water resource simulations, real-time data assimilation, calibration and archival at the scale of HUC 12 watersheds (Hydrologic Unit Code) anywhere in the continental US. Our prototype for model-data integration focuses on creating tools for fast data storage from selected national databases, as well as the computational resources necessary for a dynamic, distributed watershed simulation. The paper will describe cyberinfrastructure tools and workflow that attempts to resolve the problem of model-data accessibility and scalability such that individuals, research teams, managers and educators can create a WVO in a desired context. Examples are given for the NSF-funded Shale Hills Critical Zone Observatory and the European Critical Zone Observatories within the SoilTrEC project. In the future implementation of WVO services will benefit from the development of a cloud cyber infrastructure as the prototype evolves to data and model intensive computation for continental scale water resource predictions.

  2. Norwegian Ocean Observatory Network (NOON)

    Science.gov (United States)

    Ferré, Bénédicte; Mienert, Jürgen; Winther, Svein; Hageberg, Anne; Rune Godoe, Olav; Partners, Noon

    2010-05-01

    The Norwegian Ocean Observatory Network (NOON) is led by the University of Tromsø and collaborates with the Universities of Oslo and Bergen, UniResearch, Institute of Marine Research, Christian Michelsen Research and SINTEF. It is supported by the Research Council of Norway and oil and gas (O&G) industries like Statoil to develop science, technology and new educational programs. Main topics relate to ocean climate and environment as well as marine resources offshore Norway from the northern North Atlantic to the Arctic Ocean. NOON's vision is to bring Norway to the international forefront in using cable based ocean observatory technology for marine science and management, by establishing an infrastructure that enables real-time and long term monitoring of processes and interactions between hydrosphere, geosphere and biosphere. This activity is in concert with the EU funded European Strategy Forum on Research Infrastructures (ESFRI) roadmap and European Multidisciplinary Seafloor Observation (EMSO) project to attract international leading research developments. NOON envisions developing towards a European Research Infrastructure Consortium (ERIC). Beside, the research community in Norway already possesses a considerable marine infrastructure that can expand towards an international focus for real-time multidisciplinary observations in times of rapid climate change. PIC The presently established cable-based fjord observatory, followed by the establishment of a cable-based ocean observatory network towards the Arctic from an O&G installation, will provide invaluable knowledge and experience necessary to make a successful larger cable-based observatory network at the Norwegian and Arctic margin (figure 1). Access to large quantities of real-time observation from the deep sea, including high definition video, could be used to provide the public and future recruits to science a fascinating insight into an almost unexplored part of the Earth beyond the Arctic Circle

  3. Highly Adjustable Systems: An Architecture for Future Space Observatories

    Science.gov (United States)

    Arenberg, Jonathan; Conti, Alberto; Redding, David; Lawrence, Charles R.; Hachkowski, Roman; Laskin, Robert; Steeves, John

    2017-06-01

    Mission costs for ground breaking space astronomical observatories are increasing to the point of unsustainability. We are investigating the use of adjustable or correctable systems as a means to reduce development and therefore mission costs. The poster introduces the promise and possibility of realizing a “net zero CTE” system for the general problem of observatory design and introduces the basic systems architecture we are considering. This poster concludes with an overview of our planned study and demonstrations for proving the value and worth of highly adjustable telescopes and systems ahead of the upcoming decadal survey.

  4. Large-scale computer networks and the future of legal knowledge-based systems

    NARCIS (Netherlands)

    Leenes, R.E.; Svensson, Jorgen S.; Hage, J.C.; Bench-Capon, T.J.M.; Cohen, M.J.; van den Herik, H.J.

    1995-01-01

    In this paper we investigate the relation between legal knowledge-based systems and large-scale computer networks such as the Internet. On the one hand, researchers of legal knowledge-based systems have claimed huge possibilities, but despite the efforts over the last twenty years, the number of

  5. Flathead River Basin Hydrologic Observatory, Northern Rocky Mountains

    Science.gov (United States)

    Woessner, W. W.; Running, S. W.; Potts, D. F.; Kimball, J. S.; Deluca, T. H.; Fagre, D. B.; Makepeace, S.; Hendrix, M. S.; Lorang, M. S.; Ellis, B. K.; Lafave, J.; Harper, J.

    2004-12-01

    We are proposing the 22, 515 km2 glacially-sculpted Flathead River Basin located in Montana and British Columbia as a Hydrologic Observatory. This hydrologic landscape is diverse and includes large pristine watersheds, rapidly developing intermountain valleys, and a 95 km2 regulated reservoir and 510 km2 lake. The basin has a topographic gradient of over 2,339 m, and spans high alpine to arid climatic zones and a range of biomes. Stream flows are snow-melt dominated and underpinned by groundwater baseflow. The site headwaters contain 37 glaciers and thousands of square kilometers of watersheds in which fire and disease are the only disturbances. In contrast, the HO also contains watersheds at multiple scales that were dominated by glaciers within the last 100 years but are now glacier free, impacted by timber harvests and fires of varying ages to varying degrees, modified by water management practices including irrigation diversion and dams, and altered by development for homes, cities and agriculture. This Observatory provides a sensitive monitor of historic and future climatic shifts, air shed influences and impacts, and the consequences of land and water management practices on the hydrologic system. The HO watersheds are some of the only pristine watersheds left in the contiguous U.S.. They provide critical habitat for key species including the native threaten bull trout and lynx, and the listed western cutthroat trout, bald eagle, gray wolf and the grizzly bear. For the last several thousand years this system has been dominated by snow-melt runoff and moderated by large quantities of water stored in glacial ice. However, the timing and magnitude of droughts and summer flows have changed dramatically. With the information that can be gleaned from sediment cores and landscape records at different scales, this HO provides scientists with opportunities to establish baseline watershed conditions and data on natural hydrologic variability within the system. Such a

  6. Future development of the PLATO Observatory for Antarctic science

    Science.gov (United States)

    Ashley, Michael C. B.; Bonner, Colin S.; Everett, Jon R.; Lawrence, Jon S.; Luong-Van, Daniel; McDaid, Scott; McLaren, Campbell; Storey, John W. V.

    2010-07-01

    PLATO is a self-contained robotic observatory built into two 10-foot shipping containers. It has been successfully deployed at Dome A on the Antarctic plateau since January 2008, and has accumulated over 730 days of uptime at the time of writing. PLATO provides 0.5{1kW of continuous electrical power for a year from diesel engines running on Jet-A1, supplemented during the summertime with solar panels. One of the 10-foot shipping containers houses the power system and fuel, the other provides a warm environment for instruments. Two Iridium satellite modems allow 45 MB/day of data to be transferred across the internet. Future enhancements to PLATO, currently in development, include a more modular design, using lithium iron-phosphate batteries, higher power output, and a light-weight low-power version for eld deployment from a Twin Otter aircraft. Technologies used in PLATO include a CAN (Controller Area Network) bus, high-reliability PC/104 com- puters, ultracapacitors for starting the engines, and fault-tolerant redundant design.

  7. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  8. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  9. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  10. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A large-scale dataset of solar event reports from automated feature recognition modules

    Science.gov (United States)

    Schuh, Michael A.; Angryk, Rafal A.; Martens, Petrus C.

    2016-05-01

    The massive repository of images of the Sun captured by the Solar Dynamics Observatory (SDO) mission has ushered in the era of Big Data for Solar Physics. In this work, we investigate the entire public collection of events reported to the Heliophysics Event Knowledgebase (HEK) from automated solar feature recognition modules operated by the SDO Feature Finding Team (FFT). With the SDO mission recently surpassing five years of operations, and over 280,000 event reports for seven types of solar phenomena, we present the broadest and most comprehensive large-scale dataset of the SDO FFT modules to date. We also present numerous statistics on these modules, providing valuable contextual information for better understanding and validating of the individual event reports and the entire dataset as a whole. After extensive data cleaning through exploratory data analysis, we highlight several opportunities for knowledge discovery from data (KDD). Through these important prerequisite analyses presented here, the results of KDD from Solar Big Data will be overall more reliable and better understood. As the SDO mission remains operational over the coming years, these datasets will continue to grow in size and value. Future versions of this dataset will be analyzed in the general framework established in this work and maintained publicly online for easy access by the community.

  12. Security and VO management capabilities in a large-scale Grid operating system

    OpenAIRE

    Aziz, Benjamin; Sporea, Ioana

    2014-01-01

    This paper presents a number of security and VO management capabilities in a large-scale distributed Grid operating system. The capabilities formed the basis of the design and implementation of a number of security and VO management services in the system. The main aim of the paper is to provide some idea of the various functionality cases that need to be considered when designing similar large-scale systems in the future.

  13. A small-scale, rolled-membrane microfluidic artificial lung designed towards future large area manufacturing.

    Science.gov (United States)

    Thompson, A J; Marks, L H; Goudie, M J; Rojas-Pena, A; Handa, H; Potkay, J A

    2017-03-01

    Artificial lungs have been used in the clinic for multiple decades to supplement patient pulmonary function. Recently, small-scale microfluidic artificial lungs (μAL) have been demonstrated with large surface area to blood volume ratios, biomimetic blood flow paths, and pressure drops compatible with pumpless operation. Initial small-scale microfluidic devices with blood flow rates in the μ l/min to ml/min range have exhibited excellent gas transfer efficiencies; however, current manufacturing techniques may not be suitable for scaling up to human applications. Here, we present a new manufacturing technology for a microfluidic artificial lung in which the structure is assembled via a continuous "rolling" and bonding procedure from a single, patterned layer of polydimethyl siloxane (PDMS). This method is demonstrated in a small-scale four-layer device, but is expected to easily scale to larger area devices. The presented devices have a biomimetic branching blood flow network, 10  μ m tall artificial capillaries, and a 66  μ m thick gas transfer membrane. Gas transfer efficiency in blood was evaluated over a range of blood flow rates (0.1-1.25 ml/min) for two different sweep gases (pure O 2 , atmospheric air). The achieved gas transfer data closely follow predicted theoretical values for oxygenation and CO 2 removal, while pressure drop is marginally higher than predicted. This work is the first step in developing a scalable method for creating large area microfluidic artificial lungs. Although designed for microfluidic artificial lungs, the presented technique is expected to result in the first manufacturing method capable of simply and easily creating large area microfluidic devices from PDMS.

  14. An astronomical observatory for Peru

    Science.gov (United States)

    del Mar, Juan Quintanilla; Sicardy, Bruno; Giraldo, Víctor Ayma; Callo, Víctor Raúl Aguilar

    2011-06-01

    Peru and France are to conclude an agreement to provide Peru with an astronomical observatory equipped with a 60-cm diameter telescope. The principal aims of this project are to establish and develop research and teaching in astronomy. Since 2004, a team of researchers from Paris Observatory has been working with the University of Cusco (UNSAAC) on the educational, technical and financial aspects of implementing this venture. During an international astronomy conference in Cusco in July 2009, the foundation stone of the future Peruvian Observatory was laid at the top of Pachatusan Mountain. UNSAAC, represented by its Rector, together with the town of Oropesa and the Cusco regional authority, undertook to make the sum of 300,000€ available to the project. An agreement between Paris Observatory and UNSAAC now enables Peruvian students to study astronomy through online teaching.

  15. Global-scale hydrological response to future glacier mass loss

    Science.gov (United States)

    Huss, Matthias; Hock, Regine

    2018-01-01

    Worldwide glacier retreat and associated future runoff changes raise major concerns over the sustainability of global water resources1-4, but global-scale assessments of glacier decline and the resulting hydrological consequences are scarce5,6. Here we compute global glacier runoff changes for 56 large-scale glacierized drainage basins to 2100 and analyse the glacial impact on streamflow. In roughly half of the investigated basins, the modelled annual glacier runoff continues to rise until a maximum (`peak water') is reached, beyond which runoff steadily declines. In the remaining basins, this tipping point has already been passed. Peak water occurs later in basins with larger glaciers and higher ice-cover fractions. Typically, future glacier runoff increases in early summer but decreases in late summer. Although most of the 56 basins have less than 2% ice coverage, by 2100 one-third of them might experience runoff decreases greater than 10% due to glacier mass loss in at least one month of the melt season, with the largest reductions in central Asia and the Andes. We conclude that, even in large-scale basins with minimal ice-cover fraction, the downstream hydrological effects of continued glacier wastage can be substantial, but the magnitudes vary greatly among basins and throughout the melt season.

  16. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  17. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  18. Auger ACCESS—Remote Controlling and Monitoring the Pierre Auger Observatory

    Science.gov (United States)

    Jejkal, Thomas

    2013-10-01

    Ultra high energy cosmic rays are the most energetic particles in the universe. They are measured to have energies of up to 1020 eV and occur at a rate of about once per square kilometer per century. To increase the probability of detecting one of these events, a huge detector covering a large area is needed. The Pierre Auger Collaboration build up an observatory covering 3000 square kilometers of the Pampa Amarilla close to Malargüe for this purpose. Until now, the Auger Observatory has been controlled exclusively via the local network for security and performance reasons. As local operation is associated with high travel costs, the Auger ACCESS project, started in 2005, has constructed a secure, operable and sustainable solution for remote control and monitoring. The implemented solution includes Grid technologies for secured access and infrastructure virtualization for building up a fully featured testing environment for the Auger Observatory. Measurements showed only a negligible delay for communicating with the observatory in Argentina, which allows the establishment of remote control rooms in the near future for full remote operation and remarkable cost reduction.

  19. ESO's Two Observatories Merge

    Science.gov (United States)

    2005-02-01

    , a unique instrument capable of measuring stellar radial velocities with an unsurpassed accuracy better than 1 m/s, making it a very powerful tool for the discovery of extra-solar planets. In addition, astronomers have also access to the 2.2-m ESO/MPG telescope with its Wide Field Imager camera. A new control room, the RITZ (Remote Integrated Telescope Zentrum), allows operating all three ESO telescopes at La Silla from a single place. The La Silla Observatory is also the first world-class observatory to have been granted certification for the International Organization for Standardization (ISO) 9001 Quality Management System. Moreover, the infrastructure of La Silla is still used by many of the ESO member states for targeted projects such as the Swiss 1.2-m Euler telescope and the robotic telescope specialized in the follow-up of gamma-ray bursts detected by satellites, the Italian REM (Rapid Eye Mount). In addition, La Silla is in charge of the APEX (Atacama Pathfinder Experiment) 12-m sub-millimetre telescope which will soon start routine observations at Chajnantor, the site of the future Atacama Large Millimeter Array (ALMA). The APEX project is a collaboration between the Max Planck Society in Germany, Onsala Observatory in Sweden and ESO. ESO also operates Paranal, home of the Very Large Telescope (VLT) and the VLT Interferometer (VLTI). Antu, the first 8.2-m Unit Telescope of the VLT, saw First Light in May 1998, starting what has become a revolution in European astronomy. Since then, the three other Unit Telescopes - Kueyen, Melipal and Yepun - have been successfully put into operation with an impressive suite of the most advanced astronomical instruments. The interferometric mode of the VLT (VLTI) is also operational and fully integrated in the VLT data flow system. In the VLTI mode, one state-of-the-art instrument is already available and another will follow soon. With its remarkable resolution and unsurpassed surface area, the VLT is at the forefront of

  20. Scientific Workflows and the Sensor Web for Virtual Environmental Observatories

    Science.gov (United States)

    Simonis, I.; Vahed, A.

    2008-12-01

    Virtual observatories mature from their original domain and become common practice for earth observation research and policy building. The term Virtual Observatory originally came from the astronomical research community. Here, virtual observatories provide universal access to the available astronomical data archives of space and ground-based observatories. Further on, as those virtual observatories aim at integrating heterogeneous ressources provided by a number of participating organizations, the virtual observatory acts as a coordinating entity that strives for common data analysis techniques and tools based on common standards. The Sensor Web is on its way to become one of the major virtual observatories outside of the astronomical research community. Like the original observatory that consists of a number of telescopes, each observing a specific part of the wave spectrum and with a collection of astronomical instruments, the Sensor Web provides a multi-eyes perspective on the current, past, as well as future situation of our planet and its surrounding spheres. The current view of the Sensor Web is that of a single worldwide collaborative, coherent, consistent and consolidated sensor data collection, fusion and distribution system. The Sensor Web can perform as an extensive monitoring and sensing system that provides timely, comprehensive, continuous and multi-mode observations. This technology is key to monitoring and understanding our natural environment, including key areas such as climate change, biodiversity, or natural disasters on local, regional, and global scales. The Sensor Web concept has been well established with ongoing global research and deployment of Sensor Web middleware and standards and represents the foundation layer of systems like the Global Earth Observation System of Systems (GEOSS). The Sensor Web consists of a huge variety of physical and virtual sensors as well as observational data, made available on the Internet at standardized

  1. Searches for Large-Scale Anisotropy in the Arrival Directions of Cosmic Rays Detected above Energy of $10^{19}$ eV at the Pierre Auger Observatory and the Telescope Array

    Energy Technology Data Exchange (ETDEWEB)

    Aab, Alexander; et al,

    2014-10-07

    Spherical harmonic moments are well-suited for capturing anisotropy at any scale in the flux of cosmic rays. An unambiguous measurement of the full set of spherical harmonic coefficients requires full-sky coverage. This can be achieved by combining data from observatories located in both the northern and southern hemispheres. To this end, a joint analysis using data recorded at the Telescope Array and the Pierre Auger Observatory above 1019 eV is presented in this work. The resulting multipolar expansion of the flux of cosmic rays allows us to perform a series of anisotropy searches, and in particular to report on the angular power spectrum of cosmic rays above 1019 eV. No significant deviation from isotropic expectations is found throughout the analyses performed. Upper limits on the amplitudes of the dipole and quadrupole moments are derived as a function of the direction in the sky, varying between 7% and 13% for the dipole and between 7% and 10% for a symmetric quadrupole.

  2. The Fram Strait integrated ocean observatory

    Science.gov (United States)

    Fahrbach, E.; Beszczynska-Möller, A.; Rettig, S.; Rohardt, G.; Sagen, H.; Sandven, S.; Hansen, E.

    2012-04-01

    A long-term oceanographic moored array has been operated since 1997 to measure the ocean water column properties and oceanic advective fluxes through Fram Strait. While the mooring line along 78°50'N is devoted to monitoring variability of the physical environment, the AWI Hausgarten observatory, located north of it, focuses on ecosystem properties and benthic biology. Under the EU DAMOCLES and ACOBAR projects, the oceanographic observatory has been extended towards the innovative integrated observing system, combining the deep ocean moorings, multipurpose acoustic system and a network of gliders. The main aim of this system is long-term environmental monitoring in Fram Strait, combining satellite data, acoustic tomography, oceanographic measurements at moorings and glider sections with high-resolution ice-ocean circulation models through data assimilation. In future perspective, a cable connection between the Hausgarten observatory and a land base on Svalbard is planned as the implementation of the ESONET Arctic node. To take advantage of the planned cabled node, different technologies for the underwater data transmission were reviewed and partially tested under the ESONET DM AOEM. The main focus was to design and evaluate available technical solutions for collecting data from different components of the Fram Strait ocean observing system, and an integration of available data streams for the optimal delivery to the future cabled node. The main components of the Fram Strait integrated observing system will be presented and the current status of available technologies for underwater data transfer will be reviewed. On the long term, an initiative of Helmholtz observatories foresees the interdisciplinary Earth-Observing-System FRAM which combines observatories such as the long term deep-sea ecological observatory HAUSGARTEN, the oceanographic Fram Strait integrated observing system and the Svalbard coastal stations maintained by the Norwegian ARCTOS network. A vision

  3. NEON: the first continental-scale ecological observatory with airborne remote sensing of vegetation canopy biochemistry and structure

    Science.gov (United States)

    Thomas U. Kampe; Brian R. Johnson; Michele Kuester; Michael. Keller

    2010-01-01

    The National Ecological Observatory Network (NEON) is an ecological observation platform for discovering, understanding and forecasting the impacts of climate change, land use change, and invasive species on continental-scale ecology. NEON will operate for 30 years and gather long-term data on ecological response changes and on feedbacks with the geosphere, hydrosphere...

  4. Active power reserves evaluation in large scale PVPPs

    DEFF Research Database (Denmark)

    Crăciun, Bogdan-Ionut; Kerekes, Tamas; Sera, Dezso

    2013-01-01

    The present trend on investing in renewable ways of producing electricity in the detriment of conventional fossil fuel-based plants will lead to a certain point where these plants have to provide ancillary services and contribute to overall grid stability. Photovoltaic (PV) power has the fastest...... growth among all renewable energies and managed to reach high penetration levels creating instabilities which at the moment are corrected by the conventional generation. This paradigm will change in the future scenarios where most of the power is supplied by large scale renewable plants and parts...... of the ancillary services have to be shared by the renewable plants. The main focus of the proposed paper is to technically and economically analyze the possibility of having active power reserves in large scale PV power plants (PVPPs) without any auxiliary storage equipment. The provided reserves should...

  5. Possible future effects of large-scale algae cultivation for biofuels on coastal eutrophication in Europe

    NARCIS (Netherlands)

    Blaas, H.; Kroeze, C.

    2014-01-01

    Biodiesel is increasingly considered as an alternative for fossil diesel. Biodiesel can be produced from rapeseed, palm, sunflower, soybean and algae. In this study, the consequences of large-scale production of biodiesel from micro-algae for eutrophication in four large European seas are analysed.

  6. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  7. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  8. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  9. Scale-Up: Improving Large Enrollment Physics Courses

    Science.gov (United States)

    Beichner, Robert

    1999-11-01

    The Student-Centered Activities for Large Enrollment University Physics (SCALE-UP) project is working to establish a learning environment that will promote increased conceptual understanding, improved problem-solving performance, and greater student satisfaction, while still maintaining class sizes of approximately 100. We are also addressing the new ABET engineering accreditation requirements for inquiry-based learning along with communication and team-oriented skills development. Results of studies of our latest classroom design, plans for future classroom space, and the current iteration of instructional materials will be discussed.

  10. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  11. On the scaling features of high-latitude geomagnetic field fluctuations during a large geomagnetic storm

    Science.gov (United States)

    De Michelis, Paola; Federica Marcucci, Maria; Consolini, Giuseppe

    2015-04-01

    Recently we have investigated the spatial distribution of the scaling features of short-time scale magnetic field fluctuations using measurements from several ground-based geomagnetic observatories distributed in the northern hemisphere. We have found that the scaling features of fluctuations of the horizontal magnetic field component at time scales below 100 minutes are correlated with the geomagnetic activity level and with changes in the currents flowing in the ionosphere. Here, we present a detailed analysis of the dynamical changes of the magnetic field scaling features as a function of the geomagnetic activity level during the well-known large geomagnetic storm occurred on July, 15, 2000 (the Bastille event). The observed dynamical changes are discussed in relationship with the changes of the overall ionospheric polar convection and potential structure as reconstructed using SuperDARN data. This work is supported by the Italian National Program for Antarctic Research (PNRA) - Research Project 2013/AC3.08 and by the European Community's Seventh Framework Programme ([FP7/2007-2013]) under Grant no. 313038/STORM and

  12. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  13. Geospatial intelligence and visual classification of environmentally observed species in the Future Internet

    Science.gov (United States)

    Arbab-Zavar, B.; Chakravarthy, A.; Sabeur, Z. A.

    2012-04-01

    The rapid development of advanced smart communication tools with good quality and resolution video cameras, audio and GPS devices in the last few years shall lead to profound impacts on the way future environmental observations are conducted and accessed by communities. The resulting large scale interconnections of these "Future Internet Things" form a large environmental sensing network which will generate large volumes of quality environmental observations and at highly localised spatial scales. This enablement in environmental sensing at local scales will be of great importance to contribute in the study of fauna and flora in the near future, particularly on the effect of climate change on biodiversity in various regions of Europe and beyond. The Future Internet could also potentially become the de facto information space to provide participative real-time sensing by communities and improve our situation awarness of the effect of climate on local environments. In the ENVIROFI(2011-2013) Usage Area project in the FP7 FI-PPP programme, a set of requirements for specific (and generic) enablers is achieved with the potential establishement of participating community observatories of the future. In particular, the specific enablement of interest concerns the building of future interoperable services for the management of environmental data intelligently with tagged contextual geo-spatial information generated by multiple operators in communities (Using smart phones). The classification of observed species in the resulting images is achieved with structured data pre-processing, semantic enrichement using contextual geospatial information, and high level fusion with controlled uncertainty estimations. The returned identification of species is further improved using future ground truth corrections and learning by the specific enablers.

  14. The UNH Earth Systems Observatory: A Regional Application in Support of GEOSS Global-Scale Objectives

    Science.gov (United States)

    Vorosmarty, C. J.; Braswell, B.; Fekete, B.; Glidden, S.; Hartmann, H.; Magill, A.; Prusevich, A.; Wollheim, W.; Blaha, D.; Justice, D.; Hurtt, G.; Jacobs, J.; Ollinger, S.; McDowell, W.; Rock, B.; Rubin, F.; Schloss, A.

    2006-12-01

    The Northeast corridor of the US is emblematic of the many changes taking place across the nation's and indeed the world's watersheds. Because ecosystem and watershed change occurs over many scales and is so multifaceted, transferring scientific knowledge to applications as diverse as remediation of local ground water pollution, setting State-wide best practices for non-point source pollution control, enforcing regional carbon sequestration treaties, or creating public/private partnerships for protecting ecosystem services requires a new generation of integrative environmental surveillance systems, information technology, and information transfer to the user community. Geographically complex ecosystem interactions justify moving toward more integrative, regionally-based management strategies to deal with issues affecting land, inland waterways, and coastal waterways. A unified perspective that considers the full continuum of processes which link atmospheric forcings, terrestrial responses, watershed exports along drainage networks, and the final delivery to the coastal zone, nearshore, and off shore waters is required to adequately support the management challenge. A recent inventory of NOAA-supported environmental surveillance systems, IT resources, new sensor technologies, and management-relevant decision support systems shows the community poised to formulate an integrated and operational picture of the environment of New England. This paper presents the conceptual framework and early products of the newly-created UNH Earth Systems Observatory. The goal of the UNH Observatory is to serve as a regionally-focused yet nationally-prominent platform for observation-based, integrative science and management of the New England/Gulf of Maine's land, air, and ocean environmental systems. Development of the UNH Observatory is being guided by the principles set forth under the Global Earth Observation System of Systems and is cast as an end-to-end prototype for GEOSS

  15. Large scale scenario analysis of future low carbon energy options

    International Nuclear Information System (INIS)

    Olaleye, Olaitan; Baker, Erin

    2015-01-01

    In this study, we use a multi-model framework to examine a set of possible future energy scenarios resulting from R&D investments in Solar, Nuclear, Carbon Capture and Storage (CCS), Bio-fuels, Bio-electricity, and Batteries for Electric Transportation. Based on a global scenario analysis, we examine the impact on the economy of advancement in energy technologies, considering both individual technologies and the interactions between pairs of technologies, with a focus on the role of uncertainty. Nuclear and CCS have the most impact on abatement costs, with CCS mostly important at high levels of abatement. We show that CCS and Bio-electricity are complements, while most of the other energy technology pairs are substitutes. We also examine for stochastic dominance between R&D portfolios: given the uncertainty in R&D outcomes, we examine which portfolios would be preferred by all decision-makers, regardless of their attitude toward risk. We observe that portfolios with CCS tend to stochastically dominate those without CCS; and portfolios lacking CCS and Nuclear tend to be stochastically dominated by others. We find that the dominance of CCS becomes even stronger as uncertainty in climate damages increases. Finally, we show that there is significant value in carefully choosing a portfolio, as relatively small portfolios can dominate large portfolios. - Highlights: • We examine future energy scenarios in the face of R&D and climate uncertainty. • We examine the impact of advancement in energy technologies and pairs of technologies. • CCS complements Bio-electricity while most technology pairs are substitutes. • R&D portfolios without CCS are stochastically dominated by portfolios with CCS. • Higher damage uncertainty favors R&D development of CCS and Bio-electricity

  16. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  17. Future space missions and ground observatory for measurements of coronal magnetic fields

    Science.gov (United States)

    Fineschi, Silvano; Gibson, Sarah; Bemporad, Alessandro; Zhukov, Andrei; Damé, Luc; Susino, Roberto; Larruquert, Juan

    2016-07-01

    This presentation gives an overview of the near-future perspectives for probing coronal magnetism from space missions (i.e., SCORE and ASPIICS) and ground-based observatory (ESCAPE). Spectro-polarimetric imaging of coronal emission-lines in the visible-light wavelength-band provides an important diagnostics tool of the coronal magnetism. The interpretation in terms of Hanle and Zeeman effect of the line-polarization in forbidden emission-lines yields information on the direction and strength of the coronal magnetic field. As study case, this presentation will describe the Torino Coronal Magnetograph (CorMag) for the spectro-polarimetric observation of the FeXIV, 530.3 nm, forbidden emission-line. CorMag - consisting of a Liquid Crystal (LC) Lyot filter and a LC linear polarimeter. The CorMag filter is part of the ESCAPE experiment to be based at the French-Italian Concordia base in Antarctica. The linear polarization by resonance scattering of coronal permitted line-emission in the ultraviolet (UV)can be modified by magnetic fields through the Hanle effect. Space-based UV spectro-polarimeters would provide an additional tool for the disgnostics of coronal magnetism. As a case study of space-borne UV spectro-polarimeters, this presentation will describe the future upgrade of the Sounding-rocket Coronagraphic Experiment (SCORE) to include new generation, high-efficiency UV polarizer with the capability of imaging polarimetry of the HI Lyman-α, 121.6 nm. SCORE is a multi-wavelength imager for the emission-lines, HeII 30.4 nm and HI 121.6 nm, and visible-light broad-band emission of the polarized K-corona. SCORE has flown successfully in 2009. The second lauch is scheduled in 2016. Proba-3 is the other future solar mission that would provide the opportunity of diagnosing the coronal magnetic field. Proba-3 is the first precision formation-flying mission to launched in 2019). A pair of satellites will fly together maintaining a fixed configuration as a 'large rigid

  18. Visualization of Large Amount of Spectra in Virtual Observatory Environment

    Czech Academy of Sciences Publication Activity Database

    Šaloun, P.; Andrešič, D.; Škoda, Petr; Zelinka, I.

    2014-01-01

    Roč. 11, č. 6 (2014), s. 613-620 ISSN 1476-8186 Institutional support: RVO:67985815 Keywords : SPLAT-VO * virtual observatory * spectra Subject RIV: BN - Astronomy, Celestial Mechanics, Astrophysics

  19. Optical Spectroscopy with the Technology of Virtual Observatory

    Science.gov (United States)

    Škoda, P.

    The contemporary astronomy is flooded with an exponentially growing petabyte-scaled data volumes produced by powerful ground and space-based instrumentation as well as a product of extensive computer simulations and computations of complex numerical models. The efficient organization and seamless handling of this information avalanche stored in a world-wide spread heterogeneous databases and the facilitation of extraction of new physical knowledge about the Universe is a primary goal of the rapidly evolving astronomical Virtual Observatory (VO). We give an overview of current spectroscopic capabilities of VO and identify the future requirements indispensable for detailed multi-wavelength analysis of huge amounts of spectra in a semi-automatic manner.

  20. The Astrophysical Multimessenger Observatory Network (AMON)

    Science.gov (United States)

    Smith. M. W. E.; Fox, D. B.; Cowen, D. F.; Meszaros, P.; Tesic, G.; Fixelle, J.; Bartos, I.; Sommers, P.; Ashtekar, Abhay; Babu, G. Jogesh; hide

    2013-01-01

    We summarize the science opportunity, design elements, current and projected partner observatories, and anticipated science returns of the Astrophysical Multimessenger Observatory Network (AMON). AMON will link multiple current and future high-energy, multimessenger, and follow-up observatories together into a single network, enabling near real-time coincidence searches for multimessenger astrophysical transients and their electromagnetic counterparts. Candidate and high-confidence multimessenger transient events will be identified, characterized, and distributed as AMON alerts within the network and to interested external observers, leading to follow-up observations across the electromagnetic spectrum. In this way, AMON aims to evoke the discovery of multimessenger transients from within observatory subthreshold data streams and facilitate the exploitation of these transients for purposes of astronomy and fundamental physics. As a central hub of global multimessenger science, AMON will also enable cross-collaboration analyses of archival datasets in search of rare or exotic astrophysical phenomena.

  1. Primordial Non-Gaussianity and Bispectrum Measurements in the Cosmic Microwave Background and Large-Scale Structure

    Directory of Open Access Journals (Sweden)

    Michele Liguori

    2010-01-01

    Full Text Available The most direct probe of non-Gaussian initial conditions has come from bispectrum measurements of temperature fluctuations in the Cosmic Microwave Background and of the matter and galaxy distribution at large scales. Such bispectrum estimators are expected to continue to provide the best constraints on the non-Gaussian parameters in future observations. We review and compare the theoretical and observational problems, current results, and future prospects for the detection of a nonvanishing primordial component in the bispectrum of the Cosmic Microwave Background and large-scale structure, and the relation to specific predictions from different inflationary models.

  2. Studies of dark energy with X-ray observatories.

    Science.gov (United States)

    Vikhlinin, Alexey

    2010-04-20

    I review the contribution of Chandra X-ray Observatory to studies of dark energy. There are two broad classes of observable effects of dark energy: evolution of the expansion rate of the Universe, and slow down in the rate of growth of cosmic structures. Chandra has detected and measured both of these effects through observations of galaxy clusters. A combination of the Chandra results with other cosmological datasets leads to 5% constraints on the dark energy equation-of-state parameter, and limits possible deviations of gravity on large scales from general relativity.

  3. Development of the simulation package 'ELSES' for extra-large-scale electronic structure calculation

    International Nuclear Information System (INIS)

    Hoshi, T; Fujiwara, T

    2009-01-01

    An early-stage version of the simulation package 'ELSES' (extra-large-scale electronic structure calculation) is developed for simulating the electronic structure and dynamics of large systems, particularly nanometer-scale and ten-nanometer-scale systems (see www.elses.jp). Input and output files are written in the extensible markup language (XML) style for general users. Related pre-/post-simulation tools are also available. A practical workflow and an example are described. A test calculation for the GaAs bulk system is shown, to demonstrate that the present code can handle systems with more than one atom species. Several future aspects are also discussed.

  4. Some ecological guidelines for large-scale biomass plantations

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, W.; Cook, J.H.; Beyea, J. [National Audubon Society, Tavernier, FL (United States)

    1993-12-31

    The National Audubon Society sees biomass as an appropriate and necessary source of energy to help replace fossil fuels in the near future, but is concerned that large-scale biomass plantations could displace significant natural vegetation and wildlife habitat, and reduce national and global biodiversity. We support the development of an industry large enough to provide significant portions of our energy budget, but we see a critical need to ensure that plantations are designed and sited in ways that minimize ecological disruption, or even provide environmental benefits. We have been studying the habitat value of intensively managed short-rotation tree plantations. Our results show that these plantations support large populations of some birds, but not all of the species using the surrounding landscape, and indicate that their value as habitat can be increased greatly by including small areas of mature trees within them. We believe short-rotation plantations can benefit regional biodiversity if they can be deployed as buffers for natural forests, or as corridors connecting forest tracts. To realize these benefits, and to avoid habitat degradation, regional biomass plantation complexes (e.g., the plantations supplying all the fuel for a powerplant) need to be planned, sited, and developed as large-scale units in the context of the regional landscape mosaic.

  5. Large-scale educational telecommunications systems for the US: An analysis of educational needs and technological opportunities

    Science.gov (United States)

    Morgan, R. P.; Singh, J. P.; Rothenberg, D.; Robinson, B. E.

    1975-01-01

    The needs to be served, the subsectors in which the system might be used, the technology employed, and the prospects for future utilization of an educational telecommunications delivery system are described and analyzed. Educational subsectors are analyzed with emphasis on the current status and trends within each subsector. Issues which affect future development, and prospects for future use of media, technology, and large-scale electronic delivery within each subsector are included. Information on technology utilization is presented. Educational telecommunications services are identified and grouped into categories: public television and radio, instructional television, computer aided instruction, computer resource sharing, and information resource sharing. Technology based services, their current utilization, and factors which affect future development are stressed. The role of communications satellites in providing these services is discussed. Efforts to analyze and estimate future utilization of large-scale educational telecommunications are summarized. Factors which affect future utilization are identified. Conclusions are presented.

  6. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  7. Safeguarding future large-scale plutonium bulk handling facilities

    International Nuclear Information System (INIS)

    1979-01-01

    The paper reviews the current status, advantages, limitations and probable future developments of material accountancy and of containment and surveillance. The major limitations on the use of material accountancy in applying safeguards to future plants arise from the uncertainty with which flows and inventories can be measured (0.5 to 1.0%), and the necessity to carry out periodical physical inventories to determine whether material has been diverted. The use of plant instrumentation to determine in-process inventories has commenced and so has the development of statistical methods for the evaluations of the data derived from a series of consecutive material balance periods. The limitations of accountancy can be overcome by increased use of containment and surveillance measures which have the advantage that they are independent of the operator's actions. In using these measures it will be necessary to identify the credible diversion paths, build in sufficient redundancy to reduce false alarm rates, develop automatic data recording and alarming

  8. Site Protection Efforts at the AURA Observatory in Chile

    Science.gov (United States)

    Smith, R. Chris; Smith, Malcolm G.; Sanhueza, Pedro

    2015-08-01

    The AURA Observatory (AURA-O) was the first of the major international observatories to be established in northern Chile to exploit the optimal astronomical conditions available there. The site was originally established in 1962 to host the Cerro Tololo Inter-American Observatory (CTIO). It now hosts more than 20 operational telescopes, including some of the leading U.S. and international astronomical facilities in the southern hemisphere, such as the Blanco 4m telescope on Cerro Tololo and the Gemini-South and SOAR telescopes on Cerro Pachón. Construction of the next generation facility, the Large Synoptic Survey Telescope (LSST), has recently begun on Cerro Pachón, while additional smaller telescopes continue to be added to the complement on Cerro Tololo.While the site has become a major platform for international astronomical facilities over the last 50 years, development in the region has led to an ever-increasing threat of light pollution around the site. AURA-O has worked closely with local, regional, and national authorities and institutions (in particular with the Chilean Ministries of Environment and Foreign Relations) in an effort to protect the site so that future generations of telescopes, as well as future generations of Chileans, can benefit from the dark skies in the region. We will summarize our efforts over the past 15 years to highlight the importance of dark sky protection through education and public outreach as well as through more recent promotion of IDA certifications in the region and support for the World Heritage initiatives described by others in this conference.

  9. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  10. Large underground, liquid based detectors for astro-particle physics in Europe scientific case and prospects

    CERN Document Server

    Autiero, D; Badertscher, A; Bezrukov, L; Bouchez, J; Bueno, A; Busto, J; Campagne, J -E; Cavata, C; De Bellefon, A; Dumarchez, J; Ebert, J; Enqvist, T; Ereditato, A; Von Feilitzsch, F; Perez, P Fileviez; Goger-Neff, M; Gninenko, S; Gruber, W; Hagner, C; Hess, M; Hochmuth, K A; Kisiel, J; Knecht, L; Kreslo, I; Kudryavtsev, V A; Kuusiniemi, P; Lachenmaier, T; Laffranchi, M; Lefièvre, B; Lightfoot, P K; Lindner, M; Maalampi, J; Maltoni, M; Marchionni, A; Undagoitia, T Marrodan; Meregaglia, A; Messina, M; Mezzetto, M; Mirizzi, A; Mosca, L; Moser, U; Müller, A; Natterer, G; Oberauer, L; Otiougova, P; Patzak, T; Peltoniemi, J; Potzel, W; Pistillo, C; Raffelt, G G; Rondio, E; Roos, M; Rossi, B; Rubbia, André; Savvinov, N; Schwetz, T; Sobczyk, J; Spooner, N J C; Stefan, D; Tonazzo, A; Trzaska, W; Ulbricht, J; Volpe, C; Winter, J; Wurm, M; Zalewska-Bak, A; Zimmermann, R

    2007-01-01

    This document reports on a series of experimental and theoretical studies conducted to assess the astro-particle physics potential of three future large-scale particle detectors proposed in Europe as next generation underground observatories. The proposed apparatus employ three different and, to some extent, complementary detection techniques: GLACIER (liquid Argon TPC), LENA (liquid scintillator) and MEMPHYS (\\WC), based on the use of large mass of liquids as active detection media. The results of these studies are presented along with a critical discussion of the performance attainable by the three proposed approaches coupled to existing or planned underground laboratories, in relation to open and outstanding physics issues such as the search for matter instability, the detection of astrophysical- and geo-neutrinos and to the possible use of these detectors in future high-intensity neutrino beams.

  11. Observation of a large-scale anisotropy in the arrival directions of cosmic rays above 8 x 10.sup.18./sup. eV

    Czech Academy of Sciences Publication Activity Database

    Aab, A.; Abreu, P.; Aglietta, M.; Blažek, Jiří; Boháčová, Martina; Chudoba, Jiří; Ebr, Jan; Juryšek, Jakub; Mandát, Dušan; Palatka, Miroslav; Pech, Miroslav; Prouza, Michael; Řídký, Jan; Schovánek, Petr; Trávníček, Petr; Vícha, Jakub

    2017-01-01

    Roč. 357, č. 6357 (2017), s. 1266-1270 ISSN 0036-8075 R&D Projects: GA MŠk LM2015038; GA MŠk LG15014; GA ČR(CZ) GA14-17501S Institutional support: RVO:68378271 Keywords : cosmic rays * Pierre Auger Observatory * ultrahigh energy * large-scale anisotropy Subject RIV: BF - Elementary Particles and High Energy Physics OBOR OECD: Particles and field physics Impact factor: 37.205, year: 2016

  12. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  13. Impacts of Changing Climatic Drivers and Land use features on Future Stormwater Runoff in the Northwest Florida Basin: A Large-Scale Hydrologic Modeling Assessment

    Science.gov (United States)

    Khan, M.; Abdul-Aziz, O. I.

    2017-12-01

    Potential changes in climatic drivers and land cover features can significantly influence the stormwater budget in the Northwest Florida Basin. We investigated the hydro-climatic and land use sensitivities of stormwater runoff by developing a large-scale process-based rainfall-runoff model for the large basin by using the EPA Storm Water Management Model (SWMM 5.1). Climatic and hydrologic variables, as well as land use/cover features were incorporated into the model to account for the key processes of coastal hydrology and its dynamic interactions with groundwater and sea levels. We calibrated and validated the model by historical daily streamflow observations during 2009-2012 at four major rivers in the basin. Downscaled climatic drivers (precipitation, temperature, solar radiation) projected by twenty GCMs-RCMs under CMIP5, along with the projected future land use/cover features were also incorporated into the model. The basin storm runoff was then simulated for the historical (2000s = 1976-2005) and two future periods (2050s = 2030-2059, and 2080s = 2070-2099). Comparative evaluation of the historical and future scenarios leads to important guidelines for stormwater management in Northwest Florida and similar regions under a changing climate and environment.

  14. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  15. What the Heliophysics System Observatory is teaching us about future constellations

    Science.gov (United States)

    Angelopoulos, V.

    2017-12-01

    Owing to the benign space weather during the recent solar cycle numerous Heliophysics missions have outlived their original purpose and have exceeded expectations in terms of science return. The simultaneous availability of several multi-spacecraft fleets also offers conjunction opportunities that compounds their science yield. It allows the Heliophysics System, a vast region of Sun-Earth interactions, to be peered through the colletive eyes of a fortuitous grand Observatory. The success of this Heliophysics/Geospace System Observatory (H/GSO) has been partly due to fuel resources available on THEMIS, allowing it to reconfigure its orbit lines of apsides, apogees and mean anomalies to optimize conjunctions with the rest of the H/GSO. The other part of the success has been a mandatory open data policy, the accessibility of the data though common data formats, unified analysis tools (e.g. SPEDAS) and distributed data repositories. Future constellations are motivated by the recent science lessons learned: Tight connections between dayside and nightside processes, evidenced by fortuitous conjunctions of ground and space-based assets, suggest that regional activations drive classical global modes of circulation. Like regional tornadoes and hurricanes synthesize global atmospheric weather that cannot be studied with 5 weather stations alone, one per continent, so do dayside reconnection, and nightside injections require more than a handful of point measurements. Like atmospheric weather, space weather too requires networks of stations built to meet a minimum set of requirements to "play together" and build on each other over time. Like Argo's >3000 buoys have revolutionized research, modeling and prediction by global circulation models, "space buoys" can study space weather fronts and double-up as monitors and inputs to space weather models, increasing fidelity and advance warning. Reconfigurability can allow versatility as the scientific targets adjust to the knowledge

  16. Imprint of thawing scalar fields on the large scale galaxy overdensity

    Science.gov (United States)

    Dinda, Bikash R.; Sen, Anjan A.

    2018-04-01

    We investigate the observed galaxy power spectrum for the thawing class of scalar field models taking into account various general relativistic corrections that occur on very large scales. We consider the full general relativistic perturbation equations for the matter as well as the dark energy fluid. We form a single autonomous system of equations containing both the background and the perturbed equations of motion which we subsequently solve for different scalar field potentials. First we study the percentage deviation from the Λ CDM model for different cosmological parameters as well as in the observed galaxy power spectra on different scales in scalar field models for various choices of scalar field potentials. Interestingly the difference in background expansion results from the enhancement of power from Λ CDM on small scales, whereas the inclusion of general relativistic (GR) corrections results in the suppression of power from Λ CDM on large scales. This can be useful to distinguish scalar field models from Λ CDM with future optical/radio surveys. We also compare the observed galaxy power spectra for tracking and thawing types of scalar field using some particular choices for the scalar field potentials. We show that thawing and tracking models can have large differences in observed galaxy power spectra on large scales and for smaller redshifts due to different GR effects. But on smaller scales and for larger redshifts, the difference is small and is mainly due to the difference in background expansion.

  17. Optical Spectroscopy with the Technology of Virtual Observatory

    Directory of Open Access Journals (Sweden)

    Škoda P.

    2011-12-01

    Full Text Available The contemporary astronomy is flooded with an exponentially growing petabyte-scaled data volumes produced by powerful ground and space-based instrumentation as well as a product of extensive computer simulations and computations of complex numerical models. The efficient organization and seamless handling of this information avalanche stored in a world-wide spread heterogeneous databases and the facilitation of extraction of new physical knowledge about the Universe is a primary goal of the rapidly evolving astronomical Virtual Observatory (VO. We give an overview of current spectroscopic capabilities of VO and identify the future requirements indispensable for detailed multi-wavelength analysis of huge amounts of spectra in a semi-automatic manner.

  18. Investigation on the integral output power model of a large-scale wind farm

    Institute of Scientific and Technical Information of China (English)

    BAO Nengsheng; MA Xiuqian; NI Weidou

    2007-01-01

    The integral output power model of a large-scale wind farm is needed when estimating the wind farm's output over a period of time in the future.The actual wind speed power model and calculation method of a wind farm made up of many wind turbine units are discussed.After analyzing the incoming wind flow characteristics and their energy distributions,and after considering the multi-effects among the wind turbine units and certain assumptions,the incoming wind flow model of multi-units is built.The calculation algorithms and steps of the integral output power model of a large-scale wind farm are provided.Finally,an actual power output of the wind farm is calculated and analyzed by using the practical measurement wind speed data.The characteristics of a large-scale wind farm are also discussed.

  19. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  20. Potential climatic impacts and reliability of large-scale offshore wind farms

    International Nuclear Information System (INIS)

    Wang Chien; Prinn, Ronald G

    2011-01-01

    The vast availability of wind power has fueled substantial interest in this renewable energy source as a potential near-zero greenhouse gas emission technology for meeting future world energy needs while addressing the climate change issue. However, in order to provide even a fraction of the estimated future energy needs, a large-scale deployment of wind turbines (several million) is required. The consequent environmental impacts, and the inherent reliability of such a large-scale usage of intermittent wind power would have to be carefully assessed, in addition to the need to lower the high current unit wind power costs. Our previous study (Wang and Prinn 2010 Atmos. Chem. Phys. 10 2053) using a three-dimensional climate model suggested that a large deployment of wind turbines over land to meet about 10% of predicted world energy needs in 2100 could lead to a significant temperature increase in the lower atmosphere over the installed regions. A global-scale perturbation to the general circulation patterns as well as to the cloud and precipitation distribution was also predicted. In the later study reported here, we conducted a set of six additional model simulations using an improved climate model to further address the potential environmental and intermittency issues of large-scale deployment of offshore wind turbines for differing installation areas and spatial densities. In contrast to the previous land installation results, the offshore wind turbine installations are found to cause a surface cooling over the installed offshore regions. This cooling is due principally to the enhanced latent heat flux from the sea surface to lower atmosphere, driven by an increase in turbulent mixing caused by the wind turbines which was not entirely offset by the concurrent reduction of mean wind kinetic energy. We found that the perturbation of the large-scale deployment of offshore wind turbines to the global climate is relatively small compared to the case of land

  1. A virtual observatory in a real world: building capacity for an uncertain future

    Science.gov (United States)

    Blair, Gordon; Buytaert, Wouter; Emmett, Bridget; Freer, Jim; Gurney, Robert; Haygarth, Phil; McDonald, Adrian; Rees, Gwyn; Tetzlaff, Doerthe

    2010-05-01

    Environmental managers and policy makers face a challenging future trying to accommodate growing expectations of environmental well-being, while subject to maturing regulation, constrained budgets and a public scrutiny that expects easier and more meaningful access. To support such a challenge requires new tools and new approaches. The VO is a new initiative from the Natural Environment Research Council (NERC) designed to deliver proof of concept for these new tools and approaches. The VO is at an early stage and we first evaluate the role of existing ‘observatories' in the UK and elsewhere both to learn good practice (and just as valuable - errors) and to define boundaries. A series of exemplar ‘big catchment science questions' are posed - distinguishing between science and society positions - and the prospects for their solution are assessed. The VO vision of being driven by these questions is outlined as are the seven key ambitions namely: i. being driven by the need to contribute to the solution of major environmental issues that impinge on, or link to, catchment science ii. having the flexibility and adaptability to address future problems not yet defined or fully clarified iii. being able to communicate issues and solutions to a range of audiences iv. supporting easy access by a variety of users v. drawing meaningful information from data and models and identifying the constraints on application in terms of errors, uncertainties, etc vi. adding value and cost effectiveness to current investigations by supporting transfer and scale adjustment thus limiting the repetition of expensive field monitoring addressing essentially the same issues in varying locations vii. promoting effective interfacing of robust science with a variety of end users by using terminology or measures familiar to the user (or required by regulation), including financial and carbon accounting, whole life or fixed period costing, risk as probability or as disability adjusted life years

  2. Ten years of the Spanish Virtual Observatory

    Science.gov (United States)

    Solano, E.

    2015-05-01

    The main objective of the Virtual Observatory (VO) is to guarantee an easy and efficient access and analysis of the information hosted in astronomical archives. The Spanish Virtual Observatory (SVO) is a project that was born in 2004 with the goal of promoting and coordinating the VO-related activities at national level. SVO is also the national contact point for the international VO initiatives, in particular the International Virtual Observatory Alliance (IVOA) and the Euro-VO project. The project, led by Centro de Astrobiología (INTA-CSIC), is structured around four major topics: a) VO compliance of astronomical archives, b) VO-science, c) VO- and data mining-tools, and d) Education and outreach. In this paper I will describe the most important results obtained by the Spanish Virtual Observatory in its first ten years of life as well as the future lines of work.

  3. Triggers for the Pierre Auger Observatory, the current status and plans for the future

    CERN Document Server

    Szadkowski, Z

    2009-01-01

    The Pierre Auger Observatory is a multi-national organization for research on ultra-high energy cosmic rays. The Southern Auger Observatory (Auger-South) in the province of Mendoza, Argentina, has been completed in 2008. First results on the energy spectrum, mass composition and distribution of arrival directions on the southern sky are really impressive. The planned Northern Auger Observatory in Colorado, USA, (Auger-North) will open a new window into the universe and establish charged particle astronomy to determine the origin and nature of ultra-high energy cosmic rays. These cosmic particles carry information complementary to neutrinos and photons and to gravitational waves. They also provide an extremely energetic beam for the study of particle interactions at energies that thirty times higher than those reached in terrestrial accelerators. The Auger Observatory is a hybrid detector consisting of a Surface Detector (SD) and an atmospheric Fluorescence Detector (FD). The hybrid data set obtained when both...

  4. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  5. Copy of Using Emulation and Simulation to Understand the Large-Scale Behavior of the Internet.

    Energy Technology Data Exchange (ETDEWEB)

    Adalsteinsson, Helgi; Armstrong, Robert C.; Chiang, Ken; Gentile, Ann C.; Lloyd, Levi; Minnich, Ronald G.; Vanderveen, Keith; Van Randwyk, Jamie A; Rudish, Don W.

    2008-10-01

    We report on the work done in the late-start LDRDUsing Emulation and Simulation toUnderstand the Large-Scale Behavior of the Internet. We describe the creation of a researchplatform that emulates many thousands of machines to be used for the study of large-scale inter-net behavior. We describe a proof-of-concept simple attack we performed in this environment.We describe the successful capture of a Storm bot and, from the study of the bot and furtherliterature search, establish large-scale aspects we seek to understand via emulation of Storm onour research platform in possible follow-on work. Finally, we discuss possible future work.3

  6. HTS cables open the window for large-scale renewables

    International Nuclear Information System (INIS)

    Geschiere, A; Willen, D; Piga, E; Barendregt, P

    2008-01-01

    In a realistic approach to future energy consumption, the effects of sustainable power sources and the effects of growing welfare with increased use of electricity need to be considered. These factors lead to an increased transfer of electric energy over the networks. A dominant part of the energy need will come from expanded large-scale renewable sources. To use them efficiently over Europe, large energy transits between different countries are required. Bottlenecks in the existing infrastructure will be avoided by strengthening the network. For environmental reasons more infrastructure will be built underground. Nuon is studying the HTS technology as a component to solve these challenges. This technology offers a tremendously large power transport capacity as well as the possibility to reduce short circuit currents, making integration of renewables easier. Furthermore, power transport will be possible at lower voltage levels, giving the opportunity to upgrade the existing network while re-using it. This will result in large cost savings while reaching the future energy challenges. In a 6 km backbone structure in Amsterdam Nuon wants to install a 50 kV HTS Triax cable for a significant increase of the transport capacity, while developing its capabilities. Nevertheless several barriers have to be overcome

  7. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  8. Networking of Bibliographical Information: Lessons learned for the Virtual Observatory

    Science.gov (United States)

    Genova, Françoise; Egret, Daniel

    Networking of bibliographic information is particularly remarkable in astronomy. On-line journals, the ADS bibliographic database, SIMBAD and NED are everyday tools for research, and provide easy navigation from one resource to another. Tables are published on line, in close collaboration with data centers. Recent new developments include the links between observatory archives and the ADS, as well as the large scale prototyping of object links between Astronomy and Astrophysics and SIMBAD, following those implemented a few years ago with New Astronomy and the International Bulletin of Variable stars . This networking has been made possible by close collaboration between the ADS, data centers such as the CDS and NED, and the journals, and this partnership being now extended to observatory archives. Simple, de facto exchange standards, like the bibcode to refer to a published paper, have been the key for building links and exchanging data. This partnership, in which practitioners from different disciplines agree to link their resources and to work together to define useful and usable standards, has produced a revolution in scientists' practice. It is an excellent model for the Virtual Observatory projects.

  9. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  10. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  11. The Ownership Structure Dilemma and its Implications on the Transition from Small-Scale to Large-Scale Electric Road Systems

    OpenAIRE

    BEDNARCIK ABDULHADI, EMMA; VITEZ, MARINA

    2016-01-01

    This master thesis is written on behalf of KTH Royal Institute of Technology and the Swedish National Road and Transport Research Institute (VTI). The study investigates how infrastructure ownership could affect the transition from small-scale to large-scale electric road systems (ERS) and how infrastructure ownership affects the foreseen future roles of the ERS stakeholders. The authors have used a qualitative research method, including a literature study within the areas of infrastructure t...

  12. Impact of large-scale circulation changes in the North Atlantic sector on the current and future Mediterranean winter hydroclimate

    Science.gov (United States)

    Barcikowska, Monika J.; Kapnick, Sarah B.; Feser, Frauke

    2018-03-01

    The Mediterranean region, located in the transition zone between the dry subtropical and wet European mid-latitude climate, is very sensitive to changes in the global mean climate state. Projecting future changes of the Mediterranean hydroclimate under global warming therefore requires dynamic climate models to reproduce the main mechanisms controlling regional hydroclimate with sufficiently high resolution to realistically simulate climate extremes. To assess future winter precipitation changes in the Mediterranean region we use the Geophysical Fluid Dynamics Laboratory high-resolution general circulation model for control simulations with pre-industrial greenhouse gas and aerosol concentrations which are compared to future scenario simulations. Here we show that the coupled model is able to reliably simulate the large-scale winter circulation, including the North Atlantic Oscillation and Eastern Atlantic patterns of variability, and its associated impacts on the mean Mediterranean hydroclimate. The model also realistically reproduces the regional features of daily heavy rainfall, which are absent in lower-resolution simulations. A five-member future projection ensemble, which assumes comparatively high greenhouse gas emissions (RCP8.5) until 2100, indicates a strong winter decline in Mediterranean precipitation for the coming decades. Consistent with dynamical and thermodynamical consequences of a warming atmosphere, derived changes feature a distinct bipolar behavior, i.e. wetting in the north—and drying in the south. Changes are most pronounced over the northwest African coast, where the projected winter precipitation decline reaches 40% of present values. Despite a decrease in mean precipitation, heavy rainfall indices show drastic increases across most of the Mediterranean, except the North African coast, which is under the strong influence of the cold Canary Current.

  13. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  14. Hartle-Hawking wave function and large-scale power suppression of CMB*

    Directory of Open Access Journals (Sweden)

    Yeom Dong-han

    2018-01-01

    Full Text Available In this presentation, we first describe the Hartle-Hawking wave function in the Euclidean path integral approach. After we introduce perturbations to the background instanton solution, following the formalism developed by Halliwell-Hawking and Laflamme, one can obtain the scale-invariant power spectrum for small-scales. We further emphasize that the Hartle-Hawking wave function can explain the large-scale power suppression by choosing suitable potential parameters, where this will be a possible window to confirm or falsify models of quantum cosmology. Finally, we further comment on possible future applications, e.g., Euclidean wormholes, which can result in distinct signatures to the power spectrum.

  15. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  16. COVE: a visual environment for ocean observatory design

    International Nuclear Information System (INIS)

    Grochow, K; Lazowska, E; Stoermer, M; Kelley, D; Delaney, J

    2008-01-01

    Physical, chemical, and biological ocean processes play a crucial role in determining Earth's environment. Unfortunately, our knowledge of these processes is limited because oceanography is carried out today largely the way it was a century ago: as expeditionary science, going to sea in ships and measuring a relatively small number of parameters (e.g., temperature, salinity, and pressure) as time and budget allow. The NSF Ocean Observatories Initiative is a US$330 million project that will help transform oceanography from a data-poor to a data-rich science. A cornerstone of this project is the deep water Regional Scale Nodes (RSN) that will be installed off the coasts of Washington and Oregon. The RSN will include 1500 km of fiber optic cable providing power and bandwidth to the seafloor and throughout the water column. Thousands of sensors will be deployed to stream data and imagery to shore, where they will be available in real time for ocean scientists and the public at large. The design of the RSN is a complex undertaking, requiring a combination of many different interactive tools and areas of visualization: geographic visualization to see the available seafloor bathymetry, scientific visualization to examine existing geospatially located datasets, layout tools to place the sensors, and collaborative tools to communicate across the team during the design. COVE, the Common Observatory Visualization Environment, is a visualization environment designed to meet all these needs. COVE has been built by computer scientists working closely with the engineering and scientific teams who will build and use the RSN. This paper discusses the data and activities of cabled observatory design, the design of COVE, and results from its use across the team

  17. Development of large scale production of Nd-doped phosphate glasses for megajoule-scale laser systems

    International Nuclear Information System (INIS)

    Ficini, G.; Campbell, J.H.

    1996-01-01

    Nd-doped phosphate glasses are the preferred gain medium for high-peak-power lasers used for Inertial Confinement Fusion research because they have excellent energy storage and extraction characteristics. In addition, these glasses can be manufactured defect-free in large sizes and at relatively low cost. To meet the requirements of the future mega-joule size lasers, advanced laser glass manufacturing methods are being developed that would enable laser glass to be continuously produced at the rate of several thousand large (790 x 440 x 44 mm 3 ) plates of glass per year. This represents more than a 10 to 100-fold improvement in the scale of the present manufacturing technology

  18. Assessment of Future Whole-System Value of Large-Scale Pumped Storage Plants in Europe

    Directory of Open Access Journals (Sweden)

    Fei Teng

    2018-01-01

    Full Text Available This paper analyses the impacts and benefits of the pumped storage plant (PSP and its upgrade to variable speed on generation and transmission capacity requirements, capital costs, system operating costs and carbon emissions in the future European electricity system. The combination of a deterministic system planning tool, Whole-electricity System Investment Model (WeSIM, and a stochastic system operation optimisation tool, Advanced Stochastic Unit Commitment (ASUC, is used to analyse the whole-system value of PSP technology and to quantify the impact of European balancing market integration and other competing flexible technologies on the value of the PSP. Case studies on the Pan-European system demonstrate that PSPs can reduce the total system cost by up to €13 billion per annum by 2050 in a scenario with a high share of renewables. Upgrading the PSP to variable-speed drive enhances its long-term benefits by 10–20%. On the other hand, balancing market integration across Europe may potentially reduce the overall value of the variable-speed PSP, although the effect can vary across different European regions. The results also suggest that large-scale deployment of demand-side response (DSR leads to a significant reduction in the value of PSPs, while the value of PSPs increases by circa 18% when the total European interconnection capacity is halved. The benefit of PSPs in reducing emissions is relatively negligible by 2030 but constitutes around 6–10% of total annual carbon emissions from the European power sector by 2050.

  19. TOPOLOGY OF A LARGE-SCALE STRUCTURE AS A TEST OF MODIFIED GRAVITY

    International Nuclear Information System (INIS)

    Wang Xin; Chen Xuelei; Park, Changbom

    2012-01-01

    The genus of the isodensity contours is a robust measure of the topology of a large-scale structure, and it is relatively insensitive to nonlinear gravitational evolution, galaxy bias, and redshift-space distortion. We show that the growth of density fluctuations is scale dependent even in the linear regime in some modified gravity theories, which opens a new possibility of testing the theories observationally. We propose to use the genus of the isodensity contours, an intrinsic measure of the topology of the large-scale structure, as a statistic to be used in such tests. In Einstein's general theory of relativity, density fluctuations grow at the same rate on all scales in the linear regime, and the genus per comoving volume is almost conserved as structures grow homologously, so we expect that the genus-smoothing-scale relation is basically time independent. However, in some modified gravity models where structures grow with different rates on different scales, the genus-smoothing-scale relation should change over time. This can be used to test the gravity models with large-scale structure observations. We study the cases of the f(R) theory, DGP braneworld theory as well as the parameterized post-Friedmann models. We also forecast how the modified gravity models can be constrained with optical/IR or redshifted 21 cm radio surveys in the near future.

  20. Compensating active power imbalances in power system with large-scale wind power penetration

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Altin, Müfit

    2016-01-01

    Large-scale wind power penetration can affectthe supply continuity in the power system. This is a matterof high priority to investigate, as more regulating reservesand specified control strategies for generation control arerequired in the future power system with even more highwind power penetrat...

  1. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  2. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    Science.gov (United States)

    Dong, Xianlei; Bollen, Johan

    2015-01-01

    Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I) that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  3. Computational models of consumer confidence from large-scale online attention data: crowd-sourcing econometrics.

    Directory of Open Access Journals (Sweden)

    Xianlei Dong

    Full Text Available Economies are instances of complex socio-technical systems that are shaped by the interactions of large numbers of individuals. The individual behavior and decision-making of consumer agents is determined by complex psychological dynamics that include their own assessment of present and future economic conditions as well as those of others, potentially leading to feedback loops that affect the macroscopic state of the economic system. We propose that the large-scale interactions of a nation's citizens with its online resources can reveal the complex dynamics of their collective psychology, including their assessment of future system states. Here we introduce a behavioral index of Chinese Consumer Confidence (C3I that computationally relates large-scale online search behavior recorded by Google Trends data to the macroscopic variable of consumer confidence. Our results indicate that such computational indices may reveal the components and complex dynamics of consumer psychology as a collective socio-economic phenomenon, potentially leading to improved and more refined economic forecasting.

  4. Solving large scale structure in ten easy steps with COLA

    Energy Technology Data Exchange (ETDEWEB)

    Tassev, Svetlin [Department of Astrophysical Sciences, Princeton University, 4 Ivy Lane, Princeton, NJ 08544 (United States); Zaldarriaga, Matias [School of Natural Sciences, Institute for Advanced Study, Olden Lane, Princeton, NJ 08540 (United States); Eisenstein, Daniel J., E-mail: stassev@cfa.harvard.edu, E-mail: matiasz@ias.edu, E-mail: deisenstein@cfa.harvard.edu [Center for Astrophysics, Harvard University, 60 Garden Street, Cambridge, MA 02138 (United States)

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  5. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  6. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  7. Large-scale impact of climate change vs. land-use change on future biome shifts in Latin America

    NARCIS (Netherlands)

    Boit, Alice; Sakschewski, Boris; Boysen, Lena; Cano-Crespo, Ana; Clement, Jan; Garcia-alaniz, Nashieli; Kok, Kasper; Kolb, Melanie; Langerwisch, Fanny; Rammig, Anja; Sachse, René; Eupen, van Michiel; Bloh, von Werner; Clara Zemp, Delphine; Thonicke, Kirsten

    2016-01-01

    Climate change and land-use change are two major drivers of biome shifts causing habitat and biodiversity loss. What is missing is a continental-scale future projection of the estimated relative impacts of both drivers on biome shifts over the course of this century. Here, we provide such a

  8. Review of Dynamic Modeling and Simulation of Large Scale Belt Conveyor System

    Science.gov (United States)

    He, Qing; Li, Hong

    Belt conveyor is one of the most important devices to transport bulk-solid material for long distance. Dynamic analysis is the key to decide whether the design is rational in technique, safe and reliable in running, feasible in economy. It is very important to study dynamic properties, improve efficiency and productivity, guarantee conveyor safe, reliable and stable running. The dynamic researches and applications of large scale belt conveyor are discussed. The main research topics, the state-of-the-art of dynamic researches on belt conveyor are analyzed. The main future works focus on dynamic analysis, modeling and simulation of main components and whole system, nonlinear modeling, simulation and vibration analysis of large scale conveyor system.

  9. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  10. Computational challenges of large-scale, long-time, first-principles molecular dynamics

    International Nuclear Information System (INIS)

    Kent, P R C

    2008-01-01

    Plane wave density functional calculations have traditionally been able to use the largest available supercomputing resources. We analyze the scalability of modern projector-augmented wave implementations to identify the challenges in performing molecular dynamics calculations of large systems containing many thousands of electrons. Benchmark calculations on the Cray XT4 demonstrate that global linear-algebra operations are the primary reason for limited parallel scalability. Plane-wave related operations can be made sufficiently scalable. Improving parallel linear-algebra performance is an essential step to reaching longer timescales in future large-scale molecular dynamics calculations

  11. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  12. Seafloor Observatory Science: a Review

    Directory of Open Access Journals (Sweden)

    L. Beranzoli

    2006-06-01

    Full Text Available The ocean exerts a pervasive influence on Earth’s environment. It is therefore important that we learn how this system operates (NRC, 1998b; 1999. For example, the ocean is an important regulator of climate change (e.g., IPCC, 1995. Understanding the link between natural and anthropogenic climate change and ocean circulation is essential for predicting the magnitude and impact of future changes in Earth’s climate. Understanding the ocean, and the complex physical, biological, chemical, and geological systems operating within it, should be an important goal for the opening decades of the 21st century. Another fundamental reason for increasing our understanding of ocean systems is that the global economy is highly dependent on the ocean (e.g., for tourism, fisheries, hydrocarbons, and mineral resources (Summerhayes, 1996. The establishment of a global network of seafloor observatories will help to provide the means to accomplish this goal. These observatories will have power and communication capabilities and will provide support for spatially distributed sensing systems and mobile platforms. Sensors and instruments will potentially collect data from above the air-sea interface to below the seafloor. Seafloor observatories will also be a powerful complement to satellite measurement systems by providing the ability to collect vertically distributed measurements within the water column for use with the spatial measurements acquired by satellites while also providing the capability to calibrate remotely sensed satellite measurements (NRC, 2000. Ocean observatory science has already had major successes. For example the TAO array has enabled the detection, understanding and prediction of El Niño events (e.g., Fujimoto et al., 2003. This paper is a world-wide review of the new emerging “Seafloor Observatory Science”, and describes both the scientific motivations for seafloor observatories and the technical solutions applied to their architecture. A

  13. Towards large-scale plasma-assisted synthesis of nanowires

    Science.gov (United States)

    Cvelbar, U.

    2011-05-01

    Large quantities of nanomaterials, e.g. nanowires (NWs), are needed to overcome the high market price of nanomaterials and make nanotechnology widely available for general public use and applications to numerous devices. Therefore, there is an enormous need for new methods or routes for synthesis of those nanostructures. Here plasma technologies for synthesis of NWs, nanotubes, nanoparticles or other nanostructures might play a key role in the near future. This paper presents a three-dimensional problem of large-scale synthesis connected with the time, quantity and quality of nanostructures. Herein, four different plasma methods for NW synthesis are presented in contrast to other methods, e.g. thermal processes, chemical vapour deposition or wet chemical processes. The pros and cons are discussed in detail for the case of two metal oxides: iron oxide and zinc oxide NWs, which are important for many applications.

  14. REQUIREMENTS FOR SYSTEMS DEVELOPMENT LIFE CYCLE MODELS FOR LARGE-SCALE DEFENSE SYSTEMS

    Directory of Open Access Journals (Sweden)

    Kadir Alpaslan DEMIR

    2015-10-01

    Full Text Available TLarge-scale defense system projects are strategic for maintaining and increasing the national defense capability. Therefore, governments spend billions of dollars in the acquisition and development of large-scale defense systems. The scale of defense systems is always increasing and the costs to build them are skyrocketing. Today, defense systems are software intensive and they are either a system of systems or a part of it. Historically, the project performances observed in the development of these systems have been signifi cantly poor when compared to other types of projects. It is obvious that the currently used systems development life cycle models are insuffi cient to address today’s challenges of building these systems. Using a systems development life cycle model that is specifi cally designed for largescale defense system developments and is effective in dealing with today’s and near-future challenges will help to improve project performances. The fi rst step in the development a large-scale defense systems development life cycle model is the identifi cation of requirements for such a model. This paper contributes to the body of literature in the fi eld by providing a set of requirements for system development life cycle models for large-scale defense systems. Furthermore, a research agenda is proposed.

  15. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  16. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  17. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  18. Construction and testing of a large scale prototype of a silicon tungsten electromagnetic calorimeter for a future lepton collider

    International Nuclear Information System (INIS)

    Rouëné, Jérémy

    2013-01-01

    The CALICE collaboration is preparing large scale prototypes of highly granular calorimeters for detectors to be operated at a future linear electron positron collider. After several beam campaigns at DESY, CERN and FNAL, the CALICE collaboration has demonstrated the principle of highly granular electromagnetic calorimeters with a first prototype called physics prototype. The next prototype, called technological prototype, addresses the engineering challenges which come along with the realisation of highly granular calorimeters. This prototype will comprise 30 layers where each layer is composed of four 9×9 cm 2 silicon wafers. The front end electronics is integrated into the detector layers. The size of each pixel is 5×5 mm 2 . This prototype enters its construction phase. We present results of the first layers of the technological prototype obtained during beam test campaigns in spring and summer 2012. According to these results the signal over noise ratio of the detector exceeds the R and D goal of 10:1

  19. Progress Report on the US Critical Zone Observatory Program

    Science.gov (United States)

    Barrera, E. C.

    2014-12-01

    The Critical Zone Observatory (CZO) program supported by the National Science Foundation originated from the recommendation of the Earth Science community published in the National Research Council report "Basic Research Opportunities in Earth Sciences" (2001) to establish natural laboratories to study processes and systems of the Critical Zone - the surface and near-surface environment sustaining nearly all terrestrial life. After a number of critical zone community workshops to develop a science plan, the CZO program was initiated in 2007 with three sites and has now grown to 10 sites and a National Office, which coordinates research, education and outreach activities of the network. Several of the CZO sites are collocated with sites supported by the US Long Term Ecological Research (LTER) and the Long Term Agricultural Research (LTAR) programs, and the National Ecological Observatory Network (NEON). Future collaboration with additional sites of these networks will add to the potential to answer questions in a more comprehensive manner and in a larger regional scale about the critical zone form and function. At the international level, CZOs have been established in many countries and strong collaborations with the US program have been in place for many years. The next step is the development of a coordinated international program of critical zone research. The success of the CZO network of sites can be measured in transformative results that elucidate properties and processes controlling the critical zone and how the critical zone structure, stores and fluxes respond to climate and land use change. This understanding of the critical zone can be used to enhance resilience and sustainability, and restore ecosystem function. Thus, CZO science can address major societal challenges. The US CZO network is a facility open to research of the critical zone community at large. Scientific data and information about the US program are available at www.criticalzone.org.

  20. The Need for Large-Scale, Longitudinal Empirical Studies in Middle Level Education Research

    Science.gov (United States)

    Mertens, Steven B.; Caskey, Micki M.; Flowers, Nancy

    2016-01-01

    This essay describes and discusses the ongoing need for large-scale, longitudinal, empirical research studies focused on middle grades education. After a statement of the problem and concerns, the essay describes and critiques several prior middle grades efforts and research studies. Recommendations for future research efforts to inform policy…

  1. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  2. The National Astronomical Observatory of Japan and Post-war Japanese Optical Astronomy

    Science.gov (United States)

    Tajima, Toshiyuki

    This paper depicts some aspects of the formative process of the Japanese optical and infrared astronomical community in the post-war period, featuring the transition of the National Astronomical Observatory of Japan(NAOJ). We take up three cases of telescope construction, examining their background and their contribution to the Japanese astronomical community. Through these cases, the characteristics of traditions and cultures of optical and infrared astronomy in Japan are considered. Although the Tokyo Astronomical Observatory (TAO) of the University of Tokyo, the predecessor of NAOJ, was originally founded as an agency for practical astronomical observation such as time and almanac service, it has become an international centre for all types of astrophysical research. Research and development of telescopes and observational instruments have become an important part of the astronomers' practice. Now, however, a number of Japanese universities are planning to have their own large to middle-sized telescopes, and a new style of astronomical research is emerging involving astrophysical studies utilising data acquired from the Virtual Observatory, so there is a distinct possibility that the status of the NAOJ will change even further in the future.

  3. Nonlinear evolution of large-scale structure in the universe

    International Nuclear Information System (INIS)

    Frenk, C.S.; White, S.D.M.; Davis, M.

    1983-01-01

    Using N-body simulations we study the nonlinear development of primordial density perturbation in an Einstein--de Sitter universe. We compare the evolution of an initial distribution without small-scale density fluctuations to evolution from a random Poisson distribution. These initial conditions mimic the assumptions of the adiabatic and isothermal theories of galaxy formation. The large-scale structures which form in the two cases are markedly dissimilar. In particular, the correlation function xi(r) and the visual appearance of our adiabatic (or ''pancake'') models match better the observed distribution of galaxies. This distribution is characterized by large-scale filamentary structure. Because the pancake models do not evolve in a self-similar fashion, the slope of xi(r) steepens with time; as a result there is a unique epoch at which these models fit the galaxy observations. We find the ratio of cutoff length to correlation length at this time to be lambda/sub min//r 0 = 5.1; its expected value in a neutrino dominated universe is 4(Ωh) -1 (H 0 = 100h km s -1 Mpc -1 ). At early epochs these models predict a negligible amplitude for xi(r) and could explain the lack of measurable clustering in the Lyα absorption lines of high-redshift quasars. However, large-scale structure in our models collapses after z = 2. If this collapse precedes galaxy formation as in the usual pancake theory, galaxies formed uncomfortably recently. The extent of this problem may depend on the cosmological model used; the present series of experiments should be extended in the future to include models with Ω<1

  4. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  5. Properties of large-scale methane/hydrogen jet fires

    Energy Technology Data Exchange (ETDEWEB)

    Studer, E. [CEA Saclay, DEN, LTMF Heat Transfer and Fluid Mech Lab, 91 - Gif-sur-Yvette (France); Jamois, D.; Leroy, G.; Hebrard, J. [INERIS, F-60150 Verneuil En Halatte (France); Jallais, S. [Air Liquide, F-78350 Jouy En Josas (France); Blanchetiere, V. [GDF SUEZ, 93 - La Plaine St Denis (France)

    2009-12-15

    A future economy based on reduction of carbon-based fuels for power generation and transportation may consider hydrogen as possible energy carrier Extensive and widespread use of hydrogen might require a pipeline network. The alternatives might be the use of the existing natural gas network or to design a dedicated network. Whatever the solution, mixing hydrogen with natural gas will modify the consequences of accidents, substantially The French National Research Agency (ANR) funded project called HYDROMEL focuses on these critical questions Within this project large-scale jet fires have been studied experimentally and numerically The main characteristics of these flames including visible length, radiation fluxes and blowout have been assessed. (authors)

  6. Space Active Optics: toward optimized correcting mirrors for future large spaceborne observatories

    Science.gov (United States)

    Laslandes, Marie; Hugot, Emmanuel; Ferrari, Marc; Lemaitre, Gérard; Liotard, Arnaud

    2011-10-01

    Wave-front correction in optical instruments is often needed, either to compensate Optical Path Differences, off-axis aberrations or mirrors deformations. Active optics techniques are developed to allow efficient corrections with deformable mirrors. In this paper, we will present the conception of particular deformation systems which could be used in space telescopes and instruments in order to improve their performances while allowing relaxing specifications on the global system stability. A first section will be dedicated to the design and performance analysis of an active mirror specifically designed to compensate for aberrations that might appear in future 3m-class space telescopes, due to lightweight primary mirrors, thermal variations or weightless conditions. A second section will be dedicated to a brand new design of active mirror, able to compensate for given combinations of aberrations with a single actuator. If the aberrations to be corrected in an instrument and their evolutions are known in advance, an optimal system geometry can be determined thanks to the elasticity theory and Finite Element Analysis.

  7. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    Science.gov (United States)

    Jensen, Tue V.; Pinson, Pierre

    2017-11-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  8. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.

    Science.gov (United States)

    Jensen, Tue V; Pinson, Pierre

    2017-11-28

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  9. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  10. Large-scale Meteorological Patterns Associated with Extreme Precipitation Events over Portland, OR

    Science.gov (United States)

    Aragon, C.; Loikith, P. C.; Lintner, B. R.; Pike, M.

    2017-12-01

    Extreme precipitation events can have profound impacts on human life and infrastructure, with broad implications across a range of stakeholders. Changes to extreme precipitation events are a projected outcome of climate change that warrants further study, especially at regional- to local-scales. While global climate models are generally capable of simulating mean climate at global-to-regional scales with reasonable skill, resiliency and adaptation decisions are made at local-scales where most state-of-the-art climate models are limited by coarse resolution. Characterization of large-scale meteorological patterns associated with extreme precipitation events at local-scales can provide climatic information without this scale limitation, thus facilitating stakeholder decision-making. This research will use synoptic climatology as a tool by which to characterize the key large-scale meteorological patterns associated with extreme precipitation events in the Portland, Oregon metro region. Composite analysis of meteorological patterns associated with extreme precipitation days, and associated watershed-specific flooding, is employed to enhance understanding of the climatic drivers behind such events. The self-organizing maps approach is then used to characterize the within-composite variability of the large-scale meteorological patterns associated with extreme precipitation events, allowing us to better understand the different types of meteorological conditions that lead to high-impact precipitation events and associated hydrologic impacts. A more comprehensive understanding of the meteorological drivers of extremes will aid in evaluation of the ability of climate models to capture key patterns associated with extreme precipitation over Portland and to better interpret projections of future climate at impact-relevant scales.

  11. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  12. Development of the simulation package 'ELSES' for extra-large-scale electronic structure calculation

    Energy Technology Data Exchange (ETDEWEB)

    Hoshi, T [Department of Applied Mathematics and Physics, Tottori University, Tottori 680-8550 (Japan); Fujiwara, T [Core Research for Evolutional Science and Technology, Japan Science and Technology Agency (CREST-JST) (Japan)

    2009-02-11

    An early-stage version of the simulation package 'ELSES' (extra-large-scale electronic structure calculation) is developed for simulating the electronic structure and dynamics of large systems, particularly nanometer-scale and ten-nanometer-scale systems (see www.elses.jp). Input and output files are written in the extensible markup language (XML) style for general users. Related pre-/post-simulation tools are also available. A practical workflow and an example are described. A test calculation for the GaAs bulk system is shown, to demonstrate that the present code can handle systems with more than one atom species. Several future aspects are also discussed.

  13. Using unplanned fires to help suppressing future large fires in Mediterranean forests.

    Directory of Open Access Journals (Sweden)

    Adrián Regos

    Full Text Available Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain, we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050. An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire

  14. Using unplanned fires to help suppressing future large fires in Mediterranean forests.

    Science.gov (United States)

    Regos, Adrián; Aquilué, Núria; Retana, Javier; De Cáceres, Miquel; Brotons, Lluís

    2014-01-01

    Despite the huge resources invested in fire suppression, the impact of wildfires has considerably increased across the Mediterranean region since the second half of the 20th century. Modulating fire suppression efforts in mild weather conditions is an appealing but hotly-debated strategy to use unplanned fires and associated fuel reduction to create opportunities for suppression of large fires in future adverse weather conditions. Using a spatially-explicit fire-succession model developed for Catalonia (Spain), we assessed this opportunistic policy by using two fire suppression strategies that reproduce how firefighters in extreme weather conditions exploit previous fire scars as firefighting opportunities. We designed scenarios by combining different levels of fire suppression efficiency and climatic severity for a 50-year period (2000-2050). An opportunistic fire suppression policy induced large-scale changes in fire regimes and decreased the area burnt under extreme climate conditions, but only accounted for up to 18-22% of the area to be burnt in reference scenarios. The area suppressed in adverse years tended to increase in scenarios with increasing amounts of area burnt during years dominated by mild weather. Climate change had counterintuitive effects on opportunistic fire suppression strategies. Climate warming increased the incidence of large fires under uncontrolled conditions but also indirectly increased opportunities for enhanced fire suppression. Therefore, to shift fire suppression opportunities from adverse to mild years, we would require a disproportionately large amount of area burnt in mild years. We conclude that the strategic planning of fire suppression resources has the potential to become an important cost-effective fuel-reduction strategy at large spatial scale. We do however suggest that this strategy should probably be accompanied by other fuel-reduction treatments applied at broad scales if large-scale changes in fire regimes are to be

  15. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    ]; Peach et al., 1998; DeSante et al., 2001 are generally co–ordinated by ringing centres such as those that make up the membership of EURING. In some countries volunteer census work (often called Breeding Bird Surveys is undertaken by the same organizations while in others different bodies may co–ordinate this aspect of the work. This session was concerned with the analysis of such extensive data sets and the approaches that are being developed to address the key theoretical and applied issues outlined above. The papers reflect the development of more spatially explicit approaches to analyses of data gathered at large spatial scales. They show that while the statistical tools that have been developed in recent years can be used to derive useful biological conclusions from such data, there is additional need for further developments. Future work should also consider how to best implement such analytical developments within future study designs. In his plenary paper Andy Royle (Royle, 2004 addresses this theme directly by describing a general framework for modelling spatially replicated abundance data. The approach is based on the idea that a set of spatially referenced local populations constitutes a metapopulation, within which local abundance is determined as a random process. This provides an elegant and general approach in which the metapopulation model as described above is combined with a data–generating model specific to the type of data being analysed to define a simple hierarchical model that can be analysed using conventional methods. It should be noted, however, that further software development will be needed if the approach is to be made readily available to biologists. The approach is well suited to dealing with sparse data and avoids the need for data aggregation prior to analysis. Spatial synchrony has received most attention in studies of species whose populations show cyclic fluctuations, particularly certain game birds and small mammals. However

  16. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  17. Large-Scale Astrophysical Visualization on Smartphones

    Science.gov (United States)

    Becciani, U.; Massimino, P.; Costa, A.; Gheller, C.; Grillo, A.; Krokos, M.; Petta, C.

    2011-07-01

    Nowadays digital sky surveys and long-duration, high-resolution numerical simulations using high performance computing and grid systems produce multidimensional astrophysical datasets in the order of several Petabytes. Sharing visualizations of such datasets within communities and collaborating research groups is of paramount importance for disseminating results and advancing astrophysical research. Moreover educational and public outreach programs can benefit greatly from novel ways of presenting these datasets by promoting understanding of complex astrophysical processes, e.g., formation of stars and galaxies. We have previously developed VisIVO Server, a grid-enabled platform for high-performance large-scale astrophysical visualization. This article reviews the latest developments on VisIVO Web, a custom designed web portal wrapped around VisIVO Server, then introduces VisIVO Smartphone, a gateway connecting VisIVO Web and data repositories for mobile astrophysical visualization. We discuss current work and summarize future developments.

  18. A framework for cross-observatory volcanological database management

    Science.gov (United States)

    Aliotta, Marco Antonio; Amore, Mauro; Cannavò, Flavio; Cassisi, Carmelo; D'Agostino, Marcello; Dolce, Mario; Mastrolia, Andrea; Mangiagli, Salvatore; Messina, Giuseppe; Montalto, Placido; Fabio Pisciotta, Antonino; Prestifilippo, Michele; Rossi, Massimo; Scarpato, Giovanni; Torrisi, Orazio

    2017-04-01

    In the last years, it has been clearly shown how the multiparametric approach is the winning strategy to investigate the complex dynamics of the volcanic systems. This involves the use of different sensor networks, each one dedicated to the acquisition of particular data useful for research and monitoring. The increasing interest devoted to the study of volcanological phenomena led the constitution of different research organizations or observatories, also relative to the same volcanoes, which acquire large amounts of data from sensor networks for the multiparametric monitoring. At INGV we developed a framework, hereinafter called TSDSystem (Time Series Database System), which allows to acquire data streams from several geophysical and geochemical permanent sensor networks (also represented by different data sources such as ASCII, ODBC, URL etc.), located on the main volcanic areas of Southern Italy, and relate them within a relational database management system. Furthermore, spatial data related to different dataset are managed using a GIS module for sharing and visualization purpose. The standardization provides the ability to perform operations, such as query and visualization, of many measures synchronizing them using a common space and time scale. In order to share data between INGV observatories, and also with Civil Protection, whose activity is related on the same volcanic districts, we designed a "Master View" system that, starting from the implementation of a number of instances of the TSDSystem framework (one for each observatory), makes possible the joint interrogation of data, both temporal and spatial, on instances located in different observatories, through the use of web services technology (RESTful, SOAP). Similarly, it provides metadata for equipment using standard schemas (such as FDSN StationXML). The "Master View" is also responsible for managing the data policy through a "who owns what" system, which allows you to associate viewing/download of

  19. Large-scale computation at PSI scientific achievements and future requirements

    International Nuclear Information System (INIS)

    Adelmann, A.; Markushin, V.

    2008-11-01

    ' (SNSP-HPCN) is discussing this complex. Scientific results which are made possible by PSI's engagement at CSCS (named Horizon) are summarised and PSI's future high-performance computing requirements are evaluated. The data collected shows the current situation and a 5 year extrapolation of the users' needs with respect to HPC resources is made. In consequence this report can serve as a basis for future strategic decisions with respect to a non-existing HPC road-map for PSI. PSI's institutional HPC area started hardware-wise approximately in 1999 with the assembly of a 32-processor LINUX cluster called Merlin. Merlin was upgraded several times, lastly in 2007. The Merlin cluster at PSI is used for small scale parallel jobs, and is the only general purpose computing system at PSI. Several dedicated small scale clusters followed the Merlin scheme. Many of the clusters are used to analyse data from experiments at PSI or CERN, because dedicated clusters are most efficient. The intellectual and financial involvement of the procurement (including a machine update in 2007) results in a PSI share of 25 % of the available computing resources at CSCS. The (over) usage of available computing resources by PSI scientists is demonstrated. We actually get more computing cycles than we have paid for. The reason is the fair share policy that is implemented on the Horizon machine. This policy allows us to get cycles, with a low priority, even when our bi-monthly share is used. Five important observations can be drawn from the analysis of the scientific output and the survey of future requirements of main PSI HPC users: (1) High Performance Computing is a main pillar in many important PSI research areas; (2) there is a lack in the order of 10 times the current computing resources (measured in available core-hours per year); (3) there is a trend to use in the order of 600 processors per average production run; (4) the disk and tape storage growth is dramatic; (5) small HPC clusters located

  20. Large-scale computation at PSI scientific achievements and future requirements

    Energy Technology Data Exchange (ETDEWEB)

    Adelmann, A.; Markushin, V

    2008-11-15

    and Networking' (SNSP-HPCN) is discussing this complex. Scientific results which are made possible by PSI's engagement at CSCS (named Horizon) are summarised and PSI's future high-performance computing requirements are evaluated. The data collected shows the current situation and a 5 year extrapolation of the users' needs with respect to HPC resources is made. In consequence this report can serve as a basis for future strategic decisions with respect to a non-existing HPC road-map for PSI. PSI's institutional HPC area started hardware-wise approximately in 1999 with the assembly of a 32-processor LINUX cluster called Merlin. Merlin was upgraded several times, lastly in 2007. The Merlin cluster at PSI is used for small scale parallel jobs, and is the only general purpose computing system at PSI. Several dedicated small scale clusters followed the Merlin scheme. Many of the clusters are used to analyse data from experiments at PSI or CERN, because dedicated clusters are most efficient. The intellectual and financial involvement of the procurement (including a machine update in 2007) results in a PSI share of 25 % of the available computing resources at CSCS. The (over) usage of available computing resources by PSI scientists is demonstrated. We actually get more computing cycles than we have paid for. The reason is the fair share policy that is implemented on the Horizon machine. This policy allows us to get cycles, with a low priority, even when our bi-monthly share is used. Five important observations can be drawn from the analysis of the scientific output and the survey of future requirements of main PSI HPC users: (1) High Performance Computing is a main pillar in many important PSI research areas; (2) there is a lack in the order of 10 times the current computing resources (measured in available core-hours per year); (3) there is a trend to use in the order of 600 processors per average production run; (4) the disk and tape storage growth

  1. III. FROM SMALL TO BIG: METHODS FOR INCORPORATING LARGE SCALE DATA INTO DEVELOPMENTAL SCIENCE.

    Science.gov (United States)

    Davis-Kean, Pamela E; Jager, Justin

    2017-06-01

    For decades, developmental science has been based primarily on relatively small-scale data collections with children and families. Part of the reason for the dominance of this type of data collection is the complexity of collecting cognitive and social data on infants and small children. These small data sets are limited in both power to detect differences and the demographic diversity to generalize clearly and broadly. Thus, in this chapter we will discuss the value of using existing large-scale data sets to tests the complex questions of child development and how to develop future large-scale data sets that are both representative and can answer the important questions of developmental scientists. © 2017 The Society for Research in Child Development, Inc.

  2. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  3. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  4. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  5. Astronomical Virtual Observatories Through International Collaboration

    Directory of Open Access Journals (Sweden)

    Masatoshi Ohishi

    2010-03-01

    Full Text Available Astronomical Virtual Observatories (VOs are emerging research environment for astronomy, and 16 countries and a region have funded to develop their VOs based on international standard protocols for interoperability. The 16 funded VO projects have established the International Virtual Observatory Alliance (http://www.ivoa.net/ to develop the standard interoperable interfaces such as registry (meta data, data access, query languages, output format (VOTable, data model, application interface, and so on. The IVOA members have constructed each VO environment through the IVOA interfaces. National Astronomical Observatory of Japan (NAOJ started its VO project (Japanese Virtual Observatory - JVO in 2002, and developed its VO system. We have succeeded to interoperate the latest JVO system with other VOs in the USA and Europe since December 2004. Observed data by the Subaru telescope, satellite data taken by the JAXA/ISAS, etc. are connected to the JVO system. Successful interoperation of the JVO system with other VOs means that astronomers in the world will be able to utilize top-level data obtained by these telescopes from anywhere in the world at anytime. System design of the JVO system, experiences during our development including problems of current standard protocols defined in the IVOA, and proposals to resolve these problems in the near future are described.

  6. Large-Scale Sequencing: The Future of Genomic Sciences Colloquium

    Energy Technology Data Exchange (ETDEWEB)

    Margaret Riley; Merry Buckley

    2009-01-01

    Genetic sequencing and the various molecular techniques it has enabled have revolutionized the field of microbiology. Examining and comparing the genetic sequences borne by microbes - including bacteria, archaea, viruses, and microbial eukaryotes - provides researchers insights into the processes microbes carry out, their pathogenic traits, and new ways to use microorganisms in medicine and manufacturing. Until recently, sequencing entire microbial genomes has been laborious and expensive, and the decision to sequence the genome of an organism was made on a case-by-case basis by individual researchers and funding agencies. Now, thanks to new technologies, the cost and effort of sequencing is within reach for even the smallest facilities, and the ability to sequence the genomes of a significant fraction of microbial life may be possible. The availability of numerous microbial genomes will enable unprecedented insights into microbial evolution, function, and physiology. However, the current ad hoc approach to gathering sequence data has resulted in an unbalanced and highly biased sampling of microbial diversity. A well-coordinated, large-scale effort to target the breadth and depth of microbial diversity would result in the greatest impact. The American Academy of Microbiology convened a colloquium to discuss the scientific benefits of engaging in a large-scale, taxonomically-based sequencing project. A group of individuals with expertise in microbiology, genomics, informatics, ecology, and evolution deliberated on the issues inherent in such an effort and generated a set of specific recommendations for how best to proceed. The vast majority of microbes are presently uncultured and, thus, pose significant challenges to such a taxonomically-based approach to sampling genome diversity. However, we have yet to even scratch the surface of the genomic diversity among cultured microbes. A coordinated sequencing effort of cultured organisms is an appropriate place to begin

  7. Review of social issues for large-scale land investment in Zambia

    OpenAIRE

    Henley, Giles

    2017-01-01

    Given unsuccessful experiences to date in establishing large-scale investments for biofuels in Zambia, this paper explores the social constraints that may hinder future efforts to use the same models. The author reviews the legal framework that has guided the establishment of most agricultural investments to date (including investment in biofuels), and analyses some of the issues and social repercussions associated with them, through a review of existing case studies. He also explores through...

  8. Technical instrumentation R&D for ILD SiW ECAL large scale device

    Science.gov (United States)

    Balagura, V.

    2018-03-01

    Calorimeters with silicon detectors have many unique features and are proposed for several world-leading experiments. We describe the R&D program of the large scale detector element with up to 12 000 readout channels for the International Large Detector (ILD) at the future e+e‑ ILC collider. The program is focused on the readout front-end electronics embedded inside the calorimeter. The first part with 2 000 channels and two small silicon sensors has already been constructed, the full prototype is planned for the beginning of 2018.

  9. Technical instrumentation R&D for ILD SiW ECAL large scale device

    CERN Document Server

    Balagura, V. (on behalf of SIW ECAL ILD collaboration)

    2018-01-01

    Calorimeters with silicon detectors have many unique features and are proposed for several world-leading experiments. We describe the R&D program of the large scale detector element with up to 12 000 readout channels for the International Large Detector (ILD) at the future e+e- ILC collider. The program is focused on the readout front-end electronics embedded inside the calorimeter. The first part with 2 000 channels and two small silicon sensors has already been constructed, the full prototype is planned for the beginning of 2018.

  10. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  11. Inflation Physics from the Cosmic Microwave Background and Large Scale Structure

    Science.gov (United States)

    Abazajian, K.N.; Arnold,K.; Austermann, J.; Benson, B.A.; Bischoff, C.; Bock, J.; Bond, J.R.; Borrill, J.; Buder, I.; Burke, D.L.; hide

    2013-01-01

    Fluctuations in the intensity and polarization of the cosmic microwave background (CMB) and the large-scale distribution of matter in the universe each contain clues about the nature of the earliest moments of time. The next generation of CMB and large-scale structure (LSS) experiments are poised to test the leading paradigm for these earliest moments---the theory of cosmic inflation---and to detect the imprints of the inflationary epoch, thereby dramatically increasing our understanding of fundamental physics and the early universe. A future CMB experiment with sufficient angular resolution and frequency coverage that surveys at least 1 of the sky to a depth of 1 uK-arcmin can deliver a constraint on the tensor-to-scalar ratio that will either result in a 5-sigma measurement of the energy scale of inflation or rule out all large-field inflation models, even in the presence of foregrounds and the gravitational lensing B-mode signal. LSS experiments, particularly spectroscopic surveys such as the Dark Energy Spectroscopic Instrument, will complement the CMB effort by improving current constraints on running of the spectral index by up to a factor of four, improving constraints on curvature by a factor of ten, and providing non-Gaussianity constraints that are competitive with the current CMB bounds.

  12. Self-* and Adaptive Mechanisms for Large Scale Distributed Systems

    Science.gov (United States)

    Fragopoulou, P.; Mastroianni, C.; Montero, R.; Andrjezak, A.; Kondo, D.

    Large-scale distributed computing systems and infrastructure, such as Grids, P2P systems and desktop Grid platforms, are decentralized, pervasive, and composed of a large number of autonomous entities. The complexity of these systems is such that human administration is nearly impossible and centralized or hierarchical control is highly inefficient. These systems need to run on highly dynamic environments, where content, network topologies and workloads are continuously changing. Moreover, they are characterized by the high degree of volatility of their components and the need to provide efficient service management and to handle efficiently large amounts of data. This paper describes some of the areas for which adaptation emerges as a key feature, namely, the management of computational Grids, the self-management of desktop Grid platforms and the monitoring and healing of complex applications. It also elaborates on the use of bio-inspired algorithms to achieve self-management. Related future trends and challenges are described.

  13. Reengineering observatory operations for the time domain

    Science.gov (United States)

    Seaman, Robert L.; Vestrand, W. T.; Hessman, Frederic V.

    2014-07-01

    Observatories are complex scientific and technical institutions serving diverse users and purposes. Their telescopes, instruments, software, and human resources engage in interwoven workflows over a broad range of timescales. These workflows have been tuned to be responsive to concepts of observatory operations that were applicable when various assets were commissioned, years or decades in the past. The astronomical community is entering an era of rapid change increasingly characterized by large time domain surveys, robotic telescopes and automated infrastructures, and - most significantly - of operating modes and scientific consortia that span our individual facilities, joining them into complex network entities. Observatories must adapt and numerous initiatives are in progress that focus on redesigning individual components out of the astronomical toolkit. New instrumentation is both more capable and more complex than ever, and even simple instruments may have powerful observation scripting capabilities. Remote and queue observing modes are now widespread. Data archives are becoming ubiquitous. Virtual observatory standards and protocols and astroinformatics data-mining techniques layered on these are areas of active development. Indeed, new large-aperture ground-based telescopes may be as expensive as space missions and have similarly formal project management processes and large data management requirements. This piecewise approach is not enough. Whatever challenges of funding or politics facing the national and international astronomical communities it will be more efficient - scientifically as well as in the usual figures of merit of cost, schedule, performance, and risks - to explicitly address the systems engineering of the astronomical community as a whole.

  14. Sierra Stars Observatory Network: An Accessible Global Network

    Science.gov (United States)

    Williams, Richard; Beshore, Edward

    2011-03-01

    The Sierra Stars Observatory Network (SSON) is a unique partnership among professional observatories that provides its users with affordable high-quality calibrated image data. SSON comprises observatories in the Northern and Southern Hemisphere and is in the process of expanding to a truly global network capable of covering the entire sky 24 hours a day in the near future. The goal of SSON is to serve the needs of science-based projects and programs. Colleges, universities, institutions, and individuals use SSON for their education and research projects. The mission of SSON is to promote and expand the use of its facilities among the thousands of colleges and schools worldwide that do not have access to professional-quality automated observatory systems to use for astronomy education and research. With appropriate leadership and guidance educators can use SSON to help teach astronomy and do meaningful scientific projects. The relatively small cost of using SSON for this type of work makes it affordable and accessible for educators to start using immediately. Remote observatory services like SSON need to evolve to better support education and research initiatives of colleges, institutions and individual investigators. To meet these needs, SSON is developing a sophisticated interactive scheduling system to integrate among the nodes of the observatory network. This will enable more dynamic observations, including immediate priority interrupts, acquiring moving objects using ephemeris data, and more.

  15. Large-Scale Academic Achievement Testing of Deaf and Hard-of-Hearing Students: Past, Present, and Future

    Science.gov (United States)

    Qi, Sen; Mitchell, Ross E.

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the…

  16. Large-Scale Atmospheric Circulation Patterns Associated with Temperature Extremes as a Basis for Model Evaluation: Methodological Overview and Results

    Science.gov (United States)

    Loikith, P. C.; Broccoli, A. J.; Waliser, D. E.; Lintner, B. R.; Neelin, J. D.

    2015-12-01

    Anomalous large-scale circulation patterns often play a key role in the occurrence of temperature extremes. For example, large-scale circulation can drive horizontal temperature advection or influence local processes that lead to extreme temperatures, such as by inhibiting moderating sea breezes, promoting downslope adiabatic warming, and affecting the development of cloud cover. Additionally, large-scale circulation can influence the shape of temperature distribution tails, with important implications for the magnitude of future changes in extremes. As a result of the prominent role these patterns play in the occurrence and character of extremes, the way in which temperature extremes change in the future will be highly influenced by if and how these patterns change. It is therefore critical to identify and understand the key patterns associated with extremes at local to regional scales in the current climate and to use this foundation as a target for climate model validation. This presentation provides an overview of recent and ongoing work aimed at developing and applying novel approaches to identifying and describing the large-scale circulation patterns associated with temperature extremes in observations and using this foundation to evaluate state-of-the-art global and regional climate models. Emphasis is given to anomalies in sea level pressure and 500 hPa geopotential height over North America using several methods to identify circulation patterns, including self-organizing maps and composite analysis. Overall, evaluation results suggest that models are able to reproduce observed patterns associated with temperature extremes with reasonable fidelity in many cases. Model skill is often highest when and where synoptic-scale processes are the dominant mechanisms for extremes, and lower where sub-grid scale processes (such as those related to topography) are important. Where model skill in reproducing these patterns is high, it can be inferred that extremes are

  17. Analysis for preliminary evaluation of discrete fracture flow and large-scale permeability in sedimentary rocks

    International Nuclear Information System (INIS)

    Kanehiro, B.Y.; Lai, C.H.; Stow, S.H.

    1987-05-01

    Conceptual models for sedimentary rock settings that could be used in future evaluation and suitability studies are being examined through the DOE Repository Technology Program. One area of concern for the hydrologic aspects of these models is discrete fracture flow analysis as related to the estimation of the size of the representative elementary volume, evaluation of the appropriateness of continuum assumptions and estimation of the large-scale permeabilities of sedimentary rocks. A basis for preliminary analysis of flow in fracture systems of the types that might be expected to occur in low permeability sedimentary rocks is presented. The approach used involves numerical modeling of discrete fracture flow for the configuration of a large-scale hydrologic field test directed at estimation of the size of the representative elementary volume and large-scale permeability. Analysis of fracture data on the basis of this configuration is expected to provide a preliminary indication of the scale at which continuum assumptions can be made

  18. First Joint Workshop on Energy Management for Large-Scale Research Infrastructures

    CERN Document Server

    2011-01-01

      CERN, ERF (European Association of National Research Facilities) and ESS (European Spallation Source) announce the first Joint Workshop on Energy Management for Large-Scale Research Infrastructures. The event will take place on 13-14 October 2011 at the ESS office in Sparta - Lund, Sweden.   The workshop will bring together international experts on energy and representatives from laboratories and future projects all over the world in order to identify the challenges and best practice in respect of energy efficiency and optimization, solutions and implementation as well as to review the challenges represented by potential future technical solutions and the tools for effective collaboration. Further information at: http://ess-scandinavia.eu/general-information

  19. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  20. James Webb Space Telescope Core 2 Test - Cryogenic Thermal Balance Test of the Observatorys Core Area Thermal Control Hardware

    Science.gov (United States)

    Cleveland, Paul; Parrish, Keith; Thomson, Shaun; Marsh, James; Comber, Brian

    2016-01-01

    The James Webb Space Telescope (JWST), successor to the Hubble Space Telescope, will be the largest astronomical telescope ever sent into space. To observe the very first light of the early universe, JWST requires a large deployed 6.5-meter primary mirror cryogenically cooled to less than 50 Kelvin. Three scientific instruments are further cooled via a large radiator system to less than 40 Kelvin. A fourth scientific instrument is cooled to less than 7 Kelvin using a combination pulse-tube Joule-Thomson mechanical cooler. Passive cryogenic cooling enables the large scale of the telescope which must be highly folded for launch on an Ariane 5 launch vehicle and deployed once on orbit during its journey to the second Earth-Sun Lagrange point. Passive cooling of the observatory is enabled by the deployment of a large tennis court sized five layer Sunshield combined with the use of a network of high efficiency radiators. A high purity aluminum heat strap system connects the three instrument's detector systems to the radiator systems to dissipate less than a single watt of parasitic and instrument dissipated heat. JWST's large scale features, while enabling passive cooling, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone of most space missions' thermal verification plans. This paper describes the JWST Core 2 Test, which is a cryogenic thermal balance test of a full size, high fidelity engineering model of the Observatory's 'Core' area thermal control hardware. The 'Core' area is the key mechanical and cryogenic interface area between all Observatory elements. The 'Core' area thermal control hardware allows for temperature transition of 300K to approximately 50 K by attenuating heat from the room temperature IEC (instrument electronics) and the Spacecraft Bus. Since the flight hardware is not available for test, the Core 2 test uses high fidelity and flight-like reproductions.

  1. Private Observatories in South Africa

    Science.gov (United States)

    Rijsdijk, C.

    2016-12-01

    Descriptions of private observatories in South Africa, written by their owners. Positions, equipment descriptions and observing programmes are given. Included are: Klein Karoo Observatory (B. Monard), Cederberg Observatory (various), Centurion Planetary and Lunar Observatory (C. Foster), Le Marischel Observatory (L. Ferreira), Sterkastaaing Observatory (M. Streicher), Henley on Klip (B. Fraser), Archer Observatory (B. Dumas), Overbeek Observatory (A. Overbeek), Overberg Observatory (A. van Staden), St Cyprian's School Observatory, Fisherhaven Small Telescope Observatory (J. Retief), COSPAR 0433 (G. Roberts), COSPAR 0434 (I. Roberts), Weltevreden Karoo Observatory (D. Bullis), Winobs (M. Shafer)

  2. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  3. CSCW Challenges in Large-Scale Technical Projects - a case study

    DEFF Research Database (Denmark)

    Grønbæk, Kaj; Kyng, Morten; Mogensen, Preben Holst

    1992-01-01

    This paper investigates CSCW aspects of large-scale technical projects based on a case study of a specific Danish engineering company and uncovers s challenges to CSCW applications in this setting. The company is responsible for management and supervision of one of the worlds largest tunnel....... The initial qualitative analysis identified a number of bottlenecks in daily work, where support for cooperation is needed. Examples of bottlenecks are: sharing materials, issuing tasks, and keeping track of task status. Grounded in the analysis, cooperative design workshops based on scenarios of future work...

  4. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  5. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  6. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  7. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  8. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results: (1) confirmed, in a general way, the procedures for application to pulsed burning, (2) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur, and (3) indicated that steam can terminate continuous burning. Future actions recommended include: (1) modification of the code to perform continuous-burn analyses, which is demonstrated, (2) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (3) changes to the models for estimating burn parameters

  9. Hydrogen-combustion analyses of large-scale tests

    International Nuclear Information System (INIS)

    Gido, R.G.; Koestel, A.

    1986-01-01

    This report uses results of the large-scale tests with turbulence performed by the Electric Power Research Institute at the Nevada Test Site to evaluate hydrogen burn-analysis procedures based on lumped-parameter codes like COMPARE-H2 and associated burn-parameter models. The test results (a) confirmed, in a general way, the procedures for application to pulsed burning, (b) increased significantly our understanding of the burn phenomenon by demonstrating that continuous burning can occur and (c) indicated that steam can terminate continuous burning. Future actions recommended include (a) modification of the code to perform continuous-burn analyses, which is demonstrated, (b) analyses to determine the type of burning (pulsed or continuous) that will exist in nuclear containments and the stable location if the burning is continuous, and (c) changes to the models for estimating burn parameters

  10. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  11. Addressing the social dimensions of citizen observatories: The Ground Truth 2.0 socio-technical approach for sustainable implementation of citizen observatories

    Science.gov (United States)

    Wehn, Uta; Joshi, Somya; Pfeiffer, Ellen; Anema, Kim; Gharesifard, Mohammad; Momani, Abeer

    2017-04-01

    Owing to ICT-enabled citizen observatories, citizens can take on new roles in environmental monitoring, decision making and co-operative planning, and environmental stewardship. And yet implementing advanced citizen observatories for data collection, knowledge exchange and interactions to support policy objectives is neither always easy nor successful, given the required commitment, trust, and data reliability concerns. Many efforts are facing problems with the uptake and sustained engagement by citizens, limited scalability, unclear long-term sustainability and limited actual impact on governance processes. Similarly, to sustain the engagement of decision makers in citizen observatories, mechanisms are required from the start of the initiative in order to have them invest in and, hence, commit to and own the entire process. In order to implement sustainable citizen observatories, these social dimensions therefore need to be soundly managed. We provide empirical evidence of how the social dimensions of citizen observatories are being addressed in the Ground Truth 2.0 project, drawing on a range of relevant social science approaches. This project combines the social dimensions of citizen observatories with enabling technologies - via a socio-technical approach - so that their customisation and deployment is tailored to the envisaged societal and economic impacts of the observatories. The projects consists of the demonstration and validation of six scaled up citizen observatories in real operational conditions both in the EU and in Africa, with a specific focus on flora and fauna as well as water availability and water quality for land and natural resources management. The demonstration cases (4 EU and 2 African) cover the full 'spectrum' of citizen-sensed data usage and citizen engagement, and therefore allow testing and validation of the socio-technical concept for citizen observatories under a range of conditions.

  12. The Russian-Ukrainian Observatories Network for the European Astronomical Observatory Route Project

    Science.gov (United States)

    Andrievsky, S. M.; Bondar, N. I.; Karetnikov, V. G.; Kazantseva, L. V.; Nefedyev, Y. A.; Pinigin, G. I.; Pozhalova, Zh. A.; Rostopchina-Shakhovskay, A. N.; Stepanov, A. V.; Tolbin, S. V.

    2011-09-01

    In 2004,the Center of UNESCO World Heritage has announced a new initiative "Astronomy & World Heritage" directed for search and preserving of objects,referred to astronomy,its history in a global value,historical and cultural properties. There were defined a strategy of thematic programme "Initiative" and general criteria for selecting of ancient astronomical objects and observatories. In particular, properties that are situated or have significance in relation to celestial objects or astronomical events; representations of sky and/or celestial bodies and astronomical events; observatories and instruments; properties closely connected with the history of astronomy. In 2005-2006,in accordance with the program "Initiative", information about outstanding properties connected with astronomy have been collected.In Ukraine such work was organized by astronomical expert group in Nikolaev Astronomical Observatory. In 2007, Nikolaev observatory was included to the Tentative List of UNESCO under # 5116. Later, in 2008, the network of four astronomical observatories of Ukraine in Kiev,Crimea, Nikolaev and Odessa,considering their high authenticities and integrities,was included to the Tentative List of UNESCO under # 5267 "Astronomical Observatories of Ukraine". In 2008-2009, a new project "Thematic Study" was opened as a successor of "Initiative". It includes all fields of astronomical heritage from earlier prehistory to the Space astronomy (14 themes in total). We present the Ukraine-Russian Observatories network for the "European astronomical observatory Route project". From Russia two observatories are presented: Kazan Observatory and Pulkovo Observatory in the theme "Astronomy from the Renaissance to the mid-twentieth century".The description of astronomical observatories of Ukraine is given in accordance with the project "Thematic study"; the theme "Astronomy from the Renaissance to the mid-twentieth century" - astronomical observatories in Kiev,Nikolaev and Odessa; the

  13. Brazil to Join the European Southern Observatory

    Science.gov (United States)

    2010-12-01

    The Federative Republic of Brazil has yesterday signed the formal accession agreement paving the way for it to become a Member State of the European Southern Observatory (ESO). Following government ratification Brazil will become the fifteenth Member State and the first from outside Europe. On 29 December 2010, at a ceremony in Brasilia, the Brazilian Minister of Science and Technology, Sergio Machado Rezende and the ESO Director General, Tim de Zeeuw signed the formal accession agreement aiming to make Brazil a Member State of the European Southern Observatory. Brazil will become the fifteen Member State and the first from outside Europe. Since the agreement means accession to an international convention, the agreement must now be submitted to the Brazilian Parliament for ratification [1]. The signing of the agreement followed the unanimous approval by the ESO Council during an extraordinary meeting on 21 December 2010. "Joining ESO will give new impetus to the development of science, technology and innovation in Brazil as part of the considerable efforts our government is making to keep the country advancing in these strategic areas," says Rezende. The European Southern Observatory has a long history of successful involvement with South America, ever since Chile was selected as the best site for its observatories in 1963. Until now, however, no non-European country has joined ESO as a Member State. "The membership of Brazil will give the vibrant Brazilian astronomical community full access to the most productive observatory in the world and open up opportunities for Brazilian high-tech industry to contribute to the European Extremely Large Telescope project. It will also bring new resources and skills to the organisation at the right time for them to make a major contribution to this exciting project," adds ESO Director General, Tim de Zeeuw. The European Extremely Large Telescope (E-ELT) telescope design phase was recently completed and a major review was

  14. Downscaling the Impacts of Large-Scale LUCC on Surface Temperature along with IPCC RCPs: A Global Perspective

    Directory of Open Access Journals (Sweden)

    Xiangzheng Deng

    2014-04-01

    Full Text Available This study focuses on the potential impacts of large-scale land use and land cover changes (LUCC on surface temperature from a global perspective. As important types of LUCC, urbanization, deforestation, cultivated land reclamation, and grassland degradation have effects on the climate, the potential changes of the surface temperature caused by these four types of large-scale LUCC from 2010 to 2050 are downscaled, and this issue analyzed worldwide along with Representative Concentration Pathways (RCPs of the Intergovernmental Panel on Climate Change (IPCC. The first case study presents some evidence of the effects of future urbanization on surface temperature in the Northeast megalopolis of the United States of America (USA. In order to understand the potential climatological variability caused by future forest deforestation and vulnerability, we chose Brazilian Amazon region as the second case study. The third selected region in India as a typical region of cultivated land reclamation where the possible climatic impacts are explored. In the fourth case study, we simulate the surface temperature changes caused by future grassland degradation in Mongolia. Results show that the temperature in built-up area would increase obviously throughout the four land types. In addition, the effects of all four large-scale LUCC on monthly average temperature change would vary from month to month with obviously spatial heterogeneity.

  15. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  16. How large-scale subsidence affects stratocumulus transitions

    Directory of Open Access Journals (Sweden)

    J. J. van der Dussen

    2016-01-01

    Full Text Available Some climate modeling results suggest that the Hadley circulation might weaken in a future climate, causing a subsequent reduction in the large-scale subsidence velocity in the subtropics. In this study we analyze the cloud liquid water path (LWP budget from large-eddy simulation (LES results of three idealized stratocumulus transition cases, each with a different subsidence rate. As shown in previous studies a reduced subsidence is found to lead to a deeper stratocumulus-topped boundary layer, an enhanced cloud-top entrainment rate and a delay in the transition of stratocumulus clouds into shallow cumulus clouds during its equatorwards advection by the prevailing trade winds. The effect of a reduction of the subsidence rate can be summarized as follows. The initial deepening of the stratocumulus layer is partly counteracted by an enhanced absorption of solar radiation. After some hours the deepening of the boundary layer is accelerated by an enhancement of the entrainment rate. Because this is accompanied by a change in the cloud-base turbulent fluxes of moisture and heat, the net change in the LWP due to changes in the turbulent flux profiles is negligibly small.

  17. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  18. Improved Large-Scale Inundation Modelling by 1D-2D Coupling and Consideration of Hydrologic and Hydrodynamic Processes - a Case Study in the Amazon

    Science.gov (United States)

    Hoch, J. M.; Bierkens, M. F.; Van Beek, R.; Winsemius, H.; Haag, A.

    2015-12-01

    Understanding the dynamics of fluvial floods is paramount to accurate flood hazard and risk modeling. Currently, economic losses due to flooding constitute about one third of all damage resulting from natural hazards. Given future projections of climate change, the anticipated increase in the World's population and the associated implications, sound knowledge of flood hazard and related risk is crucial. Fluvial floods are cross-border phenomena that need to be addressed accordingly. Yet, only few studies model floods at the large-scale which is preferable to tiling the output of small-scale models. Most models cannot realistically model flood wave propagation due to a lack of either detailed channel and floodplain geometry or the absence of hydrologic processes. This study aims to develop a large-scale modeling tool that accounts for both hydrologic and hydrodynamic processes, to find and understand possible sources of errors and improvements and to assess how the added hydrodynamics affect flood wave propagation. Flood wave propagation is simulated by DELFT3D-FM (FM), a hydrodynamic model using a flexible mesh to schematize the study area. It is coupled to PCR-GLOBWB (PCR), a macro-scale hydrological model, that has its own simpler 1D routing scheme (DynRout) which has already been used for global inundation modeling and flood risk assessments (GLOFRIS; Winsemius et al., 2013). A number of model set-ups are compared and benchmarked for the simulation period 1986-1996: (0) PCR with DynRout; (1) using a FM 2D flexible mesh forced with PCR output and (2) as in (1) but discriminating between 1D channels and 2D floodplains, and, for comparison, (3) and (4) the same set-ups as (1) and (2) but forced with observed GRDC discharge values. Outputs are subsequently validated against observed GRDC data at Óbidos and flood extent maps from the Dartmouth Flood Observatory. The present research constitutes a first step into a globally applicable approach to fully couple

  19. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  20. Use of water-Cherenkov detectors to detect Gamma Ray Bursts at the Large Aperture GRB Observatory (LAGO)

    International Nuclear Information System (INIS)

    Allard, D.; Allekotte, I.; Alvarez, C.; Asorey, H.; Barros, H.; Bertou, X.; Burgoa, O.; Gomez Berisso, M.; Martinez, O.; Miranda Loza, P.; Murrieta, T.; Perez, G.; Rivera, H.; Rovero, A.; Saavedra, O.; Salazar, H.; Tello, J.C.; Ticona Peralda, R.; Velarde, A.; Villasenor, L.

    2008-01-01

    The Large Aperture GRB Observatory (LAGO) project aims at the detection of high energy photons from Gamma Ray Bursts (GRB) using the single particle technique in ground-based water-Cherenkov detectors (WCD). To reach a reasonable sensitivity, high altitude mountain sites have been selected in Mexico (Sierra Negra, 4550 m a.s.l.), Bolivia (Chacaltaya, 5300 m a.s.l.) and Venezuela (Merida, 4765 m a.s.l.). We report on detector calibration and operation at high altitude, search for bursts in 4 months of preliminary data, as well as search for signal at ground level when satellites report a burst

  1. Use of water-Cherenkov detectors to detect Gamma Ray Bursts at the Large Aperture GRB Observatory (LAGO)

    Energy Technology Data Exchange (ETDEWEB)

    Allard, D. [APC, CNRS et Universite Paris 7 (France); Allekotte, I. [Centro Atomico Bariloche, Instituto Balseiro (Argentina); Alvarez, C. [Facultad de Ciencias Fisico-Matematicas de la BUAP (Mexico); Asorey, H. [Centro Atomico Bariloche, Instituto Balseiro (Argentina); Barros, H. [Laboratorio de Fisica Nuclear, Universidad Simon Bolivar, Caracas (Venezuela, Bolivarian Republic of); Bertou, X. [Centro Atomico Bariloche, Instituto Balseiro (Argentina)], E-mail: bertou@cab.cnea.gov.ar; Burgoa, O. [Instituto de Investigaciones Fisicas, UMSA (Bolivia); Gomez Berisso, M. [Centro Atomico Bariloche, Instituto Balseiro (Argentina); Martinez, O. [Facultad de Ciencias Fisico-Matematicas de la BUAP (Mexico); Miranda Loza, P. [Instituto de Investigaciones Fisicas, UMSA (Bolivia); Murrieta, T.; Perez, G. [Facultad de Ciencias Fisico-Matematicas de la BUAP (Mexico); Rivera, H. [Instituto de Investigaciones Fisicas, UMSA (Bolivia); Rovero, A. [Instituto de Astronomia y Fisica del Espacio (Argentina); Saavedra, O. [Dipartimento di Fisica Generale and INFN, Torino (Italy); Salazar, H. [Facultad de Ciencias Fisico-Matematicas de la BUAP (Mexico); Tello, J.C. [Laboratorio de Fisica Nuclear, Universidad Simon Bolivar, Caracas (Venezuela, Bolivarian Republic of); Ticona Peralda, R.; Velarde, A. [Instituto de Investigaciones Fisicas, UMSA (Bolivia); Villasenor, L. [Facultad de Ciencias Fisico-Matematicas de la BUAP (Mexico); Instituto de Fisica y Matematicas, Universidad de Michoacan (Mexico)

    2008-09-21

    The Large Aperture GRB Observatory (LAGO) project aims at the detection of high energy photons from Gamma Ray Bursts (GRB) using the single particle technique in ground-based water-Cherenkov detectors (WCD). To reach a reasonable sensitivity, high altitude mountain sites have been selected in Mexico (Sierra Negra, 4550 m a.s.l.), Bolivia (Chacaltaya, 5300 m a.s.l.) and Venezuela (Merida, 4765 m a.s.l.). We report on detector calibration and operation at high altitude, search for bursts in 4 months of preliminary data, as well as search for signal at ground level when satellites report a burst.

  2. Magnetic observations at Geophysical Observatory Paratunka IKIR FEB RAS: tasks, possibilities and future prospects

    Science.gov (United States)

    Khomutov, Sergey Y.

    2017-10-01

    Continuous magnetic measurements at Geophysical Observatory "Paratunka" (PET) of IKIR FEB RAS are performed since 1967. In the new millennium analogue magnetometers were modernized to digital, the technologies of absolute observations were changed, the data processing was completely transferred to computers, and the status of INTERMAGNET observatory was obtained. Currently, the observatory uses the following magnetometers: (a) for absolute observations - DIflux LEMI-203 (theodolite 3T2KP) and Mag-01 (theodolite Wild-T1), Overhauser magnetometers POS-1 and GSM-19W; (b) for variation measurements - fluxgate magnetometers FGE-DTU, FRG-601 and MAGDAS (installed under international agreements of IKIR), vector magnetometers dIdD GSM-19FD and POS-4 with Overhauser sensors and coil systems, scalar magnetometer GSM-90 and induction magnetometer STELAB. During Spring-Autumn season dIdD also is installed at remote station "Karymshina" at distance of 15 km from Observatory. There is monitoring system for monitoring of conditions in which magnetic observations are performed, including the semi-professional weather stations Davis Vantage Pro2 and WS2000 and a network of digital temperature sensors DS19B20 located at various points in magnetic pavilions and outdoor. All measurements are synchronized with the UTC. The results of observations are collected by the IKIR data server from the recorders and loggers, including in real-time. Specialized software was developed (based on MATLAB and Octave packages), which allows automatic and semi-automatic processing of data, the comparison of the results from different magnetometers and presenting final data in formats, defined by international standards, including INTERMAGNET. Significant efforts of observatory staff are direct to archive (raw) magnetic data, a significant part of which has not been entirely processed, is not presented in international data centers and is still not available to the scientific community. Digital images of

  3. Magnetic observations at Geophysical Observatory Paratunka IKIR FEB RAS: tasks, possibilities and future prospects

    Directory of Open Access Journals (Sweden)

    Khomutov Sergey Y.

    2017-01-01

    Full Text Available Continuous magnetic measurements at Geophysical Observatory “Paratunka” (PET of IKIR FEB RAS are performed since 1967. In the new millennium analogue magnetometers were modernized to digital, the technologies of absolute observations were changed, the data processing was completely transferred to computers, and the status of INTERMAGNET observatory was obtained. Currently, the observatory uses the following magnetometers: (a for absolute observations – DIflux LEMI-203 (theodolite 3T2KP and Mag-01 (theodolite Wild-T1, Overhauser magnetometers POS-1 and GSM-19W; (b for variation measurements – fluxgate magnetometers FGE-DTU, FRG-601 and MAGDAS (installed under international agreements of IKIR, vector magnetometers dIdD GSM-19FD and POS-4 with Overhauser sensors and coil systems, scalar magnetometer GSM-90 and induction magnetometer STELAB. During Spring-Autumn season dIdD also is installed at remote station “Karymshina” at distance of 15 km from Observatory. There is monitoring system for monitoring of conditions in which magnetic observations are performed, including the semi-professional weather stations Davis Vantage Pro2 and WS2000 and a network of digital temperature sensors DS19B20 located at various points in magnetic pavilions and outdoor. All measurements are synchronized with the UTC. The results of observations are collected by the IKIR data server from the recorders and loggers, including in real-time. Specialized software was developed (based on MATLAB and Octave packages, which allows automatic and semi-automatic processing of data, the comparison of the results from different magnetometers and presenting final data in formats, defined by international standards, including INTERMAGNET. Significant efforts of observatory staff are direct to archive (raw magnetic data, a significant part of which has not been entirely processed, is not presented in international data centers and is still not available to the scientific

  4. Planning under uncertainty solving large-scale stochastic linear programs

    Energy Technology Data Exchange (ETDEWEB)

    Infanger, G. [Stanford Univ., CA (United States). Dept. of Operations Research]|[Technische Univ., Vienna (Austria). Inst. fuer Energiewirtschaft

    1992-12-01

    For many practical problems, solutions obtained from deterministic models are unsatisfactory because they fail to hedge against certain contingencies that may occur in the future. Stochastic models address this shortcoming, but up to recently seemed to be intractable due to their size. Recent advances both in solution algorithms and in computer technology now allow us to solve important and general classes of practical stochastic problems. We show how large-scale stochastic linear programs can be efficiently solved by combining classical decomposition and Monte Carlo (importance) sampling techniques. We discuss the methodology for solving two-stage stochastic linear programs with recourse, present numerical results of large problems with numerous stochastic parameters, show how to efficiently implement the methodology on a parallel multi-computer and derive the theory for solving a general class of multi-stage problems with dependency of the stochastic parameters within a stage and between different stages.

  5. A Green Robotic Observatory for Astronomy Education

    Science.gov (United States)

    Reddy, Vishnu; Archer, K.

    2008-09-01

    With the development of robotic telescopes and stable remote observing software, it is currently possible for a small institution to have an affordable astronomical facility for astronomy education. However, a faculty member has to deal with the light pollution (observatory location on campus), its nightly operations and regular maintenance apart from his day time teaching and research responsibilities. While building an observatory at a remote location is a solution, the cost of constructing and operating such a facility, not to mention the environmental impact, are beyond the reach of most institutions. In an effort to resolve these issues we have developed a robotic remote observatory that can be operated via the internet from anywhere in the world, has a zero operating carbon footprint and minimum impact on the local environment. The prototype observatory is a clam-shell design that houses an 8-inch telescope with a SBIG ST-10 CCD detector. The brain of the observatory is a low draw 12-volt harsh duty computer that runs the dome, telescope, CCD camera, focuser, and weather monitoring. All equipment runs of a 12-volt AGM-style battery that has low lead content and hence more environmental-friendly to dispose. The total power of 12-14 amp/hrs is generated from a set of solar panels that are large enough to maintain a full battery charge for several cloudy days. This completely eliminates the need for a local power grid for operations. Internet access is accomplished via a high-speed cell phone broadband connection or satellite link eliminating the need for a phone network. An independent observatory monitoring system interfaces with the observatory computer during operation. The observatory converts to a trailer for transportation to the site and is converted to a semi-permanent building without wheels and towing equipment. This ensures minimal disturbance to local environment.

  6. Active self-testing noise measurement sensors for large-scale environmental sensor networks.

    Science.gov (United States)

    Domínguez, Federico; Cuong, Nguyen The; Reinoso, Felipe; Touhafi, Abdellah; Steenhaut, Kris

    2013-12-13

    Large-scale noise pollution sensor networks consist of hundreds of spatially distributed microphones that measure environmental noise. These networks provide historical and real-time environmental data to citizens and decision makers and are therefore a key technology to steer environmental policy. However, the high cost of certified environmental microphone sensors render large-scale environmental networks prohibitively expensive. Several environmental network projects have started using off-the-shelf low-cost microphone sensors to reduce their costs, but these sensors have higher failure rates and produce lower quality data. To offset this disadvantage, we developed a low-cost noise sensor that actively checks its condition and indirectly the integrity of the data it produces. The main design concept is to embed a 13 mm speaker in the noise sensor casing and, by regularly scheduling a frequency sweep, estimate the evolution of the microphone's frequency response over time. This paper presents our noise sensor's hardware and software design together with the results of a test deployment in a large-scale environmental network in Belgium. Our middle-range-value sensor (around €50) effectively detected all experienced malfunctions, in laboratory tests and outdoor deployments, with a few false positives. Future improvements could further lower the cost of our sensor below €10.

  7. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  8. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  9. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  10. Isocurvature modes and Baryon Acoustic Oscillations II: gains from combining CMB and Large Scale Structure

    International Nuclear Information System (INIS)

    Carbone, Carmelita; Mangilli, Anna; Verde, Licia

    2011-01-01

    We consider cosmological parameters estimation in the presence of a non-zero isocurvature contribution in the primordial perturbations. A previous analysis showed that even a tiny amount of isocurvature perturbation, if not accounted for, could affect standard rulers calibration from Cosmic Microwave Background observations such as those provided by the Planck mission, affect Baryon Acoustic Oscillations interpretation, and introduce biases in the recovered dark energy properties that are larger than forecasted statistical errors from future surveys. Extending on this work, here we adopt a general fiducial cosmology which includes a varying dark energy equation of state parameter and curvature. Beside Baryon Acoustic Oscillations measurements, we include the information from the shape of the galaxy power spectrum and consider a joint analysis of a Planck-like Cosmic Microwave Background probe and a future, space-based, Large Scale Structure probe not too dissimilar from recently proposed surveys. We find that this allows one to break the degeneracies that affect the Cosmic Microwave Background and Baryon Acoustic Oscillations combination. As a result, most of the cosmological parameter systematic biases arising from an incorrect assumption on the isocurvature fraction parameter f iso , become negligible with respect to the statistical errors. We find that the Cosmic Microwave Background and Large Scale Structure combination gives a statistical error σ(f iso ) ∼ 0.008, even when curvature and a varying dark energy equation of state are included, which is smaller that the error obtained from Cosmic Microwave Background alone when flatness and cosmological constant are assumed. These results confirm the synergy and complementarity between Cosmic Microwave Background and Large Scale Structure, and the great potential of future and planned galaxy surveys

  11. The Cosmic Ray Energy Spectrum and Related Measurements with the Pierre Auger Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Abraham, : J.; Abreu, P.; Aglietta, M.; Aguirre, C.; Ahn, E.J.; Allard, D.; Allekotte, I.; Allen, J.; Alvarez-Muniz, J.; Ambrosio, M.; Anchordoqui, L.

    2009-06-01

    These are presentations to be presented at the 31st International Cosmic Ray Conference, in Lodz, Poland during July 2009. It consists of the following presentations: (1) Measurement of the cosmic ray energy spectrum above 10{sup 18} eV with the Pierre Auger Observatory; (2) The cosmic ray flux observed at zenith angles larger than 60 degrees with the Pierre Auger Observatory; (3) Energy calibration of data recorded with the surface detectors of the Pierre Auger Observatory; (4) Exposure of the Hybrid Detector of The Pierre Auger Observatory; and (5) Energy scale derived from Fluorescence Telescopes using Cherenkov Light and Shower Universality.

  12. Observatories and Telescopes of Modern Times

    Science.gov (United States)

    Leverington, David

    2016-11-01

    Preface; Part I. Optical Observatories: 1. Palomar Mountain Observatory; 2. The United States Optical Observatory; 3. From the Next Generation Telescope to Gemini and SOAR; 4. Competing primary mirror designs; 5. Active optics, adaptive optics and other technical innovations; 6. European Northern Observatory and Calar Alto; 7. European Southern Observatory; 8. Mauna Kea Observatory; 9. Australian optical observatories; 10. Mount Hopkins' Whipple Observatory and the MMT; 11. Apache Point Observatory; 12. Carnegie Southern Observatory (Las Campanas); 13. Mount Graham International Optical Observatory; 14. Modern optical interferometers; 15. Solar observatories; Part II. Radio Observatories: 16. Australian radio observatories; 17. Cambridge Mullard Radio Observatory; 18. Jodrell Bank; 19. Early radio observatories away from the Australian-British axis; 20. The American National Radio Astronomy Observatory; 21. Owens Valley and Mauna Kea; 22. Further North and Central American observatories; 23. Further European and Asian radio observatories; 24. ALMA and the South Pole; Name index; Optical observatory and telescope index; Radio observatory and telescope index; General index.

  13. Space astrophysical observatory 'Orion-2'

    International Nuclear Information System (INIS)

    Gurzadyan, G.A.; Jarakyan, A.L.; Krmoyan, M.N.; Kashin, A.L.; Loretsyan, G.M.; Ohanesyan, J.B.

    1976-01-01

    Ultraviolet spectrograms of a large number of faint stars up to 13sup(m) were obtained in the wavelengths 2000-5000 A by means of the space observatory 'Orion-2' installed in the spaceship 'Soyuz-13' with two spacemen on board. The paper deals with a description of the operation modes of this observatory, the designs and basic schemes of the scientific and auxiliary device and the method of combining the work of the flight engineer and the automation system of the observatory itself. It also treats of the combination of the particular parts of 'Orion-2' observatory on board the spaceship and the measures taken to provide for its normal functioning in terms of the space flight. A detailed description is given of the optical, electrical and mechanical schemes of the devices - meniscus telescope with an objective prism, stellar diffraction spectrographs, single-coordinate and two-coordinate stellar and solar transducers, control panel, control systems, etc. The paper also provides the functional scheme of astronavigation, six-wheel stabilization, the design of mounting (assembling) the stabilized platform carrying the telescopes and the drives used in it. Problems relating to the observation program in orbit, the ballistic provision of initial data, and control of the operation of the observatory are also dealt with. In addition, the paper carries information of the photomaterials used, the methods of their energy calibration, standardization and the like. Matters of pre-start tests of apparatus, the preparation of the spacemen for conducting astronomical observations with the given devices, etc. are likewise dwelt on. The paper ends with a brief survey of the results obtained and the elaboration of the observed material. (Auth.)

  14. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  15. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  16. Probing cosmology with the homogeneity scale of the Universe through large scale structure surveys

    International Nuclear Information System (INIS)

    Ntelis, Pierros

    2017-01-01

    . It is thus possible to reconstruct the distribution of matter in 3 dimensions in gigantic volumes. We can then extract various statistical observables to measure the BAO scale and the scale of homogeneity of the universe. Using Data Release 12 CMASS galaxy catalogs, we obtained precision on the homogeneity scale reduced by 5 times compared to Wiggle Z measurement. At large scales, the universe is remarkably well described in linear order by the ΛCDM-model, the standard model of cosmology. In general, it is not necessary to take into account the nonlinear effects which complicate the model at small scales. On the other hand, at large scales, the measurement of our observables becomes very sensitive to the systematic effects. This is particularly true for the analysis of cosmic homogeneity, which requires an observational method so as not to bias the measurement. In order to study the homogeneity principle in a model independent way, we explore a new way to infer distances using cosmic clocks and type Ia Supernovae. This establishes the Cosmological Principle using only a small number of a priori assumption, i.e. the theory of General Relativity and astrophysical assumptions that are independent from Friedmann Universes and in extend the homogeneity assumption. This manuscript is as follows. After a short presentation of the knowledge in cosmology necessary for the understanding of this manuscript, presented in Chapter 1, Chapter 2 will deal with the challenges of the Cosmological Principle as well as how to overcome those. In Chapter 3, we will discuss the technical characteristics of the large scale structure surveys, in particular focusing on BOSS and eBOSS galaxy surveys. Chapter 4 presents the detailed analysis of the measurement of cosmic homogeneity and the various systematic effects likely to impact our observables. Chapter 5 will discuss how to use the cosmic homogeneity as a standard ruler to constrain dark energy models from current and future surveys. In

  17. LAGO: The Latin American giant observatory

    Science.gov (United States)

    Sidelnik, Iván; Asorey, Hernán; LAGO Collaboration

    2017-12-01

    The Latin American Giant Observatory (LAGO) is an extended cosmic ray observatory composed of a network of water-Cherenkov detectors (WCD) spanning over different sites located at significantly different altitudes (from sea level up to more than 5000 m a.s.l.) and latitudes across Latin America, covering a wide range of geomagnetic rigidity cut-offs and atmospheric absorption/reaction levels. The LAGO WCD is simple and robust, and incorporates several integrated devices to allow time synchronization, autonomous operation, on board data analysis, as well as remote control and automated data transfer. This detection network is designed to make detailed measurements of the temporal evolution of the radiation flux coming from outer space at ground level. LAGO is mainly oriented to perform basic research in three areas: high energy phenomena, space weather and atmospheric radiation at ground level. It is an observatory designed, built and operated by the LAGO Collaboration, a non-centralized collaborative union of more than 30 institutions from ten countries. In this paper we describe the scientific and academic goals of the LAGO project - illustrating its present status with some recent results - and outline its future perspectives.

  18. Taurus Hill Observatory Scientific Observations for Pulkova Observatory during the 2016-2017 Season

    Science.gov (United States)

    Hentunen, V.-P.; Haukka, H.; Heikkinen, E.; Salmi, T.; Juutilainen, J.

    2017-09-01

    Taurus Hill Observatory (THO), observatory code A95, is an amateur observatory located in Varkaus, Finland. The observatory is maintained by the local astronomical association Warkauden Kassiopeia. THO research team has observed and measured various stellar objects and phenomena. Observatory has mainly focused on exoplanet light curve measurements, observing the gamma rays burst, supernova discoveries and monitoring. We also do long term monitoring projects.

  19. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  20. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  1. Large scale and performance tests of the ATLAS online software

    International Nuclear Information System (INIS)

    Alexandrov; Kotov, V.; Mineev, M.; Roumiantsev, V.; Wolters, H.; Amorim, A.; Pedro, L.; Ribeiro, A.; Badescu, E.; Caprini, M.; Burckhart-Chromek, D.; Dobson, M.; Jones, R.; Kazarov, A.; Kolos, S.; Liko, D.; Lucio, L.; Mapelli, L.; Nassiakou, M.; Schweiger, D.; Soloviev, I.; Hart, R.; Ryabov, Y.; Moneta, L.

    2001-01-01

    One of the sub-systems of the Trigger/DAQ system of the future ATLAS experiment is the Online Software system. It encompasses the functionality needed to configure, control and monitor the DAQ. Its architecture is based on a component structure described in the ATLAS Trigger/DAQ technical proposal. Regular integration tests ensure its smooth operation in test beam setups during its evolutionary development towards the final ATLAS online system. Feedback is received and returned into the development process. Studies of the system behavior have been performed on a set of up to 111 PCs on a configuration which is getting closer to the final size. Large scale and performance test of the integrated system were performed on this setup with emphasis on investigating the aspects of the inter-dependence of the components and the performance of the communication software. Of particular interest were the run control state transitions in various configurations of the run control hierarchy. For the purpose of the tests, the software from other Trigger/DAQ sub-systems has been emulated. The author presents a brief overview of the online system structure, its components and the large scale integration tests and their results

  2. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  3. Synthesis of Large-Scale Single-Crystalline Monolayer WS2 Using a Semi-Sealed Method

    Directory of Open Access Journals (Sweden)

    Feifei Lan

    2018-02-01

    Full Text Available As a two-dimensional semiconductor, WS2 has attracted great attention due to its rich physical properties and potential applications. However, it is still difficult to synthesize monolayer single-crystalline WS2 at larger scale. Here, we report the growth of large-scale triangular single-crystalline WS2 with a semi-sealed installation by chemical vapor deposition (CVD. Through this method, triangular single-crystalline WS2 with an average length of more than 300 µm was obtained. The largest one was about 405 μm in length. WS2 triangles with different sizes and thicknesses were analyzed by optical microscope and atomic force microscope (AFM. Their optical properties were evaluated by Raman and photoluminescence (PL spectra. This report paves the way to fabricating large-scale single-crystalline monolayer WS2, which is useful for the growth of high-quality WS2 and its potential applications in the future.

  4. Future of Space Astronomy: A Global Road Map for the Next Decades

    Science.gov (United States)

    Ubertini, Pietro; Gehrels, Neil; Corbett, Ian; DeBernardis, Paolo; Machado, Marcos; Griffin, Matt; Hauser, Michael; Manchanda, Ravinder K.; Kawai, Nobuyuki; Zhang, Shuang-Nan; hide

    2012-01-01

    The use of space techniques continues to play a key role in the advance of astrophysics by providing access to the entire electromagnetic spectrum from the radio observations to the high energy gamma rays. The increasing size, complexity and cost of large space observatories places a growing emphasis on international collaboration. Furthermore, combining existing and future datasets from space and ground based observatories is an emerging mode of powerful and relatively inexpensive research to address problems that can only be tackled by the application of large multi-wavelength observations. If the present set of space and ground-based astronomy facilities today is impressive and complete, with space and ground based astronomy telescopes nicely complementing each other, the situation becomes concerning and critical in the next 10-20 years. In fact, only a few main space missions are planned, possibly restricted to JWST and, perhaps, WFIRST and SPICA, since no other main facilities are already recommended. A "Working Group on the Future of Space Astronomy" was established at the 38th COSPAR Assembly held in Bremen, Germany in July 2010. The purpose of this Working Group was to establish a roadmap for future major space missions to complement future large ground-based telescopes. This paper presents the results of this study including a number of recommendations and a road map for the next decades of Space Astronomy research.

  5. Designing Observatories for the Hydrologic Sciences

    Science.gov (United States)

    Hooper, R. P.

    2004-05-01

    The need for longer-term, multi-scale, coherent, and multi-disciplinary data to test hypotheses in hydrologic science has been recognized by numerous prestigious review panels over the past decade (e.g. NRC's Basic Research Opportunities in Earth Science). Designing such observatories has proven to be a challenge not only on scientific, but also technological, economic and even sociologic levels. The Consortium of Universities for the Advancement of Hydrologic Science, Inc. (CUAHSI) has undertaken a "paper" prototype design of a hydrologic observatory (HO) for the Neuse River Basin, NC and plans to solicit proposals and award grants to develop implementation plans for approximately 10 basins (which may be defined by topographic or groundwater divides) during the summer of 2004. These observatories are envisioned to be community resources with data available to all scientists, with support facilities to permit their use by both local and remote investigators. This paper presents the broad design concepts which were developed from a national team of scientists for the Neuse River Basin Prototype. There are three fundamental characteristics of a watershed or river basin that are critical for answering the major scientific questions proposed by the NRC to advance hydrologic, biogeochemical and ecological sciences: (1) the store and flux of water, sediment, nutrients and contaminants across interfaces at multiple scales must be identified; (2) the residence time of these constituents, and (3) their flowpaths and response spectra to forcing must be estimated. "Stores" consist of subsurface, land surface and atmospheric volumes partitioned over the watershed. The HO will require "core measurements" which will serve the communities of hydrologic science for long range research questions. The core measurements will also provide context for shorter-term or hypothesis-driven research investigations. The HO will support "mobile measurement facilities" designed to support teams

  6. ArgonCube: a novel, fully-modular approach for the realization of large-mass liquid argon TPC neutrino detectors

    CERN Document Server

    Amsler, C; Asaadi, J; Auger, M; Barbato, F; Bay, F; Bishai, M; Bleiner, D; Borgschulte, A; Bremer, J; Cavus, E; Chen, H; De Geronimo, G; Ereditato, A; Fleming, B; Goldi, D; Hanni, R; Kose, U; Kreslo, I; La Mattina, F; Lanni, F; Lissauer, D; Luthi, M; Lutz, P; Marchionni, A; Mladenov, D; Nessi, M; Noto, F; Palamara, O; Raaf, J L; Radeka, V; Rudolph Von Rohr, Ch; Smargianaki, D; Soderberg, M; Strauss, Th; Weber, M; Yu, B; Zeller, G P; Zeyrek, M; CERN. Geneva. SPS and PS Experiments Committee; SPSC

    2015-01-01

    The Liquid Argon Time Projection Chamber is a prime candidate detector for future neutrino oscillation physics experiments, underground neutrino observatories and proton decay searches. A large international project based on this technology is currently being considered at the future LBNF facility in the United States on the very large mass scale of 40 kton. In this document, following the long standing R&D work conducted over the last years in several laboratories in Europe and in the United States, we intend to propose a novel Liquid Argon TPC approach based on a fully-modular, innovative design, the ArgonCube. The related R&D work will proceed along two main directions; one aimed at on the assessment of the proposed modular detector design, the other on the exploitation of new signal readout methods. Such a strategy will provide high performance while being cost-effective and robust at the same time. According to our plans, we will firstly realize a detector prototype hosted in a cryostat that is a...

  7. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  8. Maps on large-scale air quality concentrations in the Netherlands

    International Nuclear Information System (INIS)

    Velders, G.J.M.; Aben, J.M.M.; Beck, J.P.; Blom, W.F.; Van Dam, J.D.; Elzenga, H.E.; Geilenkirchen, G.P.; Hoen, A.; Jimmink, B.A.; Matthijsen, J.; Peek, C.J.; Van Velze, K.; Visser, H.; De Vries, W.J.

    2007-01-01

    Every year MNP produces maps showing large-scale concentrations of several air quality components in the Netherlands for which there are European regulations. The concentration maps are based on a combination of model calculations and measurements. These maps (called GCN maps) show the large-scale contribution of these components in air in the Netherlands for both past and future years. Local, provincial and other authorities use these maps for reporting exceedances in the framework of the EU Air Quality Directive and for planning. The report gives the underlying assumptions applied to the GCN-maps in this 2007 report. The Dutch Ministry of Housing, Spatial Planning and the Environment (VROM) is legally responsible for selecting the scenario to be used in the GCN maps. The Ministry has chosen to base the current maps of nitrogen dioxide, particulate matter (PM10) and sulphur dioxide for 2010 up to 2020 on standing and proposed Dutch and European policies. That means that the Netherlands and other European countries will meet their National Emissions Ceilings (NEC) by 2010 and the emissions according to the ambitions of the Thematic Strategy on Air Pollution of the European Commission up to 2020, as assumed in the calculations. The large-scale concentrations of NO2 and PM10, presented by the GCN maps, are in 2006 and for the 2010-2020 period, below the European limit value of yearly averaged 40 μg m 3 everywhere in the Netherlands. The large-scale concentration exceeds the European limit value for the daily average of PM10 of maximally 35 days above 50 μg m 3 in some locations in 2006. This applies close to the harbours of Amsterdam and Rotterdam and is associated with storage and handling of dry bulk material. The large-scale concentration of PM10 is below the European limit value for the daily average everywhere in 2010-2020. Several changes have been implemented, in addition to the changes in the GCN maps of last year (report March 2006). New insights into

  9. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  10. Discovery of a Large-Scale Filament Connected to the Massive Galaxy Cluster MACS J0717.5+3745 at z=0.551,

    Science.gov (United States)

    Ebeling, H.; Barrett, E.; Donovan, D.

    2004-07-01

    We report the detection of a 4 h-170 Mpc long large-scale filament leading into the massive galaxy cluster MACS J0717.5+3745. The extent of this object well beyond the cluster's nominal virial radius (~2.3 Mpc) rules out prior interaction between its constituent galaxies and the cluster and makes it a prime candidate for a genuine filament as opposed to a merger remnant or a double cluster. The structure was discovered as a pronounced overdensity of galaxies selected to have V-R colors close to the cluster red sequence. Extensive spectroscopic follow-up of over 300 of these galaxies in a region covering the filament and the cluster confirms that the entire structure is located at the cluster redshift of z=0.545. Featuring galaxy surface densities of typically 15 Mpc-2 down to luminosities of 0.13L*V, the most diffuse parts of the filament are comparable in density to the clumps of red galaxies found around A851 in the only similar study carried out to date (Kodama et al.). Our direct detection of an extended large-scale filament funneling matter onto a massive distant cluster provides a superb target for in-depth studies of the evolution of galaxies in environments of greatly varying density and supports the predictions from theoretical models and numerical simulations of structure formation in a hierarchical picture. Some of the data presented herein were obtained at the W. M. Keck Observatory, which is operated as a scientific partnership among the California Institute of Technology, the University of California, and the National Aeronautics and Space Administration. The observatory was made possible by the generous financial support of the W. M. Keck Foundation. Based partly on observations obtained at the Gemini Observatory, which is operated by the Association of Universities for Research in Astronomy, Inc., under a cooperative agreement with the NSF on behalf of the Gemini partnership: the National Science Foundation (US), the Particle Physics and Astronomy

  11. Regional modeling of large wildfires under current and potential future climates in Colorado and Wyoming, USA

    Science.gov (United States)

    West, Amanda; Kumar, Sunil; Jarnevich, Catherine S.

    2016-01-01

    Regional analysis of large wildfire potential given climate change scenarios is crucial to understanding areas most at risk in the future, yet wildfire models are not often developed and tested at this spatial scale. We fit three historical climate suitability models for large wildfires (i.e. ≥ 400 ha) in Colorado andWyoming using topography and decadal climate averages corresponding to wildfire occurrence at the same temporal scale. The historical models classified points of known large wildfire occurrence with high accuracies. Using a novel approach in wildfire modeling, we applied the historical models to independent climate and wildfire datasets, and the resulting sensitivities were 0.75, 0.81, and 0.83 for Maxent, Generalized Linear, and Multivariate Adaptive Regression Splines, respectively. We projected the historic models into future climate space using data from 15 global circulation models and two representative concentration pathway scenarios. Maps from these geospatial analyses can be used to evaluate the changing spatial distribution of climate suitability of large wildfires in these states. April relative humidity was the most important covariate in all models, providing insight to the climate space of large wildfires in this region. These methods incorporate monthly and seasonal climate averages at a spatial resolution relevant to land management (i.e. 1 km2) and provide a tool that can be modified for other regions of North America, or adapted for other parts of the world.

  12. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  13. Off the scale: a new species of fish-scale gecko (Squamata: Gekkonidae: Geckolepis with exceptionally large scales

    Directory of Open Access Journals (Sweden)

    Mark D. Scherz

    2017-02-01

    Full Text Available The gecko genus Geckolepis, endemic to Madagascar and the Comoro archipelago, is taxonomically challenging. One reason is its members ability to autotomize a large portion of their scales when grasped or touched, most likely to escape predation. Based on an integrative taxonomic approach including external morphology, morphometrics, genetics, pholidosis, and osteology, we here describe the first new species from this genus in 75 years: Geckolepis megalepis sp. nov. from the limestone karst of Ankarana in northern Madagascar. The new species has the largest known body scales of any gecko (both relatively and absolutely, which come off with exceptional ease. We provide a detailed description of the skeleton of the genus Geckolepis based on micro-Computed Tomography (micro-CT analysis of the new species, the holotype of G. maculata, the recently resurrected G. humbloti, and a specimen belonging to an operational taxonomic unit (OTU recently suggested to represent G. maculata. Geckolepis is characterized by highly mineralized, imbricated scales, paired frontals, and unfused subolfactory processes of the frontals, among other features. We identify diagnostic characters in the osteology of these geckos that help define our new species and show that the OTU assigned to G. maculata is probably not conspecific with it, leaving the taxonomic identity of this species unclear. We discuss possible reasons for the extremely enlarged scales of G. megalepis in the context of an anti-predator defence mechanism, and the future of Geckolepis taxonomy.

  14. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  15. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  16. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  17. A DATA-DRIVEN ANALYTIC MODEL FOR PROTON ACCELERATION BY LARGE-SCALE SOLAR CORONAL SHOCKS

    Energy Technology Data Exchange (ETDEWEB)

    Kozarev, Kamen A. [Smithsonian Astrophysical Observatory (United States); Schwadron, Nathan A. [Institute for the Study of Earth, Oceans, and Space, University of New Hampshire (United States)

    2016-11-10

    We have recently studied the development of an eruptive filament-driven, large-scale off-limb coronal bright front (OCBF) in the low solar corona, using remote observations from the Solar Dynamics Observatory ’s Advanced Imaging Assembly EUV telescopes. In that study, we obtained high-temporal resolution estimates of the OCBF parameters regulating the efficiency of charged particle acceleration within the theoretical framework of diffusive shock acceleration (DSA). These parameters include the time-dependent front size, speed, and strength, as well as the upstream coronal magnetic field orientations with respect to the front’s surface normal direction. Here we present an analytical particle acceleration model, specifically developed to incorporate the coronal shock/compressive front properties described above, derived from remote observations. We verify the model’s performance through a grid of idealized case runs using input parameters typical for large-scale coronal shocks, and demonstrate that the results approach the expected DSA steady-state behavior. We then apply the model to the event of 2011 May 11 using the OCBF time-dependent parameters derived by Kozarev et al. We find that the compressive front likely produced energetic particles as low as 1.3 solar radii in the corona. Comparing the modeled and observed fluences near Earth, we also find that the bulk of the acceleration during this event must have occurred above 1.5 solar radii. With this study we have taken a first step in using direct observations of shocks and compressions in the innermost corona to predict the onsets and intensities of solar energetic particle events.

  18. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  19. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  20. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  1. Newly Uncovered Large-Scale Component of the Northern Jet in R Aqr

    Science.gov (United States)

    Montez, Rodolfo; Karovska, Margarita; Nichols, Joy S.; Kashyap, Vinay

    2017-06-01

    R Aqr is a symbiotic system comprised a compact white dwarf and Mira giant star. The interaction of these stars is responsible for the presence of a two-sided jet structure that is seen across the electromagnetic spectrum. X-ray emission from the jet was first discovered in 2000 with an observation by the Chandra X-ray Observatory. Since then follow-up observations have traced the evolution of the X-ray emission from the jet and a central compact source. In X-rays, the NE jet is brighter than the SW jet, but the full extent of the SW jet was larger - before it began fading below the detection threshold. However, we have uncovered evidence for large-scale emission associated with the NE jet that matches the extent of the SW jet. The emission has escaped previous identification because it is near the detection threshold, but it has been present since the first 2000 observation and clearly evolves in subsequent observations. We present our study of the emission from this component of the NE jet, its relationship to multiwavelength observations, and how it impacts our interpretation of the jet-phenomenon in R Aqr.

  2. Citizen Observatories: A Standards Based Architecture

    Science.gov (United States)

    Simonis, Ingo

    2015-04-01

    A number of large-scale research projects are currently under way exploring the various components of citizen observatories, e.g. CITI-SENSE (http://www.citi-sense.eu), Citclops (http://citclops.eu), COBWEB (http://cobwebproject.eu), OMNISCIENTIS (http://www.omniscientis.eu), and WeSenseIt (http://www.wesenseit.eu). Common to all projects is the motivation to develop a platform enabling effective participation by citizens in environmental projects, while considering important aspects such as security, privacy, long-term storage and availability, accessibility of raw and processed data and its proper integration into catalogues and international exchange and collaboration systems such as GEOSS or INSPIRE. This paper describes the software architecture implemented for setting up crowdsourcing campaigns using standardized components, interfaces, security features, and distribution capabilities. It illustrates the Citizen Observatory Toolkit, a software suite that allows defining crowdsourcing campaigns, to invite registered and unregistered participants to participate in crowdsourcing campaigns, and to analyze, process, and visualize raw and quality enhanced crowd sourcing data and derived products. The Citizen Observatory Toolkit is not a single software product. Instead, it is a framework of components that are built using internationally adopted standards wherever possible (e.g. OGC standards from Sensor Web Enablement, GeoPackage, and Web Mapping and Processing Services, as well as security and metadata/cataloguing standards), defines profiles of those standards where necessary (e.g. SWE O&M profile, SensorML profile), and implements design decisions based on the motivation to maximize interoperability and reusability of all components. The toolkit contains tools to set up, manage and maintain crowdsourcing campaigns, allows building on-demand apps optimized for the specific sampling focus, supports offline and online sampling modes using modern cell phones with

  3. The U.S. NSF Ocean Observatories Initiative: A Modern Virtual Observatory

    Science.gov (United States)

    Orcutt, John; Vernon, Frank; Peach, Cheryl; Arrott, Matthew; Graybeal, John; Farcas, Claudiu; Farcas, Emilia; Krueger, Ingolf; Meisinger, Michael; Chave, Alan

    2010-05-01

    The NSF Ocean Observatories Initiative (OOI) began a five-year construction period in October 2009. The Consortium on Ocean Leadership (COL) manages the overall program with Implementing Organizations for Coastal/Global Scale Nodes (CGSN) at Woods Hole, Oregon State and Scripps; the Regional Cabled Network (RCN) at U of Washington and Cyberinfrastructure (CI) at UCSD and more than ten subcontractors. The NSF has made a commitment to support the observatory operations and maintenance for a 30-year period; a minimal period of time to measure physical, chemical and biological data over a length of time possibly sufficient to measure secular changes associated with climate and geodesy. The CI component is a substantial departure from previous approaches to data distribution and management. These innovations include the availability of data in near-real-time with latencies of seconds, open access to all data, analysis of the data stream for detection and modeling, use of the derived knowledge to modify the network with minimal or no human interaction and maintenance of data provenance through time as new versions of the data are created through QA/QC processes. The network architecture is designed to be scalable so that addition of new sensors is straightforward and inexpensive with costs increasing linearly at worst. Rather than building new computer infrastructure (disk farms and computer clusters), we are presently exploiting Amazon's Extensible Computing Cloud (EC2) and Simple Storage System (S3) to reduce long-term commitments to hardware and maintenance in order to minimize operations and maintenance costs. The OOI CI is actively partnering with other organizations (e.g. NOAA's IOOS) to integrate existing data systems using many of the same technologies to improve broad access to existing and planned observing systems, including those that provide critical climate data. Because seasonal and annual variability of most measureable parameters is so large, the

  4. Integrating Near Fault Observatories (NFO) for EPOS Implementation Phase

    Science.gov (United States)

    Chiaraluce, Lauro

    2015-04-01

    Following the European Plate Observing System (EPOS) project vision aimed at creating a pan-European infrastructure for Earth sciences to support science for a more sustainable society, we are working on the integration of Near-Fault Observatories (NFOs). NFOs are state of the art research infrastructures consisting of advanced networks of multi-parametric sensors continuously monitoring the chemical and physical processes related to the common underlying earth instabilities governing active faults evolution and the genesis of earthquakes. Such a methodological approach, currently applicable only at the local scale (areas of tens to few hundreds of kilometres), is based on extremely dense networks and less common instruments deserving an extraordinary work on data quality control and multi-parameter data description. These networks in fact usually complement regional seismic and geodetic networks (typically with station spacing of 50-100km) with high-density distributions of seismic, geodetic, geochemical and geophysical sensors located typically within 10-20 km of active faults where large earthquakes are expected in the future. In the initial phase of EPOS-IP, seven NFO nodes will be linked: the Alto Tiberina and Irpinia Observatories in Italy, the Corinth Observatory in Greece, the South-Iceland Seismic Zone, the Valais Observatory in Switzerland, Marmara Sea GEO Supersite in Turkey (EU MARSite) and the Vrancea Observatory in Romania. Our work is aimed at establishing standards and integration within this first core group of NFOs while other NFOs are expected to be installed in the next years adopting the standards established and developed within the EPOS Thematic Core Services (TCS). The goal of our group is to build upon the initial development supported by these few key national observatories coordinated under previous EU projects (NERA and REAKT), inclusive and harmonised TCS supporting the installation over the next decade of tens of near

  5. Optical Manufacturing and Testing Requirements Identified by the NASA Science Instruments, Observatories and Sensor Systems Technology Assessment

    Science.gov (United States)

    Stahl, H. Philip; Barney, Rich; Bauman, Jill; Feinberg, Lee; Mcleese, Dan; Singh, Upendra

    2011-01-01

    In August 2010, the NASA Office of Chief Technologist (OCT) commissioned an assessment of 15 different technology areas of importance to the future of NASA. Technology assessment #8 (TA8) was Science Instruments, Observatories and Sensor Systems (SIOSS). SIOSS assess the needs for optical technology ranging from detectors to lasers, x-ray mirrors to microwave antenna, in-situ spectrographs for on-surface planetary sample characterization to large space telescopes. The needs assessment looked across the entirety of NASA and not just the Science Mission Directorate. This paper reviews the optical manufacturing and testing technologies identified by SIOSS which require development in order to enable future NASA high priority missions.

  6. Large-scale tropospheric transport in the Chemistry-Climate Model Initiative (CCMI) simulations

    Science.gov (United States)

    Orbe, Clara; Yang, Huang; Waugh, Darryn W.; Zeng, Guang; Morgenstern, Olaf; Kinnison, Douglas E.; Lamarque, Jean-Francois; Tilmes, Simone; Plummer, David A.; Scinocca, John F.; Josse, Beatrice; Marecal, Virginie; Jöckel, Patrick; Oman, Luke D.; Strahan, Susan E.; Deushi, Makoto; Tanaka, Taichu Y.; Yoshida, Kohei; Akiyoshi, Hideharu; Yamashita, Yousuke; Stenke, Andreas; Revell, Laura; Sukhodolov, Timofei; Rozanov, Eugene; Pitari, Giovanni; Visioni, Daniele; Stone, Kane A.; Schofield, Robyn; Banerjee, Antara

    2018-05-01

    Understanding and modeling the large-scale transport of trace gases and aerosols is important for interpreting past (and projecting future) changes in atmospheric composition. Here we show that there are large differences in the global-scale atmospheric transport properties among the models participating in the IGAC SPARC Chemistry-Climate Model Initiative (CCMI). Specifically, we find up to 40 % differences in the transport timescales connecting the Northern Hemisphere (NH) midlatitude surface to the Arctic and to Southern Hemisphere high latitudes, where the mean age ranges between 1.7 and 2.6 years. We show that these differences are related to large differences in vertical transport among the simulations, in particular to differences in parameterized convection over the oceans. While stronger convection over NH midlatitudes is associated with slower transport to the Arctic, stronger convection in the tropics and subtropics is associated with faster interhemispheric transport. We also show that the differences among simulations constrained with fields derived from the same reanalysis products are as large as (and in some cases larger than) the differences among free-running simulations, most likely due to larger differences in parameterized convection. Our results indicate that care must be taken when using simulations constrained with analyzed winds to interpret the influence of meteorology on tropospheric composition.

  7. Large-scale tropospheric transport in the Chemistry–Climate Model Initiative (CCMI simulations

    Directory of Open Access Journals (Sweden)

    C. Orbe

    2018-05-01

    Full Text Available Understanding and modeling the large-scale transport of trace gases and aerosols is important for interpreting past (and projecting future changes in atmospheric composition. Here we show that there are large differences in the global-scale atmospheric transport properties among the models participating in the IGAC SPARC Chemistry–Climate Model Initiative (CCMI. Specifically, we find up to 40 % differences in the transport timescales connecting the Northern Hemisphere (NH midlatitude surface to the Arctic and to Southern Hemisphere high latitudes, where the mean age ranges between 1.7 and 2.6 years. We show that these differences are related to large differences in vertical transport among the simulations, in particular to differences in parameterized convection over the oceans. While stronger convection over NH midlatitudes is associated with slower transport to the Arctic, stronger convection in the tropics and subtropics is associated with faster interhemispheric transport. We also show that the differences among simulations constrained with fields derived from the same reanalysis products are as large as (and in some cases larger than the differences among free-running simulations, most likely due to larger differences in parameterized convection. Our results indicate that care must be taken when using simulations constrained with analyzed winds to interpret the influence of meteorology on tropospheric composition.

  8. Laboratory astrophysics. Model experiments of astrophysics with large-scale lasers

    International Nuclear Information System (INIS)

    Takabe, Hideaki

    2012-01-01

    I would like to review the model experiment of astrophysics with high-power, large-scale lasers constructed mainly for laser nuclear fusion research. The four research directions of this new field named 'Laser Astrophysics' are described with four examples mainly promoted in our institute. The description is of magazine style so as to be easily understood by non-specialists. A new theory and its model experiment on the collisionless shock and particle acceleration observed in supernova remnants (SNRs) are explained in detail and its result and coming research direction are clarified. In addition, the vacuum breakdown experiment to be realized with the near future ultra-intense laser is also introduced. (author)

  9. Observatory Magnetometer In-Situ Calibration

    Directory of Open Access Journals (Sweden)

    A Marusenkov

    2011-07-01

    Full Text Available An experimental validation of the in-situ calibration procedure, which allows estimating parameters of observatory magnetometers (scale factors, sensor misalignment without its operation interruption, is presented. In order to control the validity of the procedure, the records provided by two magnetometers calibrated independently in a coil system have been processed. The in-situ estimations of the parameters are in very good agreement with the values provided by the coil system calibration.

  10. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  11. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  12. Local scale decision support systems - actual situation and trends for the future

    International Nuclear Information System (INIS)

    Govaerts, P.

    1993-01-01

    Based on the communications presented in the session on local scale decision support systems, some common trends for those models have been identified. During the last decade the evolutionary change of those models is related with the better insight in decisions to be taken with respect to interventions, the acceptance of large uncertainties, the perceived importance of social and economic factors and shift of the identity of the user. A more revolutionary change is predicted for the near future, putting most emphasis on the predictive mode, extending the integration of monitoring data in the decision support system, and the use of pre-established scenarios. The local scale decision support system will become the key module of the off-site emergency control room. (author)

  13. Lockheed Solar Observatory and the Discovery of Moreton-Ramsey Waves

    Science.gov (United States)

    Tarbell, Theodore D.

    2014-06-01

    Moreton Waves are high-speed disturbances seen traveling away from large solar flares in H-alpha movies of the solar chromosphere. They were discovered by the observer Harry Ramsey in the late 1950s, and then published and publicized by the director Gail Moreton, both of the Lockheed Solar Observatory in the Hollywood Hills of Southern California. These efforts established the scientific reputation and secured continuing funding of the observatory, whose present-day successor is the Lockheed Martin Solar and Astrophysics Lab in Palo Alto. Moreton waves are rare, and there was limited interest in them until the EIT instrument on SOHO began seeing large numbers of similar waves in the corona in the late 1990s. The exact relation between the two observations is still a research topic today. This talk will describe some of the history of the observatory and the discovery and early interpretation of the waves.

  14. Passive technologies for future large-scale photonic integrated circuits on silicon: polarization handling, light non-reciprocity and loss reduction

    Directory of Open Access Journals (Sweden)

    Daoxin Dai

    2012-03-01

    Full Text Available Silicon-based large-scale photonic integrated circuits are becoming important, due to the need for higher complexity and lower cost for optical transmitters, receivers and optical buffers. In this paper, passive technologies for large-scale photonic integrated circuits are described, including polarization handling, light non-reciprocity and loss reduction. The design rule for polarization beam splitters based on asymmetrical directional couplers is summarized and several novel designs for ultra-short polarization beam splitters are reviewed. A novel concept for realizing a polarization splitter–rotator is presented with a very simple fabrication process. Realization of silicon-based light non-reciprocity devices (e.g., optical isolator, which is very important for transmitters to avoid sensitivity to reflections, is also demonstrated with the help of magneto-optical material by the bonding technology. Low-loss waveguides are another important technology for large-scale photonic integrated circuits. Ultra-low loss optical waveguides are achieved by designing a Si3N4 core with a very high aspect ratio. The loss is reduced further to <0.1 dB m−1 with an improved fabrication process incorporating a high-quality thermal oxide upper cladding by means of wafer bonding. With the developed ultra-low loss Si3N4 optical waveguides, some devices are also demonstrated, including ultra-high-Q ring resonators, low-loss arrayed-waveguide grating (demultiplexers, and high-extinction-ratio polarizers.

  15. Managing sensitive phenotypic data and biomaterial in large-scale collaborative psychiatric genetic research projects: practical considerations.

    Science.gov (United States)

    Demiroglu, S Y; Skrowny, D; Quade, M; Schwanke, J; Budde, M; Gullatz, V; Reich-Erkelenz, D; Jakob, J J; Falkai, P; Rienhoff, O; Helbing, K; Heilbronner, U; Schulze, T G

    2012-12-01

    Large-scale collaborative research will be a hallmark of future psychiatric genetic research. Ideally, both academic and non-academic institutions should be able to participate in such collaborations to allow for the establishment of very large samples in a straightforward manner. Any such endeavor requires an easy-to-implement information technology (IT) framework. Here we present the requirements for a centralized framework and describe how they can be met through a modular IT toolbox.

  16. A new regard about Surlari National Geomagnetic Observatory

    Science.gov (United States)

    Asimopolos, Laurentiu; Asimopolos, Natalia-Silvia; Pestina, Agata-Monica

    2010-05-01

    Geomagnetic field study in Romanian stations has started with irregular measurements in late XIXth century. In 1943, the foundation of Surlari National Geomagnetic Observatory (SNGO) marks the beginning of a new era in the systematic study of geomagnetic field by a continuous registration of its variations and by carrying out standard absolute measurements in a fundamental station. The location of the observatory meets the highest exigencies, being situated in physical-geological conditions of a uniform local field, at a reasonably long distance from human activities. Its laboratories observe strict conditions of non-magnetism, ensuring the possibility of absolute standard measurements (national magnetic standards) for all the units in the country, civil or military, which are endowed with equipment based on geomagnetic metrology. These basic conditions have allowed the observatory to become by developing its initial preoccupations a centre of complex geomagnetic research, constantly involved in national and international issues, promoting new themes in our country and bringing significant contributions. During the last two decades, infrastructure and equipment used in monitoring geomagnetic field at European and planetary level have experienced a remarkable development. New registering techniques have allowed a complete to automate of data acquisition, and sampling step and their precision increased by two classes of size. Systems of transmitting these data in real time to world collecting centres have resulted in the possibility of approaching globalize studies, suitable for following some phenomena at planetary scale. At the same time, a significant development in the procedures of processing primary data has been registered, based on standardized programmes. The new stage of this fundamental research, largely applicable in various fields, is also marked by the simultaneous observation of space-time distribution of terrestrial electromagnetic field by means of

  17. The Observatory as Laboratory: Spectral Analysis at Mount Wilson Observatory

    Science.gov (United States)

    Brashear, Ronald

    2018-01-01

    This paper will discuss the seminal changes in astronomical research practices made at the Mount Wilson Observatory in the early twentieth century by George Ellery Hale and his staff. Hale’s desire to set the agenda for solar and stellar astronomical research is often described in terms of his new telescopes, primarily the solar tower observatories and the 60- and 100-inch telescopes on Mount Wilson. This paper will focus more on the ancillary but no less critical parts of Hale’s research mission: the establishment of associated “physical” laboratories as part of the observatory complex where observational spectral data could be quickly compared with spectra obtained using specialized laboratory equipment. Hale built a spectroscopic laboratory on the mountain and a more elaborate physical laboratory in Pasadena and staffed it with highly trained physicists, not classically trained astronomers. The success of Hale’s vision for an astronomical observatory quickly made the Carnegie Institution’s Mount Wilson Observatory one of the most important astrophysical research centers in the world.

  18. The US Department of Energy Nuclear Data and Low Energy Physics Programs: Aspects of current operational status and future direction

    International Nuclear Information System (INIS)

    Whetstone, S.L.; Meyer, R.A.

    1991-01-01

    The Nuclear Data and Low-Energy Programs are operated within the Division of Nuclear Physics of the US Department of Energy. The data program supports a range of activities including large scale data measurements, nuclear cross section modelling, and nuclear data compilation and dissemination. The US nuclear data needs and prospects for the future of this effort are currently being addressed and its present status is reviewed. Possibilities for the next generation nuclear data accessibility will be discussed and examples presented. The Low-Energy Nuclear Physics Program supports investigations into low-energy nuclear structure and neutrino physics. Among examples of the latter that are covered is the Sudbury Neutrino Observatory

  19. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  20. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  1. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  2. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  3. A Combined Ethical and Scientific Analysis of Large-scale Tests of Solar Climate Engineering

    Science.gov (United States)

    Ackerman, T. P.

    2017-12-01

    Our research group recently published an analysis of the combined ethical and scientific issues surrounding large-scale testing of stratospheric aerosol injection (SAI; Lenferna et al., 2017, Earth's Future). We are expanding this study in two directions. The first is extending this same analysis to other geoengineering techniques, particularly marine cloud brightening (MCB). MCB has substantial differences to SAI in this context because MCB can be tested over significantly smaller areas of the planet and, following injection, has a much shorter lifetime of weeks as opposed to years for SAI. We examine issues such as the role of intent, the lesser of two evils, and the nature of consent. In addition, several groups are currently considering climate engineering governance tools such as a code of ethics and a registry. We examine how these tools might influence climate engineering research programs and, specifically, large-scale testing. The second direction of expansion is asking whether ethical and scientific issues associated with large-scale testing are so significant that they effectively preclude moving ahead with climate engineering research and testing. Some previous authors have suggested that no research should take place until these issues are resolved. We think this position is too draconian and consider a more nuanced version of this argument. We note, however, that there are serious questions regarding the ability of the scientific research community to move to the point of carrying out large-scale tests.

  4. The National Virtual Observatory Science Definintion Team: Report and Status

    Science.gov (United States)

    Djorgovski, S. G.; NVO SDT Team

    2002-05-01

    Astronomy has become an enormously data-rich science, with numerous multi-Terabyte sky surveys and archives over the full range of wavelengths, and Petabyte-scale data sets already on the horizon. The amount of the available information is growing exponentially, largely driven by the progress in detector and information technology, and the quality and complexity of the data are unprecedented. This great quantitative advance will result in qualitative changes in the way astronomy is done. The Virtual Observatory concept is the astronomy community's organized response to the challenges posed by efficient handling and scientific exploration of new, massive data sets. The NAS Decadal Survey, Astronomy and Astrophysics in the New Millennium, recommends as the first priority in the ``small'' projects category creation of the National Virtual Observatory (NVO). In response to this, the NSF and NASA formed in June 2001 the NVO Science Definition Team (SDT), with a mandate to: (1) Define and formulate a joint NASA/NSF initiative to pursue the NVO goals; (2) Solicit input from the U.S. astronomy community, and incorporate it in the NVO definition documents and recommendations for further actions; and (3) Serve as liaison to broader space science, computer science, and statistics communities for the NVO initiative, and as liaison with the similar efforts in Europe, looking forward towards a truly Global Virtual Observatory. The Team has delivered its report to the agencies and made it publicly available on its website (http://nvosdt.org), where many other relevant links can be found. We will summarize the report, its conclusions, and recommendations.

  5. Concepts for Future Large Fire Modeling

    Science.gov (United States)

    A. P. Dimitrakopoulos; R. E. Martin

    1987-01-01

    A small number of fires escape initial attack suppression efforts and become large, but their effects are significant and disproportionate. In 1983, of 200,000 wildland fires in the United States, only 4,000 exceeded 100 acres. However, these escaped fires accounted for roughly 95 percent of wildfire-related costs and damages (Pyne, 1984). Thus, future research efforts...

  6. Searching the Heavens and the Earth: This History of Jesuit Observatories

    Science.gov (United States)

    Udías, Agustín

    2003-10-01

    Jesuits established a large number of astronomical, geophysical and meteorological observatories during the 17th and 18th centuries and again during the 19th and 20th centuries throughout the world. The history of these observatories has never been published in a complete form. Many early European astronomical observatories were established in Jesuit colleges. During the 17th and 18th centuries Jesuits were the first western scientists to enter into contact with China and India. It was through them that western astronomy was first introduced in these countries. They made early astronomical observations in India and China and they directed for 150 years the Imperial Observatory of Beijing. In the 19th and 20th centuries a new set of observatories were established. Besides astronomy these now included meteorology and geophysics. Jesuits established some of the earliest observatories in Africa, South America and the Far East. Jesuit observatories constitute an often forgotten chapter of the history of these sciences. This volume is aimed at all scientists and students who do not want to forget the Jesuit contributions to science. Link: http://www.wkap.nl/prod/b/1-4020-1189-X

  7. The Importance of Marine Observatories and of RAIA in Particular

    Directory of Open Access Journals (Sweden)

    Luísa Bastos

    2016-08-01

    Full Text Available Coastal and Oceanic Observatories are important tools to provide information on ocean state, phenomena and processes. They meet the need for a better understanding of coastal and ocean dynamics, revealing regional characteristics and vulnerabilities. These observatories are extremely useful to guide human actions in response to natural events and potential climate change impacts, anticipating the occurrence of extreme weather and oceanic events and helping to minimize consequent personal and material damages and costs.International organizations and local governments have shown an increasing interest in operational oceanography and coastal, marine and oceanic observations, which resulted in substantial investments in these areas. A variety of physical, chemical and biological data have been collected to better understand the specific characteristics of each ocean area and its importance in the global context. Also the general public’s interest in marine issues and observatories has been raised, mainly in relation to vulnerability, sustainability and climate change issues. Data and products obtained by an observatory are hence useful to a broad range of stakeholders, from national and local authorities to the population in general.An introduction to Ocean Observatories, including their national and regional importance, and a brief analysis of the societal interest in these observatories and related issues are presented. The potential of a Coastal and Ocean Observatory is then demonstrated using the RAIA observatory as example. This modern and comprehensive observatory is dedicated to improve operational oceanography, technology and marine science for the North Western Iberian coast, and to provide services to a large range of stakeholders.

  8. Going Digital: A Survey on Digitalization and Large Scale Data Analytics in Healthcare

    OpenAIRE

    Tresp, Volker; Overhage, J. Marc; Bundschus, Markus; Rabizadeh, Shahrooz; Fasching, Peter A.; Yu, Shipeng

    2016-01-01

    We provide an overview of the recent trends towards digitalization and large scale data analytics in healthcare. It is expected that these trends are instrumental in the dramatic changes in the way healthcare will be organized in the future. We discuss the recent political initiatives designed to shift care delivery processes from paper to electronic, with the goals of more effective treatments with better outcomes; cost pressure is a major driver of innovation. We describe newly developed ne...

  9. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  10. Nutrient removal from Chinese coastal waters by large-scale seaweed aquaculture

    KAUST Repository

    Xiao, Xi

    2017-04-21

    China is facing intense coastal eutrophication. Large-scale seaweed aquaculture in China is popular, now accounting for over 2/3\\'s of global production. Here, we estimate the nutrient removal capability of large-scale Chinese seaweed farms to determine its significance in mitigating eutrophication. We combined estimates of yield and nutrient concentration of Chinese seaweed aquaculture to quantify that one hectare of seaweed aquaculture removes the equivalent nutrient inputs entering 17.8 ha for nitrogen and 126.7 ha for phosphorus of Chinese coastal waters, respectively. Chinese seaweed aquaculture annually removes approximately 75,000 t nitrogen and 9,500 t phosphorus. Whereas removal of the total N inputs to Chinese coastal waters requires a seaweed farming area 17 times larger than the extant area, one and a half times more of the seaweed area would be able to remove close to 100% of the P inputs. With the current growth rate of seaweed aquaculture, we project this industry will remove 100% of the current phosphorus inputs to Chinese coastal waters by 2026. Hence, seaweed aquaculture already plays a hitherto unrealized role in mitigating coastal eutrophication, a role that may be greatly expanded with future growth of seaweed aquaculture.

  11. Nutrient removal from Chinese coastal waters by large-scale seaweed aquaculture

    KAUST Repository

    Xiao, Xi; Agusti, Susana; Lin, Fang; Li, Ke; Pan, Yaoru; Yu, Yan; Zheng, Yuhan; Wu, Jiaping; Duarte, Carlos M.

    2017-01-01

    China is facing intense coastal eutrophication. Large-scale seaweed aquaculture in China is popular, now accounting for over 2/3's of global production. Here, we estimate the nutrient removal capability of large-scale Chinese seaweed farms to determine its significance in mitigating eutrophication. We combined estimates of yield and nutrient concentration of Chinese seaweed aquaculture to quantify that one hectare of seaweed aquaculture removes the equivalent nutrient inputs entering 17.8 ha for nitrogen and 126.7 ha for phosphorus of Chinese coastal waters, respectively. Chinese seaweed aquaculture annually removes approximately 75,000 t nitrogen and 9,500 t phosphorus. Whereas removal of the total N inputs to Chinese coastal waters requires a seaweed farming area 17 times larger than the extant area, one and a half times more of the seaweed area would be able to remove close to 100% of the P inputs. With the current growth rate of seaweed aquaculture, we project this industry will remove 100% of the current phosphorus inputs to Chinese coastal waters by 2026. Hence, seaweed aquaculture already plays a hitherto unrealized role in mitigating coastal eutrophication, a role that may be greatly expanded with future growth of seaweed aquaculture.

  12. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  13. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  14. Cosmological Parameter Estimation with Large Scale Structure Observations

    CERN Document Server

    Di Dio, Enea; Durrer, Ruth; Lesgourgues, Julien

    2014-01-01

    We estimate the sensitivity of future galaxy surveys to cosmological parameters, using the redshift dependent angular power spectra of galaxy number counts, $C_\\ell(z_1,z_2)$, calculated with all relativistic corrections at first order in perturbation theory. We pay special attention to the redshift dependence of the non-linearity scale and present Fisher matrix forecasts for Euclid-like and DES-like galaxy surveys. We compare the standard $P(k)$ analysis with the new $C_\\ell(z_1,z_2)$ method. We show that for surveys with photometric redshifts the new analysis performs significantly better than the $P(k)$ analysis. For spectroscopic redshifts, however, the large number of redshift bins which would be needed to fully profit from the redshift information, is severely limited by shot noise. We also identify surveys which can measure the lensing contribution and we study the monopole, $C_0(z_1,z_2)$.

  15. Scaling future tropical cyclone damage with global mean temperature

    Science.gov (United States)

    Geiger, T.; Bresch, D.; Frieler, K.

    2017-12-01

    Tropical cyclones (TC) are one of the most damaging natural hazards and severely affectmany countries around the globe each year. Their nominal impact is projected to increasesubstantially as the exposed coastal population grows, per capita income increases, andanthropogenic climate change manifests. The magnitude of this increase, however, variesacross regions and is obscured by the stochastic behaviour of TCs, so far impeding arigorous quantification of trends in TC damage with global mean temperature (GMT) rise. Here, we build on the large sample of spatially explicit TCs simulations generated withinISIMIP(2b) for 1) pre-industrial conditions, 2) the historical period, and 3) future projectionsunder RCP2.6 and RCP6.0 to estimate future TC damage assuming fixed present-daysocio-economic conditions or SSP-based future projections of population patterns andincome. Damage estimates will be based on region-specific empirical damage modelsderived from reported damages and accounting for regional characteristics of vulnerability.Different combinations of 1) socio-economic drivers with pre-industrial climate or 2) changingclimate with fixed socio-economic conditions will be used to derive functional relationshipsbetween regionally aggregated changes in damages on one hand and global meantemperature and socio-economic predictors on the other hand. The obtained region-specific scaling of future TC damage with GMT provides valuable inputfor IPCC's special report on the impacts of global warming of 1.5°C by quantifying theincremental changes in impact with global warming. The approach allows for an update ofdamage functions used in integrated assessment models, and contributes to assessing theadequateness of climate mitigation and adaptation strategies.

  16. The founding charter of the Genomic Observatories Network.

    Science.gov (United States)

    Davies, Neil; Field, Dawn; Amaral-Zettler, Linda; Clark, Melody S; Deck, John; Drummond, Alexei; Faith, Daniel P; Geller, Jonathan; Gilbert, Jack; Glöckner, Frank Oliver; Hirsch, Penny R; Leong, Jo-Ann; Meyer, Chris; Obst, Matthias; Planes, Serge; Scholin, Chris; Vogler, Alfried P; Gates, Ruth D; Toonen, Rob; Berteaux-Lecellier, Véronique; Barbier, Michèle; Barker, Katherine; Bertilsson, Stefan; Bicak, Mesude; Bietz, Matthew J; Bobe, Jason; Bodrossy, Levente; Borja, Angel; Coddington, Jonathan; Fuhrman, Jed; Gerdts, Gunnar; Gillespie, Rosemary; Goodwin, Kelly; Hanson, Paul C; Hero, Jean-Marc; Hoekman, David; Jansson, Janet; Jeanthon, Christian; Kao, Rebecca; Klindworth, Anna; Knight, Rob; Kottmann, Renzo; Koo, Michelle S; Kotoulas, Georgios; Lowe, Andrew J; Marteinsson, Viggó Thór; Meyer, Folker; Morrison, Norman; Myrold, David D; Pafilis, Evangelos; Parker, Stephanie; Parnell, John Jacob; Polymenakou, Paraskevi N; Ratnasingham, Sujeevan; Roderick, George K; Rodriguez-Ezpeleta, Naiara; Schonrogge, Karsten; Simon, Nathalie; Valette-Silver, Nathalie J; Springer, Yuri P; Stone, Graham N; Stones-Havas, Steve; Sansone, Susanna-Assunta; Thibault, Kate M; Wecker, Patricia; Wichels, Antje; Wooley, John C; Yahara, Tetsukazu; Zingone, Adriana

    2014-03-07

    The co-authors of this paper hereby state their intention to work together to launch the Genomic Observatories Network (GOs Network) for which this document will serve as its Founding Charter. We define a Genomic Observatory as an ecosystem and/or site subject to long-term scientific research, including (but not limited to) the sustained study of genomic biodiversity from single-celled microbes to multicellular organisms.An international group of 64 scientists first published the call for a global network of Genomic Observatories in January 2012. The vision for such a network was expanded in a subsequent paper and developed over a series of meetings in Bremen (Germany), Shenzhen (China), Moorea (French Polynesia), Oxford (UK), Pacific Grove (California, USA), Washington (DC, USA), and London (UK). While this community-building process continues, here we express our mutual intent to establish the GOs Network formally, and to describe our shared vision for its future. The views expressed here are ours alone as individual scientists, and do not necessarily represent those of the institutions with which we are affiliated.

  17. Large Scale Beam-beam Simulations for the CERN LHC using Distributed Computing

    CERN Document Server

    Herr, Werner; McIntosh, E; Schmidt, F

    2006-01-01

    We report on a large scale simulation of beam-beam effects for the CERN Large Hadron Collider (LHC). The stability of particles which experience head-on and long-range beam-beam effects was investigated for different optical configurations and machine imperfections. To cover the interesting parameter space required computing resources not available at CERN. The necessary resources were available in the LHC@home project, based on the BOINC platform. At present, this project makes more than 60000 hosts available for distributed computing. We shall discuss our experience using this system during a simulation campaign of more than six months and describe the tools and procedures necessary to ensure consistent results. The results from this extended study are presented and future plans are discussed.

  18. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  19. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  20. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  1. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  2. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  3. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  4. Scientific Observatories And The Long-term Fate Of The Data They Collect

    Science.gov (United States)

    Pirenne, B.

    2017-12-01

    Observatories are designed to take successive snapshots of their environment, (e.g., deep underground, the ocean abyss, land, sky and all the way to the farthest reaches of the Universe). In the humanities and social sciences, observatories collect data on people and their activities. In doing so, they help study the environment's history and its (rate of) change in an attempt to understand it, and ultimately predict its future. The mechanism for capturing snapshots relies on sensors: they range from transcribed human experiences to sophisticated apparatus that require billion of dollars of equipment and decades of planning, construction and operation. Today sensors are ubiquitous and their product is data. The advent of the digital era has allowed the collection of more and more data points, and offered almost no limits to the scope of insights that can be obtained from them through computing. Though an individual data point's value is continually decreasing due to the explosion of the number of sensors and their ever increasing rate of sampling, data analytics assigns data points a role, and consequently a non-zero value. It is therefore imperative to keep them all: new knowledge stemming from new ideas and methods is consistently obtained from re-analysis of older data. Observatory operators have typically proposed that their organization would be in the best position to deal with the long-term archival of the data produced at the facility. This was justified by the need to have the right level of expertise near the new, unique and novel data. Funding agencies have given the argument a positive nod and supported facility-based data archives. Nowadays however, funding agencies will only funds facilities for 3 to 5 years. Data managers, when asked about their plans related to the post-funding future of their datasets, respond with a promise to deposit them in large national or supra-national repositories. Is this a good approach? Are there suitable alternatives

  5. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  6. Beyond MOS and fibers: Optical Fourier-transform Imaging Unit for Cananea Observatory (OFIUCO)

    Science.gov (United States)

    Nieto-Suárez, M. A.; Rosales-Ortega, F. F.; Castillo, E.; García, P.; Escobedo, G.; Sánchez, S. F.; González, J.; Iglesias-Páramo, J.; Mollá, M.; Chávez, M.; Bertone, E.; et al.

    2017-11-01

    Many physical processes in astronomy are still hampered by the lack of spatial and spectral resolution, and also restricted to the field-of-view (FoV) of current 2D spectroscopy instruments available worldwide. It is due to that, many of the ongoing or proposed studies are based on large-scale imaging and/or spectroscopic surveys. Under this philosophy, large aperture telescopes are dedicated to the study of intrinsically faint and/or distance objects, covering small FoVs, with high spatial resolution, while smaller telescopes are devoted to wide-field explorations. However, future astronomical surveys, should be addressed by acquiring un-biases, spatially resolved, high-quality spectroscopic information for a wide FoV. Therefore, and in order to improve the current instrumental offer in the Observatorio Astrofísico Guillermo Haro (OAGH) in Cananea, Mexico (INAOE); and to explore a possible instrument for the future Telescopio San Pedro Mártir (6.5m), we are currently integrating at INAOE an instrument prototype that will provide us with un-biased wide-field (few arcmin) spectroscopic information, and with the flexibility of operating at different spectral resolutions (R 1-20000), with a spatial resolution limited by seeing, and therefore, to be used in a wide range of astronomical problems. This instrument called OFIUCO: Optical Fourier-transform Imaging Unit for Cananea Observatory, will make use of the Fourier Transform Spectroscopic technique, which has been proved to be feasible in the optical wavelength range (350-1000 nm) with designs such as SITELLE (CFHT). We describe here the basic technical description of a Fourier transform spectrograph with important modifications from previous astronomical versions, as well as the technical advantages and weakness, and the science cases in which this instrument can be implemented.

  7. Comparing Existing Pipeline Networks with the Potential Scale of Future U.S. CO2 Pipeline Networks

    Energy Technology Data Exchange (ETDEWEB)

    Dooley, James J.; Dahowski, Robert T.; Davidson, Casie L.

    2008-02-29

    There is growing interest regarding the potential size of a future U.S. dedicated CO2 pipeline infrastructure if carbon dioxide capture and storage (CCS) technologies are commercially deployed on a large scale. In trying to understand the potential scale of a future national CO2 pipeline network, comparisons are often made to the existing pipeline networks used to deliver natural gas and liquid hydrocarbons to markets within the U.S. This paper assesses the potential scale of the CO2 pipeline system needed under two hypothetical climate policies and compares this to the extant U.S. pipeline infrastructures used to deliver CO2 for enhanced oil recovery (EOR), and to move natural gas and liquid hydrocarbons from areas of production and importation to markets. The data presented here suggest that the need to increase the size of the existing dedicated CO2 pipeline system should not be seen as a significant obstacle for the commercial deployment of CCS technologies.

  8. Large scale hydrogeological modelling of a low-lying complex coastal aquifer system

    DEFF Research Database (Denmark)

    Meyer, Rena

    2018-01-01

    intrusion. In this thesis a new methodological approach was developed to combine 3D numerical groundwater modelling with a detailed geological description and hydrological, geochemical and geophysical data. It was applied to a regional scale saltwater intrusion in order to analyse and quantify...... the groundwater flow dynamics, identify the driving mechanisms that formed the saltwater intrusion to its present extent and to predict its progression in the future. The study area is located in the transboundary region between Southern Denmark and Northern Germany, adjacent to the Wadden Sea. Here, a large-scale...... parametrization schemes that accommodate hydrogeological heterogeneities. Subsequently, density-dependent flow and transport modelling of multiple salt sources was successfully applied to simulate the formation of the saltwater intrusion during the last 4200 years, accounting for historic changes in the hydraulic...

  9. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  10. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  11. Revisiting the EC/CMB model for extragalactic large scale jets

    Science.gov (United States)

    Lucchini, M.; Tavecchio, F.; Ghisellini, G.

    2017-04-01

    One of the most outstanding results of the Chandra X-ray Observatory was the discovery that AGN jets are bright X-ray emitters on very large scales, up to hundreds of kpc. Of these, the powerful and beamed jets of flat-spectrum radio quasars are particularly interesting, as the X-ray emission cannot be explained by an extrapolation of the lower frequency synchrotron spectrum. Instead, the most common model invokes inverse Compton scattering of photons of the cosmic microwave background (EC/CMB) as the mechanism responsible for the high-energy emission. The EC/CMB model has recently come under criticism, particularly because it should predict a significant steady flux in the MeV-GeV band which has not been detected by the Fermi/LAT telescope for two of the best studied jets (PKS 0637-752 and 3C273). In this work, we revisit some aspects of the EC/CMB model and show that electron cooling plays an important part in shaping the spectrum. This can solve the overproduction of γ-rays by suppressing the high-energy end of the emitting particle population. Furthermore, we show that cooling in the EC/CMB model predicts a new class of extended jets that are bright in X-rays but silent in the radio and optical bands. These jets are more likely to lie at intermediate redshifts and would have been missed in all previous X-ray surveys due to selection effects.

  12. Modeling the Hydrologic Effects of Large-Scale Green Infrastructure Projects with GIS

    Science.gov (United States)

    Bado, R. A.; Fekete, B. M.; Khanbilvardi, R.

    2015-12-01

    Impervious surfaces in urban areas generate excess runoff, which in turn causes flooding, combined sewer overflows, and degradation of adjacent surface waters. Municipal environmental protection agencies have shown a growing interest in mitigating these effects with 'green' infrastructure practices that partially restore the perviousness and water holding capacity of urban centers. Assessment of the performance of current and future green infrastructure projects is hindered by the lack of adequate hydrological modeling tools; conventional techniques fail to account for the complex flow pathways of urban environments, and detailed analyses are difficult to prepare for the very large domains in which green infrastructure projects are implemented. Currently, no standard toolset exists that can rapidly and conveniently predict runoff, consequent inundations, and sewer overflows at a city-wide scale. We demonstrate how streamlined modeling techniques can be used with open-source GIS software to efficiently model runoff in large urban catchments. Hydraulic parameters and flow paths through city blocks, roadways, and sewer drains are automatically generated from GIS layers, and ultimately urban flow simulations can be executed for a variety of rainfall conditions. With this methodology, users can understand the implications of large-scale land use changes and green/gray storm water retention systems on hydraulic loading, peak flow rates, and runoff volumes.

  13. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  14. Feasibility analysis of large length-scale thermocapillary flow experiment for the International Space Station

    Science.gov (United States)

    Alberts, Samantha J.

    The investigation of microgravity fluid dynamics emerged out of necessity with the advent of space exploration. In particular, capillary research took a leap forward in the 1960s with regards to liquid settling and interfacial dynamics. Due to inherent temperature variations in large spacecraft liquid systems, such as fuel tanks, forces develop on gas-liquid interfaces which induce thermocapillary flows. To date, thermocapillary flows have been studied in small, idealized research geometries usually under terrestrial conditions. The 1 to 3m lengths in current and future large tanks and hardware are designed based on hardware rather than research, which leaves spaceflight systems designers without the technological tools to effectively create safe and efficient designs. This thesis focused on the design and feasibility of a large length-scale thermocapillary flow experiment, which utilizes temperature variations to drive a flow. The design of a helical channel geometry ranging from 1 to 2.5m in length permits a large length-scale thermocapillary flow experiment to fit in a seemingly small International Space Station (ISS) facility such as the Fluids Integrated Rack (FIR). An initial investigation determined the proposed experiment produced measurable data while adhering to the FIR facility limitations. The computational portion of this thesis focused on the investigation of functional geometries of fuel tanks and depots using Surface Evolver. This work outlines the design of a large length-scale thermocapillary flow experiment for the ISS FIR. The results from this work improve the understanding thermocapillary flows and thus improve technological tools for predicting heat and mass transfer in large length-scale thermocapillary flows. Without the tools to understand the thermocapillary flows in these systems, engineers are forced to design larger, heavier vehicles to assure safety and mission success.

  15. Large Scale Leach Test Facility: Development of equipment and methods, and comparison to MCC-1 leach tests

    International Nuclear Information System (INIS)

    Pellarin, D.J.; Bickford, D.F.

    1985-01-01

    This report describes the test equipment and methods, and documents the results of the first large-scale MCC-1 experiments in the Large Scale Leach Test Facility (LSLTF). Two experiments were performed using 1-ft-long samples sectioned from the middle of canister MS-11. The leachant used in the experiments was ultrapure deionized water - an aggressive and well characterized leachant providing high sensitivity for liquid sample analyses. All the original test plan objectives have been successfully met. Equipment and procedures have been developed for large-sample-size leach testing. The statistical reliability of the method has been determined, and ''bench mark'' data developed to relate small scale leach testing to full size waste forms. The facility is unique, and provides sampling reliability and flexibility not possible in smaller laboratory scale tests. Future use of this facility should simplify and accelerate the development of leaching models and repository specific data. The factor of less than 3 for leachability, corresponding to a 200,000/1 increase in sample volume, enhances the credibility of small scale test data which precedes this work, and supports the ability of the DWPF waste form to meet repository criteria

  16. Astronomical publications of Melbourne Observatory

    Science.gov (United States)

    Andropoulos, Jenny Ioanna

    2014-05-01

    During the second half of the 19th century and the first half of the 20th century, four well-equipped government observatories were maintained in Australia - in Melbourne, Sydney, Adelaide and Perth. These institutions conducted astronomical observations, often in the course of providing a local time service, and they also collected and collated meteorological data. As well, some of these observatories were involved at times in geodetic surveying, geomagnetic recording, gravity measurements, seismology, tide recording and physical standards, so the term "observatory" was being used in a rather broad sense! Despite the international renown that once applied to Williamstown and Melbourne Observatories, relatively little has been written by modern-day scholars about astronomical activities at these observatories. This research is intended to rectify this situation to some extent by gathering, cataloguing and analysing the published astronomical output of the two Observatories to see what contributions they made to science and society. It also compares their contributions with those of Sydney, Adelaide and Perth Observatories. Overall, Williamstown and Melbourne Observatories produced a prodigious amount of material on astronomy in scientific and technical journals, in reports and in newspapers. The other observatories more or less did likewise, so no observatory of those studied markedly outperformed the others in the long term, especially when account is taken of their relative resourcing in staff and equipment.

  17. Properties of a large NaI(Tl) spectrometer for the energy measurement of high-energy gamma rays on the Gamma Ray Observatory

    International Nuclear Information System (INIS)

    Hughes, E.B.; Finman, L.C.; Hofstadter, R.; Lepetich, J.E.; Lin, Y.C.; Mattox, J.R.; Nolan, P.L.; Parks, R.; Walker, A.H.

    1986-01-01

    A large NaI(T1) spectrometer is expected to play a crucial role in the measurement of the energy spectra from an all-sky survey of high-energy celestial gamma rays on the Gamma Ray Observatory. The crystal size and requirements of space flight have resulted in a novel crystal-packaging and optics combination. The structure of this spectrometer and the operating characteristics determined in a test program using high energy positrons are described

  18. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  19. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  20. Characterizing Temperature Variability and Associated Large Scale Meteorological Patterns Across South America

    Science.gov (United States)

    Detzer, J.; Loikith, P. C.; Mechoso, C. R.; Barkhordarian, A.; Lee, H.

    2017-12-01

    South America's climate varies considerably owing to its large geographic range and diverse topographical features. Spanning the tropics to the mid-latitudes and from high peaks to tropical rainforest, the continent experiences an array of climate and weather patterns. Due to this considerable spatial extent, assessing temperature variability at the continent scale is particularly challenging. It is well documented in the literature that temperatures have been increasing across portions of South America in recent decades, and while there have been many studies that have focused on precipitation variability and change, temperature has received less scientific attention. Therefore, a more thorough understanding of the drivers of temperature variability is critical for interpreting future change. First, k-means cluster analysis is used to identify four primary modes of temperature variability across the continent, stratified by season. Next, composites of large scale meteorological patterns (LSMPs) are calculated for months assigned to each cluster. Initial results suggest that LSMPs, defined using meteorological variables such as sea level pressure (SLP), geopotential height, and wind, are able to identify synoptic scale mechanisms important for driving temperature variability at the monthly scale. Some LSMPs indicate a relationship with known recurrent modes of climate variability. For example, composites of geopotential height suggest that the Southern Annular Mode is an important, but not necessarily dominant, component of temperature variability over southern South America. This work will be extended to assess the drivers of temperature extremes across South America.

  1. The Paris Observatory has 350 years

    Science.gov (United States)

    Lequeux, James

    2017-01-01

    The Paris Observatory is the oldest astronomical observatory that has worked without interruption since its foundation to the present day. The building due to Claude Perrault is still in existence with few modifications, but of course other buildings have been added all along the centuries for housing new instruments and laboratories. In particular, a large dome has been built on the terrace in 1847, with a 38-cm diameter telescope completed in 1857: both are still visible. The main initial purpose of the Observatory was to determine longitudes. This was achieved by Jean-Dominique Cassini using the eclipses of the satellites of Jupiter: a much better map of France was the produced using this method, which unfortunately does not work at sea. Incidentally, the observation of these eclipses led to the discovery in 1676 of the finite velocity of light by Cassini and Rømer. Cassini also discovered the differential rotation of Jupiter and four satellites of Saturn. Then, geodesy was to be the main activity of the Observatory for more than a century, culminating in the famous Cassini map of France completed around 1790. During the first half of the 19th century, under François Arago, the Observatory was at the centre of French physics, which then developed very rapidly. Arago initiated astrophysics in 1810 by showing that the Sun and stars are made of incandescent gas. In 1854, the new director, Urbain Le Verrier, put emphasis on astrometry and celestial mechanics, discovering in particular the anomalous advance of the perihelion of Mercury, which was later to be a proof of General Relativity. In 1858, Leon Foucault built the first modern reflecting telescopes with their silvered glass mirror. Le Verrier created on his side modern meteorology, including some primitive forecasts. The following period was not so bright, due to the enormous project of the Carte du Ciel, which took much of the forces of the Observatory for half a century with little scientific return. In

  2. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  3. Status and Future Developments in Large Accelerator Control Systems

    International Nuclear Information System (INIS)

    Karen S. White

    2006-01-01

    Over the years, accelerator control systems have evolved from small hardwired systems to complex computer controlled systems with many types of graphical user interfaces and electronic data processing. Today's control systems often include multiple software layers, hundreds of distributed processors, and hundreds of thousands of lines of code. While it is clear that the next generation of accelerators will require much bigger control systems, they will also need better systems. Advances in technology will be needed to ensure the network bandwidth and CPU power can provide reasonable update rates and support the requisite timing systems. Beyond the scaling problem, next generation systems face additional challenges due to growing cyber security threats and the likelihood that some degree of remote development and operation will be required. With a large number of components, the need for high reliability increases and commercial solutions can play a key role towards this goal. Future control systems will operate more complex machines and need to present a well integrated, interoperable set of tools with a high degree of automation. Consistency of data presentation and exception handling will contribute to efficient operations. From the development perspective, engineers will need to provide integrated data management in the beginning of the project and build adaptive software components around a central data repository. This will make the system maintainable and ensure consistency throughout the inevitable changes during the machine lifetime. Additionally, such a large project will require professional project management and disciplined use of well-defined engineering processes. Distributed project teams will make the use of standards, formal requirements and design and configuration control vital. Success in building the control system of the future may hinge on how well we integrate commercial components and learn from best practices used in other industries

  4. The Atsa Suborbital Observatory: An Observatory for a Commercial Suborbital Spacecraft

    Science.gov (United States)

    Vilas, F.; Sollitt, L. S.

    2012-12-01

    The advantages of astronomical observations made above Earth's atmosphere have long been understood: free access to spectral regions inaccessible from Earth (e.g., UV) or affected by the atmosphere's content (e.g., IR). Most robotic, space-based telescopes maintain large angular separation between the Sun and an observational target in order to avoid accidental damage to instruments from the Sun. For most astronomical targets, this possibility is easily avoided by waiting until objects are visible away from the Sun. For the Solar System objects inside Earth's orbit, this is never the case. Suborbital astronomical observations have over 50 years' history using NASA's sounding rockets and experimental space planes. Commercial suborbital spacecraft are largely expected to go to ~100 km altitude above Earth, providing a limited amount of time for astronomical observations. The unique scientific advantage to these observations is the ability to point close to the Sun: if a suborbital spacecraft accidentally turns too close to the Sun and fries an instrument, it is easy to land the spacecraft and repair the hardware for the next flight. Objects uniquely observed during the short observing window include inner-Earth asteroids, Mercury, Venus, and Sun-grazing comets. Both open-FOV and target-specific observations are possible. Despite many space probes to the inner Solar System, scientific questions remain. These include inner-Earth asteroid size and bulk density informing Solar System evolution studies and efforts to develop methods of mitigation against imminent impactors to Earth; chemistry and dynamics of Venus' atmosphere addressing physical phenomena such as greenhouse effect, atmospheric super-rotation and global resurfacing on Venus. With the Atsa Suborbital Observatory, we combine the strengths of both ground-based observatories and space-based observing to create a facility where a telescope is maintained and used interchangeably with both in-house facility

  5. The Rapid Ice Sheet Change Observatory (RISCO)

    Science.gov (United States)

    Morin, P.; Howat, I. M.; Ahn, Y.; Porter, C.; McFadden, E. M.

    2010-12-01

    The recent expansion of observational capacity from space has revealed dramatic, rapid changes in the Earth’s ice cover. These discoveries have fundamentally altered how scientists view ice-sheet change. Instead of just slow changes in snow accumulation and melting over centuries or millennia, important changes can occur in sudden events lasting only months, weeks, or even a single day. Our understanding of these short time- and space-scale processes, which hold important implications for future global sea level rise, has been impeded by the low temporal and spatial resolution, delayed sensor tasking, incomplete coverage, inaccessibility and/or high cost of data available to investigators. New cross-agency partnerships and data access policies provide the opportunity to dramatically improve the resolution of ice sheet observations by an order of magnitude, from timescales of months and distances of 10’s of meters, to days and meters or less. Advances in image processing technology also enable application of currently under-utilized datasets. The infrastructure for systematically gathering, processing, analyzing and distributing these data does not currently exist. Here we present the development of a multi-institutional, multi-platform observatory for rapid ice change with the ultimate objective of helping to elucidate the relevant timescales and processes of ice sheet dynamics and response to climate change. The Rapid Ice Sheet Observatory (RISCO) gathers observations of short time- and space-scale Cryosphere events and makes them easily accessible to investigators, media and general public. As opposed to existing data centers, which are structured to archive and distribute diverse types of raw data to end users with the specialized software and skills to analyze them, RISCO focuses on three types of geo-referenced raster (image) data products in a format immediately viewable with commonly available software. These three products are (1) sequences of images

  6. RadioAstron and millimetron space observatories: Multiverse models and the search for life

    Science.gov (United States)

    Kardashev, N. S.

    2017-04-01

    The transition from the radio to the millimeter and submillimeter ranges is very promising for studies of galactic nuclei, as well as detailed studies of processes related to supermassive black holes, wormholes, and possible manifestations of multi-element Universe (Multiverse) models. This is shown by observations with the largest interferometer available—RadioAstron observatory—that will be used for the scientific program forMillimetron observatory. Observations have also shown the promise of this range for studies of the formation and evolution of planetary systems and searches for manifestations of intelligent life. This is caused by the requirements to use a large amount of condensedmatter and energy in large-scale technological activities. This range can also be used efficiently in the organisation of optimal channels for the transmission of information.

  7. Dynamic model of frequency control in Danish power system with large scale integration of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Hansen, Anca Daniela; Sørensen, Poul Ejnar

    2013-01-01

    This work evaluates the impact of large scale integration of wind power in future power systems when 50% of load demand can be met from wind power. The focus is on active power balance control, where the main source of power imbalance is an inaccurate wind speed forecast. In this study, a Danish...... power system model with large scale of wind power is developed and a case study for an inaccurate wind power forecast is investigated. The goal of this work is to develop an adequate power system model that depicts relevant dynamic features of the power plants and compensates for load generation...... imbalances, caused by inaccurate wind speed forecast, by an appropriate control of the active power production from power plants....

  8. A large-scale computer facility for computational aerodynamics

    International Nuclear Information System (INIS)

    Bailey, F.R.; Balhaus, W.F.

    1985-01-01

    The combination of computer system technology and numerical modeling have advanced to the point that computational aerodynamics has emerged as an essential element in aerospace vehicle design methodology. To provide for further advances in modeling of aerodynamic flow fields, NASA has initiated at the Ames Research Center the Numerical Aerodynamic Simulation (NAS) Program. The objective of the Program is to develop a leading-edge, large-scale computer facility, and make it available to NASA, DoD, other Government agencies, industry and universities as a necessary element in ensuring continuing leadership in computational aerodynamics and related disciplines. The Program will establish an initial operational capability in 1986 and systematically enhance that capability by incorporating evolving improvements in state-of-the-art computer system technologies as required to maintain a leadership role. This paper briefly reviews the present and future requirements for computational aerodynamics and discusses the Numerical Aerodynamic Simulation Program objectives, computational goals, and implementation plans

  9. Surface-subsurface flow modeling: an example of large-scale research at the new NEON user facility

    Science.gov (United States)

    Powell, H.; McKnight, D. M.

    2009-12-01

    Climate change is predicted to alter surface-subsurface interactions in freshwater ecosystems. These interactions are hypothesized to control nutrient release at diel and seasonal time scales, which may then exert control over epilithic algal growth rates. The mechanisms underlying shifts in complex physical-chemical-biological patterns can be elucidated by long-term observations at sites that span hydrologic and climate gradients across the continent. Development of the National Ecological Observatory Network (NEON) will provide researchers the opportunity to investigate continental-scale patterns by combining investigator-driven measurements with Observatory data. NEON is a national-scale research platform for analyzing and understanding the impacts of climate change, land-use change, and invasive species on ecology. NEON features sensor networks and experiments, linked by advanced cyberinfrastructure to record and archive ecological data for at least 30 years. NEON partitions the United States into 20 ecoclimatic domains. Each domain hosts one fully instrumented Core Aquatic site in a wildland area and one Relocatable site, which aims to capture ecologically significant gradients (e.g. landuse, nitrogen deposition, urbanization). In the current definition of NEON there are 36 Aquatic sites: 30 streams/rivers and 6 ponds/lakes. Each site includes automated, in-situ sensors for groundwater elevation and temperature; stream flow (discharge and stage); pond water elevation; atmospheric chemistry (Tair, barometric pressure, PAR, radiation); and surface water chemistry (DO, Twater, conductivity, pH, turbidity, cDOM, nutrients). Groundwater and surface water sites shall be regularly sampled for selected chemical and isotopic parameters. The hydrologic and geochemical monitoring design provides basic information on water and chemical fluxes in streams and ponds and between groundwater and surface water, which is intended to support investigator-driven modeling studies

  10. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  11. Process optimization of large-scale production of recombinant adeno-associated vectors using dielectric spectroscopy.

    Science.gov (United States)

    Negrete, Alejandro; Esteban, Geoffrey; Kotin, Robert M

    2007-09-01

    A well-characterized manufacturing process for the large-scale production of recombinant adeno-associated vectors (rAAV) for gene therapy applications is required to meet current and future demands for pre-clinical and clinical studies and potential commercialization. Economic considerations argue in favor of suspension culture-based production. Currently, the only feasible method for large-scale rAAV production utilizes baculovirus expression vectors and insect cells in suspension cultures. To maximize yields and achieve reproducibility between batches, online monitoring of various metabolic and physical parameters is useful for characterizing early stages of baculovirus-infected insect cells. In this study, rAAVs were produced at 40-l scale yielding ~1 x 10(15) particles. During the process, dielectric spectroscopy was performed by real time scanning in radio frequencies between 300 kHz and 10 MHz. The corresponding permittivity values were correlated with the rAAV production. Both infected and uninfected reached a maximum value; however, only infected cell cultures permittivity profile reached a second maximum value. This effect was correlated with the optimal harvest time for rAAV production. Analysis of rAAV indicated the harvesting time around 48 h post-infection (hpi), and 72 hpi produced similar quantities of biologically active rAAV. Thus, if operated continuously, the 24-h reduction in the production process of rAAV gives sufficient time for additional 18 runs a year corresponding to an extra production of ~2 x 10(16) particles. As part of large-scale optimization studies, this new finding will facilitate the bioprocessing scale-up of rAAV and other bioproducts.

  12. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  13. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  14. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  15. The Pierre Auger Observatory status and the AugerPrime upgrade program

    Directory of Open Access Journals (Sweden)

    Martello Daniele

    2017-01-01

    Full Text Available The nature and the origin of ultra-high energy cosmic rays (UHECRs, above 1017 eV, are still unknown. The Pierre Auger Observatory with its huge exposure provides us with a large set of high quality data. The analysis of these data has led to major breakthroughs in the last decade, but a coherent interpretation is still missing. To answer the open questions the Observatory has started a major upgrade, with an emphasis on improved mass composition determination using the surface detectors. The latest results and the planned detector upgrade will be presented. The expected performance and the improved physics sensitivity of the Observatory will be discussed.

  16. Application of Large-Scale Database-Based Online Modeling to Plant State Long-Term Estimation

    Science.gov (United States)

    Ogawa, Masatoshi; Ogai, Harutoshi

    Recently, attention has been drawn to the local modeling techniques of a new idea called “Just-In-Time (JIT) modeling”. To apply “JIT modeling” to a large amount of database online, “Large-scale database-based Online Modeling (LOM)” has been proposed. LOM is a technique that makes the retrieval of neighboring data more efficient by using both “stepwise selection” and quantization. In order to predict the long-term state of the plant without using future data of manipulated variables, an Extended Sequential Prediction method of LOM (ESP-LOM) has been proposed. In this paper, the LOM and the ESP-LOM are introduced.

  17. Citizen Observatories and the New Earth Observation Science

    Directory of Open Access Journals (Sweden)

    Alan Grainger

    2017-02-01

    Full Text Available Earth observation is diversifying, and now includes new types of systems, such as citizen observatories, unmanned aerial vehicles and wireless sensor networks. However, the Copernicus Programme vision of a seamless chain from satellite data to usable information in the hands of decision makers is still largely unrealized, and remote sensing science lacks a conceptual framework to explain why. This paper reviews the literatures on citizen science, citizen observatories and conceptualization of remote sensing systems. It then proposes a Conceptual Framework for Earth Observation which can be used in a new Earth observation science to explain blockages in the chain from collecting data to disseminating information in any Earth observation system, including remote sensing systems. The framework differs from its predecessors by including social variables as well as technological and natural ones. It is used here, with evidence from successful citizen science projects, to compare the factors that are likely to influence the effectiveness of satellite remote sensing systems and citizen observatories. The paper finds that constraints on achieving the seamless “Copernicus Chain” are not solely technical, as assumed in the new Space Strategy for Europe, but include social constraints too. Achieving the Copernicus Chain will depend on the balance between: (a the ‘forward’ momentum generated by the repetitive functioning of each component in the system, as a result of automatic operation or human institutions, and by the efficiency of interfaces between components; and (b the ‘backward’ flow of information on the information needs of end users. Citizen observatories will face challenges in components which for satellite remote sensing systems are: (a automatic or straightforward, e.g., sensor design and launch, data collection, and data products; and (b also challenging, e.g., data processing. Since citizen observatories will rely even more on

  18. The prospects for large-scale import of biomass and biofuels to Sweden - A review of critical issues

    International Nuclear Information System (INIS)

    Hansson, Julia; Berndes, Goeran; Boerjesson, Paal

    2006-01-01

    Sweden is one of the biggest consumers of both domestic and imported biofuels in the EU. This paper evaluates the prospects for an increased and large-scale import of biofuels to Sweden in the future. The parameters included are prospective Swedish and global biofuel supply and demand, the cost, energy input and environmental impact of long-distance biofuel transport as well as the capacity of global freight and of Swedish ports to handle increased biofuel flows. The Swedish bioenergy potential seems large enough to accommodate a substantial increase in the domestic use of biofuels. However, an extensive import of biofuel feedstock would be needed for a prospective Swedish biofuel industry to be able to export substantial volumes of biofuels. The costs, including transport, of imported biofuels from regions, where the assessed potential supply of biomass are higher than the estimated future regional demand, are estimated to be equivalent to or lower than current costs of domestic biofuels. But the price is dependent on future competition for biofuels as well as freight and port capacity. Current specialization at Swedish ports may in the short term be an obstacle to a rapid increase in biofuel import. The energy input in long-distance biofuel transport is estimated to be low. However, to make large-scale biofuel trade flows acceptable special attention needs to be paid, e.g., to the impact on biodiversity and socioeconomic conditions in the exporting countries

  19. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  20. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  1. Evaluating cloud processes in large-scale models: Of idealized case studies, parameterization testbeds and single-column modelling on climate time-scales

    Science.gov (United States)

    Neggers, Roel

    2016-04-01

    Boundary-layer schemes have always formed an integral part of General Circulation Models (GCMs) used for numerical weather and climate prediction. The spatial and temporal scales associated with boundary-layer processes and clouds are typically much smaller than those at which GCMs are discretized, which makes their representation through parameterization a necessity. The need for generally applicable boundary-layer parameterizations has motivated many scientific studies, which in effect has created its own active research field in the atmospheric sciences. Of particular interest has been the evaluation of boundary-layer schemes at "process-level". This means that parameterized physics are studied in isolated mode from the larger-scale circulation, using prescribed forcings and excluding any upscale interaction. Although feedbacks are thus prevented, the benefit is an enhanced model transparency, which might aid an investigator in identifying model errors and understanding model behavior. The popularity and success of the process-level approach is demonstrated by the many past and ongoing model inter-comparison studies that have been organized by initiatives such as GCSS/GASS. A red line in the results of these studies is that although most schemes somehow manage to capture first-order aspects of boundary layer cloud fields, there certainly remains room for improvement in many areas. Only too often are boundary layer parameterizations still found to be at the heart of problems in large-scale models, negatively affecting forecast skills of NWP models or causing uncertainty in numerical predictions of future climate. How to break this parameterization "deadlock" remains an open problem. This presentation attempts to give an overview of the various existing methods for the process-level evaluation of boundary-layer physics in large-scale models. This includes i) idealized case studies, ii) longer-term evaluation at permanent meteorological sites (the testbed approach

  2. Partially acoustic dark matter, interacting dark radiation, and large scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Chacko, Zackaria [Maryland Center for Fundamental Physics, Department of Physics, University of Maryland,Stadium Dr., College Park, MD 20742 (United States); Cui, Yanou [Maryland Center for Fundamental Physics, Department of Physics, University of Maryland,Stadium Dr., College Park, MD 20742 (United States); Department of Physics and Astronomy, University of California-Riverside,University Ave, Riverside, CA 92521 (United States); Perimeter Institute, 31 Caroline Street, North Waterloo, Ontario N2L 2Y5 (Canada); Hong, Sungwoo [Maryland Center for Fundamental Physics, Department of Physics, University of Maryland,Stadium Dr., College Park, MD 20742 (United States); Okui, Takemichi [Department of Physics, Florida State University,College Avenue, Tallahassee, FL 32306 (United States); Tsai, Yuhsinz [Maryland Center for Fundamental Physics, Department of Physics, University of Maryland,Stadium Dr., College Park, MD 20742 (United States)

    2016-12-21

    The standard paradigm of collisionless cold dark matter is in tension with measurements on large scales. In particular, the best fit values of the Hubble rate H{sub 0} and the matter density perturbation σ{sub 8} inferred from the cosmic microwave background seem inconsistent with the results from direct measurements. We show that both problems can be solved in a framework in which dark matter consists of two distinct components, a dominant component and a subdominant component. The primary component is cold and collisionless. The secondary component is also cold, but interacts strongly with dark radiation, which itself forms a tightly coupled fluid. The growth of density perturbations in the subdominant component is inhibited by dark acoustic oscillations due to its coupling to the dark radiation, solving the σ{sub 8} problem, while the presence of tightly coupled dark radiation ameliorates the H{sub 0} problem. The subdominant component of dark matter and dark radiation continue to remain in thermal equilibrium until late times, inhibiting the formation of a dark disk. We present an example of a simple model that naturally realizes this scenario in which both constituents of dark matter are thermal WIMPs. Our scenario can be tested by future stage-IV experiments designed to probe the CMB and large scale structure.

  3. Partially acoustic dark matter, interacting dark radiation, and large scale structure

    International Nuclear Information System (INIS)

    Chacko, Zackaria; Cui, Yanou; Hong, Sungwoo; Okui, Takemichi; Tsai, Yuhsinz

    2016-01-01

    The standard paradigm of collisionless cold dark matter is in tension with measurements on large scales. In particular, the best fit values of the Hubble rate H 0 and the matter density perturbation σ 8 inferred from the cosmic microwave background seem inconsistent with the results from direct measurements. We show that both problems can be solved in a framework in which dark matter consists of two distinct components, a dominant component and a subdominant component. The primary component is cold and collisionless. The secondary component is also cold, but interacts strongly with dark radiation, which itself forms a tightly coupled fluid. The growth of density perturbations in the subdominant component is inhibited by dark acoustic oscillations due to its coupling to the dark radiation, solving the σ 8 problem, while the presence of tightly coupled dark radiation ameliorates the H 0 problem. The subdominant component of dark matter and dark radiation continue to remain in thermal equilibrium until late times, inhibiting the formation of a dark disk. We present an example of a simple model that naturally realizes this scenario in which both constituents of dark matter are thermal WIMPs. Our scenario can be tested by future stage-IV experiments designed to probe the CMB and large scale structure.

  4. SIRTA, a ground-based atmospheric observatory for cloud and aerosol research

    Directory of Open Access Journals (Sweden)

    M. Haeffelin

    2005-02-01

    Full Text Available Ground-based remote sensing observatories have a crucial role to play in providing data to improve our understanding of atmospheric processes, to test the performance of atmospheric models, and to develop new methods for future space-borne observations. Institut Pierre Simon Laplace, a French research institute in environmental sciences, created the Site Instrumental de Recherche par Télédétection Atmosphérique (SIRTA, an atmospheric observatory with these goals in mind. Today SIRTA, located 20km south of Paris, operates a suite a state-of-the-art active and passive remote sensing instruments dedicated to routine monitoring of cloud and aerosol properties, and key atmospheric parameters. Detailed description of the state of the atmospheric column is progressively archived and made accessible to the scientific community. This paper describes the SIRTA infrastructure and database, and provides an overview of the scientific research associated with the observatory. Researchers using SIRTA data conduct research on atmospheric processes involving complex interactions between clouds, aerosols and radiative and dynamic processes in the atmospheric column. Atmospheric modellers working with SIRTA observations develop new methods to test their models and innovative analyses to improve parametric representations of sub-grid processes that must be accounted for in the model. SIRTA provides the means to develop data interpretation tools for future active remote sensing missions in space (e.g. CloudSat and CALIPSO. SIRTA observation and research activities take place in networks of atmospheric observatories that allow scientists to access consistent data sets from diverse regions on the globe.

  5. Astronomical virtual observatory and the place and role of Bulgarian one

    Science.gov (United States)

    Petrov, Georgi; Dechev, Momchil; Slavcheva-Mihova, Luba; Duchlev, Peter; Mihov, Bojko; Kochev, Valentin; Bachev, Rumen

    2009-07-01

    Virtual observatory could be defined as a collection of integrated astronomical data archives and software tools that utilize computer networks to create an environment in which research can be conducted. Several countries have initiated national virtual observatory programs that combine existing databases from ground-based and orbiting observatories, scientific facility especially equipped to detect and record naturally occurring scientific phenomena. As a result, data from all the world's major observatories will be available to all users and to the public. This is significant not only because of the immense volume of astronomical data but also because the data on stars and galaxies has been compiled from observations in a variety of wavelengths-optical, radio, infrared, gamma ray, X-ray and more. In a virtual observatory environment, all of this data is integrated so that it can be synthesized and used in a given study. During the autumn of the 2001 (26.09.2001) six organizations from Europe put the establishment of the Astronomical Virtual Observatory (AVO)-ESO, ESA, Astrogrid, CDS, CNRS, Jodrell Bank (Dolensky et al., 2003). Its aims have been outlined as follows: - To provide comparative analysis of large sets of multiwavelength data; - To reuse data collected by a single source; - To provide uniform access to data; - To make data available to less-advantaged communities; - To be an educational tool. The Virtual observatory includes: - Tools that make it easy to locate and retrieve data from catalogues, archives, and databases worldwide; - Tools for data analysis, simulation, and visualization; - Tools to compare observations with results obtained from models, simulations and theory; - Interoperability: services that can be used regardless of the clients computing platform, operating system and software capabilities; - Access to data in near real-time, archived data and historical data; - Additional information - documentation, user-guides, reports

  6. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  7. EMSO: European multidisciplinary seafloor observatory

    Science.gov (United States)

    Favali, Paolo; Beranzoli, Laura

    2009-04-01

    EMSO has been identified by the ESFRI Report 2006 as one of the Research Infrastructures that European members and associated states are asked to develop in the next decades. It will be based on a European-scale network of multidisciplinary seafloor observatories from the Arctic to the Black Sea with the aim of long-term real-time monitoring of processes related to geosphere/biosphere/hydrosphere interactions. EMSO will enhance our understanding of processes, providing long time series data for the different phenomenon scales which constitute the new frontier for study of Earth interior, deep-sea biology and chemistry, and ocean processes. The development of an underwater network is based on past EU projects and is supported by several EU initiatives, such as the on-going ESONET-NoE, aimed at strengthening the ocean observatories' scientific and technological community. The EMSO development relies on the synergy between the scientific community and industry to improve European competitiveness with respect to countries such as USA, Canada and Japan. Within the FP7 Programme launched in 2006, a call for Preparatory Phase (PP) was issued in order to support the foundation of the legal and organisational entity in charge of building up and managing the infrastructure, and coordinating the financial effort among the countries. The EMSO-PP project, coordinated by the Italian INGV with participation by 11 institutions from as many European countries, started in April 2008 and will last four years.

  8. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  9. Probing Dark Energy via Neutrino and Supernova Observatories

    Energy Technology Data Exchange (ETDEWEB)

    Hall, Lawrence; Hall, Lawrence J.; Murayama, Hitoshi; Papucci, Michele; Perez, Gilad

    2006-07-10

    A novel method for extracting cosmological evolution parameters is proposed, using a probe other than light: future observations of the diffuse anti-neutrino flux emitted from core-collapse supernovae (SNe), combined with the SN rate extracted from future SN surveys. The relic SN neutrino differential flux can be extracted by using future neutrino detectors such as Gadolinium-enriched, megaton, water detectors or 100-kiloton detectors of liquid Argon or liquid scintillator. The core-collapse SN rate can be reconstructed from direct observation of SN explosions using future precision observatories. Our method, by itself, cannot compete with the accuracy of the optical-based measurements but may serve as an important consistency check as well as a source of complementary information. The proposal does not require construction of a dedicated experiment, but rather relies on future experiments proposed for other purposes.

  10. Probing Dark Energy via Neutrino and Supernova Observatories

    International Nuclear Information System (INIS)

    Hall, Lawrence; Hall, Lawrence J.; Murayama, Hitoshi; Papucci, Michele; Perez, Gilad

    2006-01-01

    A novel method for extracting cosmological evolution parameters is proposed, using a probe other than light: future observations of the diffuse anti-neutrino flux emitted from core-collapse supernovae (SNe), combined with the SN rate extracted from future SN surveys. The relic SN neutrino differential flux can be extracted by using future neutrino detectors such as Gadolinium-enriched, megaton, water detectors or 100-kiloton detectors of liquid Argon or liquid scintillator. The core-collapse SN rate can be reconstructed from direct observation of SN explosions using future precision observatories. Our method, by itself, cannot compete with the accuracy of the optical-based measurements but may serve as an important consistency check as well as a source of complementary information. The proposal does not require construction of a dedicated experiment, but rather relies on future experiments proposed for other purposes

  11. New seismic instrumentation packaged for all terrestrial environments (including the quietest observatories!).

    Science.gov (United States)

    Parker, Tim; Devanney, Peter; Bainbridge, Geoff; Townsend, Bruce

    2017-04-01

    The march to make every type of seismometer, weak to strong motion, reliable and economically deployable in any terrestrial environment continues with the availability of three new sensors and seismic systems including ones with over 200dB of dynamic range. Until recently there were probably 100 pier type broadband sensors for every observatory type pier, not the types of deployments geoscientists are needing to advance science and monitoring capability. Deeper boreholes are now the recognized quieter environments for best observatory class instruments and these same instruments can now be deployed in direct burial environments which is unprecedented. The experiences of facilities in large deployments of broadband seismometers in continental scale rolling arrays proves the utility of packaging new sensors in corrosion resistant casings and designing in the robustness needed to work reliably in temporary deployments. Integrating digitizers and other sensors decreases deployment complexity, decreases acquisition and deployment costs, increases reliability and utility. We'll discuss the informed evolution of broadband pier instruments into the modern integrated field tools that enable economic densification of monitoring arrays along with supporting new ways to approach geoscience research in a field environment.

  12. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  13. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  14. Heating of large format filters in sub-mm and fir space optics

    Science.gov (United States)

    Baccichet, N.; Savini, G.

    2017-11-01

    Most FIR and sub-mm space borne observatories use polymer-based quasi-optical elements like filters and lenses, due to their high transparency and low absorption in such wavelength ranges. Nevertheless, data from those missions have proven that thermal imbalances in the instrument (not caused by filters) can complicate the data analysis. Consequently, for future, higher precision instrumentation, further investigation is required on any thermal imbalances embedded in such polymer-based filters. Particularly, in this paper the heating of polymers when operating at cryogenic temperature in space will be studied. Such phenomenon is an important aspect of their functioning since the transient emission of unwanted thermal radiation may affect the scientific measurements. To assess this effect, a computer model was developed for polypropylene based filters and PTFE-based coatings. Specifically, a theoretical model of their thermal properties was created and used into a multi-physics simulation that accounts for conductive and radiative heating effects of large optical elements, the geometry of which was suggested by the large format array instruments designed for future space missions. It was found that in the simulated conditions, the filters temperature was characterized by a time-dependent behaviour, modulated by a small scale fluctuation. Moreover, it was noticed that thermalization was reached only when a low power input was present.

  15. Initial Technology Assessment for the Large UV-Optical-Infrared (LUVOIR) Mission Concept Study

    Science.gov (United States)

    Bolcar, Matthew R.; Feinberg, Lee D.; France, Kevin; Rauscher, Bernard J.; Redding, David; Schiminovich, David

    2016-01-01

    The NASA Astrophysics Divisions 30-Year Roadmap prioritized a future large-aperture space telescope operating in the ultra-violet-optical-infrared wavelength regime. The Association of Universities for Research in Astronomy envisioned a similar observatory, the High Definition Space Telescope. And a multi-institution group also studied the Advanced Technology Large Aperture Space Telescope. In all three cases, a broad science case is outlined, combining general astrophysics with the search for bio-signatures via direct-imaging and spectroscopic characterization of habitable exo-planets. We present an initial technology assessment that enables such an observatory that is currently being studied for the 2020 Decadal Survey by the Large UV-Optical Infrared (LUVOIR) surveyor Science and Technology Definition Team. We present here the technology prioritization for the 2016 technology cycle and define the required technology capabilities and current state-of-the-art performance. Current, planned, and recommended technology development efforts are also reported.

  16. Improvements in geomagnetic observatory data quality

    DEFF Research Database (Denmark)

    Reda, Jan; Fouassier, Danielle; Isac, Anca

    2011-01-01

    between observatories and the establishment of observatory networks has harmonized standards and practices across the world; improving the quality of the data product available to the user. Nonetheless, operating a highquality geomagnetic observatory is non-trivial. This article gives a record...... of the current state of observatory instrumentation and methods, citing some of the general problems in the complex operation of geomagnetic observatories. It further gives an overview of recent improvements of observatory data quality based on presentation during 11th IAGA Assembly at Sopron and INTERMAGNET...

  17. The Convergence of High Performance Computing and Large Scale Data Analytics

    Science.gov (United States)

    Duffy, D.; Bowen, M. K.; Thompson, J. H.; Yang, C. P.; Hu, F.; Wills, B.

    2015-12-01

    As the combinations of remote sensing observations and model outputs have grown, scientists are increasingly burdened with both the necessity and complexity of large-scale data analysis. Scientists are increasingly applying traditional high performance computing (HPC) solutions to solve their "Big Data" problems. While this approach has the benefit of limiting data movement, the HPC system is not optimized to run analytics, which can create problems that permeate throughout the HPC environment. To solve these issues and to alleviate some of the strain on the HPC environment, the NASA Center for Climate Simulation (NCCS) has created the Advanced Data Analytics Platform (ADAPT), which combines both HPC and cloud technologies to create an agile system designed for analytics. Large, commonly used data sets are stored in this system in a write once/read many file system, such as Landsat, MODIS, MERRA, and NGA. High performance virtual machines are deployed and scaled according to the individual scientist's requirements specifically for data analysis. On the software side, the NCCS and GMU are working with emerging commercial technologies and applying them to structured, binary scientific data in order to expose the data in new ways. Native NetCDF data is being stored within a Hadoop Distributed File System (HDFS) enabling storage-proximal processing through MapReduce while continuing to provide accessibility of the data to traditional applications. Once the data is stored within HDFS, an additional indexing scheme is built on top of the data and placed into a relational database. This spatiotemporal index enables extremely fast mappings of queries to data locations to dramatically speed up analytics. These are some of the first steps toward a single unified platform that optimizes for both HPC and large-scale data analysis, and this presentation will elucidate the resulting and necessary exascale architectures required for future systems.

  18. Virtual Observatories, Data Mining, and Astroinformatics

    Science.gov (United States)

    Borne, Kirk

    The historical, current, and future trends in knowledge discovery from data in astronomy are presented here. The story begins with a brief history of data gathering and data organization. A description of the development ofnew information science technologies for astronomical discovery is then presented. Among these are e-Science and the virtual observatory, with its data discovery, access, display, and integration protocols; astroinformatics and data mining for exploratory data analysis, information extraction, and knowledge discovery from distributed data collections; new sky surveys' databases, including rich multivariate observational parameter sets for large numbers of objects; and the emerging discipline of data-oriented astronomical research, called astroinformatics. Astroinformatics is described as the fourth paradigm of astronomical research, following the three traditional research methodologies: observation, theory, and computation/modeling. Astroinformatics research areas include machine learning, data mining, visualization, statistics, semantic science, and scientific data management.Each of these areas is now an active research discipline, with significantscience-enabling applications in astronomy. Research challenges and sample research scenarios are presented in these areas, in addition to sample algorithms for data-oriented research. These information science technologies enable scientific knowledge discovery from the increasingly large and complex data collections in astronomy. The education and training of the modern astronomy student must consequently include skill development in these areas, whose practitioners have traditionally been limited to applied mathematicians, computer scientists, and statisticians. Modern astronomical researchers must cross these traditional discipline boundaries, thereby borrowing the best of breed methodologies from multiple disciplines. In the era of large sky surveys and numerous large telescopes, the potential

  19. A European collaboration research programme to study and test large scale base isolated structures

    International Nuclear Information System (INIS)

    Renda, V.; Verzeletti, G.; Papa, L.

    1995-01-01

    The improvement of the technology of innovative anti-seismic mechanisms, as those for base isolation and energy dissipation, needs of testing capability for large scale models of structures integrated with these mechanisms. These kind experimental tests are of primary importance for the validation of design rules and the setting up of an advanced earthquake engineering for civil constructions of relevant interest. The Joint Research Centre of the European Commission offers the European Laboratory for Structural Assessment located at Ispra - Italy, as a focal point for an international european collaboration research programme to test large scale models of structure making use of innovative anti-seismic mechanisms. A collaboration contract, opened to other future contributions, has been signed with the national italian working group on seismic isolation (Gruppo di Lavoro sull's Isolamento Sismico GLIS) which includes the national research centre ENEA, the national electricity board ENEL, the industrial research centre ISMES and producer of isolators ALGA. (author). 3 figs

  20. Large-Scale Spacecraft Fire Safety Tests

    Science.gov (United States)

    Urban, David; Ruff, Gary A.; Ferkul, Paul V.; Olson, Sandra; Fernandez-Pello, A. Carlos; T'ien, James S.; Torero, Jose L.; Cowlard, Adam J.; Rouvreau, Sebastien; Minster, Olivier; hide

    2014-01-01

    An international collaborative program is underway to address open issues in spacecraft fire safety. Because of limited access to long-term low-gravity conditions and the small volume generally allotted for these experiments, there have been relatively few experiments that directly study spacecraft fire safety under low-gravity conditions. Furthermore, none of these experiments have studied sample sizes and environment conditions typical of those expected in a spacecraft fire. The major constraint has been the size of the sample, with prior experiments limited to samples of the order of 10 cm in length and width or smaller. This lack of experimental data forces spacecraft designers to base their designs and safety precautions on 1-g understanding of flame spread, fire detection, and suppression. However, low-gravity combustion research has demonstrated substantial differences in flame behavior in low-gravity. This, combined with the differences caused by the confined spacecraft environment, necessitates practical scale spacecraft fire safety research to mitigate risks for future space missions. To address this issue, a large-scale spacecraft fire experiment is under development by NASA and an international team of investigators. This poster presents the objectives, status, and concept of this collaborative international project (Saffire). The project plan is to conduct fire safety experiments on three sequential flights of an unmanned ISS re-supply spacecraft (the Orbital Cygnus vehicle) after they have completed their delivery of cargo to the ISS and have begun their return journeys to earth. On two flights (Saffire-1 and Saffire-3), the experiment will consist of a flame spread test involving a meter-scale sample ignited in the pressurized volume of the spacecraft and allowed to burn to completion while measurements are made. On one of the flights (Saffire-2), 9 smaller (5 x 30 cm) samples will be tested to evaluate NASAs material flammability screening tests

  1. The Solar Connections Observatory for Planetary Environments

    Science.gov (United States)

    Oliversen, Ronald J.; Harris, Walter M.; Oegerle, William R. (Technical Monitor)

    2002-01-01

    The NASA Sun-Earth Connection theme roadmap calls for comparative study of how the planets, comets, and local interstellar medium (LISM) interact with the Sun and respond to solar variability. Through such a study we advance our understanding of basic physical plasma and gas dynamic processes, thus increasing our predictive capabilities for the terrestrial, planetary, and interplanetary environments where future remote and human exploration will occur. Because the other planets have lacked study initiatives comparable to the terrestrial ITM, LWS, and EOS programs, our understanding of the upper atmospheres and near space environments on these worlds is far less detailed than our knowledge of the Earth. To close this gap we propose a mission to study {\\it all) of the solar interacting bodies in our planetary system out to the heliopause with a single remote sensing space observatory, the Solar Connections Observatory for Planetary Environments (SCOPE). SCOPE consists of a binocular EUV/FUV telescope operating from a remote, driftaway orbit that provides sub-arcsecond imaging and broadband medium resolution spectro-imaging over the 55-290 nm bandpass, and high (R>10$^{5}$ resolution H Ly-$\\alpha$ emission line profile measurements of small scale planetary and wide field diffuse solar system structures. A key to the SCOPE approach is to include Earth as a primary science target. From its remote vantage point SCOPE will be able to observe auroral emission to and beyond the rotational pole. The other planets and comets will be monitored in long duration campaigns centered when possible on solar opposition when interleaved terrestrial-planet observations can be used to directly compare the response of both worlds to the same solar wind stream and UV radiation field. Using a combination of observations and MHD models, SCOPE will isolate the different controlling parameters in each planet system and gain insight into the underlying physical processes that define the

  2. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  3. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  4. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  5. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  6. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  7. ASSOCIATION OF {sup 3}He-RICH SOLAR ENERGETIC PARTICLES WITH LARGE-SCALE CORONAL WAVES

    Energy Technology Data Exchange (ETDEWEB)

    Bučík, Radoslav [Institut für Astrophysik, Georg-August-Universität Göttingen, D-37077, Göttingen (Germany); Innes, Davina E. [Max-Planck-Institut für Sonnensystemforschung, D-37077, Göttingen (Germany); Mason, Glenn M. [Applied Physics Laboratory, Johns Hopkins University, Laurel, MD 20723 (United States); Wiedenbeck, Mark E., E-mail: bucik@mps.mpg.de [Jet Propulsion Laboratory, California Institute of Technology, Pasadena, CA 91109 (United States)

    2016-12-10

    Small, {sup 3}He-rich solar energetic particle (SEP) events have been commonly associated with extreme-ultraviolet (EUV) jets and narrow coronal mass ejections (CMEs) that are believed to be the signatures of magnetic reconnection, involving field lines open to interplanetary space. The elemental and isotopic fractionation in these events are thought to be caused by processes confined to the flare sites. In this study, we identify 32 {sup 3}He-rich SEP events observed by the Advanced Composition Explorer , near the Earth, during the solar minimum period 2007–2010, and we examine their solar sources with the high resolution Solar Terrestrial Relations Observatory ( STEREO ) EUV images. Leading the Earth, STEREO -A has provided, for the first time, a direct view on {sup 3}He-rich flares, which are generally located on the Sun’s western hemisphere. Surprisingly, we find that about half of the {sup 3}He-rich SEP events in this survey are associated with large-scale EUV coronal waves. An examination of the wave front propagation, the source-flare distribution, and the coronal magnetic field connections suggests that the EUV waves may affect the injection of {sup 3}He-rich SEPs into interplanetary space.

  8. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  9. Auroral electrojet dynamics during magnetic storms, connection with plasma precipitation and large-scale structure of the magnetospheric magnetic field

    Directory of Open Access Journals (Sweden)

    Y. I. Feldstein

    1999-04-01

    magnetospheric magnetic field paraboloid model the influence of the ring current and magnetospheric tail plasma sheet currents on large-scale magnetosphere structure is considered.Key words. Ionosphere (particle precipitation · Magnetospheric physics (current systems; magnetospheric configuration and dynamics.

  10. Energy System Analysis of Large-Scale Integration of Wind Power

    International Nuclear Information System (INIS)

    Lund, Henrik

    2003-11-01

    The paper presents the results of two research projects conducted by Aalborg University and financed by the Danish Energy Research Programme. Both projects include the development of models and system analysis with focus on large-scale integration of wind power into different energy systems. Market reactions and ability to exploit exchange on the international market for electricity by locating exports in hours of high prices are included in the analyses. This paper focuses on results which are valid for energy systems in general. The paper presents the ability of different energy systems and regulation strategies to integrate wind power, The ability is expressed by three factors: One factor is the degree of electricity excess production caused by fluctuations in wind and CHP heat demands. The other factor is the ability to utilise wind power to reduce CO 2 emission in the system. And the third factor is the ability to benefit from exchange of electricity on the market. Energy systems and regulation strategies are analysed in the range of a wind power input from 0 to 100% of the electricity demand. Based on the Danish energy system, in which 50 per cent of the electricity demand is produced in CHP, a number of future energy systems with CO 2 reduction potentials are analysed, i.e. systems with more CHP, systems using electricity for transportation (battery or hydrogen vehicles) and systems with fuel-cell technologies. For the present and such potential future energy systems different regulation strategies have been analysed, i.e. the inclusion of small CHP plants into the regulation task of electricity balancing and grid stability and investments in electric heating, heat pumps and heat storage capacity. Also the potential of energy management has been analysed. The results of the analyses make it possible to compare short-term and long-term potentials of different strategies of large-scale integration of wind power

  11. TrackingNet: A Large-Scale Dataset and Benchmark for Object Tracking in the Wild

    KAUST Repository

    Mü ller, Matthias; Bibi, Adel Aamer; Giancola, Silvio; Al-Subaihi, Salman; Ghanem, Bernard

    2018-01-01

    Despite the numerous developments in object tracking, further development of current tracking algorithms is limited by small and mostly saturated datasets. As a matter of fact, data-hungry trackers based on deep-learning currently rely on object detection datasets due to the scarcity of dedicated large-scale tracking datasets. In this work, we present TrackingNet, the first large-scale dataset and benchmark for object tracking in the wild. We provide more than 30K videos with more than 14 million dense bounding box annotations. Our dataset covers a wide selection of object classes in broad and diverse context. By releasing such a large-scale dataset, we expect deep trackers to further improve and generalize. In addition, we introduce a new benchmark composed of 500 novel videos, modeled with a distribution similar to our training dataset. By sequestering the annotation of the test set and providing an online evaluation server, we provide a fair benchmark for future development of object trackers. Deep trackers fine-tuned on a fraction of our dataset improve their performance by up to 1.6% on OTB100 and up to 1.7% on TrackingNet Test. We provide an extensive benchmark on TrackingNet by evaluating more than 20 trackers. Our results suggest that object tracking in the wild is far from being solved.

  12. TrackingNet: A Large-Scale Dataset and Benchmark for Object Tracking in the Wild

    KAUST Repository

    Müller, Matthias

    2018-03-28

    Despite the numerous developments in object tracking, further development of current tracking algorithms is limited by small and mostly saturated datasets. As a matter of fact, data-hungry trackers based on deep-learning currently rely on object detection datasets due to the scarcity of dedicated large-scale tracking datasets. In this work, we present TrackingNet, the first large-scale dataset and benchmark for object tracking in the wild. We provide more than 30K videos with more than 14 million dense bounding box annotations. Our dataset covers a wide selection of object classes in broad and diverse context. By releasing such a large-scale dataset, we expect deep trackers to further improve and generalize. In addition, we introduce a new benchmark composed of 500 novel videos, modeled with a distribution similar to our training dataset. By sequestering the annotation of the test set and providing an online evaluation server, we provide a fair benchmark for future development of object trackers. Deep trackers fine-tuned on a fraction of our dataset improve their performance by up to 1.6% on OTB100 and up to 1.7% on TrackingNet Test. We provide an extensive benchmark on TrackingNet by evaluating more than 20 trackers. Our results suggest that object tracking in the wild is far from being solved.

  13. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  14. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  15. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  16. Recent results from the Compton Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Michelson, P.F.; Hansen, W.W. [Stanford Univ., CA (United States)

    1994-12-01

    The Compton Observatory is an orbiting astronomical observatory for gamma-ray astronomy that covers the energy range from about 30 keV to 30 GeV. The Energetic Gamma Ray Experiment Telescope (EGRET), one of four instruments on-board, is capable of detecting and imaging gamma radiation from cosmic sources in the energy range from approximately 20 MeV to 30 GeV. After about one month of tests and calibration following the April 1991 launch, a 15-month all sky survey was begun. This survey is now complete and the Compton Observatory is well into Phase II of its observing program which includes guest investigator observations. Among the highlights from the all-sky survey discussed in this presentation are the following: detection of five pulsars with emission above 100 MeV; detection of more than 24 active galaxies, the most distant at redshift greater than two; detection of many high latitude, unidentified gamma-ray sources, some showing significant time variability; detection of at least two high energy gamma-ray bursts, with emission in one case extending to at least 1 GeV. EGRET has also detected gamma-ray emission from solar flares up to energies of at least 2 GeV and has observed gamma-rays from the Large Magellanic Cloud.

  17. Technology Development for a Neutrino Astrophysical Observatory

    International Nuclear Information System (INIS)

    Chaloupka, V.; Cole, T.; Crawford, H.J.; He, Y.D.; Jackson, S.; Kleinfelder, S.; Lai, K.W.; Learned, J.; Ling, J.; Liu, D.; Lowder, D.; Moorhead, M.; Morookian, J.M.; Nygren, D.R.; Price, P.B.; Richards, A.; Shapiro, G.; Shen, B.; Smoot, George F.; Stokstad, R.G.; VanDalen, G.; Wilkes, J.; Wright, F.; Young, K.

    1996-01-01

    We propose a set of technology developments relevant to the design of an optimized Cerenkov detector for the study of neutrino interactions of astrophysical interest. Emphasis is placed on signal processing innovations that enhance significantly the quality of primary data. These technical advances, combined with field experience from a follow-on test deployment, are intended to provide a basis for the engineering design for a kilometer-scale Neutrino Astrophysical Observatory

  18. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  19. Modelling high Reynolds number wall-turbulence interactions in laboratory experiments using large-scale free-stream turbulence.

    Science.gov (United States)

    Dogan, Eda; Hearst, R Jason; Ganapathisubramani, Bharathram

    2017-03-13

    A turbulent boundary layer subjected to free-stream turbulence is investigated in order to ascertain the scale interactions that dominate the near-wall region. The results are discussed in relation to a canonical high Reynolds number turbulent boundary layer because previous studies have reported considerable similarities between these two flows. Measurements were acquired simultaneously from four hot wires mounted to a rake which was traversed through the boundary layer. Particular focus is given to two main features of both canonical high Reynolds number boundary layers and boundary layers subjected to free-stream turbulence: (i) the footprint of the large scales in the logarithmic region on the near-wall small scales, specifically the modulating interaction between these scales, and (ii) the phase difference in amplitude modulation. The potential for a turbulent boundary layer subjected to free-stream turbulence to 'simulate' high Reynolds number wall-turbulence interactions is discussed. The results of this study have encouraging implications for future investigations of the fundamental scale interactions that take place in high Reynolds number flows as it demonstrates that these can be achieved at typical laboratory scales.This article is part of the themed issue 'Toward the development of high-fidelity models of wall turbulence at large Reynolds number'. © 2017 The Author(s).

  20. Projection Effects of Large-scale Structures on Weak-lensing Peak Abundances

    Science.gov (United States)

    Yuan, Shuo; Liu, Xiangkun; Pan, Chuzhong; Wang, Qiao; Fan, Zuhui

    2018-04-01

    High peaks in weak lensing (WL) maps originate dominantly from the lensing effects of single massive halos. Their abundance is therefore closely related to the halo mass function and thus a powerful cosmological probe. However, besides individual massive halos, large-scale structures (LSS) along lines of sight also contribute to the peak signals. In this paper, with ray-tracing simulations, we investigate the LSS projection effects. We show that for current surveys with a large shape noise, the stochastic LSS effects are subdominant. For future WL surveys with source galaxies having a median redshift z med ∼ 1 or higher, however, they are significant. For the cosmological constraints derived from observed WL high-peak counts, severe biases can occur if the LSS effects are not taken into account properly. We extend the model of Fan et al. by incorporating the LSS projection effects into the theoretical considerations. By comparing with simulation results, we demonstrate the good performance of the improved model and its applicability in cosmological studies.

  1. Application of parallel computing techniques to a large-scale reservoir simulation

    International Nuclear Information System (INIS)

    Zhang, Keni; Wu, Yu-Shu; Ding, Chris; Pruess, Karsten

    2001-01-01

    Even with the continual advances made in both computational algorithms and computer hardware used in reservoir modeling studies, large-scale simulation of fluid and heat flow in heterogeneous reservoirs remains a challenge. The problem commonly arises from intensive computational requirement for detailed modeling investigations of real-world reservoirs. This paper presents the application of a massive parallel-computing version of the TOUGH2 code developed for performing large-scale field simulations. As an application example, the parallelized TOUGH2 code is applied to develop a three-dimensional unsaturated-zone numerical model simulating flow of moisture, gas, and heat in the unsaturated zone of Yucca Mountain, Nevada, a potential repository for high-level radioactive waste. The modeling approach employs refined spatial discretization to represent the heterogeneous fractured tuffs of the system, using more than a million 3-D gridblocks. The problem of two-phase flow and heat transfer within the model domain leads to a total of 3,226,566 linear equations to be solved per Newton iteration. The simulation is conducted on a Cray T3E-900, a distributed-memory massively parallel computer. Simulation results indicate that the parallel computing technique, as implemented in the TOUGH2 code, is very efficient. The reliability and accuracy of the model results have been demonstrated by comparing them to those of small-scale (coarse-grid) models. These comparisons show that simulation results obtained with the refined grid provide more detailed predictions of the future flow conditions at the site, aiding in the assessment of proposed repository performance

  2. Plasmonic resonances of nanoparticles from large-scale quantum mechanical simulations

    Science.gov (United States)

    Zhang, Xu; Xiang, Hongping; Zhang, Mingliang; Lu, Gang

    2017-09-01

    Plasmonic resonance of metallic nanoparticles results from coherent motion of its conduction electrons, driven by incident light. For the nanoparticles less than 10 nm in diameter, localized surface plasmonic resonances become sensitive to the quantum nature of the conduction electrons. Unfortunately, quantum mechanical simulations based on time-dependent Kohn-Sham density functional theory are computationally too expensive to tackle metal particles larger than 2 nm. Herein, we introduce the recently developed time-dependent orbital-free density functional theory (TD-OFDFT) approach which enables large-scale quantum mechanical simulations of plasmonic responses of metallic nanostructures. Using TD-OFDFT, we have performed quantum mechanical simulations to understand size-dependent plasmonic response of Na nanoparticles and plasmonic responses in Na nanoparticle dimers and trimers. An outlook of future development of the TD-OFDFT method is also presented.

  3. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  4. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  5. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  6. Wind Observatory 2017. Analysis of the wind power market, wind jobs and future of the wind industry in France

    International Nuclear Information System (INIS)

    2017-09-01

    Two years after the enactment of the Energy Transition for Green Growth Act, wind power capacity continues to grow in France, exceeding 12 GWatt the end of 2016 and soon to account for 5% of France's electric power consumption. This vitality, which is set to continue in 2017, will help France achieve its objectives of an installed capacity of 15,000 MW in onshore wind by 2018 and 21,800 to 26,000 MW by 2023. The current pace will nevertheless have to be accelerated in order to reach the realistic objective of 26 GW by 2023 mentioned in the multi-annual energy plan (PPE). With 1,400 jobs created in one year and more than 3,300 over the last two years, the relevance of wind power as a driving force of sustainable job creation throughout the country is unequivocally confirmed: the increase in wind power capacity continues to contribute to the growth in employment in the country. Prepared in collaboration with the consulting firm BearingPoint, the 2017 edition of the Observatory aims to give the reader an overview of employment in the wind industry and the wind power market over the period under consideration. Any changes from the three previous editions are highlighted. It is based on a comprehensive census of all market participants on three themes: employment, the market and the future of wind power. The Observatory gives an accurate picture of how the wind energy industry is structured, thereby presenting a precise overview of the wind energy industry and all its components

  7. The Virtual Solar Observatory and the Heliophysics Meta-Virtual Observatory

    Science.gov (United States)

    Gurman, Joseph B.

    2007-01-01

    The Virtual Solar Observatory (VSO) is now able to search for solar data ranging from the radio to gamma rays, obtained from space and groundbased observatories, from 26 sources at 12 data providers, and from 1915 to the present. The solar physics community can use a Web interface or an Application Programming Interface (API) that allows integrating VSO searches into other software, including other Web services. Over the next few years, this integration will be especially obvious as the NASA Heliophysics division sponsors the development of a heliophysics-wide virtual observatory (VO), based on existing VO's in heliospheric, magnetospheric, and ionospheric physics as well as the VSO. We examine some of the challenges and potential of such a "meta-VO."

  8. Large-scale demonstration of D ampersand D technologies

    International Nuclear Information System (INIS)

    Bhattacharyya, S.K.; Black, D.B.; Rose, R.W.

    1997-01-01

    It is becoming increasingly evident that new technologies will need to be utilized for decontamination and decommissioning (D ampersand D) activities in order to assure safe and cost effective operations. The magnitude of the international D ampersand D problem is sufficiently large in anticipated cost (100's of billions of dollars) and in elapsed time (decades), that the utilization of new technologies should lead to substantial improvements in cost and safety performance. Adoption of new technologies in the generally highly contaminated D ampersand D environments requires assurances that the technology will perform as advertised. Such assurances can be obtained from demonstrations of the technology in environments that are similar to the actual environments without being quite as contaminated and hazardous. The Large Scale Demonstration Project (LSDP) concept was designed to provide such a function. The first LSDP funded by the U.S. Department Of Energy's Environmental Management Office (EM) was on the Chicago Pile 5 (CP-5) Reactor at Argonne National Laboratory. The project, conducted by a Strategic Alliance for Environmental Restoration, has completed demonstrations of 10 D ampersand D technologies and is in the process of comparing the performance to baseline technologies. At the conclusion of the project, a catalog of performance comparisons of these technologies will be developed that will be suitable for use by future D ampersand D planners

  9. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  10. Initial Technology Assessment for the Large-Aperture UV-Optical-Infrared (LUVOIR) Mission Concept Study

    Science.gov (United States)

    Bolcar, Matthew R.; Feinberg, Lee; France, Kevin; Rauscher, Bernard J.; Redding, David; Schiminovich, David

    2016-01-01

    The NASA Astrophysics Division's 30-Year Roadmap prioritized a future large-aperture space telescope operating in the ultra-violet/optical/infrared wavelength regime. The Association of Universities for Research in Astronomy envisioned a similar observatory, the High Definition Space Telescope. And a multi-institution group also studied the Advanced Technology Large Aperture Space Telescope. In all three cases, a broad science case is outlined, combining general astrophysics with the search for biosignatures via direct-imaging and spectroscopic characterization of habitable exoplanets. We present an initial technology assessment that enables such an observatory that is currently being studied for the 2020 Decadal Survey by the Large UV/Optical/Infrared (LUVOIR) surveyor Science and Technology Definition Team. We present here the technology prioritization for the 2016 technology cycle and define the required technology capabilities and current state-of-the-art performance. Current, planned, and recommended technology development efforts are also reported.

  11. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  12. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  13. Towards Agent-Based Simulation of Emerging and Large-Scale Social Networks. Examples of the Migrant Crisis and MMORPGs

    Directory of Open Access Journals (Sweden)

    Schatten, Markus

    2016-10-01

    Full Text Available Large-scale agent based simulation of social networks is described in the context of the migrant crisis in Syria and the EU as well as massively multi-player on-line role playing games (MMORPG. The recipeWorld system by Terna and Fontana is proposed as a possible solution to simulating large-scale social networks. The initial system has been re-implemented using the Smart Python multi-Agent Development Environment (SPADE and Pyinteractive was used for visualization. We present initial models of simulation that we plan to develop further in future studies. Thus this paper is research in progress that will hopefully establish a novel agent-based modelling system in the context of the ModelMMORPG project.

  14. The Wide Field Imager of the International X-ray Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Stefanescu, A., E-mail: astefan@hll.mpg.d [Max-Planck-Institut Halbleiterlabor, Otto-Hahn-Ring 6, 81739 Muenchen (Germany); Johannes Gutenberg-Universitaet, Inst. f. anorganische und analytische Chemie, 55099 Mainz (Germany); Bautz, M.W. [Kavli Institute for Astrophysics and Space Research, Massachusetts Institute of Technology, Cambridge, MA 02139-4307 (United States); Burrows, D.N. [Department of Astronomy and Astrophysics, Pennsylvania State University, University Park, PA 16802 (United States); Bombelli, L.; Fiorini, C. [Politecnico di Milano, Dipartimento di Elettronica e Informazione, Milano (Italy); INFN Sezione di Milano, Milano (Italy); Fraser, G. [Space Research Centre, Department of Physics and Astronomy, University of Leicester, University Road, Leicester LE1 7RH (United Kingdom); Heinzinger, K. [PNSensor GmbH, Roemerstr. 28, 80803 Muenchen (Germany); Herrmann, S. [Max-Planck-Institut Halbleiterlabor, Otto-Hahn-Ring 6, 81739 Muenchen (Germany); Max-Planck-Institut fuer extraterrestrische Physik, Giessenbachstr., 85748 Garching (Germany); Kuster, M. [Technische Universitaet Darmstadt, Institut fuer Kernphysik, Schlossgartenstr. 9, 64289 Darmstadt (Germany); Lauf, T. [Max-Planck-Institut Halbleiterlabor, Otto-Hahn-Ring 6, 81739 Muenchen (Germany); Max-Planck-Institut fuer extraterrestrische Physik, Giessenbachstr., 85748 Garching (Germany); Lechner, P. [PNSensor GmbH, Roemerstr. 28, 80803 Muenchen (Germany); Lutz, G. [Max-Planck-Institut Halbleiterlabor, Otto-Hahn-Ring 6, 81739 Muenchen (Germany); Max-Planck-Institut fuer Physik, Foehringer Ring 6, 80805 Muenchen (Germany); Majewski, P. [PNSensor GmbH, Roemerstr. 28, 80803 Muenchen (Germany); Meuris, A. [Max-Planck-Institut Halbleiterlabor, Otto-Hahn-Ring 6, 81739 Muenchen (Germany); Max-Planck-Institut fuer extraterrestrische Physik, Giessenbachstr., 85748 Garching (Germany); Murray, S.S. [Harvard/Smithsonian Center for Astrophysics, 60 Garden St., Cambridge, MA 02138 (United States)

    2010-12-11

    The International X-ray Observatory (IXO) will be a joint X-ray observatory mission by ESA, NASA and JAXA. It will have a large effective area (3 m{sup 2} at 1.25 keV) grazing incidence mirror system with good angular resolution (5 arcsec at 0.1-10 keV) and will feature a comprehensive suite of scientific instruments: an X-ray Microcalorimeter Spectrometer, a High Time Resolution Spectrometer, an X-ray Polarimeter, an X-ray Grating Spectrometer, a Hard X-ray Imager and a Wide-Field Imager. The Wide Field Imager (WFI) has a field-of-view of 18 ftx18 ft. It will be sensitive between 0.1 and 15 keV, offer the full angular resolution of the mirrors and good energy resolution. The WFI will be implemented as a 6 in. wafer-scale monolithical array of 1024x1024 pixels of 100x100{mu}m{sup 2} size. The DEpleted P-channel Field-Effect Transistors (DEPFET) forming the individual pixels are devices combining the functionalities of both detector and amplifier. Signal electrons are collected in a potential well below the transistor's gate, modulating the transistor current. Even when the device is powered off, the signal charge is collected and kept in the potential well below the gate until it is explicitly cleared. This makes flexible and fast readout modes possible.

  15. Optimizing fixed observational assets in a coastal observatory

    Science.gov (United States)

    Frolov, Sergey; Baptista, António; Wilkin, Michael

    2008-11-01

    Proliferation of coastal observatories necessitates an objective approach to managing of observational assets. In this article, we used our experience in the coastal observatory for the Columbia River estuary and plume to identify and address common problems in managing of fixed observational assets, such as salinity, temperature, and water level sensors attached to pilings and moorings. Specifically, we addressed the following problems: assessing the quality of an existing array, adding stations to an existing array, removing stations from an existing array, validating an array design, and targeting of an array toward data assimilation or monitoring. Our analysis was based on a combination of methods from oceanographic and statistical literature, mainly on the statistical machinery of the best linear unbiased estimator. The key information required for our analysis was the covariance structure for a field of interest, which was computed from the output of assimilated and non-assimilated models of the Columbia River estuary and plume. The network optimization experiments in the Columbia River estuary and plume proved to be successful, largely withstanding the scrutiny of sensitivity and validation studies, and hence providing valuable insight into optimization and operation of the existing observational network. Our success in the Columbia River estuary and plume suggest that algorithms for optimal placement of sensors are reaching maturity and are likely to play a significant role in the design of emerging ocean observatories, such as the United State's ocean observation initiative (OOI) and integrated ocean observing system (IOOS) observatories, and smaller regional observatories.

  16. Survey and research for the enhancement of large-scale technology development 2. How large-scale technology development should be in the future; Ogata gijutsu kaihatsu suishin no tame no chosa kenkyu. 2. Kongo no ogata gijutsu kaihatsu no arikata

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1981-03-01

    A survey is conducted over the subject matter by holding interviews with people, employed with the entrusted businesses participating in the large-scale industrial technology development system, who are engaged in the development of industrial technologies, and with people of experience or academic background involved in the project enhancement effort. Needs of improvement are pointed out that the competition principle based for example on parallel development be introduced; that research-on-research be practiced for effective task institution; midway evaluation be substantiated since prior evaluation is difficult; efforts be made to organize new industries utilizing the fruits of large-scale industrial technology for the creation of markets, not to induce economic conflicts; that transfer of technologies be enhanced from the private sector to public sector. Studies are made about the review of research conducting systems; utilization of the power of private sector research and development efforts; enlightening about industrial proprietorship; and the diffusion of large-scale project systems. In this connection, problems are pointed out, requests are submitted, and remedial measures and suggestions are presented. (NEDO)

  17. Findings and Challenges in Fine-Resolution Large-Scale Hydrological Modeling

    Science.gov (United States)

    Her, Y. G.

    2017-12-01

    Fine-resolution large-scale (FL) modeling can provide the overall picture of the hydrological cycle and transport while taking into account unique local conditions in the simulation. It can also help develop water resources management plans consistent across spatial scales by describing the spatial consequences of decisions and hydrological events extensively. FL modeling is expected to be common in the near future as global-scale remotely sensed data are emerging, and computing resources have been advanced rapidly. There are several spatially distributed models available for hydrological analyses. Some of them rely on numerical methods such as finite difference/element methods (FDM/FEM), which require excessive computing resources (implicit scheme) to manipulate large matrices or small simulation time intervals (explicit scheme) to maintain the stability of the solution, to describe two-dimensional overland processes. Others make unrealistic assumptions such as constant overland flow velocity to reduce the computational loads of the simulation. Thus, simulation efficiency often comes at the expense of precision and reliability in FL modeling. Here, we introduce a new FL continuous hydrological model and its application to four watersheds in different landscapes and sizes from 3.5 km2 to 2,800 km2 at the spatial resolution of 30 m on an hourly basis. The model provided acceptable accuracy statistics in reproducing hydrological observations made in the watersheds. The modeling outputs including the maps of simulated travel time, runoff depth, soil water content, and groundwater recharge, were animated, visualizing the dynamics of hydrological processes occurring in the watersheds during and between storm events. Findings and challenges were discussed in the context of modeling efficiency, accuracy, and reproducibility, which we found can be improved by employing advanced computing techniques and hydrological understandings, by using remotely sensed hydrological

  18. Research on unit commitment with large-scale wind power connected power system

    Science.gov (United States)

    Jiao, Ran; Zhang, Baoqun; Chi, Zhongjun; Gong, Cheng; Ma, Longfei; Yang, Bing

    2017-01-01

    Large-scale integration of wind power generators into power grid brings severe challenges to power system economic dispatch due to its stochastic volatility. Unit commitment including wind farm is analyzed from the two parts of modeling and solving methods. The structures and characteristics can be summarized after classification has been done according to different objective function and constraints. Finally, the issues to be solved and possible directions of research and development in the future are discussed, which can adapt to the requirements of the electricity market, energy-saving power generation dispatching and smart grid, even providing reference for research and practice of researchers and workers in this field.

  19. Contributions of the "Great" X-Ray Observatories (XMM-Newton and Chandra) to Astronomy and Astrophysics

    Science.gov (United States)

    Weisskopf, Martin

    2011-01-01

    NASA s Chandra X-ray Observatory and ESA s XMM-Newton made their first observations over a decade ago. The unprecedented and complementary capabilities of these observatories to detect, image, and measure the energy of cosmic X-rays, achieved less than 50 years after the first detection of an extra-solar X-ray source, represent an increase in sensitivity comparable in going from naked-eye observations to the most powerful optical telescopes over the past 400 years. In this presentation we highlight some of the many discoveries made using these powerful X-ray observatories that have transformed 21st century astronomy. We briefly discuss future prospects for this truly exciting field.

  20. OSoMe: the IUNI observatory on social media

    Directory of Open Access Journals (Sweden)

    Clayton A. Davis

    2016-10-01

    Full Text Available The study of social phenomena is becoming increasingly reliant on big data from online social networks. Broad access to social media data, however, requires software development skills that not all researchers possess. Here we present the IUNI Observatory on Social Media, an open analytics platform designed to facilitate computational social science. The system leverages a historical, ongoing collection of over 70 billion public messages from Twitter. We illustrate a number of interactive open-source tools to retrieve, visualize, and analyze derived data from this collection. The Observatory, now available at osome.iuni.iu.edu, is the result of a large, six-year collaborative effort coordinated by the Indiana University Network Science Institute.

  1. Red Geyser: A New Class of Galaxy with Large-scale AGN-driven Winds

    Science.gov (United States)

    Roy, Namrata; Bundy, Kevin; Cheung, Edmond; MaNGA Team

    2018-01-01

    A new class of quiescent (non-star-forming) galaxies harboring possible AGN-driven winds have been discovered using the spatially resolved optical spectroscopy from the ongoing SDSS-IV MaNGA (Sloan Digital Sky Survey-IV Mapping Nearby Galaxies at Apache Point Observatory) survey. These galaxies named "red geysers" constitute 5%-10% of the local quiescent galaxy population and are characterized by narrow bisymmetric ionized gas emission patterns. These enhanced patterns are seen in equivalent width maps of Hα, [OIII] and other strong emission lines. They are co-aligned with the ionized gas velocity gradients but significantly misaligned with stellar velocity gradients. They also show very high gas velocity dispersions (~200 km/s). Considering these observations in light of models of the gravitational potential, Cheung et al. argued that red geysers host large-scale AGN-driven winds of ionized gas that may play a role in suppressing star formation at late times. In this work, we test the hypothesis that AGN activity is ultimately responsible for the red geyser phenomenon. We compare the nuclear radio activity of the red geysers to a matched control sample of galaxies of similar stellar mass, redshift, rest frame NUV–r color and axis ratio. and additionally, control for the presence of ionized gas. We have used 1.4 GHz radio continuum data from the VLA FIRST Survey to stack the radio flux from the red geyser sample and control sample. We find that the red geysers have a higher average radio flux than the control galaxies at > 3σ significance. Our sample is restricted to rest-frame NUV–r color > 5, thus ruling out possible radio emission due to star formation activity. We conclude that red geysers are associated with more active AGN, supporting a feedback picture in which episodic AGN activity drives large-scale but relatively weak ionized winds in many in many early-type galaxies.

  2. Future-oriented maintenance strategy based on automated processes is finding its way into large astronomical facilities at remote observing sites

    Science.gov (United States)

    Silber, Armin; Gonzalez, Christian; Pino, Francisco; Escarate, Patricio; Gairing, Stefan

    2014-08-01

    With expanding sizes and increasing complexity of large astronomical observatories on remote observing sites, the call for an efficient and recourses saving maintenance concept becomes louder. The increasing number of subsystems on telescopes and instruments forces large observatories, like in industries, to rethink conventional maintenance strategies for reaching this demanding goal. The implementation of full-, or semi-automatic processes for standard service activities can help to keep the number of operating staff on an efficient level and to reduce significantly the consumption of valuable consumables or equipment. In this contribution we will demonstrate on the example of the 80 Cryogenic subsystems of the ALMA Front End instrument, how an implemented automatic service process increases the availability of spare parts and Line Replaceable Units. Furthermore how valuable staff recourses can be freed from continuous repetitive maintenance activities, to allow focusing more on system diagnostic tasks, troubleshooting and the interchanging of line replaceable units. The required service activities are decoupled from the day-to-day work, eliminating dependencies on workload peaks or logistic constrains. The automatic refurbishing processes running in parallel to the operational tasks with constant quality and without compromising the performance of the serviced system components. Consequentially that results in an efficiency increase, less down time and keeps the observing schedule on track. Automatic service processes in combination with proactive maintenance concepts are providing the necessary flexibility for the complex operational work structures of large observatories. The gained planning flexibility is allowing an optimization of operational procedures and sequences by considering the required cost efficiency.

  3. Recent Ultra High Energy neutrino bounds and multimessenger observations with the Pierre Auger Observatory

    Science.gov (United States)

    Zas, Enrique

    2018-01-01

    The overall picture of the highest energy particles produced in the Universe is changing because of measurements made with the Pierre Auger Observatory. Composition studies of cosmic rays point towards an unexpected mixed composition of intermediate mass nuclei, more isotropic than anticipated, which is reshaping the future of the field and underlining the priority to understand composition at the highest energies. The Observatory is competitive in the search for neutrinos of all flavors above about 100 PeV by looking for very inclined showers produced deep in the atmosphere by neutrinos interacting either in the atmosphere or in the Earth's crust. It covers a large field of view between -85° and 60° declination in equatorial coordinates. Neutrinos are expected because of the existence of ultra high energy cosmic rays. They provide valuable complementary information, their fluxes being sensitive to the primary cosmic ray masses and their directions reflecting the source positions. We report the results of the neutrino search providing competitive bounds to neutrino production and strong constraints to a number of production models including cosmogenic neutrinos due to ultra high energy protons. We also report on two recent contributions of the Observatory to multimessenger studies by searching for correlations of neutrinos both with cosmic rays and with gravitational waves. The correlations of the directions of the highest energy astrophysical neutrinos discovered with IceCube with the highest energy cosmic rays detected with the Auger Observatory and the Telescope Array revealed an excess that is not statistically significant and is being monitored. The targeted search for neutrinos correlated with the discovery of the gravitational wave events GW150914 and GW151226 with advanced LIGO has led to the first bounds on the energy emitted by black hole mergers in Ultra-High Energy Neutrinos.

  4. Recent Ultra High Energy neutrino bounds and multimessenger observations with the Pierre Auger Observatory

    Directory of Open Access Journals (Sweden)

    Zas Enrique

    2017-01-01

    Full Text Available The overall picture of the highest energy particles produced in the Universe is changing because of measurements made with the Pierre Auger Observatory. Composition studies of cosmic rays point towards an unexpected mixed composition of intermediate mass nuclei, more isotropic than anticipated, which is reshaping the future of the field and underlining the priority to understand composition at the highest energies. The Observatory is competitive in the search for neutrinos of all flavors above about 100 PeV by looking for very inclined showers produced deep in the atmosphere by neutrinos interacting either in the atmosphere or in the Earth’s crust. It covers a large field of view between −85◦ and 60◦ declination in equatorial coordinates. Neutrinos are expected because of the existence of ultra high energy cosmic rays. They provide valuable complementary information, their fluxes being sensitive to the primary cosmic ray masses and their directions reflecting the source positions. We report the results of the neutrino search providing competitive bounds to neutrino production and strong constraints to a number of production models including cosmogenic neutrinos due to ultra high energy protons. We also report on two recent contributions of the Observatory to multimessenger studies by searching for correlations of neutrinos both with cosmic rays and with gravitational waves. The correlations of the directions of the highest energy astrophysical neutrinos discovered with IceCube with the highest energy cosmic rays detected with the Auger Observatory and the Telescope Array revealed an excess that is not statistically significant and is being monitored. The targeted search for neutrinos correlated with the discovery of the gravitational wave events GW150914 and GW151226 with advanced LIGO has led to the first bounds on the energy emitted by black hole mergers in Ultra-High Energy Neutrinos.

  5. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  6. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  7. A pilot Virtual Observatory (pVO) for integrated catchment science - Demonstration of national scale modelling of hydrology and biogeochemistry (Invited)

    Science.gov (United States)

    Freer, J. E.; Bloomfield, J. P.; Johnes, P. J.; MacLeod, C.; Reaney, S.

    2010-12-01

    There are many challenges in developing effective and integrated catchment management solutions for hydrology and water quality issues. Such solutions should ideally build on current scientific evidence to inform policy makers and regulators and additionally allow stakeholders to take ownership of local and/or national issues, in effect bringing together ‘communities of practice’. A strategy being piloted in the UK as the Pilot Virtual Observatory (pVO), funded by NERC, is to demonstrate the use of cyber-infrastructure and cloud computing resources to investigate better methods of linking data and models and to demonstrate scenario analysis for research, policy and operational needs. The research will provide new ways the scientific and stakeholder communities come together to exploit current environmental information, knowledge and experience in an open framework. This poster presents the project scope and methodologies for the pVO work dealing with national modelling of hydrology and macro-nutrient biogeochemistry. We evaluate the strategies needed to robustly benchmark our current predictive capability of these resources through ensemble modelling. We explore the use of catchment similarity concepts to understand if national monitoring programs can inform us about the behaviour of catchments. We discuss the challenges to applying these strategies in an open access and integrated framework and finally we consider the future for such virtual observatory platforms for improving the way we iteratively improve our understanding of catchment science.

  8. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  9. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  10. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  11. The influence of large-scale electricity generation on man and environment. Ch. 2

    International Nuclear Information System (INIS)

    Brugghen, F.W. van der; Koops, F.B.J.; Steen, J. van der.

    1990-01-01

    In order to be able to make a reliable consideration of the pro's and contra's of nuclear power, this chapter considers the risks of large-scale generation of electricity in general for man and environment. The treatment of this question is limited to the use of fossil fuel at one, and nuclear power at the other hand. Recurring energy sources like the sun and wind are not considered. These are important energy sources, but the planned further application of these sources will not render the contributions of large power plants to the Dutch electricity need in the near future. For the time being recurring energy sources therefore don't play a role in the choice of the type of nuclear reactors to be build in the near future. This limitation however leaves unimpeded that each contribution to the decrease of the burden to the environment, wether it concerns energy saving or application of recurrent energy sources, should be encouraged. When such alternatives can be realized technically and economically, they certainly will form a welcome supplement to the existing power-generation techniques. (author). 6 figs

  12. VULCANO: a large scale U O2 program to study corium behaviour and cooling for future reactors

    International Nuclear Information System (INIS)

    Cognet, G.; Bouchter, J.C.

    1994-01-01

    The CEA has launched the VULCANO project, a large experimental facility whose objectives are the understanding of corium behaviour from core melting up to vessel melt-through, and the qualification of core-catcher concepts. This paper deals with the strategy adopted to overcome the difficulties of such experiments (use of real materials such as U O 2 , controlled temperature and flowrate...); in particular, it describes the feasibility studies undertaken on corium production, and on sustained heating within the melt (micro-waves). Some indications are also given on scaling studies for experiments devoted to vessel integrity. 7 figs., 3 refs

  13. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  14. The Very Large Array Data Processing Pipeline

    Science.gov (United States)

    Kent, Brian R.; Masters, Joseph S.; Chandler, Claire J.; Davis, Lindsey E.; Kern, Jeffrey S.; Ott, Juergen; Schinzel, Frank K.; Medlin, Drew; Muders, Dirk; Williams, Stewart; Geers, Vincent C.; Momjian, Emmanuel; Butler, Bryan J.; Nakazato, Takeshi; Sugimoto, Kanako

    2018-01-01

    We present the VLA Pipeline, software that is part of the larger pipeline processing framework used for the Karl G. Jansky Very Large Array (VLA), and Atacama Large Millimeter/sub-millimeter Array (ALMA) for both interferometric and single dish observations.Through a collection of base code jointly used by the VLA and ALMA, the pipeline builds a hierarchy of classes to execute individual atomic pipeline tasks within the Common Astronomy Software Applications (CASA) package. Each pipeline task contains heuristics designed by the team to actively decide the best processing path and execution parameters for calibration and imaging. The pipeline code is developed and written in Python and uses a "context" structure for tracking the heuristic decisions and processing results. The pipeline "weblog" acts as the user interface in verifying the quality assurance of each calibration and imaging stage. The majority of VLA scheduling blocks above 1 GHz are now processed with the standard continuum recipe of the pipeline and offer a calibrated measurement set as a basic data product to observatory users. In addition, the pipeline is used for processing data from the VLA Sky Survey (VLASS), a seven year community-driven endeavor started in September 2017 to survey the entire sky down to a declination of -40 degrees at S-band (2-4 GHz). This 5500 hour next-generation large radio survey will explore the time and spectral domains, relying on pipeline processing to generate calibrated measurement sets, polarimetry, and imaging data products that are available to the astronomical community with no proprietary period. Here we present an overview of the pipeline design philosophy, heuristics, and calibration and imaging results produced by the pipeline. Future development will include the testing of spectral line recipes, low signal-to-noise heuristics, and serving as a testing platform for science ready data products.The pipeline is developed as part of the CASA software package by an

  15. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  16. Statistics and Dynamics in the Large-scale Structure of the Universe

    International Nuclear Information System (INIS)

    Matsubara, Takahiko

    2006-01-01

    In cosmology, observations and theories are related to each other by statistics in most cases. Especially, statistical methods play central roles in analyzing fluctuations in the universe, which are seeds of the present structure of the universe. The confrontation of the statistics and dynamics is one of the key methods to unveil the structure and evolution of the universe. I will review some of the major statistical methods in cosmology, in connection with linear and nonlinear dynamics of the large-scale structure of the universe. The present status of analyses of the observational data such as the Sloan Digital Sky Survey, and the future prospects to constrain the nature of exotic components of the universe such as the dark energy will be presented

  17. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  18. Preparing the Plate Boundary Observatory GNSS Network for the Future

    Science.gov (United States)

    Austin, K. E.; Walls, C. P.; Dittman, T.; Mann, D.; Boyce, E. S.; Basset, A.; Woolace, A. C.; Turner, R.; Lawrence, S.; Rhoades, S.; Pyatt, C.; Willoughby, H.; Feaux, K.; Mattioli, G. S.

    2017-12-01

    The EarthScope Plate Boundary Observatory (PBO) GNSS network, funded by the NSF and operated by UNAVCO, is comprised of 1100 permanent GPS and GNSS stations spanning three principal tectonic regimes and is administered by distinct management. The GPS-only network was initially designed for daily data file downloads primarily for tectonic analysis. This low data volume requirement and circa-2004 IP-based cellular/VSat modems provided significant freedom for station placement and enabled science-targeted installation of stations in some of the most remote and geologically interesting areas. Community requests for high-rate data downloads for GNSS seismology, airborne LiDAR surveys, meteorological/GNSS/seismic real-time data flow and other demands, however, require significantly increased bandwidth beyond the 5-20 kB/s transfer rates that were needed as part of the original design. Since the close of construction in September 2008, PBO enhancements have been implemented through additional funding by the NSF (ARRA/Cascadia), NOAA, and NASA and in collaboration with stakeholders such as Caltrans, ODOT, Scripps, and the USGS. Today, only 18 of the original cell modems remain, with 601 upgraded cell modems providing 3G/4G/LTE data communications that support transfer rates ranging from 80-400 kB/s. Radio network expansion and upgrades continue to harden communications using both 2.4 GHz and 5.8 GHz radios. 78 VSAT and 5 manual download sites remain. PBO-wide the network capabilities for 1 Hz & 5 Hz downloads or low latency 1 Hz streaming are 85%, 80% and 65% of PBO stations, respectively, with 708 active 1 Hz streams. Vaisala meteorological instruments are located at 140 sites most of which stream GPS/Met data in real time. GPS-only receivers are being replaced with GNSS receivers and antennas. Today, there are 279 stations in the PBO network with either GLONASS enabled Trimble NetR9 or full GNSS constellation Septentrio PolaRx5 receivers. Just as the scale and

  19. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  20. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  1. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  2. Providing Undergraduate Research Opportunities Through the World Rivers Observatory Collaborative Network

    Science.gov (United States)

    Gillies, S. L.; Marsh, S. J.; Janmaat, A.; Peucker-Ehrenbrink, B.; Voss, B.; Holmes, R. M.

    2013-12-01

    Successful research collaboration exists between the University of the Fraser Valley (UFV), a primarily undergraduate-serving university located on the Fraser River in British Columbia, and the World Rivers Observatory that is coordinated through the Woods Hole Oceanographic Institution (WHOI) and the Woods Hole Research Center (WHRC). The World Rivers Observatory coordinates time-series sampling of 15 large rivers, with particular focus on the large Arctic rivers, the Ganges-Brahmaputra, Congo, Fraser, Yangtze (Changjiang), Amazon, and Mackenzie River systems. The success of this international observatory critically depends on the participation of local collaborators, such as UFV, that are necessary in order to collect temporally resolved data from these rivers. Several faculty members and undergraduate students from the Biology and Geography Departments of UFV received on-site training from the lead-PIs of the Global Rivers Observatory. To share information and ensure good quality control of sampling methods, WHOI and WHRC hosted two international workshops at Woods Hole for collaborators. For the past four years, faculty and students from UFV have been collecting a variety of bi-monthly water samples from the Fraser River for the World Rivers Observatory. UFV undergraduate students who become involved learn proper sampling techniques and are given the opportunity to design and conduct their own research. Students have collected, analyzed and presented data from this project at regional, national, and international scientific meetings. UFV undergraduate students have also been hosted by WHOI and WHRC as guest students to work on independent research projects. While at WHOI and WHRC, students are able to conduct research using state-of-the-art specialized research facilities not available at UFV.

  3. Morro Azul Observatory: A New Center for Teaching and Popularization of Astronomy.

    Science.gov (United States)

    Bretones, Paulo Sergio; Cardoso de Oliveira, Vladimir

    2002-08-01

    In 1999, the Instituto Superior de Ciências Aplicadas (ISCA Faculdades de Limeira) started a project to build an observatory and initiate several astronomy related activities in the city of Limeira and region (São Paulo state) with the aim of teaching and popularizing astronomy. After contracting teachers, a technician and an intern, the Morro Azul Observatory was inaugurated in March 2000 as a part of the geosciences department of ISCA Faculdades. This poster describes the development phases of the Observatory, the activities initiated by the Observatory, and assesses the impact of the project. Several issues will be discussed such as the criteria for choosing the site, buildings, instruments, group visits, and particularly the goals that were reached. The Observatory, as described here, serves as a model for other centers with the same purpose in the country. The achievements of this project include the creation of two astronomical disciplines for the geography course and liaisons with other courses such as tourism, pedagogy, social communication and engineering. New activities were initiated, educational materials created, and the Observatory is now part of the regions teaching network and is in contact with other Brazilian and foreign centers. This poster presents the results from report analyses, visitor records, the local media, goal strategy assessment, and the current state of the project. It concludes with an evaluation of the social commitment of the Observatory, its initiatives for the constant renewal and growth of the project, its policy of maintaining the activities and interchange with other national and international astronomy centers, and the future perspectives in terms of its contribution for the research in science education.

  4. Inventing a space mission the story of the Herschel space observatory

    CERN Document Server

    Minier, Vincent; Bontems, Vincent; de Graauw, Thijs; Griffin, Matt; Helmich, Frank; Pilbratt, Göran; Volonte, Sergio

    2017-01-01

    This book describes prominent technological achievements within a very successful space science mission: the Herschel space observatory. Focusing on the various processes of innovation it offers an analysis and discussion of the social, technological and scientific context of the mission that paved the way to its development. It addresses the key question raised by these processes in our modern society, i.e.: how knowledge management of innovation set the conditions for inventing the future? In that respect the book is based on a transdisciplinary analysis of the programmatic complexity of Herschel, with inputs from space scientists, managers, philosophers, and engineers. This book is addressed to decision makers, not only in space science, but also in other industries and sciences using or building large machines. It is also addressed to space engineers and scientists as well as students in science and management.

  5. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  6. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  7. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  8. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  9. Promotion orientation explains why future-oriented people exercise and eat healthy: evidence from the two-factor consideration of future consequences-14 scale.

    Science.gov (United States)

    Joireman, Jeff; Shaffer, Monte J; Balliet, Daniel; Strathman, Alan

    2012-10-01

    The authors extended research linking individual differences in consideration of future consequences (CFC) with health behaviors by (a) testing whether individual differences in regulatory focus would mediate that link and (b) highlighting the value of a revised, two-factor CFC-14 scale with subscales assessing concern with future consequences (CFC-Future) and concern with immediate consequences (CFC-Immediate) proper. Exploratory and confirmatory factor analyses of the revised CFC-14 scale supported the presence of two highly reliable factors (CFC-Future and CFC-Immediate; αs from .80 to .84). Moreover, structural equation modeling showed that those high in CFC-Future engage in exercise and healthy eating because they adopt a promotion orientation. Future use of the two-factor CFC-14 scale is encouraged to shed additional light on how concern with future and concern with immediate consequences (proper) differentially impact the way people resolve a host of intertemporal dilemmas (e.g., health, financial, and environmental behavior).

  10. Implementing effect of energy efficiency supervision system for government office buildings and large-scale public buildings in China

    International Nuclear Information System (INIS)

    Zhao Jing; Wu Yong; Zhu Neng

    2009-01-01

    The Chinese central government released a document to initiate a task of energy efficiency supervision system construction for government office buildings and large-scale public buildings in 2007, which marks the overall start of existing buildings energy efficiency management in China with the government office buildings and large-scale public buildings as a breakthrough. This paper focused on the implementing effect in the demonstration region all over China for less than one year, firstly introduced the target and path of energy efficiency supervision system, then described the achievements and problems during the implementing process in the first demonstration provinces and cities. A certain data from the energy efficiency public notice in some typical demonstration provinces and cities were analyzed statistically. It can be concluded that different functional buildings have different energy consumption and the average energy consumption of large-scale public buildings is too high in China compared with the common public buildings and residential buildings. The obstacles need to be overcome afterward were summarized and the prospects for the future work were also put forward in the end.

  11. Implementing effect of energy efficiency supervision system for government office buildings and large-scale public buildings in China

    Energy Technology Data Exchange (ETDEWEB)

    Zhao Jing [School of Environmental Science and Engineering, Tianjin University, Tianjin 300072 (China)], E-mail: zhaojing@tju.edu.cn; Wu Yong [Department of Science and Technology, Ministry of Housing and Urban-Rural Development of the People' s Republic of China, Beijing 100835 (China); Zhu Neng [School of Environmental Science and Engineering, Tianjin University, Tianjin 300072 (China)

    2009-06-15

    The Chinese central government released a document to initiate a task of energy efficiency supervision system construction for government office buildings and large-scale public buildings in 2007, which marks the overall start of existing buildings energy efficiency management in China with the government office buildings and large-scale public buildings as a breakthrough. This paper focused on the implementing effect in the demonstration region all over China for less than one year, firstly introduced the target and path of energy efficiency supervision system, then described the achievements and problems during the implementing process in the first demonstration provinces and cities. A certain data from the energy efficiency public notice in some typical demonstration provinces and cities were analyzed statistically. It can be concluded that different functional buildings have different energy consumption and the average energy consumption of large-scale public buildings is too high in China compared with the common public buildings and residential buildings. The obstacles need to be overcome afterward were summarized and the prospects for the future work were also put forward in the end.

  12. Implementing effect of energy efficiency supervision system for government office buildings and large-scale public buildings in China

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Jing; Zhu, Neng [School of Environmental Science and Engineering, Tianjin University, Tianjin 300072 (China); Wu, Yong [Department of Science and Technology, Ministry of Housing and Urban-Rural Development of the People' s Republic of China, Beijing 100835 (China)

    2009-06-15

    The Chinese central government released a document to initiate a task of energy efficiency supervision system construction for government office buildings and large-scale public buildings in 2007, which marks the overall start of existing buildings energy efficiency management in China with the government office buildings and large-scale public buildings as a breakthrough. This paper focused on the implementing effect in the demonstration region all over China for less than one year, firstly introduced the target and path of energy efficiency supervision system, then described the achievements and problems during the implementing process in the first demonstration provinces and cities. A certain data from the energy efficiency public notice in some typical demonstration provinces and cities were analyzed statistically. It can be concluded that different functional buildings have different energy consumption and the average energy consumption of large-scale public buildings is too high in China compared with the common public buildings and residential buildings. The obstacles need to be overcome afterward were summarized and the prospects for the future work were also put forward in the end. (author)

  13. The architecture of Hamburg-Bergedorf Observatory 1906 - 1912, compared with other observatories (German Title: Die Architektur der Hamburg-Bergedorfer Sternwarte 1906 - 1912 im Vergleich mit anderen Observatorien)

    Science.gov (United States)

    Müller, Peter

    The foundation of the astrophysical observatories in Potsdam-Telegrafenberg in 1874, in Meudon near Paris in 1875 and in Mount Hamilton in California in 1875 resulted in a complete change of observatory architecture. Astrometry had become irrelevant; meridian halls, i.e. an exact north-south orientation, were no longer necessary. The location in the centre of a (university) town was disadvantageous, due to vibrations caused by traffic and artificial light at night. New principles were defined: considerable distance (from the city center), secluded and exposed position (on a mountain) and construction of pavilions: inside a park a pavilion was built for each instrument. Other observatories of this type are: Pic du Midi in the French Alps, built as from 1878 as the first permanent observatory in the high mountains; Nice, Mont Gros, (1879); Brussels, Uccle (1883); Edinburgh, Blackford Hill (1892); Heidelberg, Königstuhl (1896); Barcelona, Monte Tibidado (1902). The original Hamburg Observatory was a modest rectangular building near the Millernrtor; in 1833 it became a State institute. As from 1906 erection of a spacious complex in Bergedorf, 20 km northeast of the city center, took place. Except for the unavailable position on a mountain, this complex fulfilled all principles of a modern observatory: in a park pavilion architecture in an elegant neo-baroque style designed by Albert Erbe (architect of the new Hamburger Kunsthalle with cupola). At the Hamburg Observatory the domed structures were cleverly hierarchised leaving an open view to the south. At the beginning astrometry and astrophysics were equally important; there was still a meridian circle. Apart from that, the instruments were manifold: a large refractor 0.60 m (installed by Repsold/Hamburg, 9 m focal length) and a large reflector 1 m (Zeiss/Jena, 3m focal length). Both were the largest instruments of their kind in the German Empire. In addition, there was the Lippert Astrograph on an elegant polar

  14. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  15. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  16. Climatology at the Roque de LOS Muchachos Observatory

    Science.gov (United States)

    Varela, Antonia M.; Muñoz-Tuñón, Casiana

    2009-09-01

    The Roque de los Muchachos Observatory (ORM) at La Palma (Canary Islands) is one of the two top pre-selected sites for hosting the future European Extremely Large Telescope (E-ELT), the other ones are Ventarrones (Chile), Macon (Argentine) and Aklim (Maroc). Meteorological and seeing conditions are crucial both for the site selection and for telescope design and feasibility studies for adaptive optics. The ELTs shall be very sensitive to wind behavior when operating in open air, therefore ground level wind velocity and wind gust are also required for the feasibility of the telescope construction. Here we analyze the wind speed and wind direction, the air temperature, the relative humidity and the barometric pressure statistical results obtained from data recorded at different sites at the ORM by several Automatic Weather Stations (AWS) since 1985, day and night time separately. Ground wind speed regimes (775mbar) are compared with those provided by satellites from 200 to 700mbar. There exists also observational evidence of the correlation between the seeing and the wind speed and wind direction that will be discussed in this work.

  17. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  18. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  19. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  20. Analyzing the cosmic variance limit of remote dipole measurements of the cosmic microwave background using the large-scale kinetic Sunyaev Zel'dovich effect

    Energy Technology Data Exchange (ETDEWEB)

    Terrana, Alexandra; Johnson, Matthew C. [Department of Physics and Astronomy, York University, Toronto, Ontario, M3J 1P3 (Canada); Harris, Mary-Jean, E-mail: aterrana@perimeterinstitute.ca, E-mail: mharris8@perimeterinstitute.ca, E-mail: mjohnson@perimeterinstitute.ca [Perimeter Institute for Theoretical Physics, Waterloo, Ontario N2L 2Y5 (Canada)

    2017-02-01

    Due to cosmic variance we cannot learn any more about large-scale inhomogeneities from the primary cosmic microwave background (CMB) alone. More information on large scales is essential for resolving large angular scale anomalies in the CMB. Here we consider cross correlating the large-scale kinetic Sunyaev Zel'dovich (kSZ) effect and probes of large-scale structure, a technique known as kSZ tomography. The statistically anisotropic component of the cross correlation encodes the CMB dipole as seen by free electrons throughout the observable Universe, providing information about long wavelength inhomogeneities. We compute the large angular scale power asymmetry, constructing the appropriate transfer functions, and estimate the cosmic variance limited signal to noise for a variety of redshift bin configurations. The signal to noise is significant over a large range of power multipoles and numbers of bins. We present a simple mode counting argument indicating that kSZ tomography can be used to estimate more modes than the primary CMB on comparable scales. A basic forecast indicates that a first detection could be made with next-generation CMB experiments and galaxy surveys. This paper motivates a more systematic investigation of how close to the cosmic variance limit it will be possible to get with future observations.

  1. Assessing large-scale weekly cycles in meteorological variables: a review

    Directory of Open Access Journals (Sweden)

    A. Sanchez-Lorenzo

    2012-07-01

    Full Text Available Several studies have claimed to have found significant weekly cycles of meteorological variables appearing over large domains, which can hardly be related to urban effects exclusively. Nevertheless, there is still an ongoing scientific debate whether these large-scale weekly cycles exist or not, and some other studies fail to reproduce them with statistical significance. In addition to the lack of the positive proof for the existence of these cycles, their possible physical explanations have been controversially discussed during the last years. In this work we review the main results about this topic published during the recent two decades, including a summary of the existence or non-existence of significant weekly weather cycles across different regions of the world, mainly over the US, Europe and Asia. In addition, some shortcomings of common statistical methods for analyzing weekly cycles are listed. Finally, a brief summary of supposed causes of the weekly cycles, focusing on the aerosol-cloud-radiation interactions and their impact on meteorological variables as a result of the weekly cycles of anthropogenic activities, and possible directions for future research, is presented.

  2. Quantum Monte Carlo for large chemical systems: implementing efficient strategies for peta scale platforms and beyond

    International Nuclear Information System (INIS)

    Scemama, Anthony; Caffarel, Michel; Oseret, Emmanuel; Jalby, William

    2013-01-01

    Various strategies to implement efficiently quantum Monte Carlo (QMC) simulations for large chemical systems are presented. These include: (i) the introduction of an efficient algorithm to calculate the computationally expensive Slater matrices. This novel scheme is based on the use of the highly localized character of atomic Gaussian basis functions (not the molecular orbitals as usually done), (ii) the possibility of keeping the memory footprint minimal, (iii) the important enhancement of single-core performance when efficient optimization tools are used, and (iv) the definition of a universal, dynamic, fault-tolerant, and load-balanced framework adapted to all kinds of computational platforms (massively parallel machines, clusters, or distributed grids). These strategies have been implemented in the QMC-Chem code developed at Toulouse and illustrated with numerical applications on small peptides of increasing sizes (158, 434, 1056, and 1731 electrons). Using 10-80 k computing cores of the Curie machine (GENCI-TGCC-CEA, France), QMC-Chem has been shown to be capable of running at the peta scale level, thus demonstrating that for this machine a large part of the peak performance can be achieved. Implementation of large-scale QMC simulations for future exa scale platforms with a comparable level of efficiency is expected to be feasible. (authors)

  3. Search for ultra high energy primary photons at the Pierre Auger Observatory

    Directory of Open Access Journals (Sweden)

    Colalillo Roberta

    2016-01-01

    Full Text Available The Pierre Auger Observatory, located in Argentina, provides an unprecedented integrated aperture in the search for primary photons with energy above 1017 eV over a large portion of the southern sky. Such photons can be detected in principle via the air showers they initiate at such energies, using the complement of Auger Observatory detectors. We discuss the results obtained in diffuse and directional searches for primary photons in the EeV energy range.

  4. Large Scale Relationship between Aquatic Insect Traits and Climate.

    Science.gov (United States)

    Bhowmik, Avit Kumar; Schäfer, Ralf B

    2015-01-01

    Climate is the predominant environmental driver of freshwater assemblage pattern on large spatial scales, and traits of freshwater organisms have shown considerable potential to identify impacts of climate change. Although several studies suggest traits that may indicate vulnerability to climate change, the empirical relationship between freshwater assemblage trait composition and climate has been rarely examined on large scales. We compared the responses of the assumed climate-associated traits from six grouping features to 35 bioclimatic indices (~18 km resolution) for five insect orders (Diptera, Ephemeroptera, Odonata, Plecoptera and Trichoptera), evaluated their potential for changing distribution pattern under future climate change and identified the most influential bioclimatic indices. The data comprised 782 species and 395 genera sampled in 4,752 stream sites during 2006 and 2007 in Germany (~357,000 km² spatial extent). We quantified the variability and spatial autocorrelation in the traits and orders that are associated with the combined and individual bioclimatic indices. Traits of temperature preference grouping feature that are the products of several other underlying climate-associated traits, and the insect order Ephemeroptera exhibited the strongest response to the bioclimatic indices as well as the highest potential for changing distribution pattern. Regarding individual traits, insects in general and ephemeropterans preferring very cold temperature showed the highest response, and the insects preferring cold and trichopterans preferring moderate temperature showed the highest potential for changing distribution. We showed that the seasonal radiation and moisture are the most influential bioclimatic aspects, and thus changes in these aspects may affect the most responsive traits and orders and drive a change in their spatial distribution pattern. Our findings support the development of trait-based metrics to predict and detect climate

  5. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  6. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  7. LASSIE: the large analogue signal and scaling information environment for FAIR

    International Nuclear Information System (INIS)

    Hoffmann, T.; Braeuning, H.; Haseitl, R.

    2012-01-01

    At FAIR, the Facility for Antiproton and Ion Research, several new accelerators and storage rings such as the SIS-100, HESR, CR, the inter-connecting HEBT beam lines, S-FRS and experiments will be built. All of these installations are equipped with beam diagnostic devices and other components, which deliver time-resolved analogue signals to show status, quality and performance of the accelerators. These signals can originate from particle detectors such as ionization chambers and plastic scintillators, but also from adapted output signals of transformers, collimators, magnet functions, RF cavities and others. To visualize and precisely correlate the time axis of all input signals a dedicated FESA based data acquisition and analysis system named LASSIE, the Large Analogue Signal and Scaling Information Environment, is currently being developed. The main operation mode of LASSIE is currently pulse counting with latching VME scaler boards. Later enhancements for ADC, QDC, or TDC digitization in the future are foreseen. The concept, features and challenges of this large distributed data acquisition system are presented. (authors)

  8. Low energy response calibration of the BATSE large area detectors onboard the Compton Observatory

    Energy Technology Data Exchange (ETDEWEB)

    Laird, C.E. [Dept. of Physics and Astronomy, Eastern Kentucky University, Moore 351, 521 Lancaster Avenue, Richmond, KY 40475-3124 (United States)]. E-mail: Chris.Laird@eku.edu; Harmon, B.A. [XD12 NASA/Marshall Space Flight Center, Huntsville, AL 35812 (United States); Wilson, Colleen A. [XD12 NASA/Marshall Space Flight Center, Huntsville, AL 35812 (United States); Hunter, David [Dept. of Physics and Astronomy, Eastern Kentucky University, Moore 351, 521 Lancaster Avenue, Richmond, KY 40475-3124 (United States); Isaacs, Jason [Dept. of Physics and Astronomy, Eastern Kentucky University, Moore 351, 521 Lancaster Avenue, Richmond, KY 40475-3124 (United States)

    2006-10-15

    The low-energy attenuation of the covering material of the Burst and Transient Source Experiment (BATSE) large area detectors (LADs) on the Compton Gamma Ray Observatory as well as the small-angle response of the LADs have been studied. These effects are shown to be more significant than previously assumed. The LAD entrance window included layers of an aluminum-epoxy composite (hexel) that acted as a collimator for the lowest energy photons entering the detector just above threshold (20-50 keV). Simplifying assumptions made concerning the entrance window materials and the angular response at incident angles near normal to the detector face in the original BATSE response matrix formalism had little effect on {gamma}-ray burst measurements; however, these assumptions created serious errors in measured fluxes of galactic sources, whose emission is strongest near the LAD energy threshold. Careful measurements of the angular and low-energy dependence of the attenuation due to the hexel plates only partially improved the response. A systematic study of Crab Nebula spectra showed the need for additional corrections: an angular-dependent correction for all detectors and an angular-independent correction for each detector. These corrections have been applied as part of an overall energy and angular-dependent correction to the BATSE response matrices.

  9. Sensitivity of tree ring growth to local and large-scale climate variability in a region of Southeastern Brazil

    Science.gov (United States)

    Venegas-González, Alejandro; Chagas, Matheus Peres; Anholetto Júnior, Claudio Roberto; Alvares, Clayton Alcarde; Roig, Fidel Alejandro; Tomazello Filho, Mario

    2016-01-01

    We explored the relationship between tree growth in two tropical species and local and large-scale climate variability in Southeastern Brazil. Tree ring width chronologies of Tectona grandis (teak) and Pinus caribaea (Caribbean pine) trees were compared with local (Water Requirement Satisfaction Index—WRSI, Standardized Precipitation Index—SPI, and Palmer Drought Severity Index—PDSI) and large-scale climate indices that analyze the equatorial pacific sea surface temperature (Trans-Niño Index-TNI and Niño-3.4-N3.4) and atmospheric circulation variations in the Southern Hemisphere (Antarctic Oscillation-AAO). Teak trees showed positive correlation with three indices in the current summer and fall. A significant correlation between WRSI index and Caribbean pine was observed in the dry season preceding tree ring formation. The influence of large-scale climate patterns was observed only for TNI and AAO, where there was a radial growth reduction in months preceding the growing season with positive values of the TNI in teak trees and radial growth increase (decrease) during December (March) to February (May) of the previous (current) growing season with positive phase of the AAO in teak (Caribbean pine) trees. The development of a new dendroclimatological study in Southeastern Brazil sheds light to local and large-scale climate influence on tree growth in recent decades, contributing in future climate change studies.

  10. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  11. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    Shimakawa, Yoshio; Nibe, Nobuaki; Hori, Toru

    2002-05-01

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  12. Large scale CMB anomalies from thawing cosmic strings

    Energy Technology Data Exchange (ETDEWEB)

    Ringeval, Christophe [Centre for Cosmology, Particle Physics and Phenomenology, Institute of Mathematics and Physics, Louvain University, 2 Chemin du Cyclotron, 1348 Louvain-la-Neuve (Belgium); Yamauchi, Daisuke; Yokoyama, Jun' ichi [Research Center for the Early Universe (RESCEU), Graduate School of Science, The University of Tokyo, Tokyo 113-0033 (Japan); Bouchet, François R., E-mail: christophe.ringeval@uclouvain.be, E-mail: yamauchi@resceu.s.u-tokyo.ac.jp, E-mail: yokoyama@resceu.s.u-tokyo.ac.jp, E-mail: bouchet@iap.fr [Institut d' Astrophysique de Paris, UMR 7095-CNRS, Université Pierre et Marie Curie, 98bis boulevard Arago, 75014 Paris (France)

    2016-02-01

    Cosmic strings formed during inflation are expected to be either diluted over super-Hubble distances, i.e., invisible today, or to have crossed our past light cone very recently. We discuss the latter situation in which a few strings imprint their signature in the Cosmic Microwave Background (CMB) Anisotropies after recombination. Being almost frozen in the Hubble flow, these strings are quasi static and evade almost all of the previously derived constraints on their tension while being able to source large scale anisotropies in the CMB sky. Using a local variance estimator on thousand of numerically simulated Nambu-Goto all sky maps, we compute the expected signal and show that it can mimic a dipole modulation at large angular scales while being negligible at small angles. Interestingly, such a scenario generically produces one cold spot from the thawing of a cosmic string loop. Mixed with anisotropies of inflationary origin, we find that a few strings of tension GU = O(1) × 10{sup −6} match the amplitude of the dipole modulation reported in the Planck satellite measurements and could be at the origin of other large scale anomalies.

  13. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  14. Griffith Observatory: Hollywood's Celestial Theater

    Science.gov (United States)

    Margolis, Emily A.; Dr. Stuart W. Leslie

    2018-01-01

    The Griffith Observatory, perched atop the Hollywood Hills, is perhaps the most recognizable observatory in the world. Since opening in 1935, this Los Angeles icon has brought millions of visitors closer to the heavens. Through an analysis of planning documentation, internal newsletters, media coverage, programming and exhibition design, I demonstrate how the Observatory’s Southern California location shaped its form and function. The astronomical community at nearby Mt. Wilson Observatory and Caltech informed the selection of instrumentation and programming, especially for presentations with the Observatory’s Zeiss Planetarium, the second installed in the United States. Meanwhile the Observatory staff called upon some of Hollywood’s best artists, model makers, and scriptwriters to translate the latest astronomical discoveries into spectacular audiovisual experiences, which were enhanced with Space Age technological displays on loan from Southern California’s aerospace companies. The influences of these three communities- professional astronomy, entertainment, and aerospace- persist today and continue to make Griffith Observatory one of the premiere sites of public astronomy in the country.

  15. Application of bamboo laminates in large-scale wind turbine blade design?

    Institute of Scientific and Technical Information of China (English)

    Long WANG; Hui LI; Tongguang WANG

    2016-01-01

    From the viewpoint of material and structure in the design of bamboo blades of large-scale wind turbine, a series of mechanical property tests of bamboo laminates as the major enhancement materials for blades are presented. The basic mechanical characteristics needed in the design of bamboo blades are brie?y introduced. Based on these data, the aerodynamic-structural integrated design of a 1.5 MW wind turbine bamboo blade relying on a conventional platform of upwind, variable speed, variable pitch, and doubly-fed generator is carried out. The process of the structural layer design of bamboo blades is documented in detail. The structural strength and fatigue life of the designed wind turbine blades are certified. The technical issues raised from the design are discussed. Key problems and direction of the future study are also summarized.

  16. Orbiting Carbon Observatory-2 (OCO-2): Science Overview and A-Train Synergy

    Science.gov (United States)

    Crisp, David

    2011-01-01

    NASA's Orbiting Carbon Observatory (OCO) was designed to provide global estimates of atmospheric carbon dioxide (CO2) with the sensitivity, accuracy and sampling density needed to quantify regional scale carbon sources and sinks and characterize their behavior over the annual cycle.

  17. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  18. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  19. An interactive display system for large-scale 3D models

    Science.gov (United States)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  20. Diagnosing indicators of large-scale forcing of east-coast cyclogenesis

    International Nuclear Information System (INIS)

    Dowdy, Andrew J; Mills, Graham A; Timbal, Bertrand

    2010-01-01

    Extra-tropical cyclones that develop near the east coast of Australia often have severe consequences such as flash flooding and damaging winds and seas, as well as beneficial consequences such as being responsible for heavy rainfall events that contribute significantly to total rainfall and runoff. There is subjective evidence that the development of most major events, commonly known as East Coast Lows, is associated with the movement of a high amplitude upper-tropospheric trough system over eastern Australia. This paper examines a number of upper-tropospheric diagnostic quantities that might provide a basis for preparing a climatology of the large-scale drivers of east-coast cyclogenesis. A preliminary climatology of these diagnostic quantities, based on ECMWF interim reanalyses, is compared with a database of observed East Coast Low events. The potential application of these diagnostics to global climate model simulations of past and future climates is also discussed.

  1. Fostering Collaboration Across the U.S. Critical Zone Observatories Network

    Science.gov (United States)

    Sharkey, S.; White, T. S.

    2017-12-01

    The Critical Zone (CZ) is defined as the permeable layer from the top of the vegetation canopy to the bottom of freely circulating groundwater where rock, soil, water, air and life meet. The study of the CZ is motivated by an overall lack of understanding of the coupled physical, chemical, and biological processes in this zone at differing spatial and temporal scales. Critical Zone Observatories (CZOs), supported by the U.S. National Science Foundation's Geosciences Directorate, are natural laboratories that aim to provide infrastructure, data and models to gain understanding of the evolution and function of the CZ from grain-to-watershed scales. The nine U.S. observatories span a range of climatic, ecologic, geologic, and physiographic environments from California to Puerto Rico, working on site-specific hypotheses and network-scale goals. CZO research infrastructure allows for teams of cross-disciplinary scientists at each site to further CZ science using field and theoretical approaches, education and outreach, and cross-CZO science. Cross-CZO science emerges from a set of common CZ science questions and hypotheses focused on CZ structure and evolution, event-based and continuous fluxes across CZ interfaces, and changes in storage of major CZ reservoirs at the catchment scale. CZO research seeks to understand coupled processes across all timescales using quantitative models parameterized from observations of meteorological variables, streams, and groundwater, and sampling and analyzing landforms, bedrock, soils, and ecosystems. Each observatory strives to apply common infrastructure, protocols and measurements that help quantify the composition and fluxes of energy, water, solutes, sediments, energy, and mass across boundaries of the CZ system through both space and time. This type of approach enables researchers to access and integrate data in a way that allows for the isolation of environmental variables and comparison of processes and responses across

  2. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  3. Large-scale perturbations from the waterfall field in hybrid inflation

    International Nuclear Information System (INIS)

    Fonseca, José; Wands, David; Sasaki, Misao

    2010-01-01

    We estimate large-scale curvature perturbations from isocurvature fluctuations in the waterfall field during hybrid inflation, in addition to the usual inflaton field perturbations. The tachyonic instability at the end of inflation leads to an explosive growth of super-Hubble scale perturbations, but they retain the steep blue spectrum characteristic of vacuum fluctuations in a massive field during inflation. The power spectrum thus peaks around the Hubble-horizon scale at the end of inflation. We extend the usual δN formalism to include the essential role of these small fluctuations when estimating the large-scale curvature perturbation. The resulting curvature perturbation due to fluctuations in the waterfall field is second-order and the spectrum is expected to be of order 10 −54 on cosmological scales

  4. Decoupling local mechanics from large-scale structure in modular metamaterials

    Science.gov (United States)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  5. Linking genes to ecosystem trace gas fluxes in a large-scale model system

    Science.gov (United States)

    Meredith, L. K.; Cueva, A.; Volkmann, T. H. M.; Sengupta, A.; Troch, P. A.

    2017-12-01

    Soil microorganisms mediate biogeochemical cycles through biosphere-atmosphere gas exchange with significant impact on atmospheric trace gas composition. Improving process-based understanding of these microbial populations and linking their genomic potential to the ecosystem-scale is a challenge, particularly in soil systems, which are heterogeneous in biodiversity, chemistry, and structure. In oligotrophic systems, such as the Landscape Evolution Observatory (LEO) at Biosphere 2, atmospheric trace gas scavenging may supply critical metabolic needs to microbial communities, thereby promoting tight linkages between microbial genomics and trace gas utilization. This large-scale model system of three initially homogenous and highly instrumented hillslopes facilitates high temporal resolution characterization of subsurface trace gas fluxes at hundreds of sampling points, making LEO an ideal location to study microbe-mediated trace gas fluxes from the gene to ecosystem scales. Specifically, we focus on the metabolism of ubiquitous atmospheric reduced trace gases hydrogen (H2), carbon monoxide (CO), and methane (CH4), which may have wide-reaching impacts on microbial community establishment, survival, and function. Additionally, microbial activity on LEO may facilitate weathering of the basalt matrix, which can be studied with trace gas measurements of carbonyl sulfide (COS/OCS) and carbon dioxide (O-isotopes in CO2), and presents an additional opportunity for gene to ecosystem study. This work will present initial measurements of this suite of trace gases to characterize soil microbial metabolic activity, as well as links between spatial and temporal variability of microbe-mediated trace gas fluxes in LEO and their relation to genomic-based characterization of microbial community structure (phylogenetic amplicons) and genetic potential (metagenomics). Results from the LEO model system will help build understanding of the importance of atmospheric inputs to

  6. Punctuated Evolution of Volcanology: An Observatory Perspective

    Science.gov (United States)

    Burton, W. C.; Eichelberger, J. C.

    2010-12-01

    Volcanology from the perspective of crisis prediction and response-the primary function of volcano observatories-is influenced both by steady technological advances and singular events that lead to rapid changes in methodology and procedure. The former can be extrapolated somewhat, while the latter are surprises or shocks. Predictable advances include the conversion from analog to digital systems and the exponential growth of computing capacity and data storage. Surprises include eruptions such as 1980 Mount St Helens, 1985 Nevado del Ruiz, 1989-1990 Redoubt, 1991 Pinatubo, and 2010 Eyjafjallajokull; the opening of GPS to civilian applications, and the advent of an open Russia. Mount St Helens switched the rationale for volcanology in the USGS from geothermal energy to volcano hazards, Ruiz and Pinatubo emphasized the need for international cooperation for effective early warning, Redoubt launched the effort to monitor even remote volcanoes for purposes of aviation safety, and Eyjafjallajokull hammered home the need for improved ash-dispersion and engine-tolerance models; better GPS led to a revolution in volcano geodesy, and the new Russian Federation sparked an Alaska-Kamchatka scientific exchange. The pattern has been that major funding increases for volcano hazards occur after these unpredictable events, which suddenly expose a gap in capabilities, rather than out of a calculated need to exploit technological advances or meet a future goal of risk mitigation. It is up to the observatory and national volcano hazard program to leverage these sudden funding increases into a long-term, sustainable business model that incorporates both the steadily increasing costs of staff and new technology and prepares for the next volcano crisis. Elements of the future will also include the immediate availability on the internet of all publically-funded volcano data, and subscribable, sophisticated hazard alert systems that run computational, fluid dynamic eruption models. These

  7. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  8. Recent Results from the Pierre Auger observatory

    International Nuclear Information System (INIS)

    Kampert, Karl-Heinz

    2010-01-01

    The Pierre Auger observatory is a hybrid air shower experiment which uses multiple detection techniques to investigate the origin, spectrum, and composition of ultrahigh energy cosmic rays. We present recent results on these topics and discuss their implications to the understanding the origin of the most energetic particles in nature as well as for physics beyond the Standard Model, such as violation of Lorentz invariance and 'top-down' models of cosmic ray production. Future plans, including enhancements underway at the southern site in Argentina will be presented. (author)

  9. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  10. Large-scale anisotropy in the extragalactic gamma-ray background as a probe for cosmological antimatter

    Science.gov (United States)

    Gao, Yi-Tian; Stecker, Floyd W.; Gleiser, Marcelo; Cline, David B.

    1990-01-01

    Intrinsic anisotropies in the extragalactic gamma-ray background (EGB), which should be detectable with the forthcoming Gamma Ray Observatory, can be used to examine some of the mechanisms proposed to explain its origin, one of which, the baryon-symmetric big bang (BSBB) model, is investigated here. In this simulation, large domains containing matter and antimatter galaxies produce gamma rays by annihilation at the domain boundaries. This mechanism can produce mountain-chain-shaped angular fluctuations in the EGB flux.

  11. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  12. CONSORT to community: translation of an RCT to a large-scale community intervention and learnings from evaluation of the upscaled program.

    Science.gov (United States)

    Moores, Carly Jane; Miller, Jacqueline; Perry, Rebecca Anne; Chan, Lily Lai Hang; Daniels, Lynne Allison; Vidgen, Helen Anna; Magarey, Anthea Margaret

    2017-11-29

    Translation encompasses the continuum from clinical efficacy to widespread adoption within the healthcare service and ultimately routine clinical practice. The Parenting, Eating and Activity for Child Health (PEACH™) program has previously demonstrated clinical effectiveness in the management of child obesity, and has been recently implemented as a large-scale community intervention in Queensland, Australia. This paper aims to describe the translation of the evaluation framework from a randomised controlled trial (RCT) to large-scale community intervention (PEACH™ QLD). Tensions between RCT paradigm and implementation research will be discussed along with lived evaluation challenges, responses to overcome these, and key learnings for future evaluation conducted at scale. The translation of evaluation from PEACH™ RCT to the large-scale community intervention PEACH™ QLD is described. While the CONSORT Statement was used to report findings from two previous RCTs, the REAIM framework was more suitable for the evaluation of upscaled delivery of the PEACH™ program. Evaluation of PEACH™ QLD was undertaken during the project delivery period from 2013 to 2016. Experiential learnings from conducting the evaluation of PEACH™ QLD to the described evaluation framework are presented for the purposes of informing the future evaluation of upscaled programs. Evaluation changes in response to real-time changes in the delivery of the PEACH™ QLD Project were necessary at stages during the project term. Key evaluation challenges encountered included the collection of complete evaluation data from a diverse and geographically dispersed workforce and the systematic collection of process evaluation data in real time to support program changes during the project. Evaluation of large-scale community interventions in the real world is challenging and divergent from RCTs which are rigourously evaluated within a more tightly-controlled clinical research setting. Constructs

  13. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  14. Temporal flexibility and careers: The role of large-scale organizations for physicians

    OpenAIRE

    Forrest Briscoe

    2006-01-01

    Temporal flexibility and careers: The role of large-scale organizations for physicians. Forrest Briscoe Briscoe This study investigates how employment in large-scale organizations affects the work lives of practicing physicians. Well-established theory associates larger organizations with bureaucratic constraint, loss of workplace control, and dissatisfaction, but this author finds that large scale is also associated with greater schedule and career flexibility. Ironically, the bureaucratic p...

  15. The role of large scale motions on passive scalar transport

    Science.gov (United States)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  16. Back to the future: virtualization of the computing environment at the W. M. Keck Observatory

    Science.gov (United States)

    McCann, Kevin L.; Birch, Denny A.; Holt, Jennifer M.; Randolph, William B.; Ward, Josephine A.

    2014-07-01

    Over its two decades of science operations, the W.M. Keck Observatory computing environment has evolved to contain a distributed hybrid mix of hundreds of servers, desktops and laptops of multiple different hardware platforms, O/S versions and vintages. Supporting the growing computing capabilities to meet the observatory's diverse, evolving computing demands within fixed budget constraints, presents many challenges. This paper describes the significant role that virtualization is playing in addressing these challenges while improving the level and quality of service as well as realizing significant savings across many cost areas. Starting in December 2012, the observatory embarked on an ambitious plan to incrementally test and deploy a migration to virtualized platforms to address a broad range of specific opportunities. Implementation to date has been surprisingly glitch free, progressing well and yielding tangible benefits much faster than many expected. We describe here the general approach, starting with the initial identification of some low hanging fruit which also provided opportunity to gain experience and build confidence among both the implementation team and the user community. We describe the range of challenges, opportunities and cost savings potential. Very significant among these was the substantial power savings which resulted in strong broad support for moving forward. We go on to describe the phasing plan, the evolving scalable architecture, some of the specific technical choices, as well as some of the individual technical issues encountered along the way. The phased implementation spans Windows and Unix servers for scientific, engineering and business operations, virtualized desktops for typical office users as well as more the more demanding graphics intensive CAD users. Other areas discussed in this paper include staff training, load balancing, redundancy, scalability, remote access, disaster readiness and recovery.

  17. Peering into space with the Morocco Oukaïmeden Observatory

    Science.gov (United States)

    Benkhaldoun, Zouhair

    2018-05-01

    Moroccan scientific production in astronomy and astrophysics has shown sustained growth since the late 1980s. This growth is largely due to the dynamism of an increasingly entrepreneurial community and to the creation of an astronomical observatory in the Moroccan Atlas Mountains.

  18. Signatures of non-universal large scales in conditional structure functions from various turbulent flows

    International Nuclear Information System (INIS)

    Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K

    2011-01-01

    We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)

  19. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    Science.gov (United States)

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  20. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    NARCIS (Netherlands)

    Loon, van A.F.; Huijgevoort, van M.H.J.; Lanen, van H.A.J.

    2012-01-01

    Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological