WorldWideScience

Sample records for replicating large-scale planetary

  1. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    Science.gov (United States)

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. Reconsidering Replication: New Perspectives on Large-Scale School Improvement

    Science.gov (United States)

    Peurach, Donald J.; Glazer, Joshua L.

    2012-01-01

    The purpose of this analysis is to reconsider organizational replication as a strategy for large-scale school improvement: a strategy that features a "hub" organization collaborating with "outlet" schools to enact school-wide designs for improvement. To do so, we synthesize a leading line of research on commercial replication to construct a…

  3. Planetary Structures And Simulations Of Large-scale Impacts On Mars

    Science.gov (United States)

    Swift, Damian; El-Dasher, B.

    2009-09-01

    The impact of large meteroids is a possible cause for isolated orogeny on bodies devoid of tectonic activity. On Mars, there is a significant, but not perfect, correlation between large, isolated volcanoes and antipodal impact craters. On Mercury and the Moon, brecciated terrain and other unusual surface features can be found at the antipodes of large impact sites. On Earth, there is a moderate correlation between long-lived mantle hotspots at opposite sides of the planet, with meteoroid impact suggested as a possible cause. If induced by impacts, the mechanisms of orogeny and volcanism thus appear to vary between these bodies, presumably because of differences in internal structure. Continuum mechanics (hydrocode) simulations have been used to investigate the response of planetary bodies to impacts, requiring assumptions about the structure of the body: its composition and temperature profile, and the constitutive properties (equation of state, strength, viscosity) of the components. We are able to predict theoretically and test experimentally the constitutive properties of matter under planetary conditions, with reasonable accuracy. To provide a reference series of simulations, we have constructed self-consistent planetary structures using simplified compositions (Fe core and basalt-like mantle), which turn out to agree surprisingly well with the moments of inertia. We have performed simulations of large-scale impacts, studying the transmission of energy to the antipodes. For Mars, significant antipodal heating to depths of a few tens of kilometers was predicted from compression waves transmitted through the mantle. Such heating is a mechanism for volcanism on Mars, possibly in conjunction with crustal cracking induced by surface waves. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory under Contract DE-AC52-07NA27344.

  4. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  5. Sustainable, Full-Scope Nuclear Fission Energy at Planetary Scale

    OpenAIRE

    Robert Petroski; Lowell Wood

    2012-01-01

    A nuclear fission-based energy system is described that is capable of supplying the energy needs of all of human civilization for a full range of human energy use scenarios, including both very high rates of energy use and strikingly-large amounts of total energy-utilized. To achieve such “planetary scale sustainability”, this nuclear energy system integrates three nascent technologies: uranium extraction from seawater, manifestly safe breeder reactors, and deep borehole d...

  6. Sustainable, Full-Scope Nuclear Fission Energy at Planetary Scale

    Directory of Open Access Journals (Sweden)

    Robert Petroski

    2012-11-01

    Full Text Available A nuclear fission-based energy system is described that is capable of supplying the energy needs of all of human civilization for a full range of human energy use scenarios, including both very high rates of energy use and strikingly-large amounts of total energy-utilized. To achieve such “planetary scale sustainability”, this nuclear energy system integrates three nascent technologies: uranium extraction from seawater, manifestly safe breeder reactors, and deep borehole disposal of nuclear waste. In addition to these technological components, it also possesses the sociopolitical quality of manifest safety, which involves engineering to a very high degree of safety in a straightforward manner, while concurrently making the safety characteristics of the resulting nuclear systems continually manifest to society as a whole. Near-term aspects of this nuclear system are outlined, and representative parameters given for a system of global scale capable of supplying energy to a planetary population of 10 billion people at a per capita level enjoyed by contemporary Americans, i.e., of a type which might be seen a half-century hence. In addition to being sustainable from a resource standpoint, the described nuclear system is also sustainable with respect to environmental and human health impacts, including those resulting from severe accidents.

  7. Large-scale replication study reveals a limit on probabilistic prediction in language comprehension.

    Science.gov (United States)

    Nieuwland, Mante S; Politzer-Ahles, Stephen; Heyselaar, Evelien; Segaert, Katrien; Darley, Emily; Kazanina, Nina; Von Grebmer Zu Wolfsthurn, Sarah; Bartolozzi, Federica; Kogan, Vita; Ito, Aine; Mézière, Diane; Barr, Dale J; Rousselet, Guillaume A; Ferguson, Heather J; Busch-Moreno, Simon; Fu, Xiao; Tuomainen, Jyrki; Kulakova, Eugenia; Husband, E Matthew; Donaldson, David I; Kohút, Zdenko; Rueschemeyer, Shirley-Ann; Huettig, Falk

    2018-04-03

    Do people routinely pre-activate the meaning and even the phonological form of upcoming words? The most acclaimed evidence for phonological prediction comes from a 2005 Nature Neuroscience publication by DeLong, Urbach and Kutas, who observed a graded modulation of electrical brain potentials (N400) to nouns and preceding articles by the probability that people use a word to continue the sentence fragment ('cloze'). In our direct replication study spanning 9 laboratories ( N =334), pre-registered replication-analyses and exploratory Bayes factor analyses successfully replicated the noun-results but, crucially, not the article-results. Pre-registered single-trial analyses also yielded a statistically significant effect for the nouns but not the articles. Exploratory Bayesian single-trial analyses showed that the article-effect may be non-zero but is likely far smaller than originally reported and too small to observe without very large sample sizes. Our results do not support the view that readers routinely pre-activate the phonological form of predictable words. © 2018, Nieuwland et al.

  8. United States Temperature and Precipitation Extremes: Phenomenology, Large-Scale Organization, Physical Mechanisms and Model Representation

    Science.gov (United States)

    Black, R. X.

    2017-12-01

    We summarize results from a project focusing on regional temperature and precipitation extremes over the continental United States. Our project introduces a new framework for evaluating these extremes emphasizing their (a) large-scale organization, (b) underlying physical sources (including remote-excitation and scale-interaction) and (c) representation in climate models. Results to be reported include the synoptic-dynamic behavior, seasonality and secular variability of cold waves, dry spells and heavy rainfall events in the observational record. We also study how the characteristics of such extremes are systematically related to Northern Hemisphere planetary wave structures and thus planetary- and hemispheric-scale forcing (e.g., those associated with major El Nino events and Arctic sea ice change). The underlying physics of event onset are diagnostically quantified for different categories of events. Finally, the representation of these extremes in historical coupled climate model simulations is studied and the origins of model biases are traced using new metrics designed to assess the large-scale atmospheric forcing of local extremes.

  9. A large planetary body inferred from diamond inclusions in a ureilite meteorite.

    Science.gov (United States)

    Nabiei, Farhang; Badro, James; Dennenwaldt, Teresa; Oveisi, Emad; Cantoni, Marco; Hébert, Cécile; El Goresy, Ahmed; Barrat, Jean-Alix; Gillet, Philippe

    2018-04-17

    Planetary formation models show that terrestrial planets are formed by the accretion of tens of Moon- to Mars-sized planetary embryos through energetic giant impacts. However, relics of these large proto-planets are yet to be found. Ureilites are one of the main families of achondritic meteorites and their parent body is believed to have been catastrophically disrupted by an impact during the first 10 million years of the solar system. Here we studied a section of the Almahata Sitta ureilite using transmission electron microscopy, where large diamonds were formed at high pressure inside the parent body. We discovered chromite, phosphate, and (Fe,Ni)-sulfide inclusions embedded in diamond. The composition and morphology of the inclusions can only be explained if the formation pressure was higher than 20 GPa. Such pressures suggest that the ureilite parent body was a Mercury- to Mars-sized planetary embryo.

  10. Wintertime westward-traveling planetary-scale perturbations over the Euro-Atlantic region

    Energy Technology Data Exchange (ETDEWEB)

    Doblas-Reyes, F.J. [Centro de Astrobiologia, INTA, Madrid (Spain); Pastor, M.A.; Casado, M.J. [Instituto Nacional de Meteorologia, Madrid (Spain); Deque, M. [CNRM, Meteo-France, Toulouse (France)

    2001-07-01

    The features of the wintertime westward-traveling planetary scale perturbations over the Euro-Atlantic region are examined through the use of space-time spectral analysis applied to the 500 hPa geopotential height field. The intention is to understand the nature of these phenomena and the performance of climate models. Data from both ECMWF re-analyses and a simulation from the ARPEGE general circulation model are used. Westward-traveling planetary scale transients are found over the region as local perturbations resembling Rossby normal modes, with a maximum power over the Eastern Atlantic. The westward-traveling planetary scale transients north of 40 {sup circle} N have periods larger than 20 days. South of this latitude, wave periods are shifted to a band around 10 days, so that they can be related to subtropical transient waves. The atmospheric model used, like other models which exhibit reasonable mean climatic properties, tend to have less overall intraseasonal variability than observed. Nevertheless, the model is able to capture most of the features of the westward-traveling low-frequency transients. The differences in basic state, partially produced by scale interactions, would lead to the generation of westward-traveling waves in the model distinct from the observed. However, it is suggested that the improvement of the present model version with regard to previous model versions is due to a better simulation of the time-mean state. The reasonable simulation of the synoptic-scale variability south of 50 {sup circle} N, and thus of its barotropic forcing on the basic state, may also help to explain the realistic westward-traveling transients in the model. (orig.)

  11. The problem of scale in planetary geomorphology

    Science.gov (United States)

    Rossbacher, L. A.

    1985-01-01

    Recent planetary exploration has shown that specific landforms exhibit a significant range in size between planets. Similar features on Earth and Mars offer some of the best examples of this scale difference. The difference in heights of volcanic features between the two planets has been cited often; the Martian volcano Olympus Mons stands approximately 26 km high, but Mauna Loa rises only 11 km above the Pacific Ocean floor. Polygonally fractured ground in the northern plains of Mars has diameters up to 20 km across; the largest terrestrial polygons are only 500 m in diameter. Mars also has landslides, aeolian features, and apparent rift valleys larger than any known on Earth. No single factor can explain the variations in landform size between planets. Controls on variation on Earth, related to climate, lithology, or elevation, have seldom been considered in detail. The size differences between features on Earth and other planets seem to be caused by a complex group of interacting relationships. The major planetary parameters that may affect landform size are discussed.

  12. Scaling properties of planetary calderas and terrestrial volcanic eruptions

    Directory of Open Access Journals (Sweden)

    L. Sanchez

    2012-11-01

    Full Text Available Volcanism plays an important role in transporting internal heat of planetary bodies to their surface. Therefore, volcanoes are a manifestation of the planet's past and present internal dynamics. Volcanic eruptions as well as caldera forming processes are the direct manifestation of complex interactions between the rising magma and the surrounding host rock in the crust of terrestrial planetary bodies. Attempts have been made to compare volcanic landforms throughout the solar system. Different stochastic models have been proposed to describe the temporal sequences of eruptions on individual or groups of volcanoes. However, comprehensive understanding of the physical mechanisms responsible for volcano formation and eruption and more specifically caldera formation remains elusive. In this work, we propose a scaling law to quantify the distribution of caldera sizes on Earth, Mars, Venus, and Io, as well as the distribution of calderas on Earth depending on their surrounding crustal properties. We also apply the same scaling analysis to the distribution of interevent times between eruptions for volcanoes that have the largest eruptive history as well as groups of volcanoes on Earth. We find that when rescaled with their respective sample averages, the distributions considered show a similar functional form. This result implies that similar processes are responsible for caldera formation throughout the solar system and for different crustal settings on Earth. This result emphasizes the importance of comparative planetology to understand planetary volcanism. Similarly, the processes responsible for volcanic eruptions are independent of the type of volcanism or geographical location.

  13. Lack of replication of thirteen single-nucleotide polymorphisms implicated in Parkinson’s disease: a large-scale international study

    Science.gov (United States)

    Elbaz, Alexis; Nelson, Lorene M; Payami, Haydeh; Ioannidis, John P A; Fiske, Brian K; Annesi, Grazia; Belin, Andrea Carmine; Factor, Stewart A; Ferrarese, Carlo; Hadjigeorgiou, Georgios M; Higgins, Donald S; Kawakami, Hideshi; Krüger, Rejko; Marder, Karen S; Mayeux, Richard P; Mellick, George D; Nutt, John G; Ritz, Beate; Samii, Ali; Tanner, Caroline M; Van Broeckhoven, Christine; Van Den Eeden, Stephen K; Wirdefeldt, Karin; Zabetian, Cyrus P; Dehem, Marie; Montimurro, Jennifer S; Southwick, Audrey; Myers, Richard M; Trikalinos, Thomas A

    2013-01-01

    Summary Background A genome-wide association study identified 13 single-nucleotide polymorphisms (SNPs) significantly associated with Parkinson’s disease. Small-scale replication studies were largely non-confirmatory, but a meta-analysis that included data from the original study could not exclude all SNP associations, leaving relevance of several markers uncertain. Methods Investigators from three Michael J Fox Foundation for Parkinson’s Research-funded genetics consortia—comprising 14 teams—contributed DNA samples from 5526 patients with Parkinson’s disease and 6682 controls, which were genotyped for the 13 SNPs. Most (88%) participants were of white, non-Hispanic descent. We assessed log-additive genetic effects using fixed and random effects models stratified by team and ethnic origin, and tested for heterogeneity across strata. A meta-analysis was undertaken that incorporated data from the original genome-wide study as well as subsequent replication studies. Findings In fixed and random-effects models no associations with any of the 13 SNPs were identified (odds ratios 0·89 to 1·09). Heterogeneity between studies and between ethnic groups was low for all SNPs. Subgroup analyses by age at study entry, ethnic origin, sex, and family history did not show any consistent associations. In our meta-analysis, no SNP showed significant association (summary odds ratios 0·95 to 1.08); there was little heterogeneity except for SNP rs7520966. Interpretation Our results do not lend support to the finding that the 13 SNPs reported in the original genome-wide association study are genetic susceptibility factors for Parkinson’s disease. PMID:17052658

  14. Planetary-Scale Inertio Gravity Waves in the Numerical Spectral Model

    Science.gov (United States)

    Mayr, H. G.; Mengel, J. R.; Talaat, E. R.; Porter, H. S.

    2004-01-01

    In the polar region of the upper mesosphere, horizontal wind oscillations have been observed with periods around 10 hours. Waves with such a period are generated in our Numerical Spectral Model (NSM), and they are identified as planetary-scale inertio gravity waves (IGW). These IGWs have periods between 9 and 11 hours and appear above 60 km in the zonal mean (m = 0), as well as in zonal wavenumbers m = 1 to 4. The waves can propagate eastward and westward and have vertical wavelengths around 25 km. The amplitudes in the wind field are typically between 10 and 20 m/s and can reach 30 m/s in the westward propagating component for m = 1 at the poles. In the temperature perturbations, the wave amplitudes above 100 km are typically 5 K and as large as 10 K for m = 0 at the poles. The IGWs are intermittent but reveal systematic seasonal variations, with the largest amplitudes occurring generally in late winter and spring. In the NSM, the IGW are generated like the planetary waves (PW). They are produced apparently by the instabilities that arise in the zonal mean circulation. Relative to the PWs, however, the IGWs propagate zonally with much larger velocities, such that they are not affected much by interactions with the background zonal winds. Since the IGWs can propagate through the mesosphere without much interaction, except for viscous dissipation, one should then expect that they reach the thermosphere with significant and measurable amplitudes.

  15. Properties of internal planetary-scale inertio gravity waves in the mesosphere

    Directory of Open Access Journals (Sweden)

    H. G. Mayr

    2004-11-01

    Full Text Available At high latitudes in the upper mesosphere, horizontal wind oscillations have been observed with periods around 10h. Waves with such a period are generated in our Numerical Spectral Model (NSM, and they are identified as planetary-scale inertio gravity waves (IGW. These IGWs have periods between 9 and 11h and appear above 60km in the zonal mean (m=0, as well as in m=1 to 4, propagating eastward and westward. Under the influence of the Coriolis force, the amplitudes of the waves propagating westward are larger at high latitudes than those propagating eastward. The waves grow in magnitude at least up to about 100km and have vertical wavelengths around 25km. Applying a running window of 15 days for spectral analysis, the amplitudes in the wind field are typically between 10 and 20m/s and can reach 30m/s in the westward propagating component for m=1 at the poles. In the temperature perturbations, the wave amplitudes above 100km are typically 5K and as large as 10K for m=0 at the poles. The IGWs are intermittent but reveal systematic seasonal variations, with the largest amplitudes occurring generally in late winter and spring. Numerical experiments show that such waves are also generated without excitation of the migrating tides. The amplitudes and periods then are similar, indicating that the tides are not essential to generate the waves. However, the seasonal variations without tides are significantly different, which leads to the conclusion that non linear interactions between the semidiurnal tide and planetary waves must contribute to the excitation of the IGWs. Directly or indirectly through the planetary waves, the IGWs are apparently excited by the instabilities that arise in the zonal mean circulation. When the solar heating is turned off for m=0, both the PWs and IGWs essentially disappear. That the IGWs and PWs have common roots in their excitation mechanism is also indicated by the striking similarity of their seasonal variations in the

  16. On the distance scale of planetary nebulae and white dwarf birth rates

    International Nuclear Information System (INIS)

    Weidemann, V.

    1977-01-01

    Arguments are presented which favor an increase of the distance scale of planetary nebulae by 30% compared to the Seaton-Webster scale. The consequences for evolutionary tracks, PN and white dwarf relations, and birth rates are discussed. It is concluded that opposite to Smith jr. (1976) underestimated, and that the proposed change in distance scale of PN brings white dwarf and PN birth rates into almost complete agreement. (orig.) [de

  17. An ecological compass for planetary engineering.

    Science.gov (United States)

    Haqq-Misra, Jacob

    2012-10-01

    Proposals to address present-day global warming through the large-scale application of technology to the climate system, known as geoengineering, raise questions of environmental ethics relevant to the broader issue of planetary engineering. These questions have also arisen in the scientific literature as discussions of how to terraform a planet such as Mars or Venus in order to make it more Earth-like and habitable. Here we draw on insights from terraforming and environmental ethics to develop a two-axis comparative tool for ethical frameworks that considers the intrinsic or instrumental value placed upon organisms, environments, planetary systems, or space. We apply this analysis to the realm of planetary engineering, such as terraforming on Mars or geoengineering on present-day Earth, as well as to questions of planetary protection and space exploration.

  18. Titius--Bode law and the possibility of recent large-scale evolution in the solar system

    International Nuclear Information System (INIS)

    Neito, M.M.

    1974-01-01

    Although it is by no means clear that the Titius--Bode law of planetary distances is indeed a ''law'' (even though there are enticing indications), it is proposed that if one assumes that the law is a ''law'' and that the planets obey it, then this argues against recent large-scale evolution in the solar system. Put another way: one can believe in the Titius--Bode law or in recent large-scale evolution or in neither of them. But it appears difficult to believe in both of them

  19. Chromosome Replication in Escherichia coli: Life on the Scales

    Science.gov (United States)

    Norris, Vic; Amar, Patrick

    2012-01-01

    At all levels of Life, systems evolve on the 'scales of equilibria'. At the level of bacteria, the individual cell must favor one of two opposing strategies and either take risks to grow or avoid risks to survive. It has been proposed in the Dualism hypothesis that the growth and survival strategies depend on non-equilibrium and equilibrium hyperstructures, respectively. It has been further proposed that the cell cycle itself is the way cells manage to balance the ratios of these types of hyperstructure so as to achieve the compromise solution of living on the two scales. Here, we attempt to re-interpret a major event, the initiation of chromosome replication in Escherichia coli, in the light of scales of equilibria. This entails thinking in terms of hyperstructures as responsible for intensity sensing and quantity sensing and how this sensing might help explain the role of the DnaA protein in initiation of replication. We outline experiments and an automaton approach to the cell cycle that should test and refine the scales concept. PMID:25371267

  20. Secure File Allocation and Caching in Large-scale Distributed Systems

    DEFF Research Database (Denmark)

    Di Mauro, Alessio; Mei, Alessandro; Jajodia, Sushil

    2012-01-01

    In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with hi......-balancing, and reducing delay of read operations. The system offers a trade-off-between performance and security that is dynamically tunable according to the current level of threat. We validate our mechanisms with extensive simulations in an Internet-like network.......In this paper, we present a file allocation and caching scheme that guarantees high assurance, availability, and load balancing in a large-scale distributed file system that can support dynamic updates of authorization policies. The scheme uses fragmentation and replication to store files with high...... security requirements in a system composed of a majority of low-security servers. We develop mechanisms to fragment files, to allocate them into multiple servers, and to cache them as close as possible to their readers while preserving the security requirement of the files, providing load...

  1. From Planetary Boundaries to national fair shares of the global safe operating space - How can the scales be bridged?

    Science.gov (United States)

    Häyhä, Tiina; Cornell, Sarah; Lucas, Paul; van Vuuren, Detlef; Hoff, Holger

    2016-04-01

    The planetary boundaries framework proposes precautionary quantitative global limits to the anthropogenic perturbation of crucial Earth system processes. In this way, it marks out a planetary 'safe operating space' for human activities. However, decisions regarding resource use and emissions are mostly made at much smaller scales, mostly by (sub-)national and regional governments, businesses, and other local actors. To operationalize the planetary boundaries, they need to be translated into and aligned with targets that are relevant at these smaller scales. In this paper, we develop a framework that addresses the three dimension of bridging across scales: biophysical, socio-economic and ethical, to provide a consistent universally applicable approach for translating the planetary boundaries into national level context-specific and fair shares of the safe operating space. We discuss our findings in the context of previous studies and their implications for future analyses and policymaking. In this way, we help link the planetary boundaries framework to widely- applied operational and policy concepts for more robust strong sustainability decision-making.

  2. Electrochemically replicated smooth aluminum foils for anodic alumina nanochannel arrays

    International Nuclear Information System (INIS)

    Biring, Sajal; Tsai, K-T; Sur, Ujjal Kumar; Wang, Y-L

    2008-01-01

    A fast electrochemical replication technique has been developed to fabricate large-scale ultra-smooth aluminum foils by exploiting readily available large-scale smooth silicon wafers as the masters. Since the adhesion of aluminum on silicon depends on the time of surface pretreatment in water, it is possible to either detach the replicated aluminum from the silicon master without damaging the replicated aluminum and master or integrate the aluminum film to the silicon substrate. Replicated ultra-smooth aluminum foils are used for the growth of both self-organized and lithographically guided long-range ordered arrays of anodic alumina nanochannels without any polishing pretreatment

  3. The role of nonlinear self-interaction in the dynamics of planetary-scale atmospheric fluctuations

    International Nuclear Information System (INIS)

    Saffioti, C; Malguzzi, P; Speranza, A

    2016-01-01

    A central role in the general circulation of the atmosphere is played by planetary-scale inertial fluctuations with zonal wavenumber in the range k  = 1–4. Geopotential variance in this range is markedly non-gaussian and a great fraction of it is non-propagating, in contrast with the normal distribution of amplitudes and the basically propagating character of fluctuations in the baroclinic range (3 <  k  < 15). While a wave dispersion relationship can be identified in the baroclinic range, no clear relationship between time and space scales emerges in the ultra-long regime ( k  < 5, period >10 days). We investigate the hypothesis that nonlinear self-interaction of planetary waves influences the mobility (and, therefore, the dispersion) of ultra-long planetary fluctuations. By means of a perturbation expansion of the barotropic vorticity equation we derive a minimal analytic description of the impact of self-nonlinearity on mobility and we show that this is responsible for a correction term to phase speed, with the prevalent effect of slowing down the propagation of waves. The intensity of nonlinear self-interaction is shown to increase with the complexity of the flow, depending on both its zonal and meridional modulations. Reanalysis data of geopotential height and zonal wind are analysed in order to test the effect of self-nonlinearity on observed planetary flows. (paper)

  4. Extended general relativity: Large-scale antigravity and short-scale gravity with ω=-1 from five-dimensional vacuum

    International Nuclear Information System (INIS)

    Madriz Aguilar, Jose Edgar; Bellini, Mauricio

    2009-01-01

    Considering a five-dimensional (5D) Riemannian spacetime with a particular stationary Ricci-flat metric, we obtain in the framework of the induced matter theory an effective 4D static and spherically symmetric metric which give us ordinary gravitational solutions on small (planetary and astrophysical) scales, but repulsive (anti gravitational) forces on very large (cosmological) scales with ω=-1. Our approach is an unified manner to describe dark energy, dark matter and ordinary matter. We illustrate the theory with two examples, the solar system and the great attractor. From the geometrical point of view, these results follow from the assumption that exists a confining force that make possible that test particles move on a given 4D hypersurface.

  5. Extended general relativity: Large-scale antigravity and short-scale gravity with ω=-1 from five-dimensional vacuum

    Science.gov (United States)

    Madriz Aguilar, José Edgar; Bellini, Mauricio

    2009-08-01

    Considering a five-dimensional (5D) Riemannian spacetime with a particular stationary Ricci-flat metric, we obtain in the framework of the induced matter theory an effective 4D static and spherically symmetric metric which give us ordinary gravitational solutions on small (planetary and astrophysical) scales, but repulsive (anti gravitational) forces on very large (cosmological) scales with ω=-1. Our approach is an unified manner to describe dark energy, dark matter and ordinary matter. We illustrate the theory with two examples, the solar system and the great attractor. From the geometrical point of view, these results follow from the assumption that exists a confining force that make possible that test particles move on a given 4D hypersurface.

  6. Extended general relativity: Large-scale antigravity and short-scale gravity with {omega}=-1 from five-dimensional vacuum

    Energy Technology Data Exchange (ETDEWEB)

    Madriz Aguilar, Jose Edgar [Instituto de Fisica de la Universidad de Guanajuato, C.P. 37150, Leon Guanajuato (Mexico); Departamento de Fisica, Facultad de Ciencias Exactas y Naturales, Universidad Nacional de Mar del Plata, Funes 3350, C.P. 7600, Mar del Plata (Argentina)], E-mail: madriz@mdp.edu.ar; Bellini, Mauricio [Departamento de Fisica, Facultad de Ciencias Exactas y Naturales, Universidad Nacional de Mar del Plata, Funes 3350, C.P. 7600, Mar del Plata (Argentina); Consejo Nacional de Investigaciones Cientificas y Tecnicas (CONICET) (Argentina)], E-mail: mbellini@mdp.edu.ar

    2009-08-31

    Considering a five-dimensional (5D) Riemannian spacetime with a particular stationary Ricci-flat metric, we obtain in the framework of the induced matter theory an effective 4D static and spherically symmetric metric which give us ordinary gravitational solutions on small (planetary and astrophysical) scales, but repulsive (anti gravitational) forces on very large (cosmological) scales with {omega}=-1. Our approach is an unified manner to describe dark energy, dark matter and ordinary matter. We illustrate the theory with two examples, the solar system and the great attractor. From the geometrical point of view, these results follow from the assumption that exists a confining force that make possible that test particles move on a given 4D hypersurface.

  7. Large-proportional shrunken bio-replication of shark skin based on UV-curing shrinkage

    International Nuclear Information System (INIS)

    Chen, Huawei; Che, Da; Zhang, Xin; Yue, Yue; Zhang, Deyuan

    2015-01-01

    The shark skin effect has attracted worldwide attention because of its superior drag reduction. As the product of natural selection, the maximum drag reduction of shark skin is found in its normal living environment. Large-proportional shrinkage of shark skin morphology is greatly anticipated for its adaptation to faster fluid flow. One novel approach, large-proportional shrunken bio-replication, is proposed as a method to adjust the optimal drag reduction region of shark skin based on the shrinkage of UV-cured material. The shark skin is taken as a replica template to allow large-proportional shrinking in the drag reduction morphology by taking advantage of the shrinkage of UV-curable material. The accuracy of the large-proportional shrunken bio-replication approach is verified by a comparison between original and shrunken bio-replicated shark skin, which shows that the shrinking ratio can reach 23% and the bio-replication accuracy is higher than 95%. In addition, the translation of the optimum drag reduction peak of natural surface function to various applications and environments is proved by drag reduction experiments. (technical note)

  8. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  9. Influence of large-scale zonal flows on the evolution of stellar and planetary magnetic fields

    Science.gov (United States)

    Petitdemange, Ludovic; Schrinner, Martin; Dormy, Emmanuel; ENS Collaboration

    2011-10-01

    Zonal flows and magnetic field are present in various objects as accretion discs, stars and planets. Observations show a huge variety of stellar and planetary magnetic fields. Of particular interest is the understanding of cyclic field variations, as known from the sun. They are often explained by an important Ω-effect, i.e., by the stretching of field lines because of strong differential rotation. We computed the dynamo coefficients for an oscillatory dynamo model with the help of the test-field method. We argue that this model is of α2 Ω -type and here the Ω-effect alone is not responsible for its cyclic time variation. More general conditions which lead to dynamo waves in global direct numerical simulations are presented. Zonal flows driven by convection in planetary interiors may lead to secondary instabilities. We showed that a simple, modified version of the MagnetoRotational Instability, i.e., the MS-MRI can develop in planteray interiors. The weak shear yields an instability by its constructive interaction with the much larger rotation rate of planets. We present results from 3D simulations and show that 3D MS-MRI modes can generate wave pattern at the surface of the spherical numerical domain. Zonal flows and magnetic field are present in various objects as accretion discs, stars and planets. Observations show a huge variety of stellar and planetary magnetic fields. Of particular interest is the understanding of cyclic field variations, as known from the sun. They are often explained by an important Ω-effect, i.e., by the stretching of field lines because of strong differential rotation. We computed the dynamo coefficients for an oscillatory dynamo model with the help of the test-field method. We argue that this model is of α2 Ω -type and here the Ω-effect alone is not responsible for its cyclic time variation. More general conditions which lead to dynamo waves in global direct numerical simulations are presented. Zonal flows driven by convection

  10. Three Conceptual Replication Studies in Group Theory

    Science.gov (United States)

    Melhuish, Kathleen

    2018-01-01

    Many studies in mathematics education research occur with a nonrepresentative sample and are never replicated. To challenge this paradigm, I designed a large-scale study evaluating student conceptions in group theory that surveyed a national, representative sample of students. By replicating questions previously used to build theory around student…

  11. Laying a Solid Foundation: Strategies for Effective Program Replication

    Science.gov (United States)

    Summerville, Geri

    2009-01-01

    The replication of proven social programs is a cost-effective and efficient way to achieve large-scale, positive social change. Yet there has been little guidance available about how to approach program replication and limited development of systems--at local, state or federal levels--to support replication efforts. "Laying a Solid Foundation:…

  12. Phosphorylation of Large T Antigen Regulates Merkel Cell Polyomavirus Replication

    International Nuclear Information System (INIS)

    Diaz, Jason; Wang, Xin; Tsang, Sabrina H.; Jiao, Jing; You, Jianxin

    2014-01-01

    Merkel Cell Polyomavirus (MCPyV) was recently discovered as a novel human polyomavirus that is associated with ~80% of Merkel Cell Carcinomas. The Large Tumor antigen (LT) is an early viral protein which has a variety of functions, including manipulation of the cell cycle and initiating viral DNA replication. Phosphorylation plays a critical regulatory role for polyomavirus LT proteins, but no investigation of MCPyV LT phosphorylation has been performed to date. In this report mass spectrometry analysis reveals three unique phosphorylation sites: T271, T297 and T299. In vivo replication assays confirm that phosphorylation of T271 does not play a role in viral replication, while modification at T297 and T299 have dramatic and opposing effects on LT’s ability to initiate replication from the viral origin. We test these mutants for their ability to bind, unwind, and act as a functional helicase at the viral origin. These studies provide a framework for understanding how phosphorylation of LT may dynamically regulate viral replication. Although the natural host cell of MCPyV has not yet been established, this work provides a foundation for understanding how LT activity is regulated and provides tools for better exploring this regulation in both natural host cells and Merkel cells

  13. Phosphorylation of Large T Antigen Regulates Merkel Cell Polyomavirus Replication

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, Jason; Wang, Xin; Tsang, Sabrina H. [Department of Microbiology, University of Pennsylvania Perelman School of Medicine, Philadelphia, PA 19104 (United States); Jiao, Jing [Department of Pathology and Laboratory Medicine, Children’s Hospital of Philadelphia, Philadelphia, PA 19104 (United States); You, Jianxin, E-mail: jianyou@mail.med.upenn.edu [Department of Microbiology, University of Pennsylvania Perelman School of Medicine, Philadelphia, PA 19104 (United States)

    2014-07-08

    Merkel Cell Polyomavirus (MCPyV) was recently discovered as a novel human polyomavirus that is associated with ~80% of Merkel Cell Carcinomas. The Large Tumor antigen (LT) is an early viral protein which has a variety of functions, including manipulation of the cell cycle and initiating viral DNA replication. Phosphorylation plays a critical regulatory role for polyomavirus LT proteins, but no investigation of MCPyV LT phosphorylation has been performed to date. In this report mass spectrometry analysis reveals three unique phosphorylation sites: T271, T297 and T299. In vivo replication assays confirm that phosphorylation of T271 does not play a role in viral replication, while modification at T297 and T299 have dramatic and opposing effects on LT’s ability to initiate replication from the viral origin. We test these mutants for their ability to bind, unwind, and act as a functional helicase at the viral origin. These studies provide a framework for understanding how phosphorylation of LT may dynamically regulate viral replication. Although the natural host cell of MCPyV has not yet been established, this work provides a foundation for understanding how LT activity is regulated and provides tools for better exploring this regulation in both natural host cells and Merkel cells.

  14. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    DEFF Research Database (Denmark)

    Jensen, Tue Vissing; Pinson, Pierre

    2017-01-01

    , we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven...... to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecastingof renewable power generation....

  15. Network-scale spatial and temporal variation in Chinook salmon (Oncorhynchus tshawytscha) redd distributions: patterns inferred from spatially continuous replicate surveys

    Science.gov (United States)

    Daniel J. Isaak; Russell F. Thurow

    2006-01-01

    Spatially continuous sampling designs, when temporally replicated, provide analytical flexibility and are unmatched in their ability to provide a dynamic system view. We have compiled such a data set by georeferencing the network-scale distribution of Chinook salmon (Oncorhynchus tshawytscha) redds across a large wilderness basin (7330 km2) in...

  16. Planetary Magnetism

    Science.gov (United States)

    Connerney, J. E. P.

    2007-01-01

    The chapter on Planetary Magnetism by Connerney describes the magnetic fields of the planets, from Mercury to Neptune, including the large satellites (Moon, Ganymede) that have or once had active dynamos. The chapter describes the spacecraft missions and observations that, along with select remote observations, form the basis of our knowledge of planetary magnetic fields. Connerney describes the methods of analysis used to characterize planetary magnetic fields, and the models used to represent the main field (due to dynamo action in the planet's interior) and/or remnant magnetic fields locked in the planet's crust, where appropriate. These observations provide valuable insights into dynamo generation of magnetic fields, the structure and composition of planetary interiors, and the evolution of planets.

  17. Influence of grid aspect ratio on planetary boundary layer turbulence in large-eddy simulations

    Directory of Open Access Journals (Sweden)

    S. Nishizawa

    2015-10-01

    Full Text Available We examine the influence of the grid aspect ratio of horizontal to vertical grid spacing on turbulence in the planetary boundary layer (PBL in a large-eddy simulation (LES. In order to clarify and distinguish them from other artificial effects caused by numerical schemes, we used a fully compressible meteorological LES model with a fully explicit scheme of temporal integration. The influences are investigated with a series of sensitivity tests with parameter sweeps of spatial resolution and grid aspect ratio. We confirmed that the mixing length of the eddy viscosity and diffusion due to sub-grid-scale turbulence plays an essential role in reproducing the theoretical −5/3 slope of the energy spectrum. If we define the filter length in LES modeling based on consideration of the numerical scheme, and introduce a corrective factor for the grid aspect ratio into the mixing length, the theoretical slope of the energy spectrum can be obtained; otherwise, spurious energy piling appears at high wave numbers. We also found that the grid aspect ratio has influence on the turbulent statistics, especially the skewness of the vertical velocity near the top of the PBL, which becomes spuriously large with large aspect ratio, even if a reasonable spectrum is obtained.

  18. Establishing a coherent and replicable measurement model of the Edinburgh Postnatal Depression Scale.

    Science.gov (United States)

    Martin, Colin R; Redshaw, Maggie

    2018-06-01

    The 10-item Edinburgh Postnatal Depression Scale (EPDS) is an established screening tool for postnatal depression. Inconsistent findings in factor structure and replication difficulties have limited the scope of development of the measure as a multi-dimensional tool. The current investigation sought to robustly determine the underlying factor structure of the EPDS and the replicability and stability of the most plausible model identified. A between-subjects design was used. EPDS data were collected postpartum from two independent cohorts using identical data capture methods. Datasets were examined with confirmatory factor analysis, model invariance testing and systematic evaluation of relational and internal aspects of the measure. Participants were two samples of postpartum women in England assessed at three months (n = 245) and six months (n = 217). The findings showed a three-factor seven-item model of the EPDS offered an excellent fit to the data, and was observed to be replicable in both datasets and invariant as a function of time point of assessment. Some EPDS sub-scale scores were significantly higher at six months. The EPDS is multi-dimensional and a robust measurement model comprises three factors that are replicable. The potential utility of the sub-scale components identified requires further research to identify a role in contemporary screening practice. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Replication of engine block cylinder bridge microstructure and mechanical properties with lab scale 319 Al alloy billet castings

    International Nuclear Information System (INIS)

    Lombardi, A.; D'Elia, F.; Ravindran, C.; MacKay, R.

    2014-01-01

    In recent years, aluminum alloy gasoline engine blocks have in large part successfully replaced nodular cast iron engine blocks, resulting in improved vehicle fuel efficiency. However, because of the inadequate wear resistance properties of hypoeutectic Al–Si alloys, gray iron cylinder liners are required. These liners cause the development of large tensile residual stress along the cylinder bores and necessitate the maximization of mechanical properties in this region to prevent premature engine failure. The aim of this study was to replicate the engine cylinder bridge microstructure and mechanical properties following TSR treatment (which removes the sand binder to enable easy casting retrieval) using lab scale billet castings of the same alloy composition with varying cooling rates. Comparisons in microstructure between the engine block and the billet castings were carried out using optical and scanning electron microscopy, while mechanical properties were assessed using tensile testing. The results suggest that the microstructure at the top and middle of the engine block cylinder bridge was successfully replicated by the billet castings. However, the microstructure at the bottom of the cylinder was not completely replicated due to variations in secondary phase morphology and distribution. The successful replication of engine block microstructure will enable the future optimization of heat treatment parameters. - Highlights: • A method to replicate engine block microstructure was developed. • Billet castings will allow cost effective optimization of heat treatment process. • The replication of microstructure in the cylinder region was mostly successful. • Porosity was more clustered in the billet castings compared to the engine block. • Mechanical properties were lower in billet castings due to porosity and inclusions

  20. Replication of engine block cylinder bridge microstructure and mechanical properties with lab scale 319 Al alloy billet castings

    Energy Technology Data Exchange (ETDEWEB)

    Lombardi, A., E-mail: a2lombar@ryerson.ca [Centre for Near-net-shape Processing of Materials, Ryerson University, 101 Gerrard Street East, Toronto, Ontario M5B2K3 (Canada); D' Elia, F.; Ravindran, C. [Centre for Near-net-shape Processing of Materials, Ryerson University, 101 Gerrard Street East, Toronto, Ontario M5B2K3 (Canada); MacKay, R. [Nemak of Canada Corporation, 4600 G.N. Booth Drive, Windsor, Ontario N9C4G8 (Canada)

    2014-01-15

    In recent years, aluminum alloy gasoline engine blocks have in large part successfully replaced nodular cast iron engine blocks, resulting in improved vehicle fuel efficiency. However, because of the inadequate wear resistance properties of hypoeutectic Al–Si alloys, gray iron cylinder liners are required. These liners cause the development of large tensile residual stress along the cylinder bores and necessitate the maximization of mechanical properties in this region to prevent premature engine failure. The aim of this study was to replicate the engine cylinder bridge microstructure and mechanical properties following TSR treatment (which removes the sand binder to enable easy casting retrieval) using lab scale billet castings of the same alloy composition with varying cooling rates. Comparisons in microstructure between the engine block and the billet castings were carried out using optical and scanning electron microscopy, while mechanical properties were assessed using tensile testing. The results suggest that the microstructure at the top and middle of the engine block cylinder bridge was successfully replicated by the billet castings. However, the microstructure at the bottom of the cylinder was not completely replicated due to variations in secondary phase morphology and distribution. The successful replication of engine block microstructure will enable the future optimization of heat treatment parameters. - Highlights: • A method to replicate engine block microstructure was developed. • Billet castings will allow cost effective optimization of heat treatment process. • The replication of microstructure in the cylinder region was mostly successful. • Porosity was more clustered in the billet castings compared to the engine block. • Mechanical properties were lower in billet castings due to porosity and inclusions.

  1. Adaptive scaling of reward in episodic memory: a replication study.

    Science.gov (United States)

    Mason, Alice; Ludwig, Casimir; Farrell, Simon

    2017-11-01

    Reward is thought to enhance episodic memory formation via dopaminergic consolidation. Bunzeck, Dayan, Dolan, and Duzel [(2010). A common mechanism for adaptive scaling of reward and novelty. Human Brain Mapping, 31, 1380-1394] provided functional magnetic resonance imaging (fMRI) and behavioural evidence that reward and episodic memory systems are sensitive to the contextual value of a reward-whether it is relatively higher or lower-as opposed to absolute value or prediction error. We carried out a direct replication of their behavioural study and did not replicate their finding that memory performance associated with reward follows this pattern of adaptive scaling. An effect of reward outcome was in the opposite direction to that in the original study, with lower reward outcomes leading to better memory than higher outcomes. There was a marginal effect of reward context, suggesting that expected value affected memory performance. We discuss the robustness of the reward memory relationship to variations in reward context, and whether other reward-related factors have a more reliable influence on episodic memory.

  2. Large-Scale Flows and Magnetic Fields Produced by Rotating Convection in a Quasi-Geostrophic Model of Planetary Cores

    Science.gov (United States)

    Guervilly, C.; Cardin, P.

    2017-12-01

    Convection is the main heat transport process in the liquid cores of planets. The convective flows are thought to be turbulent and constrained by rotation (corresponding to high Reynolds numbers Re and low Rossby numbers Ro). Under these conditions, and in the absence of magnetic fields, the convective flows can produce coherent Reynolds stresses that drive persistent large-scale zonal flows. The formation of large-scale flows has crucial implications for the thermal evolution of planets and the generation of large-scale magnetic fields. In this work, we explore this problem with numerical simulations using a quasi-geostrophic approximation to model convective and zonal flows at Re 104 and Ro 10-4 for Prandtl numbers relevant for liquid metals (Pr 0.1). The formation of intense multiple zonal jets strongly affects the convective heat transport, leading to the formation of a mean temperature staircase. We also study the generation of magnetic fields by the quasi-geostrophic flows at low magnetic Prandtl numbers.

  3. The PHD Domain of Np95 (mUHRF1) Is Involved in Large-Scale Reorganization of Pericentromeric Heterochromatin

    Science.gov (United States)

    Papait, Roberto; Pistore, Christian; Grazini, Ursula; Babbio, Federica; Cogliati, Sara; Pecoraro, Daniela; Brino, Laurent; Morand, Anne-Laure; Dechampesme, Anne-Marie; Spada, Fabio; Leonhardt, Heinrich; McBlane, Fraser; Oudet, Pierre

    2008-01-01

    Heterochromatic chromosomal regions undergo large-scale reorganization and progressively aggregate, forming chromocenters. These are dynamic structures that rapidly adapt to various stimuli that influence gene expression patterns, cell cycle progression, and differentiation. Np95-ICBP90 (m- and h-UHRF1) is a histone-binding protein expressed only in proliferating cells. During pericentromeric heterochromatin (PH) replication, Np95 specifically relocalizes to chromocenters where it highly concentrates in the replication factories that correspond to less compacted DNA. Np95 recruits HDAC and DNMT1 to PH and depletion of Np95 impairs PH replication. Here we show that Np95 causes large-scale modifications of chromocenters independently from the H3:K9 and H4:K20 trimethylation pathways, from the expression levels of HP1, from DNA methylation and from the cell cycle. The PHD domain is essential to induce this effect. The PHD domain is also required in vitro to increase access of a restriction enzyme to DNA packaged into nucleosomal arrays. We propose that the PHD domain of Np95-ICBP90 contributes to the opening and/or stabilization of dense chromocenter structures to support the recruitment of modifying enzymes, like HDAC and DNMT1, required for the replication and formation of PH. PMID:18508923

  4. Large-scale and synoptic meteorology in the south-east Pacific during the observations campaign VOCALS-REx in austral Spring 2008

    Directory of Open Access Journals (Sweden)

    T. Toniazzo

    2011-05-01

    Full Text Available We present a descriptive overview of the meteorology in the south eastern subtropical Pacific (SEP during the VOCALS-REx intensive observations campaign which was carried out between October and November 2008. Mainly based on data from operational analyses, forecasts, reanalysis, and satellite observations, we focus on spatio-temporal scales from synoptic to planetary. A climatological context is given within which the specific conditions observed during the campaign are placed, with particular reference to the relationships between the large-scale and the regional circulations. The mean circulations associated with the diurnal breeze systems are also discussed. We then provide a summary of the day-to-day synoptic-scale circulation, air-parcel trajectories, and cloud cover in the SEP during VOCALS-REx. Three meteorologically distinct periods of time are identified and the large-scale causes for their different character are discussed. The first period was characterised by significant variability associated with synoptic-scale systems interesting the SEP; while the two subsequent phases were affected by planetary-scale disturbances with a slower evolution. The changes between initial and later periods can be partly explained from the regular march of the annual cycle, but contributions from subseasonal variability and its teleconnections were important. Across the whole of the two months under consideration we find a significant correlation between the depth of the inversion-capped marine boundary layer (MBL and the amount of low cloud in the area of study. We discuss this correlation and argue that at least as a crude approximation a typical scaling may be applied relating MBL and cloud properties with the large-scale parameters of SSTs and tropospheric temperatures. These results are consistent with previously found empirical relationships involving lower-tropospheric stability.

  5. Planetary Magnetism

    International Nuclear Information System (INIS)

    Russell, C.T.

    1980-01-01

    Planetary spacecraft have now probed the magnetic fields of all the terrestrial planets, the moon, Jupiter, and Saturn. These measurements reveal that dynamos are active in at least four of the planets, Mercury, the earth, Jupiter, and Saturn but that Venus and Mars appear to have at most only very weak planetary magnetic fields. The moon may have once possessed an internal dynamo, for the surface rocks are magnetized. The large satellites of the outer solar system are candidates for dynamo action in addition to the large planets themselves. Of these satellites the one most likely to generate its own internal magnetic field is Io

  6. Utilizing a scale model solar system project to visualize important planetary science concepts and develop technology and spatial reasoning skills

    Science.gov (United States)

    Kortenkamp, Stephen J.; Brock, Laci

    2016-10-01

    Scale model solar systems have been used for centuries to help educate young students and the public about the vastness of space and the relative sizes of objects. We have adapted the classic scale model solar system activity into a student-driven project for an undergraduate general education astronomy course at the University of Arizona. Students are challenged to construct and use their three dimensional models to demonstrate an understanding of numerous concepts in planetary science, including: 1) planetary obliquities, eccentricities, inclinations; 2) phases and eclipses; 3) planetary transits; 4) asteroid sizes, numbers, and distributions; 5) giant planet satellite and ring systems; 6) the Pluto system and Kuiper belt; 7) the extent of space travel by humans and robotic spacecraft; 8) the diversity of extrasolar planetary systems. Secondary objectives of the project allow students to develop better spatial reasoning skills and gain familiarity with technology such as Excel formulas, smart-phone photography, and audio/video editing.During our presentation we will distribute a formal description of the project and discuss our expectations of the students as well as present selected highlights from preliminary submissions.

  7. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  8. THE EFFECT OF LARGE-SCALE MAGNETIC TURBULENCE ON THE ACCELERATION OF ELECTRONS BY PERPENDICULAR COLLISIONLESS SHOCKS

    International Nuclear Information System (INIS)

    Guo Fan; Giacalone, Joe

    2010-01-01

    We study the physics of electron acceleration at collisionless shocks that move through a plasma containing large-scale magnetic fluctuations. We numerically integrate the trajectories of a large number of electrons, which are treated as test particles moving in the time-dependent electric and magnetic fields determined from two-dimensional hybrid simulations (kinetic ions and fluid electron). The large-scale magnetic fluctuations effect the electrons in a number of ways and lead to efficient and rapid energization at the shock front. Since the electrons mainly follow along magnetic lines of force, the large-scale braiding of field lines in space allows the fast-moving electrons to cross the shock front several times, leading to efficient acceleration. Ripples in the shock front occurring at various scales will also contribute to the acceleration by mirroring the electrons. Our calculation shows that this process favors electron acceleration at perpendicular shocks. The current study is also helpful in understanding the injection problem for electron acceleration by collisionless shocks. It is also shown that the spatial distribution of energetic electrons is similar to in situ observations. The process may be important to our understanding of energetic electrons in planetary bow shocks and interplanetary shocks, and explaining herringbone structures seen in some type II solar radio bursts.

  9. HiQuant: Rapid Postquantification Analysis of Large-Scale MS-Generated Proteomics Data.

    Science.gov (United States)

    Bryan, Kenneth; Jarboui, Mohamed-Ali; Raso, Cinzia; Bernal-Llinares, Manuel; McCann, Brendan; Rauch, Jens; Boldt, Karsten; Lynn, David J

    2016-06-03

    Recent advances in mass-spectrometry-based proteomics are now facilitating ambitious large-scale investigations of the spatial and temporal dynamics of the proteome; however, the increasing size and complexity of these data sets is overwhelming current downstream computational methods, specifically those that support the postquantification analysis pipeline. Here we present HiQuant, a novel application that enables the design and execution of a postquantification workflow, including common data-processing steps, such as assay normalization and grouping, and experimental replicate quality control and statistical analysis. HiQuant also enables the interpretation of results generated from large-scale data sets by supporting interactive heatmap analysis and also the direct export to Cytoscape and Gephi, two leading network analysis platforms. HiQuant may be run via a user-friendly graphical interface and also supports complete one-touch automation via a command-line mode. We evaluate HiQuant's performance by analyzing a large-scale, complex interactome mapping data set and demonstrate a 200-fold improvement in the execution time over current methods. We also demonstrate HiQuant's general utility by analyzing proteome-wide quantification data generated from both a large-scale public tyrosine kinase siRNA knock-down study and an in-house investigation into the temporal dynamics of the KSR1 and KSR2 interactomes. Download HiQuant, sample data sets, and supporting documentation at http://hiquant.primesdb.eu .

  10. What Should Researchers Expect When They Replicate Studies? A Statistical View of Replicability in Psychological Science.

    Science.gov (United States)

    Patil, Prasad; Peng, Roger D; Leek, Jeffrey T

    2016-07-01

    A recent study of the replicability of key psychological findings is a major contribution toward understanding the human side of the scientific process. Despite the careful and nuanced analysis reported, the simple narrative disseminated by the mass, social, and scientific media was that in only 36% of the studies were the original results replicated. In the current study, however, we showed that 77% of the replication effect sizes reported were within a 95% prediction interval calculated using the original effect size. Our analysis suggests two critical issues in understanding replication of psychological studies. First, researchers' intuitive expectations for what a replication should show do not always match with statistical estimates of replication. Second, when the results of original studies are very imprecise, they create wide prediction intervals-and a broad range of replication effects that are consistent with the original estimates. This may lead to effects that replicate successfully, in that replication results are consistent with statistical expectations, but do not provide much information about the size (or existence) of the true effect. In this light, the results of the Reproducibility Project: Psychology can be viewed as statistically consistent with what one might expect when performing a large-scale replication experiment. © The Author(s) 2016.

  11. A fault diagnosis scheme for planetary gearboxes using adaptive multi-scale morphology filter and modified hierarchical permutation entropy

    Science.gov (United States)

    Li, Yongbo; Li, Guoyan; Yang, Yuantao; Liang, Xihui; Xu, Minqiang

    2018-05-01

    The fault diagnosis of planetary gearboxes is crucial to reduce the maintenance costs and economic losses. This paper proposes a novel fault diagnosis method based on adaptive multi-scale morphological filter (AMMF) and modified hierarchical permutation entropy (MHPE) to identify the different health conditions of planetary gearboxes. In this method, AMMF is firstly adopted to remove the fault-unrelated components and enhance the fault characteristics. Second, MHPE is utilized to extract the fault features from the denoised vibration signals. Third, Laplacian score (LS) approach is employed to refine the fault features. In the end, the obtained features are fed into the binary tree support vector machine (BT-SVM) to accomplish the fault pattern identification. The proposed method is numerically and experimentally demonstrated to be able to recognize the different fault categories of planetary gearboxes.

  12. The Effect of Large Scale Salinity Gradient on Langmuir Turbulence

    Science.gov (United States)

    Fan, Y.; Jarosz, E.; Yu, Z.; Jensen, T.; Sullivan, P. P.; Liang, J.

    2017-12-01

    Langmuir circulation (LC) is believed to be one of the leading order causes of turbulent mixing in the upper ocean. It is important for momentum and heat exchange across the mixed layer (ML) and directly impact the dynamics and thermodynamics in the upper ocean and lower atmosphere including the vertical distributions of chemical, biological, optical, and acoustic properties. Based on Craik and Leibovich (1976) theory, large eddy simulation (LES) models have been developed to simulate LC in the upper ocean, yielding new insights that could not be obtained from field observations and turbulent closure models. Due its high computational cost, LES models are usually limited to small domain sizes and cannot resolve large-scale flows. Furthermore, most LES models used in the LC simulations use periodic boundary conditions in the horizontal direction, which assumes the physical properties (i.e. temperature and salinity) and expected flow patterns in the area of interest are of a periodically repeating nature so that the limited small LES domain is representative for the larger area. Using periodic boundary condition can significantly reduce computational effort in problems, and it is a good assumption for isotropic shear turbulence. However, LC is anisotropic (McWilliams et al 1997) and was observed to be modulated by crosswind tidal currents (Kukulka et al 2011). Using symmetrical domains, idealized LES studies also indicate LC could interact with oceanic fronts (Hamlington et al 2014) and standing internal waves (Chini and Leibovich, 2005). The present study expands our previous LES modeling investigations of Langmuir turbulence to the real ocean conditions with large scale environmental motion that features fresh water inflow into the study region. Large scale gradient forcing is introduced to the NCAR LES model through scale separation analysis. The model is applied to a field observation in the Gulf of Mexico in July, 2016 when the measurement site was impacted by

  13. Planetary Geologic Mapping Handbook - 2009

    Science.gov (United States)

    Tanaka, K. L.; Skinner, J. A.; Hare, T. M.

    2009-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces (e.g., Varnes, 1974). Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962 (Hackman, 1962). Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete

  14. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  15. Recovery of arrested replication forks by homologous recombination is error-prone.

    Directory of Open Access Journals (Sweden)

    Ismail Iraqui

    Full Text Available Homologous recombination is a universal mechanism that allows repair of DNA and provides support for DNA replication. Homologous recombination is therefore a major pathway that suppresses non-homology-mediated genome instability. Here, we report that recovery of impeded replication forks by homologous recombination is error-prone. Using a fork-arrest-based assay in fission yeast, we demonstrate that a single collapsed fork can cause mutations and large-scale genomic changes, including deletions and translocations. Fork-arrest-induced gross chromosomal rearrangements are mediated by inappropriate ectopic recombination events at the site of collapsed forks. Inverted repeats near the site of fork collapse stimulate large-scale genomic changes up to 1,500 times over spontaneous events. We also show that the high accuracy of DNA replication during S-phase is impaired by impediments to fork progression, since fork-arrest-induced mutation is due to erroneous DNA synthesis during recovery of replication forks. The mutations caused are small insertions/duplications between short tandem repeats (micro-homology indicative of replication slippage. Our data establish that collapsed forks, but not stalled forks, recovered by homologous recombination are prone to replication slippage. The inaccuracy of DNA synthesis does not rely on PCNA ubiquitination or trans-lesion-synthesis DNA polymerases, and it is not counteracted by mismatch repair. We propose that deletions/insertions, mediated by micro-homology, leading to copy number variations during replication stress may arise by progression of error-prone replication forks restarted by homologous recombination.

  16. Planetary Geologic Mapping Handbook - 2010. Appendix

    Science.gov (United States)

    Tanaka, K. L.; Skinner, J. A., Jr.; Hare, T. M.

    2010-01-01

    Geologic maps present, in an historical context, fundamental syntheses of interpretations of the materials, landforms, structures, and processes that characterize planetary surfaces and shallow subsurfaces. Such maps also provide a contextual framework for summarizing and evaluating thematic research for a given region or body. In planetary exploration, for example, geologic maps are used for specialized investigations such as targeting regions of interest for data collection and for characterizing sites for landed missions. Whereas most modern terrestrial geologic maps are constructed from regional views provided by remote sensing data and supplemented in detail by field-based observations and measurements, planetary maps have been largely based on analyses of orbital photography. For planetary bodies in particular, geologic maps commonly represent a snapshot of a surface, because they are based on available information at a time when new data are still being acquired. Thus the field of planetary geologic mapping has been evolving rapidly to embrace the use of new data and modern technology and to accommodate the growing needs of planetary exploration. Planetary geologic maps have been published by the U.S. Geological Survey (USGS) since 1962. Over this time, numerous maps of several planetary bodies have been prepared at a variety of scales and projections using the best available image and topographic bases. Early geologic map bases commonly consisted of hand-mosaicked photographs or airbrushed shaded-relief views and geologic linework was manually drafted using mylar bases and ink drafting pens. Map publishing required a tedious process of scribing, color peel-coat preparation, typesetting, and photo-laboratory work. Beginning in the 1990s, inexpensive computing, display capability and user-friendly illustration software allowed maps to be drawn using digital tools rather than pen and ink, and mylar bases became obsolete. Terrestrial geologic maps published by

  17. Replication assessment of surface texture at sub-micrometre scale

    DEFF Research Database (Denmark)

    Quagliotti, Danilo; Tosello, Guido; Hansen, Hans Nørgaard

    2017-01-01

    [2]. A replication process requires reproducing a master geometry by conveying it to a substrate material. It is typically induced by means of different energy sources (usually heat and force) and a direct physical contact between the master and the substrate. Furthermore, concepts of advanced......, because of the replication nature of molding processes, the required specifications for the manufacture of micro molded components must be ensured by means of a metrological approach to surface replication and dimensional control of both master geometry and replicated substrate [3]-[4]. Therefore...... replication was assessed by the replication fidelity, i.e., comparing the produced parts with the tool used to replicate the geometry. Furthermore, the uncertainty of the replication fidelity was achieved by propagating the uncertainties evaluated for both masters and replicas. Finally, despite the specimens...

  18. Luminosity function for planetary nebulae and the number of planetary nebulae in local group galaxies

    International Nuclear Information System (INIS)

    Jacoby, G.H.

    1980-01-01

    Identifications of 19 and 34 faint planetary nebulae have been made in the central regions of the SMC and LMC, respectively, using on-line/off-line filter photography at [O III] and Hα. The previously known brighter planetary nebulae in these fields, eight in both the SMC and the LMC, were also identified. On the basis of the ratio of the numbers of faint to bright planetary nebulae in these fields and the numbers of bright planetary nebulae in the surrounding fields, the total numbers of planetary nebulae in the SMC and LMC are estimated to be 285 +- 78 and 996 +- 253, respectively. Corrections have been applied to account for omissions due to crowding confusion in previous surveys, spatial and detectability incompleteness, and obscuration by dust.Equatorial coordinates and finding charts are presented for all the identified planetary nebulae. The coordinates have uncertainties smaller than 0.''6 relative to nearby bright stars, thereby allowing acquisition of the planetary nebulae by bling offsetting.Monochromatic fluxes are derived photographically and used to determine the luminosity function for Magellanic Cloud planetary nebulae as faint as 6 mag below the brightest. The luminosity function is used to estimate the total numbers of planetary nebulae in eight Local Group galaxies in which only bright planetary nebulae have been identified. The dervied luminosity specific number of planetary nebulae per unit luminosity is nearly constant for all eight galaxies, having a value of 6.1 x 10 -7 planetary nebulae L -1 /sub sun/. The mass specific number, based on the three galaxies with well-determined masses, is 2.1 x 10 -7 planetary nebulae M -1 /sub sun/. With estimates for the luminosity and mass of our Galaxy, its total number of planetary nebulae is calculated to be 10,000 +- 4000, in support of the Cudworth distance scale

  19. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project

    Science.gov (United States)

    Ewers, Robert M.; Didham, Raphael K.; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D.; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L.; Turner, Edgar C.

    2011-01-01

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification. PMID:22006969

  20. A large-scale forest fragmentation experiment: the Stability of Altered Forest Ecosystems Project.

    Science.gov (United States)

    Ewers, Robert M; Didham, Raphael K; Fahrig, Lenore; Ferraz, Gonçalo; Hector, Andy; Holt, Robert D; Kapos, Valerie; Reynolds, Glen; Sinun, Waidi; Snaddon, Jake L; Turner, Edgar C

    2011-11-27

    Opportunities to conduct large-scale field experiments are rare, but provide a unique opportunity to reveal the complex processes that operate within natural ecosystems. Here, we review the design of existing, large-scale forest fragmentation experiments. Based on this review, we develop a design for the Stability of Altered Forest Ecosystems (SAFE) Project, a new forest fragmentation experiment to be located in the lowland tropical forests of Borneo (Sabah, Malaysia). The SAFE Project represents an advance on existing experiments in that it: (i) allows discrimination of the effects of landscape-level forest cover from patch-level processes; (ii) is designed to facilitate the unification of a wide range of data types on ecological patterns and processes that operate over a wide range of spatial scales; (iii) has greater replication than existing experiments; (iv) incorporates an experimental manipulation of riparian corridors; and (v) embeds the experimentally fragmented landscape within a wider gradient of land-use intensity than do existing projects. The SAFE Project represents an opportunity for ecologists across disciplines to participate in a large initiative designed to generate a broad understanding of the ecological impacts of tropical forest modification.

  1. A modeling study of the thermosphere-ionosphere interactions during the boreal winter and spring 2015-2016: Tidal and planetary-scale waves effect on the ionospheric structure.

    Science.gov (United States)

    Sassi, F.; McDonald, S. E.; McCormack, J. P.; Tate, J.; Liu, H.; Kuhl, D.

    2017-12-01

    The 2015-2016 boreal winter and spring is a dynamically very interesting time in the lower atmosphere: a minor high latitude stratospheric warming occurred in February 2016; an interrupted descent of the QBO was found in the tropical stratosphere; and a large warm ENSO took place in the tropical Pacific Ocean. The stratospheric warming, the QBO and ENSO are known to affect in different ways the meteorology of the upper atmosphere in different ways: low latitude solar tides and high latitude planetary-scale waves have potentially important implications on the structure of the ionosphere. In this study, we use global atmospheric analyses from a high-altitude version of the High-Altitude Navy Global Environmental Model (HA-NAVGEM) to constrain the meteorology of numerical simulations of the Specified Dynamics Whole Atmosphere Community Climate Model, extended version (SD-WACCM-X). We describe the large-scale behavior of tropical tides and mid-latitude planetary waves that emerge in the lower thermosphere. The effect on the ionosphere is captured by numerical simulations of the Navy Highly Integrated Thermosphere Ionosphere Demonstration System (Navy-HITIDES) that uses the meteorology generated by SD-WACCM-X to drive ionospheric simulations during this time period. We will analyze the impact of various dynamical fields on the zonal behavior of the ionosphere by selectively filtering the relevant dynamical modes.

  2. Multilevel method for modeling large-scale networks.

    Energy Technology Data Exchange (ETDEWEB)

    Safro, I. M. (Mathematics and Computer Science)

    2012-02-24

    Understanding the behavior of real complex networks is of great theoretical and practical significance. It includes developing accurate artificial models whose topological properties are similar to the real networks, generating the artificial networks at different scales under special conditions, investigating a network dynamics, reconstructing missing data, predicting network response, detecting anomalies and other tasks. Network generation, reconstruction, and prediction of its future topology are central issues of this field. In this project, we address the questions related to the understanding of the network modeling, investigating its structure and properties, and generating artificial networks. Most of the modern network generation methods are based either on various random graph models (reinforced by a set of properties such as power law distribution of node degrees, graph diameter, and number of triangles) or on the principle of replicating an existing model with elements of randomization such as R-MAT generator and Kronecker product modeling. Hierarchical models operate at different levels of network hierarchy but with the same finest elements of the network. However, in many cases the methods that include randomization and replication elements on the finest relationships between network nodes and modeling that addresses the problem of preserving a set of simplified properties do not fit accurately enough the real networks. Among the unsatisfactory features are numerically inadequate results, non-stability of algorithms on real (artificial) data, that have been tested on artificial (real) data, and incorrect behavior at different scales. One reason is that randomization and replication of existing structures can create conflicts between fine and coarse scales of the real network geometry. Moreover, the randomization and satisfying of some attribute at the same time can abolish those topological attributes that have been undefined or hidden from

  3. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  4. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  5. The real-time control of planetary rovers through behavior modification

    Science.gov (United States)

    Miller, David P.

    1991-01-01

    It is not yet clear of what type, and how much, intelligence is needed for a planetary rover to function semi-autonomously on a planetary surface. Current designs assume an advanced AI system that maintains a detailed map of its journeys and the surroundings, and that carefully calculates and tests every move in advance. To achieve these abilities, and because of the limitations of space-qualified electronics, the supporting rover is quite sizable, massing a large fraction of a ton, and requiring technology advances in everything from power to ground operations. An alternative approach is to use a behavior driven control scheme. Recent research has shown that many complex tasks may be achieved by programming a robot with a set of behaviors and activation or deactivating a subset of those behaviors as required by the specific situation in which the robot finds itself. Behavior control requires much less computation than is required by tradition AI planning techniques. The reduced computation requirements allows the entire rover to be scaled down as appropriate (only down-link communications and payload do not scale under these circumstances). The missions that can be handled by the real-time control and operation of a set of small, semi-autonomous, interacting, behavior-controlled planetary rovers are discussed.

  6. Challenges in Managing Trustworthy Large-scale Digital Science

    Science.gov (United States)

    Evans, B. J. K.

    2017-12-01

    The increased use of large-scale international digital science has opened a number of challenges for managing, handling, using and preserving scientific information. The large volumes of information are driven by three main categories - model outputs including coupled models and ensembles, data products that have been processing to a level of usability, and increasingly heuristically driven data analysis. These data products are increasingly the ones that are usable by the broad communities, and far in excess of the raw instruments data outputs. The data, software and workflows are then shared and replicated to allow broad use at an international scale, which places further demands of infrastructure to support how the information is managed reliably across distributed resources. Users necessarily rely on these underlying "black boxes" so that they are productive to produce new scientific outcomes. The software for these systems depend on computational infrastructure, software interconnected systems, and information capture systems. This ranges from the fundamentals of the reliability of the compute hardware, system software stacks and libraries, and the model software. Due to these complexities and capacity of the infrastructure, there is an increased emphasis of transparency of the approach and robustness of the methods over the full reproducibility. Furthermore, with large volume data management, it is increasingly difficult to store the historical versions of all model and derived data. Instead, the emphasis is on the ability to access the updated products and the reliability by which both previous outcomes are still relevant and can be updated for the new information. We will discuss these challenges and some of the approaches underway that are being used to address these issues.

  7. Relations between overturning length scales at the Spanish planetary boundary layer

    Science.gov (United States)

    López, Pilar; Cano, José L.

    2016-04-01

    We analyze the behavior of the maximum Thorpe displacement (dT)max and the Thorpe scale LTat the atmospheric boundary layer (ABL), extending previous research with new data and improving our studies related to the novel use of the Thorpe method applied to ABL. The maximum Thorpe displacements vary between -900 m and 950 m for the different field campaigns. The maximum Thorpe displacement is always greater under convective conditions than under stable ones, independently of its sign. The Thorpe scale LT ranges between 0.2 m and 680 m for the different data sets which cover different stratified mixing conditions (turbulence shear-driven and convective regions). The Thorpe scale does not exceed several tens of meters under stable and neutral stratification conditions related to instantaneous density gradients. In contrast, under convective conditions, Thorpe scales are relatively large, they exceed hundreds of meters which may be related to convective bursts. We analyze the relation between (dT)max and the Thorpe scale LT and we deduce that they verify a power law. We also deduce that there is a difference in exponents of the power laws for convective conditions and shear-driven conditions. These different power laws could identify overturns created under different mechanisms. References Cuxart, J., Yagüe, C., Morales, G., Terradellas, E., Orbe, J., Calvo, J., Fernández, A., Soler, M., Infante, C., Buenestado, P., Espinalt, Joergensen, H., Rees, J., Vilà, J., Redondo, J., Cantalapiedra, I. and Conangla, L.: Stable atmospheric boundary-layer experiment in Spain (Sables 98). A report, Boundary-Layer Meteorology, 96, 337-370, 2000. Dillon, T. M.: Vertical Overturns: A Comparison of Thorpe and Ozmidov Length Scales, J. Geophys. Res., 87(C12), 9601-9613, 1982. Itsweire, E. C.: Measurements of vertical overturns in stably stratified turbulent flow, Phys. Fluids, 27(4), 764-766, 1984. Kitade, Y., Matsuyama, M. and Yoshida, J.: Distribution of overturn induced by internal

  8. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  9. Meta-replication reveals nonstationarity in multi-scale habitat selection of Mexican Spotted Owl

    Science.gov (United States)

    Ho Yi Wan; Kevin McGarigal; Joseph L. Ganey; Valentin Lauret; Brad C. Timm; Samuel A. Cushman

    2017-01-01

    Anthropogenic environmental changes are leading to habitat loss and degradation, driving many species to extinction. In this context, habitat models become increasingly important for effective species management and conservation. However, most habitat studies lack replicated study areas and do not properly address the role of nonstationarity and spatial scales in...

  10. The restructuring of analogical reasoning in planetary science

    Science.gov (United States)

    Soare, Richard J.

    Despite its ubiquity in planetary science, analogue-based reasoning largely has geomorphology and posit rules of use that facilitate the evaluation of Q y, I present four hypotheses concerning aeolian, fluvial and periglacial processes on Mars. Each of these hypotheses is evaluated in terms of the analogical rules presented. The fourth hypothesis is original to this thesis and suggests that a periglacial landscape comprising pingos and small-scale polygonal ground exists in an impact crater located in northwest Utopia Planitia.

  11. Replicating Data for Better Performances in X10

    DEFF Research Database (Denmark)

    Andrić, Marina; De Nicola, Rocco; Lluch Lafuente, Alberto

    2015-01-01

    used to experiment with X10, a parallel programming language primarily targeting clusters of multi-core processors linked in a large-scale system via high-performance networks. Our approach aims at allowing the programmer to specify and coordinate the replication of shared data items by taking...

  12. Planetary Data Archiving Plan at JAXA

    Science.gov (United States)

    Shinohara, Iku; Kasaba, Yasumasa; Yamamoto, Yukio; Abe, Masanao; Okada, Tatsuaki; Imamura, Takeshi; Sobue, Shinichi; Takashima, Takeshi; Terazono, Jun-Ya

    After the successful rendezvous of Hayabusa with the small-body planet Itokawa, and the successful launch of Kaguya to the moon, Japanese planetary community has gotten their own and full-scale data. However, at this moment, these datasets are only available from the data sites managed by each mission team. The databases are individually constructed in the different formats, and the user interface of these data sites is not compatible with foreign databases. To improve the usability of the planetary archives at JAXA and to enable the international data exchange smooth, we are investigating to make a new planetary database. Within a coming decade, Japan will have fruitful datasets in the planetary science field, Venus (Planet-C), Mercury (BepiColombo), and several missions in planning phase (small-bodies). In order to strongly assist the international scientific collaboration using these mission archive data, the planned planetary data archive at JAXA should be managed in an unified manner and the database should be constructed in the international planetary database standard style. In this presentation, we will show the current status and future plans of the planetary data archiving at JAXA.

  13. Lunar and Planetary Geology

    Science.gov (United States)

    Basilevsky, Alexander T.

    2018-05-01

    Lunar and planetary geology can be described using examples such as the geology of Earth (as the reference case) and geologies of the Earth's satellite the Moon; the planets Mercury, Mars and Venus; the satellite of Saturn Enceladus; the small stony asteroid Eros; and the nucleus of the comet 67P Churyumov-Gerasimenko. Each body considered is illustrated by its global view, with information given as to its position in the solar system, size, surface, environment including gravity acceleration and properties of its atmosphere if it is present, typical landforms and processes forming them, materials composing these landforms, information on internal structure of the body, stages of its geologic evolution in the form of stratigraphic scale, and estimates of the absolute ages of the stratigraphic units. Information about one body may be applied to another body and this, in particular, has led to the discovery of the existence of heavy "meteoritic" bombardment in the early history of the solar system, which should also significantly affect Earth. It has been shown that volcanism and large-scale tectonics may have not only been an internal source of energy in the form of radiogenic decay of potassium, uranium and thorium, but also an external source in the form of gravity tugging caused by attractions of the neighboring bodies. The knowledge gained by lunar and planetary geology is important for planning and managing space missions and for the practical exploration of other bodies of the solar system and establishing manned outposts on them.

  14. A Dynamic Analysis of the Role of the Planetary- and Synoptic-Scale in the Summer of 2010 Blocking Episodes over the European Part of Russia

    Directory of Open Access Journals (Sweden)

    Anthony R. Lupo

    2012-01-01

    Full Text Available During the summer of 2010, an unusually persistent blocking episode resulted in anomalously warm dry weather over the European part of Russia. The excessive heat resulted in forest and peat fires, impacted terrestrial ecosystems, greatly increased pollution in urban areas, and increased mortality rates in the region. Using the National Centers for Atmospheric Research (NCAR, National Centers for Environmental Prediction (NCEP reanalysis datasets, the climatological and dynamic character of blocking events for summer 2010 and a precursor May blocking event were examined. We found that these events were stronger and longer lived than typical warm season events. Using dynamic methods, we demonstrate that the July 2010 event was a synoptic-scale dominant blocking event; unusual in the summer season. An analysis of phase diagrams demonstrated that the planetary-scale did not become stable until almost one week after block onset. For all other blocking events studied here and previously, the planetary-scale became stable around onset. Analysis using area integrated regional enstrophy (IRE demonstrated that for the July 2010 event, synoptic-scale IRE increased at block onset. This was similar for the May 2010 event, but different from case studies examined previously that demonstrated the planetary-scale IRE was prominent at block onset.

  15. HESS Opinions: A planetary boundary on freshwater use is misleading

    Science.gov (United States)

    Heistermann, Maik

    2017-07-01

    In 2009, a group of prominent Earth scientists introduced the planetary boundaries (PB) framework: they suggested nine global control variables, and defined corresponding thresholds which, if crossed, could generate unacceptable environmental change. The concept builds on systems theory, and views Earth as a complex adaptive system in which anthropogenic disturbances may trigger non-linear, abrupt, and irreversible changes at the global scale, and push the Earth system outside the stable environmental state of the Holocene. While the idea has been remarkably successful in both science and policy circles, it has also raised fundamental concerns, as the majority of suggested processes and their corresponding planetary boundaries do not operate at the global scale, and thus apparently lack the potential to trigger abrupt planetary changes. This paper picks up the debate with specific regard to the planetary boundary on global freshwater use. While the bio-physical impacts of excessive water consumption are typically confined to the river basin scale, the PB proponents argue that water-induced environmental disasters could build up to planetary-scale feedbacks and system failures. So far, however, no evidence has been presented to corroborate that hypothesis. Furthermore, no coherent approach has been presented to what extent a planetary threshold value could reflect the risk of regional environmental disaster. To be sure, the PB framework was revised in 2015, extending the planetary freshwater boundary with a set of basin-level boundaries inferred from environmental water flow assumptions. Yet, no new evidence was presented, either with respect to the ability of those basin-level boundaries to reflect the risk of regional regime shifts or with respect to a potential mechanism linking river basins to the planetary scale. So while the idea of a planetary boundary on freshwater use appears intriguing, the line of arguments presented so far remains speculative and

  16. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  17. PSYM-WIDE: A Survey for Large-separation Planetary-mass Companions to Late Spectral Type Members of Young Moving Groups

    Science.gov (United States)

    Naud, Marie-Eve; Artigau, Étienne; Doyon, René; Malo, Lison; Gagné, Jonathan; Lafrenière, David; Wolf, Christian; Magnier, Eugene A.

    2017-09-01

    We present the results of a direct imaging survey for very large separation (>100 au), low-mass companions around 95 nearby young K5-L5 stars and brown dwarfs. They are high-likelihood candidates or confirmed members of the young (≲150 Myr) β Pictoris and AB Doradus moving groups (ABDMG) and the TW Hya, Tucana-Horologium, Columba, Carina, and Argus associations. Images in I\\prime and z\\prime filters were obtained with the Gemini Multi-Object Spectrograph (GMOS) on Gemini South to search for companions down to an apparent magnitude of z\\prime ˜ 22-24 at separations ≳20″ from the targets and in the remainder of the wide 5.‧5 × 5.‧5 GMOS field of view. This allowed us to probe the most distant region where planetary-mass companions could be gravitationally bound to the targets. This region was left largely unstudied by past high-contrast imaging surveys, which probed much closer-in separations. This survey led to the discovery of a planetary-mass (9-13 {M}{Jup}) companion at 2000 au from the M3V star GU Psc, a highly probable member of ABDMG. No other substellar companions were identified. These results allowed us to constrain the frequency of distant planetary-mass companions (5-13 {M}{Jup}) to {0.84}-0.66+6.73% (95% confidence) at semimajor axes between 500 and 5000 au around young K5-L5 stars and brown dwarfs. This is consistent with other studies suggesting that gravitationally bound planetary-mass companions at wide separations from low-mass stars are relatively rare.

  18. The Climate Potentials and Side-Effects of Large-Scale terrestrial CO2 Removal - Insights from Quantitative Model Assessments

    Science.gov (United States)

    Boysen, L.; Heck, V.; Lucht, W.; Gerten, D.

    2015-12-01

    Terrestrial carbon dioxide removal (tCDR) through dedicated biomass plantations is considered as one climate engineering (CE) option if implemented at large-scale. While the risks and costs are supposed to be small, the effectiveness depends strongly on spatial and temporal scales of implementation. Based on simulations with a dynamic global vegetation model (LPJmL) we comprehensively assess the effectiveness, biogeochemical side-effects and tradeoffs from an earth system-analytic perspective. We analyzed systematic land-use scenarios in which all, 25%, or 10% of natural and/or agricultural areas are converted to tCDR plantations including the assumption that biomass plantations are established once the 2°C target is crossed in a business-as-usual climate change trajectory. The resulting tCDR potentials in year 2100 include the net accumulated annual biomass harvests and changes in all land carbon pools. We find that only the most spatially excessive, and thus undesirable, scenario would be capable to restore the 2° target by 2100 under continuing high emissions (with a cooling of 3.02°C). Large-scale biomass plantations covering areas between 1.1 - 4.2 Gha would produce a climate reduction potential of 0.8 - 1.4°C. tCDR plantations at smaller scales do not build up enough biomass over this considered period and the potentials to achieve global warming reductions are substantially lowered to no more than 0.5-0.6°C. Finally, we demonstrate that the (non-economic) costs for the Earth system include negative impacts on the water cycle and on ecosystems, which are already under pressure due to both land use change and climate change. Overall, tCDR may lead to a further transgression of land- and water-related planetary boundaries while not being able to set back the crossing of the planetary boundary for climate change. tCDR could still be considered in the near-future mitigation portfolio if implemented on small scales on wisely chosen areas.

  19. On the evolution of central stars of planetary nebulae

    International Nuclear Information System (INIS)

    Yahel, R.Z.

    1977-01-01

    The evolution of nuclei of planetary nebulae has been calculated from the end of the ejection stage that produces the nebulae to the white dwarf stage. The structure of the central star is in agreement with the general picture of Finzi (1973) about the mass ejection from the progenitors of planetary nebulae. It has been found that in order to obtain evolutionary track consistent with the Harman-Seaton track (O'Dell, 1968) one has to assume that the masses of the nuclei stars are less than approximately 0.7 solar masses. The calculated evolutionary time scale of the central stars of planetary nebulae is approximately 2 x 10 4 yr. This time scale is negatively correlated with the stellar mass: the heavier the stellar mass, the shorter the evolutionary time scale. (Auth.)

  20. Unprecedented large inverted repeats at the replication terminus of circular bacterial chromosomes suggest a novel mode of chromosome rescue

    Science.gov (United States)

    El Kafsi, Hela; Loux, Valentin; Mariadassou, Mahendra; Blin, Camille; Chiapello, Hélène; Abraham, Anne-Laure; Maguin, Emmanuelle; van de Guchte, Maarten

    2017-01-01

    The first Lactobacillus delbrueckii ssp. bulgaricus genome sequence revealed the presence of a very large inverted repeat (IR), a DNA sequence arrangement which thus far seemed inconceivable in a non-manipulated circular bacterial chromosome, at the replication terminus. This intriguing observation prompted us to investigate if similar IRs could be found in other bacteria. IRs with sizes varying from 38 to 76 kbp were found at the replication terminus of all 5 L. delbrueckii ssp. bulgaricus chromosomes analysed, but in none of 1373 other chromosomes. They represent the first naturally occurring very large IRs detected in circular bacterial genomes. A comparison of the L. bulgaricus replication terminus regions and the corresponding regions without IR in 5 L. delbrueckii ssp. lactis genomes leads us to propose a model for the formation and evolution of the IRs. The DNA sequence data are consistent with a novel model of chromosome rescue after premature replication termination or irreversible chromosome damage near the replication terminus, involving mechanisms analogous to those proposed in the formation of very large IRs in human cancer cells. We postulate that the L. delbrueckii ssp. bulgaricus-specific IRs in different strains derive from a single ancestral IR of at least 93 kbp. PMID:28281695

  1. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  2. Large Eddy simulation of turbulence: A subgrid scale model including shear, vorticity, rotation, and buoyancy

    Science.gov (United States)

    Canuto, V. M.

    1994-01-01

    The Reynolds numbers that characterize geophysical and astrophysical turbulence (Re approximately equals 10(exp 8) for the planetary boundary layer and Re approximately equals 10(exp 14) for the Sun's interior) are too large to allow a direct numerical simulation (DNS) of the fundamental Navier-Stokes and temperature equations. In fact, the spatial number of grid points N approximately Re(exp 9/4) exceeds the computational capability of today's supercomputers. Alternative treatments are the ensemble-time average approach, and/or the volume average approach. Since the first method (Reynolds stress approach) is largely analytical, the resulting turbulence equations entail manageable computational requirements and can thus be linked to a stellar evolutionary code or, in the geophysical case, to general circulation models. In the volume average approach, one carries out a large eddy simulation (LES) which resolves numerically the largest scales, while the unresolved scales must be treated theoretically with a subgrid scale model (SGS). Contrary to the ensemble average approach, the LES+SGS approach has considerable computational requirements. Even if this prevents (for the time being) a LES+SGS model to be linked to stellar or geophysical codes, it is still of the greatest relevance as an 'experimental tool' to be used, inter alia, to improve the parameterizations needed in the ensemble average approach. Such a methodology has been successfully adopted in studies of the convective planetary boundary layer. Experienc e with the LES+SGS approach from different fields has shown that its reliability depends on the healthiness of the SGS model for numerical stability as well as for physical completeness. At present, the most widely used SGS model, the Smagorinsky model, accounts for the effect of the shear induced by the large resolved scales on the unresolved scales but does not account for the effects of buoyancy, anisotropy, rotation, and stable stratification. The

  3. Replication fidelity assessment of large area sub-μm structured polymer surfaces using scatterometry

    International Nuclear Information System (INIS)

    Calaon, M; Hansen, H N; Tosello, G; Madsen, M H; Weirich, J; Hansen, P E; Garnaes, J; Tang, P T

    2015-01-01

    The present study addresses one of the key challenges in the product quality control of transparent structured polymer substrates, the replication fidelity of sub-μm structures over a large area. Additionally the work contributes to the development of new techniques focused on in-line characterization of large nanostructured surfaces using scatterometry. In particular an approach to quantify the replication fidelity of high volume manufacturing processes such as polymer injection moulding is presented. Both periodic channels and semi-spherical structures were fabricated on nickel shims used for later injection moulding of Cyclic-olefin-copolymer (COC) substrate were the sub-μm features where ultimately transferred. The scatterometry system was validated using calibrated atomic force microscopy measurements and a model based on scalar diffraction theory employed to calculate the expected angular distribution of the reflected and the transmitted intensity for the nickel surfaces and structured COC and, respectively. (paper)

  4. Planetary climates (princeton primers in climate)

    CERN Document Server

    Ingersoll, Andrew

    2013-01-01

    This concise, sophisticated introduction to planetary climates explains the global physical and chemical processes that determine climate on any planet or major planetary satellite--from Mercury to Neptune and even large moons such as Saturn's Titan. Although the climates of other worlds are extremely diverse, the chemical and physical processes that shape their dynamics are the same. As this book makes clear, the better we can understand how various planetary climates formed and evolved, the better we can understand Earth's climate history and future.

  5. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  6. Gene organization inside replication domains in mammalian genomes

    Science.gov (United States)

    Zaghloul, Lamia; Baker, Antoine; Audit, Benjamin; Arneodo, Alain

    2012-11-01

    We investigate the large-scale organization of human genes with respect to "master" replication origins that were previously identified as bordering nucleotide compositional skew domains. We separate genes in two categories depending on their CpG enrichment at the promoter which can be considered as a marker of germline DNA methylation. Using expression data in mouse, we confirm that CpG-rich genes are highly expressed in germline whereas CpG-poor genes are in a silent state. We further show that, whether tissue-specific or broadly expressed (housekeeping genes), the CpG-rich genes are over-represented close to the replication skew domain borders suggesting some coordination of replication and transcription. We also reveal that the transcription of the longest CpG-rich genes is co-oriented with replication fork progression so that the promoter of these transcriptionally active genes be located into the accessible open chromatin environment surrounding the master replication origins that border the replication skew domains. The observation of a similar gene organization in the mouse genome confirms the interplay of replication, transcription and chromatin structure as the cornerstone of mammalian genome architecture.

  7. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  8. Fluvial geomorphology on Earth-like planetary surfaces: A review.

    Science.gov (United States)

    Baker, Victor R; Hamilton, Christopher W; Burr, Devon M; Gulick, Virginia C; Komatsu, Goro; Luo, Wei; Rice, James W; Rodriguez, J A P

    2015-09-15

    Morphological evidence for ancient channelized flows (fluvial and fluvial-like landforms) exists on the surfaces of all of the inner planets and on some of the satellites of the Solar System. In some cases, the relevant fluid flows are related to a planetary evolution that involves the global cycling of a volatile component (water for Earth and Mars; methane for Saturn's moon Titan). In other cases, as on Mercury, Venus, Earth's moon, and Jupiter's moon Io, the flows were of highly fluid lava. The discovery, in 1972, of what are now known to be fluvial channels and valleys on Mars sparked a major controversy over the role of water in shaping the surface of that planet. The recognition of the fluvial character of these features has opened unresolved fundamental questions about the geological history of water on Mars, including the presence of an ancient ocean and the operation of a hydrological cycle during the earliest phases of planetary history. Other fundamental questions posed by fluvial and fluvial-like features on planetary bodies include the possible erosive action of large-scale outpourings of very fluid lavas, such as those that may have produced the remarkable canali forms on Venus; the ability of exotic fluids, such as methane, to create fluvial-like landforms, as observed on Saturn's moon, Titan; and the nature of sedimentation and erosion under different conditions of planetary surface gravity. Planetary fluvial geomorphology also illustrates fundamental epistemological and methodological issues, including the role of analogy in geomorphological/geological inquiry.

  9. Replication fidelity assessment of polymer large area sub-μm structured surfaces using fast angular intensity distribution measurements

    DEFF Research Database (Denmark)

    Calaon, M.; Hansen, H. N.; Tosello, G.

    The present investigation addresses one of the key challenges in the product quality control of transparent polymer substrates, identified in the replication fidelity of sub-μm structures over large area. Additionally the work contributes to the development of new techniques focused on in......-line characterization of large nanostructured surfaces. In particular the aim of the present paper is to introduce initial development of a metrology approach to quantify the replication fidelity of produced 500 nm diameter semi-spheres via anodizing of aluminum (Al) and subsequent nickel electroforming to COC...

  10. Virtual Machine Replication on Achieving Energy-Efficiency in a Cloud

    Directory of Open Access Journals (Sweden)

    Subrota K. Mondal

    2016-07-01

    Full Text Available The rapid growth in cloud service demand has led to the establishment of large-scale virtualized data centers in which virtual machines (VMs are used to handle user requests for service. A user’s request cannot be completed if the VM fails. Replication mechanisms can be used to mitigate the impact of failures. Further, data centers consume a large amount of energy resulting in high operating costs and contributing to significant greenhouse gas (GHG emissions. In this paper, we focus on Infrastructure as a Service (IaaS cloud where user job requests are processed by VMs and analyze the effectiveness of VM replications in terms of job completion time performance as well as energy consumption. Three different schemes: cold, warm, and hot replications are considered. The trade-offs between job completion time and energy consumption in different replication schemes are characterized through comprehensive analytical models which capture VM state transitions and associated power consumption patterns. The effectiveness of replication schemes are demonstrated through experimental results. To verify the validity of the proposed analytical models, we extend the widely used cloud simulator CloudSim and compare the simulation results with analytical solutions.

  11. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  12. Rocky Planetary Debris Around Young WDs

    Science.gov (United States)

    Gaensicke, B.

    2014-04-01

    The vast majority of all known planet host stars, including the Sun, will eventually evolve into red giants and finally end their lives as white dwarfs: extremely dense Earth-sized stellar embers. Only close-in planets will be devoured during the red-giant phase. In the solar system, Mars, the asteroid belt, and all the giant planets will escape evaporation, and the same is true for many of the known exo-planets. It is hence certain that a significant fraction of the known white dwarfs were once host stars to planets, and it is very likely that many of them still have remnants of planetary systems. The detection of metals in the atmospheres of white dwarfs is the unmistakable signpost of such evolved planetary systems. The strong surface gravity of white dwarfs causes metals to sink out of the atmosphere on time-scales much shorter than their cooling ages, leading unavoidably to pristine H/He atmospheres. Therefore any metals detected in the atmosphere of a white dwarf imply recent or ongoing accretion of planetary debris. In fact, planetary debris is also detected as circumstellar dust and gas around a number of white dwarfs. These debris disks are formed from the tidal disruption of asteroids or Kuiper belt-like objects, stirred up by left-over planets, and are subsequently accreted onto the white dwarf, imprinting their abundance pattern into its atmosphere. Determining the photospheric abundances of debris-polluted white dwarfs is hence entirely analogue to the use of meteorites, "rocks that fell from the sky", for measuring the abundances of planetary material in the solar system. I will briefly review this new field of exo-planet science, and then focus on the results of a large, unbiased COS snapshot survey of relatively young ( 20-100Myr) white dwarfs that we carried out in Cycle 18/19. * At least 30% of all white dwarfs in our sample are accreting planetary debris, and that fraction may be as high as 50%. * In most cases where debris pollution is detected

  13. The diversity of planetary system architectures: contrasting theory with observations

    Science.gov (United States)

    Miguel, Y.; Guilera, O. M.; Brunini, A.

    2011-10-01

    In order to explain the observed diversity of planetary system architectures and relate this primordial diversity to the initial properties of the discs where they were born, we develop a semi-analytical model for computing planetary system formation. The model is based on the core instability model for the gas accretion of the embryos and the oligarchic growth regime for the accretion of the solid cores. Two regimes of planetary migration are also included. With this model, we consider different initial conditions based on recent results of protoplanetary disc observations to generate a variety of planetary systems. These systems are analysed statistically, exploring the importance of several factors that define the planetary system birth environment. We explore the relevance of the mass and size of the disc, metallicity, mass of the central star and time-scale of gaseous disc dissipation in defining the architecture of the planetary system. We also test different values of some key parameters of our model to find out which factors best reproduce the diverse sample of observed planetary systems. We assume different migration rates and initial disc profiles, in the context of a surface density profile motivated by similarity solutions. According to this, and based on recent protoplanetary disc observational data, we predict which systems are the most common in the solar neighbourhood. We intend to unveil whether our Solar system is a rarity or whether more planetary systems like our own are expected to be found in the near future. We also analyse which is the more favourable environment for the formation of habitable planets. Our results show that planetary systems with only terrestrial planets are the most common, being the only planetary systems formed when considering low-metallicity discs, which also represent the best environment for the development of rocky, potentially habitable planets. We also found that planetary systems like our own are not rare in the

  14. The vertical structure of Jupiter and Saturn zonal winds from nonlinear simulations of major vortices and planetary-scale disturbances

    Science.gov (United States)

    Garcia-Melendo, E.; Legarreta, J.; Sanchez-Lavega, A.

    2012-12-01

    Direct measurements of the structure of the zonal winds of Jupiter and Saturn below the upper cloud layer are very difficult to retrieve. Except from the vertical profile at a Jupiter hot spot obtained from the Galileo probe in 1995 and measurements from cloud tracking by Cassini instruments just below the upper cloud, no other data are available. We present here our inferences of the vertical structure of Jupiter and Saturn zonal wind across the upper troposphere (deep down to about 10 bar level) obtained from nonlinear simulations using the EPIC code of the stability and interactions of large-scale vortices and planetary-scale disturbances in both planets. Acknowledgements: This work has been funded by Spanish MICIIN AYA2009-10701 with FEDER support, Grupos Gobierno Vasco IT-464-07 and UPV/EHU UFI11/55. [1] García-Melendo E., Sánchez-Lavega A., Dowling T.., Icarus, 176, 272-282 (2005). [2] García-Melendo E., Sánchez-Lavega A., Hueso R., Icarus, 191, 665-677 (2007). [3] Sánchez-Lavega A., et al., Nature, 451, 437- 440 (2008). [4] Sánchez-Lavega A., et al., Nature, 475, 71-74 (2011).

  15. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  16. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  17. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  18. Watchdog - a workflow management system for the distributed analysis of large-scale experimental data.

    Science.gov (United States)

    Kluge, Michael; Friedel, Caroline C

    2018-03-13

    The development of high-throughput experimental technologies, such as next-generation sequencing, have led to new challenges for handling, analyzing and integrating the resulting large and diverse datasets. Bioinformatical analysis of these data commonly requires a number of mutually dependent steps applied to numerous samples for multiple conditions and replicates. To support these analyses, a number of workflow management systems (WMSs) have been developed to allow automated execution of corresponding analysis workflows. Major advantages of WMSs are the easy reproducibility of results as well as the reusability of workflows or their components. In this article, we present Watchdog, a WMS for the automated analysis of large-scale experimental data. Main features include straightforward processing of replicate data, support for distributed computer systems, customizable error detection and manual intervention into workflow execution. Watchdog is implemented in Java and thus platform-independent and allows easy sharing of workflows and corresponding program modules. It provides a graphical user interface (GUI) for workflow construction using pre-defined modules as well as a helper script for creating new module definitions. Execution of workflows is possible using either the GUI or a command-line interface and a web-interface is provided for monitoring the execution status and intervening in case of errors. To illustrate its potentials on a real-life example, a comprehensive workflow and modules for the analysis of RNA-seq experiments were implemented and are provided with the software in addition to simple test examples. Watchdog is a powerful and flexible WMS for the analysis of large-scale high-throughput experiments. We believe it will greatly benefit both users with and without programming skills who want to develop and apply bioinformatical workflows with reasonable overhead. The software, example workflows and a comprehensive documentation are freely

  19. The effects of ultrasonic scaling duration and replication on caspase-3 expression of Sprague Dawley rat's pulp cells

    Directory of Open Access Journals (Sweden)

    Archadian Nuryanti

    2015-03-01

    Full Text Available Background: Ultrasonic scaling has been used commonly for stain and calculus removal in dental clinic for over 60years. Previous researches even had proved that ultrasonic scaling may give effects on the surface of tooth root. Ultrasonic wave exposure for 20 seconds or more can increase caspase-3 activity as an indicator of increased apoptotic cells associated with tissue damage. Purpose: This research was aimed to investigate the effects of ultrasonic scaling duration and replication on caspace-3 expression in dental pulp cells. Methods: The samples of this research were 54 male Sprague Dawley rats aged 2 months old divided into 2 groups, each of which consisted of 27 mice. The first group was induced with stain, while the second group was not. Each group was divided into 3 subgroups for ultrasonic scaling 1, 3, and 5 times. Each subgroup was divided into 3 sub-subgroups for duration procedure of 15, 30 and 60 seconds respectively. During scaling process, those rats were anesthetized using 0.1 ml of ketamine and 0.1 ml of xylol added to 2 ml of distilled water injected intramuscularly into their right thigh as much as 0.4 ml. Scaling was done on buccal surface of right first maxillary molar from cervical to occlusal. The teeth were decalcified and embedded in paraffin, then their sagittal plane was cut for thickness of 3µm and painted with immunohystochemistry for detecting caspace-3 expression of cell within dental pulp. Results: The results showed that the duration and replication of ultrasonic scaling procedures affected on the expression of caspace-3 cells as analyzed with Univariate Analisis of Variance test (p<0.05. Conclusion: It can be concluded that duration and replication of ultrasonic scaling procedure on teeth with and without stain enhauced the expression of  caspace-3 in dental pulp cells.

  20. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  1. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  2. Is Planetary-Scale High Tech Civilization Climatically Sustainable?: The Geophysics v Economics Paradigm War

    Science.gov (United States)

    Hoffert, M.

    2012-12-01

    Climate/energy policy is gridlocked between (1) a geophysics perspective revealing long-term instabilities from continued energy consumption growth, of which the fossil fuel greenhouse an early symptom; and (2) short-term, fossil-fuel energized-rapid-economic-growth-driven policies likely adaptive for hunter-gatherers competing for scarce food, but climatically fatal to planetary-scale economies dependent on agriculture and "energy slaves." Incorporating social science into climate/energy policy formulation has focused on integrated assessment models (IAMs) exploring scenarios (parallel universes making different social choices) depicting the evolution of GDP, energy consumed, the energy technology mixture, land use, greenhouse gas and aerosol emissions, and radiative forcing). Representative concentration pathways (RCP) scenarios developed for the IPCC AR5 report imply 5-10 degree C warming from fossil fuel burning unless unprecedentedly fast decarbonization rates ~ 7 %/yr are implemented from 2020 to 2100. A massive transition to carbon neutrality by midcentury is needed to keep warming use continues growing at 2%/year, fossil-fuel-greenhouse level warming would be generated by heat rejecting in only 200-300 years underscoring that sustainability implies a steady state planetary economy (FIG.2). Evolutionary psychology and neuroeconomics are emergent disciplines that may illuminate the physical v social science paradigm conflict threatening human survivability.

  3. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  4. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  5. Optical spectra of radio planetary nebulae in the large Magellanic Cloud

    Directory of Open Access Journals (Sweden)

    Payne J.L.

    2008-01-01

    Full Text Available We present 11 spectra from 12 candidate radio sources co-identified with known planetary nebulae (PNe in the Large Magellanic Cloud (LMC. Originally found in Australia Telescope Compact Array (ATCA LMC surveys at 1.4, 4.8 and 8.64 GHz and confirmed by new high resolution ATCA images at 6 and 3 cm (4' /2' , these complement data recently presented for candidate radio PNe in the Small Magellanic Cloud (SMC. Their spectra were obtained using the Radcliff 1.9-meter telescope in Sutherland (South Africa. All of the optical PNe and radio candidates are within 2' and may represent a population of selected radio bright sample only. Nebular ionized masses of these objects are estimated to be as high as 1.8 Mfi, supporting the idea that massive PNe progenitor central stars lose much of their mass in the asymptotic giant branch (AGB phase or prior. We also identify a sub-population (33% of radio PNe candidates with prominent ionized iron emission lines.

  6. The Kinematics of the Permitted C ii λ 6578 Line in a Large Sample of Planetary Nebulae

    Energy Technology Data Exchange (ETDEWEB)

    Richer, Michael G.; Suárez, Genaro; López, José Alberto; García Díaz, María Teresa, E-mail: richer@astrosen.unam.mx, E-mail: gsuarez@astro.unam.mx, E-mail: jal@astrosen.unam.mx, E-mail: tere@astro.unam.mx [Instituto de Astronomía, Universidad Nacional Autónoma de México, Ensenada, Baja California (Mexico)

    2017-03-01

    We present spectroscopic observations of the C ii λ 6578 permitted line for 83 lines of sight in 76 planetary nebulae at high spectral resolution, most of them obtained with the Manchester Echelle Spectrograph on the 2.1 m telescope at the Observatorio Astronómico Nacional on the Sierra San Pedro Mártir. We study the kinematics of the C ii λ 6578 permitted line with respect to other permitted and collisionally excited lines. Statistically, we find that the kinematics of the C ii λ 6578 line are not those expected if this line arises from the recombination of C{sup 2+} ions or the fluorescence of C{sup +} ions in ionization equilibrium in a chemically homogeneous nebular plasma, but instead its kinematics are those appropriate for a volume more internal than expected. The planetary nebulae in this sample have well-defined morphology and are restricted to a limited range in H α line widths (no large values) compared to their counterparts in the Milky Way bulge; both these features could be interpreted as the result of young nebular shells, an inference that is also supported by nebular modeling. Concerning the long-standing discrepancy between chemical abundances inferred from permitted and collisionally excited emission lines in photoionized nebulae, our results imply that multiple plasma components occur commonly in planetary nebulae.

  7. Equations of State: Gateway to Planetary Origin and Evolution (Invited)

    Science.gov (United States)

    Melosh, J.

    2013-12-01

    Research over the past decades has shown that collisions between solid bodies govern many crucial phases of planetary origin and evolution. The accretion of the terrestrial planets was punctuated by planetary-scale impacts that generated deep magma oceans, ejected primary atmospheres and probably created the moons of Earth and Pluto. Several extrasolar planetary systems are filled with silicate vapor and condensed 'tektites', probably attesting to recent giant collisions. Even now, long after the solar system settled down from its violent birth, a large asteroid impact wiped out the dinosaurs, while other impacts may have played a role in the origin of life on Earth and perhaps Mars, while maintaining a steady exchange of small meteorites between the terrestrial planets and our moon. Most of these events are beyond the scale at which experiments are possible, so that our main research tool is computer simulation, constrained by the laws of physics and the behavior of materials during high-speed impact. Typical solar system impact velocities range from a few km/s in the outer solar system to 10s of km/s in the inner system. Extrasolar planetary systems expand that range to 100s of km/sec typical of the tightly clustered planetary systems now observed. Although computer codes themselves are currently reaching a high degree of sophistication, we still rely on experimental studies to determine the Equations of State (EoS) of materials critical for the correct simulation of impact processes. The recent expansion of the range of pressures available for study, from a few 100 GPa accessible with light gas guns up to a few TPa from current high energy accelerators now opens experimental access to the full velocity range of interest in our solar system. The results are a surprise: several groups in both the USA and Japan have found that silicates and even iron melt and vaporize much more easily in an impact than previously anticipated. The importance of these findings is

  8. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  9. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system

    Science.gov (United States)

    Jensen, Tue V.; Pinson, Pierre

    2017-11-01

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  10. RE-Europe, a large-scale dataset for modeling a highly renewable European electricity system.

    Science.gov (United States)

    Jensen, Tue V; Pinson, Pierre

    2017-11-28

    Future highly renewable energy systems will couple to complex weather and climate dynamics. This coupling is generally not captured in detail by the open models developed in the power and energy system communities, where such open models exist. To enable modeling such a future energy system, we describe a dedicated large-scale dataset for a renewable electric power system. The dataset combines a transmission network model, as well as information for generation and demand. Generation includes conventional generators with their technical and economic characteristics, as well as weather-driven forecasts and corresponding realizations for renewable energy generation for a period of 3 years. These may be scaled according to the envisioned degrees of renewable penetration in a future European energy system. The spatial coverage, completeness and resolution of this dataset, open the door to the evaluation, scaling analysis and replicability check of a wealth of proposals in, e.g., market design, network actor coordination and forecasting of renewable power generation.

  11. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  12. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  13. Collisional stripping of planetary crusts

    Science.gov (United States)

    Carter, Philip J.; Leinhardt, Zoë M.; Elliott, Tim; Stewart, Sarah T.; Walter, Michael J.

    2018-02-01

    Geochemical studies of planetary accretion and evolution have invoked various degrees of collisional erosion to explain differences in bulk composition between planets and chondrites. Here we undertake a full, dynamical evaluation of 'crustal stripping' during accretion and its key geochemical consequences. Crusts are expected to contain a significant fraction of planetary budgets of incompatible elements, which include the major heat producing nuclides. We present smoothed particle hydrodynamics simulations of collisions between differentiated rocky planetesimals and planetary embryos. We find that the crust is preferentially lost relative to the mantle during impacts, and we have developed a scaling law based on these simulations that approximates the mass of crust that remains in the largest remnant. Using this scaling law and a recent set of N-body simulations of terrestrial planet formation, we have estimated the maximum effect of crustal stripping on incompatible element abundances during the accretion of planetary embryos. We find that on average approximately one third of the initial crust is stripped from embryos as they accrete, which leads to a reduction of ∼20% in the budgets of the heat producing elements if the stripped crust does not reaccrete. Erosion of crusts can lead to non-chondritic ratios of incompatible elements, but the magnitude of this effect depends sensitively on the details of the crust-forming melting process on the planetesimals. The Lu/Hf system is fractionated for a wide range of crustal formation scenarios. Using eucrites (the products of planetesimal silicate melting, thought to represent the crust of Vesta) as a guide to the Lu/Hf of planetesimal crust partially lost during accretion, we predict the Earth could evolve to a superchondritic 176Hf/177Hf (3-5 parts per ten thousand) at present day. Such values are in keeping with compositional estimates of the bulk Earth. Stripping of planetary crusts during accretion can lead to

  14. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  15. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  16. Planetary boundaries: exploring the safe operating space for humanity

    Science.gov (United States)

    Johan Rockström; Will Steffen; Kevin Noone; Asa Persson; F. Stuart Chapin; Eric Lambin; Timothy M. Lenton; Marten Scheffer; Carl Folke; Hans Joachim Schellnhuber; Björn Nykvist; Cynthia A. de Wit; Terry Hughes; Sander van der Leeuw; Henning Rodhe; Sverker Sörlin; Peter K. Snyder; Robert Costanza; Uno Svedin; Malin Falkenmark; Louise Karlberg; Robert W. Corell; Victoria J. Fabry; James Hansen; Brian Walker; Diana Liverman; Katherine Richardson; Paul Crutzen; Jonathan Foley

    2009-01-01

    Anthropogenic pressures on the Earth System have reached a scale where abrupt global environmental change can no longer be excluded. We propose a new approach to global sustainability in which we define planetary boundaries within which we expect that humanity can operate safely. Transgressing one or more planetary boundaries may be deleterious or even catastrophic due...

  17. On the parametrization of the planetary boundary layer of the atmosphere

    Energy Technology Data Exchange (ETDEWEB)

    Yordanov, D. [Bulgarian Academy of Sciences, Geophysical Inst., Sofia (Bulgaria); Syrakov, D.; Kolarova, M. [Bulgarian Academy of Sciences, National Inst. of Meteorology and Hydrology, Sofia (United Kingdom)

    1997-10-01

    The investigation of the dynamic processes in the planetary boundary layer presents a definite theoretical challenge and plays a growing role for the solution of a number of practical tasks. The improvement of large-scale atmospheric weather forecast depends, to a certain degree, on the proper inclusion of the planetary boundary layer dynamics in the numerical models. The modeling of the transport and the diffusion of air pollutants is connected with estimation of the different processes in the Planetary Boundary Layer (PBL) and needs also a proper PBL parametrization. For the solution of these practical tasks the following PBL models;(i) a baroclinic PBL model with its barotropic version, and (ii) a convective PBL model were developed. Both models are one dimensional and are based on the similarity theory and the resistance lows extended for the whole PBL. Two different PBL parametrizations under stable and under convective conditions are proposed, on the basis of which the turbulent surface heat and momentum fluxes are estimated using generalized similarity theory. By the proposed parametrizations the internal parameters are calculated from the synoptic scale parameters as geostrophyc wind, potential temperature and humidity given at two levels (ground level and at 850 hPa) and from them - the PBL profiles. The models consists of two layers: a surface layer (SL) with a variable height and a second (Ekman layer) over it with a constant with height turbulent exchange coefficient. (au) 14 refs.

  18. DNA replication and cancer: From dysfunctional replication origin activities to therapeutic opportunities.

    Science.gov (United States)

    Boyer, Anne-Sophie; Walter, David; Sørensen, Claus Storgaard

    2016-06-01

    A dividing cell has to duplicate its DNA precisely once during the cell cycle to preserve genome integrity avoiding the accumulation of genetic aberrations that promote diseases such as cancer. A large number of endogenous impacts can challenge DNA replication and cells harbor a battery of pathways to promote genome integrity during DNA replication. This includes suppressing new replication origin firing, stabilization of replicating forks, and the safe restart of forks to prevent any loss of genetic information. Here, we describe mechanisms by which oncogenes can interfere with DNA replication thereby causing DNA replication stress and genome instability. Further, we describe cellular and systemic responses to these insults with a focus on DNA replication restart pathways. Finally, we discuss the therapeutic potential of exploiting intrinsic replicative stress in cancer cells for targeted therapy. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  20. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  1. Proactive replica checking to assure reliability of data in cloud storage with minimum replication

    Science.gov (United States)

    Murarka, Damini; Maheswari, G. Uma

    2017-11-01

    The two major issues for cloud storage systems are data reliability and storage costs. For data reliability protection, multi-replica replication strategy which is used mostly in current clouds acquires huge storage consumption, leading to a large storage cost for applications within the loud specifically. This paper presents a cost-efficient data reliability mechanism named PRCR to cut back the cloud storage consumption. PRCR ensures data reliability of large cloud information with the replication that might conjointly function as a price effective benchmark for replication. The duplication shows that when resembled to the standard three-replica approach, PRCR will scale back to consume only a simple fraction of the cloud storage from one-third of the storage, thence considerably minimizing the cloud storage price.

  2. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  3. Planetary nebulae

    International Nuclear Information System (INIS)

    Amnuehl', P.R.

    1985-01-01

    The history of planetary nebulae discovery and their origin and evolution studies is discussed in a popular way. The problem of planetary nebulae central star is considered. The connection between the white-draft star and the planetary nebulae formulation is shown. The experimental data available acknowledge the hypothesis of red giant - planetary nebula nucleus - white-draft star transition process. Masses of planetary nebulae white-draft stars and central stars are distributed practically similarly: the medium mass is close to 0.6Msub(Sun) (Msub(Sun) - is the mass of the Sun)

  4. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  5. Large-scale chromosome folding versus genomic DNA sequences: A discrete double Fourier transform technique.

    Science.gov (United States)

    Chechetkin, V R; Lobzin, V V

    2017-08-07

    Using state-of-the-art techniques combining imaging methods and high-throughput genomic mapping tools leaded to the significant progress in detailing chromosome architecture of various organisms. However, a gap still remains between the rapidly growing structural data on the chromosome folding and the large-scale genome organization. Could a part of information on the chromosome folding be obtained directly from underlying genomic DNA sequences abundantly stored in the databanks? To answer this question, we developed an original discrete double Fourier transform (DDFT). DDFT serves for the detection of large-scale genome regularities associated with domains/units at the different levels of hierarchical chromosome folding. The method is versatile and can be applied to both genomic DNA sequences and corresponding physico-chemical parameters such as base-pairing free energy. The latter characteristic is closely related to the replication and transcription and can also be used for the assessment of temperature or supercoiling effects on the chromosome folding. We tested the method on the genome of E. coli K-12 and found good correspondence with the annotated domains/units established experimentally. As a brief illustration of further abilities of DDFT, the study of large-scale genome organization for bacteriophage PHIX174 and bacterium Caulobacter crescentus was also added. The combined experimental, modeling, and bioinformatic DDFT analysis should yield more complete knowledge on the chromosome architecture and genome organization. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  7. Optical Spectra of Radio Planetary Nebulae in the Large Magellanic Cloud

    Directory of Open Access Journals (Sweden)

    Payne, J. L.

    2008-12-01

    Full Text Available We present 11 spectra from 12 candidate radio sources co-identified with known planetary nebulae (PNe in the Large Magellanic Cloud (LMC. Originally found in Australia Telescope Compact Array (ATCA LMC surveys at 1.4, 4.8 and 8.64~GHz and confirmed by new high resolution ATCA images at 6 and 3~cm (4arcsec/2arcsec, these complement data recently presented for candidate radio PNe in the Small Magellanic Cloud (SMC. Their spectra were obtained using the Radcliffe 1.9-meter telescope in Sutherland (South Africa. All of the optical PNe and radio candidates are within 2arcsec and may represent a population of selected radio bright sample only. Nebular ionized masses of these objects are estimated to be as high as 1.8~$M_odot$, supporting the idea that massive PNe progenitor central stars lose much of their mass in the asymptotic giant branch (AGB phase or prior. We also identify a sub-population (33\\% of radio PNe candidates with prominent ionized iron emission lines.

  8. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  9. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  10. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  11. SHAPING THE GLOWING EYE PLANETARY NEBULA, NGC 6751

    International Nuclear Information System (INIS)

    Clark, D. M.; Garcia-Diaz, Ma. T.; Lopez, J. A.; Steffen, W. G.; Richer, M. G.

    2010-01-01

    NGC 6751 is a highly structured multiple-shell planetary nebula (PN) with a bipolar outflow. In this work, we present a comprehensive set of spatially resolved, high spectral resolution, long-slit spectra and deep imaging from San Pedro Martir, Gemini, the Hα composite full sky survey and archive images from the Hubble Space Telescope and Spitzer. This material allows us to identify all the main morphological components and study their detailed kinematics. We find a thick equatorial structure fragmented into multiple knots that enclose a fast expanding bubble with a filamentary surface structure. The knotty ring is surrounded by faint emission from a disk-like envelope. Lobes with embedded filaments form a bipolar outflow. The equatorial ring is tilted with respect to the line of sight and with respect to the bipolar outflow. A spherical halo surrounds the PN and there is material further out identified as a fragmented outer halo. This information is used to derive a three-dimensional morpho-kinematic model using the code SHAPE that closely replicates the observed image and long-slit spectra of the nebula, providing a fair representation of its complex structure. NGC 6751 is located close to the galactic plane and its large-scale surrounding environment is shown to be a gas-rich region. We find indications that the PN is interacting with the interstellar medium. Emission components from an extended nebulosity located a couple of arcminutes away from the nebula have radial velocities that are inconsistent with the rest of NGC 6751 and are confirmed as originating from the ambient material, not related to the PN, in agreement with a previous suggestion.

  12. An Ion-Propelled Cubesat for Planetary Defense and Planetary Science

    Science.gov (United States)

    Russell, Christopher T.; Wirz, Richard; Lai, Hairong; Li, Jian-Yang; Connors, Martin

    2017-04-01

    Small satellites can reduce the cost of launch by riding along with other payloads on a large rocket or being launched on a small rocket, but are perceived as having limited capabilities. This perception can be at least partially overcome by innovative design, including ample in-flight propulsion. This allows achieving multiple targets and adaptive exploration. Ion propulsion has been pioneered on Deep Space 1 and honed on the long-duration, multiple-planetary body mission Dawn. Most importantly, the operation of such a mission is now well- understood, including navigation, communication, and science operations for remote sensing. We examined different mission concepts that can be used for both planetary defense and planetary science near 1 AU. Such a spacecraft would travel in the region between Venus and Mars, allowing a complete inventory of material above, including objects down to about 10m diameter to be inventoried. The ion engines could be used to approach these bodies slowly and carefully and allow the spacecraft to map debris and follow its collisional evolution throughout its orbit around the Sun, if so desired. The heritage of Dawn operations experience enables the mission to be operated inexpensively, and the engineering heritage will allow it to be operated for many trips around the Sun.

  13. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  14. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    ]; Peach et al., 1998; DeSante et al., 2001 are generally co–ordinated by ringing centres such as those that make up the membership of EURING. In some countries volunteer census work (often called Breeding Bird Surveys is undertaken by the same organizations while in others different bodies may co–ordinate this aspect of the work. This session was concerned with the analysis of such extensive data sets and the approaches that are being developed to address the key theoretical and applied issues outlined above. The papers reflect the development of more spatially explicit approaches to analyses of data gathered at large spatial scales. They show that while the statistical tools that have been developed in recent years can be used to derive useful biological conclusions from such data, there is additional need for further developments. Future work should also consider how to best implement such analytical developments within future study designs. In his plenary paper Andy Royle (Royle, 2004 addresses this theme directly by describing a general framework for modelling spatially replicated abundance data. The approach is based on the idea that a set of spatially referenced local populations constitutes a metapopulation, within which local abundance is determined as a random process. This provides an elegant and general approach in which the metapopulation model as described above is combined with a data–generating model specific to the type of data being analysed to define a simple hierarchical model that can be analysed using conventional methods. It should be noted, however, that further software development will be needed if the approach is to be made readily available to biologists. The approach is well suited to dealing with sparse data and avoids the need for data aggregation prior to analysis. Spatial synchrony has received most attention in studies of species whose populations show cyclic fluctuations, particularly certain game birds and small mammals. However

  15. DNA replication and cancer

    DEFF Research Database (Denmark)

    Boyer, Anne-Sophie; Walter, David; Sørensen, Claus Storgaard

    2016-01-01

    A dividing cell has to duplicate its DNA precisely once during the cell cycle to preserve genome integrity avoiding the accumulation of genetic aberrations that promote diseases such as cancer. A large number of endogenous impacts can challenge DNA replication and cells harbor a battery of pathways...... causing DNA replication stress and genome instability. Further, we describe cellular and systemic responses to these insults with a focus on DNA replication restart pathways. Finally, we discuss the therapeutic potential of exploiting intrinsic replicative stress in cancer cells for targeted therapy....

  16. A new planetary nebula in the outer reaches of the Galaxy

    DEFF Research Database (Denmark)

    Viironen, K.; Mampaso, A.; L. M. Corradi, R.

    2011-01-01

    of a new planetary nebula towards the Anticentre direction, IPHASX J052531.19+281945.1 (PNG 178.1-04.0), is presented. The planetary nebula was discovered from the IPHAS survey. Long-slit follow-up spectroscopy was carried out to confirm its planetary nebula nature and to calculate its physical...... and chemical characteristics. The newly discovered planetary nebula turned out to be located at a very large galactocentric distance (D_GC=20.8+-3.8 kpc), larger than any previously known planetary nebula with measured abundances. Its relatively high oxygen abundance (12+log(O/H) = 8.36+-0.03) supports...

  17. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  18. Large-scale functional purification of recombinant HIV-1 capsid.

    Directory of Open Access Journals (Sweden)

    Magdeleine Hung

    Full Text Available During human immunodeficiency virus type-1 (HIV-1 virion maturation, capsid proteins undergo a major rearrangement to form a conical core that protects the viral nucleoprotein complexes. Mutations in the capsid sequence that alter the stability of the capsid core are deleterious to viral infectivity and replication. Recently, capsid assembly has become an attractive target for the development of a new generation of anti-retroviral agents. Drug screening efforts and subsequent structural and mechanistic studies require gram quantities of active, homogeneous and pure protein. Conventional means of laboratory purification of Escherichia coli expressed recombinant capsid protein rely on column chromatography steps that are not amenable to large-scale production. Here we present a function-based purification of wild-type and quadruple mutant capsid proteins, which relies on the inherent propensity of capsid protein to polymerize and depolymerize. This method does not require the packing of sizable chromatography columns and can generate double-digit gram quantities of functionally and biochemically well-behaved proteins with greater than 98% purity. We have used the purified capsid protein to characterize two known assembly inhibitors in our in-house developed polymerization assay and to measure their binding affinities. Our capsid purification procedure provides a robust method for purifying large quantities of a key protein in the HIV-1 life cycle, facilitating identification of the next generation anti-HIV agents.

  19. Entanglement replication in driven dissipative many-body systems.

    Science.gov (United States)

    Zippilli, S; Paternostro, M; Adesso, G; Illuminati, F

    2013-01-25

    We study the dissipative dynamics of two independent arrays of many-body systems, locally driven by a common entangled field. We show that in the steady state the entanglement of the driving field is reproduced in an arbitrarily large series of inter-array entangled pairs over all distances. Local nonclassical driving thus realizes a scale-free entanglement replication and long-distance entanglement distribution mechanism that has immediate bearing on the implementation of quantum communication networks.

  20. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  1. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  2. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  3. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  4. PLANETARY-SCALE STRONTIUM ISOTOPIC HETEROGENEITY AND THE AGE OF VOLATILE DEPLETION OF EARLY SOLAR SYSTEM MATERIALS

    Energy Technology Data Exchange (ETDEWEB)

    Moynier, Frederic; Podosek, Frank A. [Department of Earth and Planetary Science and McDonnell Center for Space Sciences, Washington University, St. Louis, MO 63130 (United States); Day, James M. D. [Geosciences Research Division, Scripps Institution of Oceanography, La Jolla, CA 92093-0244 (United States); Okui, Wataru; Yokoyama, Tetsuya [Department of Earth and Planetary Sciences, Tokyo Institute of Technology, Tokyo 152-8551 (Japan); Bouvier, Audrey [Department of Earth Sciences, University of Minnesota, Minneapolis, MN 55455-0231 (United States); Walker, Richard J., E-mail: moynier@levee.wustl.edu, E-mail: fap@levee.wustl.edu, E-mail: jmdday@ucsd.edu, E-mail: rjwalker@umd.edu, E-mail: okui.w.aa@m.titech.ac.jp, E-mail: tetsuya.yoko@geo.titech.ac.jp, E-mail: abouvier@umn.edu [Department of Geology, University of Maryland, College Park, MD 20742 (United States)

    2012-10-10

    Isotopic anomalies in planetary materials reflect both early solar nebular heterogeneity inherited from presolar stellar sources and processes that generated non-mass-dependent isotopic fractionations. The characterization of isotopic variations in heavy elements among early solar system materials yields important insight into the stellar environment and formation of the solar system, and about initial isotopic ratios relevant to long-term chronological applications. One such heavy element, strontium, is a central element in the geosciences due to wide application of the long-lived {sup 87}Rb-{sup 87}Sr radioactive as a chronometer. We show that the stable isotopes of Sr were heterogeneously distributed at both the mineral scale and the planetary scale in the early solar system, and also that the Sr isotopic heterogeneities correlate with mass-independent oxygen isotope variations, with only CI chondrites plotting outside of this correlation. The correlation implies that most solar system material formed by mixing of at least two isotopically distinct components: a CV-chondrite-like component and an O-chondrite-like component, and possibly a distinct CI-chondrite-like component. The heterogeneous distribution of Sr isotopes may indicate that variations in initial {sup 87}Sr/{sup 86}Sr of early solar system materials reflect isotopic heterogeneity instead of having chronological significance, as interpreted previously. For example, given the differences in {sup 84}Sr/{sup 86}Sr between calcium aluminum inclusions and eucrites ({epsilon}{sup 84}Sr > 2), the difference in age between these materials would be {approx}6 Ma shorter than previously interpreted, placing the Sr chronology in agreement with other long- and short-lived isotope systems, such as U-Pb and Mn-Cr.

  5. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  6. Lessons learned from planetary science archiving

    Science.gov (United States)

    Zender, J.; Grayzeck, E.

    2006-01-01

    The need for scientific archiving of past, current, and future planetary scientific missions, laboratory data, and modeling efforts is indisputable. To quote from a message by G. Santayama carved over the entrance of the US Archive in Washington DC “Those who can not remember the past are doomed to repeat it.” The design, implementation, maintenance, and validation of planetary science archives are however disputed by the involved parties. The inclusion of the archives into the scientific heritage is problematic. For example, there is the imbalance between space agency requirements and institutional and national interests. The disparity of long-term archive requirements and immediate data analysis requests are significant. The discrepancy between the space missions archive budget and the effort required to design and build the data archive is large. An imbalance exists between new instrument development and existing, well-proven archive standards. The authors present their view on the problems and risk areas in the archiving concepts based on their experience acquired within NASA’s Planetary Data System (PDS) and ESA’s Planetary Science Archive (PSA). Individual risks and potential problem areas are discussed based on a model derived from a system analysis done upfront. The major risk for a planetary mission science archive is seen in the combination of minimal involvement by Mission Scientists and inadequate funding. The authors outline how the risks can be reduced. The paper ends with the authors view on future planetary archive implementations including the archive interoperability aspect.

  7. Conformal Ablative Thermal Protection System for Small and Large Scale Missions: Approaching TRL 6 for Planetary and Human Exploration Missions and TRL 9 for Small Probe Missions

    Science.gov (United States)

    Beck, R. A. S.; Gasch, M. J.; Milos, F. S.; Stackpoole, M. M.; Smith, B. P.; Switzer, M. R.; Venkatapathy, E.; Wilder, M. C.; Boghhozian, T.; Chavez-Garcia, J. F.

    2015-01-01

    In 2011, NASAs Aeronautics Research Mission Directorate (ARMD) funded an effort to develop an ablative thermal protection system (TPS) material that would have improved properties when compared to Phenolic Impregnated Carbon Ablator (PICA) and AVCOAT. Their goal was a conformal material, processed with a flexible reinforcement that would result in similar or better thermal characteristics and higher strain-to-failure characteristics that would allow for easier integration on flight aeroshells than then-current rigid ablative TPS materials. In 2012, NASAs Space Technology Mission Directorate (STMD) began funding the maturation of the best formulation of the game changing conformal ablator, C-PICA. Progress has been reported at IPPW over the past three years, describing C-PICA with a density and recession rates similar to PICA, but with a higher strain-to-failure which allows for direct bonding and no gap fillers, and even more important, with thermal characteristics resulting in half the temperature rise of PICA. Overall, C-PICA should be able to replace PICA with a thinner, lighter weight, less complicated design. These characteristics should be particularly attractive for use as backshell TPS on high energy planetary entry vehicles. At the end of this year, the material should be ready for missions to consider including in their design, in fact, NASAs Science Mission Directorate (SMD) is considering incentivizing the use of C-PICA in the next Discovery Proposal call. This year both scale up of the material to large (1-m) sized pieces and the design and build of small probe heatshields for flight tests will be completed. NASA, with an industry partner, will build a 1-m long manufacturing demonstration unit (MDU) with a shape based on a mid LD lifting body. In addition, in an effort to fly as you test and test as you fly, NASA, with a second industry partner, will build a small probe to test in the Interactive Heating Facility (IHF) arc jet and, using nearly the

  8. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  9. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  10. Proto-planetary nebulae

    International Nuclear Information System (INIS)

    Zuckerman, B.

    1978-01-01

    A 'proto-planetary nebula' or a 'planetary nebula progenitor' is the term used to describe those objects that are losing mass at a rate >approximately 10 -5 Msolar masses/year (i.e. comparable to mass loss rates in planetary nebulae with ionized masses >approximately 0.2 Msolar masses) and which, it is believed, will become planetary nebulae themselves within 5 years. It is shown that most proto-planetary nebulae appear as very red objects although a few have been 'caught' near the middle of the Hertzsprung-Russell diagram. The precursors of these proto-planetaries are the general red giant population, more specifically probably Mira and semi-regular variables. (Auth.)end

  11. Agriculture production as a major driver of the earth system exceeding planetary boundaries

    DEFF Research Database (Denmark)

    Campbell, Bruce Morgan; Beare, Douglas J.; Bennett, Elena M.

    2017-01-01

    We explore the role of agriculture in destabilizing the Earth system at the planetary scale, through examining nine planetary boundaries, or “safe limits”: land-system change, freshwater use, biogeochemical flows, biosphere integrity, climate change, ocean acidification, stratospheric ozone...

  12. Planetary Boundaries: Exploring the Safe Operating Space for Humanity

    DEFF Research Database (Denmark)

    Richardson, Katherine; Rockström, Johan; Steffen, Will

    2009-01-01

    boundaries are rough, first estimates only, surrounded by large uncertainties and knowledge gaps. Filling these gaps will require major advancements in Earth System and resilience science. The proposed concept of "planetary boundaries" lays the groundwork for shifting our approach to governance...... and management, away from the essentially sectoral analyses of limits to growth aimed at minimizing negative externalities, toward the estimation of the safe space for human development. Planetary boundaries define, as it were, the boundaries of the "planetary playing field" for humanity if we want to be sure...

  13. Extinction of planetary nebulae and the turbulent structure of the galaxy

    International Nuclear Information System (INIS)

    Lerche, I.; Milne, D.K.

    1980-01-01

    Fluctuations in the extinction of planetary nebulae provide strong support for the concept of a turbulent interstellar medium. We have analyzed theoretically the mean extinction and its variance as a function of height, z, above the galactic plane. The mean increases monotonically, and exponentially, to a saturation level. The variance increases as z 2 for small z and has damped oscillations for intermediate z, before levelling off at large z. The observed mean extinction and the observed variance are found to be in excellent agreement with these theoretical deductions. The spatial scale of the mean extinction is estimated to be 100 pc; the oscillation scale of the variance and the damping scale of the oscillations are estimated to be about 200 +- 100 pc. The rms level of density fluctuations in the absorbing material causing the extinction is about equal to the mean value

  14. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  15. Remote Sensing Data Analytics for Planetary Science with PlanetServer/EarthServer

    Science.gov (United States)

    Rossi, Angelo Pio; Figuera, Ramiro Marco; Flahaut, Jessica; Martinot, Melissa; Misev, Dimitar; Baumann, Peter; Pham Huu, Bang; Besse, Sebastien

    2016-04-01

    Planetary Science datasets, beyond the change in the last two decades from physical volumes to internet-accessible archives, still face the problem of large-scale processing and analytics (e.g. Rossi et al., 2014, Gaddis and Hare, 2015). PlanetServer, the Planetary Science Data Service of the EC-funded EarthServer-2 project (#654367) tackles the planetary Big Data analytics problem with an array database approach (Baumann et al., 2014). It is developed to serve a large amount of calibrated, map-projected planetary data online, mainly through Open Geospatial Consortium (OGC) Web Coverage Processing Service (WCPS) (e.g. Rossi et al., 2014; Oosthoek et al., 2013; Cantini et al., 2014). The focus of the H2020 evolution of PlanetServer is still on complex multidimensional data, particularly hyperspectral imaging and topographic cubes and imagery. In addition to hyperspectral and topographic from Mars (Rossi et al., 2014), the use of WCPS is applied to diverse datasets on the Moon, as well as Mercury. Other Solar System Bodies are going to be progressively available. Derived parameters such as summary products and indices can be produced through WCPS queries, as well as derived imagery colour combination products, dynamically generated and accessed also through OGC Web Coverage Service (WCS). Scientific questions translated into queries can be posed to a large number of individual coverages (data products), locally, regionally or globally. The new PlanetServer system uses the the Open Source Nasa WorldWind (e.g. Hogan, 2011) virtual globe as visualisation engine, and the array database Rasdaman Community Edition as core server component. Analytical tools and client components of relevance for multiple communities and disciplines are shared across service such as the Earth Observation and Marine Data Services of EarthServer. The Planetary Science Data Service of EarthServer is accessible on http://planetserver.eu. All its code base is going to be available on GitHub, on

  16. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  17. Planetary Radar

    Science.gov (United States)

    Neish, Catherine D.; Carter, Lynn M.

    2015-01-01

    This chapter describes the principles of planetary radar, and the primary scientific discoveries that have been made using this technique. The chapter starts by describing the different types of radar systems and how they are used to acquire images and accurate topography of planetary surfaces and probe their subsurface structure. It then explains how these products can be used to understand the properties of the target being investigated. Several examples of discoveries made with planetary radar are then summarized, covering solar system objects from Mercury to Saturn. Finally, opportunities for future discoveries in planetary radar are outlined and discussed.

  18. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  19. From Planetary Mapping to Map Production: Planetary Cartography as integral discipline in Planetary Sciences

    Science.gov (United States)

    Nass, Andrea; van Gasselt, Stephan; Hargitai, Hendrik; Hare, Trent; Manaud, Nicolas; Karachevtseva, Irina; Kersten, Elke; Roatsch, Thomas; Wählisch, Marita; Kereszturi, Akos

    2016-04-01

    Cartography is one of the most important communication channels between users of spatial information and laymen as well as the open public alike. This applies to all known real-world objects located either here on Earth or on any other object in our Solar System. In planetary sciences, however, the main use of cartography resides in a concept called planetary mapping with all its various attached meanings: it can be (1) systematic spacecraft observation from orbit, i.e. the retrieval of physical information, (2) the interpretation of discrete planetary surface units and their abstraction, or it can be (3) planetary cartography sensu strictu, i.e., the technical and artistic creation of map products. As the concept of planetary mapping covers a wide range of different information and knowledge levels, aims associated with the concept of mapping consequently range from a technical and engineering focus to a scientific distillation process. Among others, scientific centers focusing on planetary cartography are the United State Geological Survey (USGS, Flagstaff), the Moscow State University of Geodesy and Cartography (MIIGAiK, Moscow), Eötvös Loránd University (ELTE, Hungary), and the German Aerospace Center (DLR, Berlin). The International Astronomical Union (IAU), the Commission Planetary Cartography within International Cartographic Association (ICA), the Open Geospatial Consortium (OGC), the WG IV/8 Planetary Mapping and Spatial Databases within International Society for Photogrammetry and Remote Sensing (ISPRS) and a range of other institutions contribute on definition frameworks in planetary cartography. Classical cartography is nowadays often (mis-)understood as a tool mainly rather than a scientific discipline and an art of communication. Consequently, concepts of information systems, mapping tools and cartographic frameworks are used interchangeably, and cartographic workflows and visualization of spatial information in thematic maps have often been

  20. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  1. Multiscale regime shifts and planetary boundaries

    NARCIS (Netherlands)

    Hughes, T.P.; Carpenter, S.; Rockstrom, J.; Scheffer, M.; Walker, B.

    2013-01-01

    Life on Earth has repeatedly displayed abrupt and massive changes in the past, and there is no reason to expect that comparable planetary-scale regime shifts will not continue in the future. Different lines of evidence indicate that regime shifts occur when the climate or biosphere transgresses a

  2. Antibodies, synthetic peptides and related constructs for planetary health based on green chemistry in the Anthropocene.

    Science.gov (United States)

    C Caoili, Salvador Eugenio

    2018-03-01

    The contemporary Anthropocene is characterized by rapidly evolving complex global challenges to planetary health vis-a-vis sustainable development, yet innovation is constrained under the prevailing precautionary regime that regulates technological change. Small-molecule xenobiotic drugs are amenable to efficient large-scale industrial synthesis; but their pharmacokinetics, pharmacodynamics, interactions and ultimate ecological impact are difficult to predict, raising concerns over initial testing and environmental contamination. Antibodies and similar agents can serve as antidotes and drug buffers or vehicles to address patient safety and decrease dosing requirements. More generally, peptidic agents including synthetic peptide-based constructs exemplified by vaccines can be used together with or instead of nonpeptidic xenobiotics, thus enabling advances in planetary health based on principles of green chemistry from manufacturing through final disposition.

  3. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  4. On unravelling mechanism of interplay between cloud and large scale circulation: a grey area in climate science

    Science.gov (United States)

    De, S.; Agarwal, N. K.; Hazra, Anupam; Chaudhari, Hemantkumar S.; Sahai, A. K.

    2018-04-01

    The interaction between cloud and large scale circulation is much less explored area in climate science. Unfolding the mechanism of coupling between these two parameters is imperative for improved simulation of Indian summer monsoon (ISM) and to reduce imprecision in climate sensitivity of global climate model. This work has made an effort to explore this mechanism with CFSv2 climate model experiments whose cloud has been modified by changing the critical relative humidity (CRH) profile of model during ISM. Study reveals that the variable CRH in CFSv2 has improved the nonlinear interactions between high and low frequency oscillations in wind field (revealed as internal dynamics of monsoon) and modulates realistically the spatial distribution of interactions over Indian landmass during the contrasting monsoon season compared to the existing CRH profile of CFSv2. The lower tropospheric wind error energy in the variable CRH simulation of CFSv2 appears to be minimum due to the reduced nonlinear convergence of error to the planetary scale range from long and synoptic scales (another facet of internal dynamics) compared to as observed from other CRH experiments in normal and deficient monsoons. Hence, the interplay between cloud and large scale circulation through CRH may be manifested as a change in internal dynamics of ISM revealed from scale interactive quasi-linear and nonlinear kinetic energy exchanges in frequency as well as in wavenumber domain during the monsoon period that eventually modify the internal variance of CFSv2 model. Conversely, the reduced wind bias and proper modulation of spatial distribution of scale interaction between the synoptic and low frequency oscillations improve the eastward and northward extent of water vapour flux over Indian landmass that in turn give feedback to the realistic simulation of cloud condensates attributing improved ISM rainfall in CFSv2.

  5. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  6. Large-scale assessment of olfactory preferences and learning in Drosophila melanogaster: behavioral and genetic components

    Directory of Open Access Journals (Sweden)

    Elisabetta Versace

    2015-09-01

    Full Text Available In the Evolve and Resequence method (E&R, experimental evolution and genomics are combined to investigate evolutionary dynamics and the genotype-phenotype link. As other genomic approaches, this methods requires many replicates with large population sizes, which imposes severe restrictions on the analysis of behavioral phenotypes. Aiming to use E&R for investigating the evolution of behavior in Drosophila, we have developed a simple and effective method to assess spontaneous olfactory preferences and learning in large samples of fruit flies using a T-maze. We tested this procedure on (a a large wild-caught population and (b 11 isofemale lines of Drosophila melanogaster. Compared to previous methods, this procedure reduces the environmental noise and allows for the analysis of large population samples. Consistent with previous results, we show that flies have a preference for orange vs. apple odor. With our procedure wild-derived flies exhibit olfactory learning in the absence of previous laboratory selection. Furthermore, we find genetic differences in the olfactory learning with relatively high heritability. We propose this large-scale method as an effective tool for E&R and genome-wide association studies on olfactory preferences and learning.

  7. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  8. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  9. The Making of a Pre-Planetary Nebula

    Science.gov (United States)

    Kohler, Susanna

    2017-07-01

    nebula, OH231, which lies 4,200 light-years away and is about 1.4 light-years long. This is a well studied nebula, so the team had many observations that their model needed to successfully replicate: the nebulas shapes, dimensions, overall geometry, locations of shocks, timescales, and even velocity gradients are known.The authors model included mass injection from the central source into the ambient gas in three different ways:clumps: spherical knots injected all at once,cylindrical jets: thin outflows with parallel streamlines, andsprays: conical outflows with diverging streamlines.Explanation from a Champagne BottlePanel A: best-fitting simulations of OH231 200, 400, and 800 yr after the clump and spray are launched. Panel B: example from the same family of solutions, in which the mass is reduced by a factor of 10. Click for a closer look. [Balick et al. 2017]Balick and collaborators found that by injecting the mass in these three ways with a specific order and spacing, they were able to find a family of solutions that very well replicated observations of OH231. In the best-fitting model, combinations of pairs of clumps are embedded within sprays of brief duration and launched into static ancient AGB winds. The authors compare the setup to the ejection of the cork and the spray of high-pressure fluid when a bottle of champagne is opened.These simulations successfully map out all but perhaps the first century of the nebulas evolution and give us some of the best insight yet into how these short-lived objects are formed. The authors are now working to reproduce these simulations for other pre-planetary nebulae, with the goal of piecing together common attributes of their ejection histories.CitationBruce Balick et al 2017 ApJ 843 108. doi:10.3847/1538-4357/aa77f0

  10. The MIND PALACE: A Multi-Spectral Imaging and Spectroscopy Database for Planetary Science

    Science.gov (United States)

    Eshelman, E.; Doloboff, I.; Hara, E. K.; Uckert, K.; Sapers, H. M.; Abbey, W.; Beegle, L. W.; Bhartia, R.

    2017-12-01

    The Multi-Instrument Database (MIND) is the web-based home to a well-characterized set of analytical data collected by a suite of deep-UV fluorescence/Raman instruments built at the Jet Propulsion Laboratory (JPL). Samples derive from a growing body of planetary surface analogs, mineral and microbial standards, meteorites, spacecraft materials, and other astrobiologically relevant materials. In addition to deep-UV spectroscopy, datasets stored in MIND are obtained from a variety of analytical techniques obtained over multiple spatial and spectral scales including electron microscopy, optical microscopy, infrared spectroscopy, X-ray fluorescence, and direct fluorescence imaging. Multivariate statistical analysis techniques, primarily Principal Component Analysis (PCA), are used to guide interpretation of these large multi-analytical spectral datasets. Spatial co-referencing of integrated spectral/visual maps is performed using QGIS (geographic information system software). Georeferencing techniques transform individual instrument data maps into a layered co-registered data cube for analysis across spectral and spatial scales. The body of data in MIND is intended to serve as a permanent, reliable, and expanding database of deep-UV spectroscopy datasets generated by this unique suite of JPL-based instruments on samples of broad planetary science interest.

  11. Characterization of laser-induced plasmas as a complement to high-explosive large-scale detonations

    Directory of Open Access Journals (Sweden)

    Clare Kimblin

    2017-09-01

    Full Text Available Experimental investigations into the characteristics of laser-induced plasmas indicate that LIBS provides a relatively inexpensive and easily replicable laboratory technique to isolate and measure reactions germane to understanding aspects of high-explosive detonations under controlled conditions. Spectral signatures and derived physical parameters following laser ablation of aluminum, graphite and laser-sparked air are examined as they relate to those observed following detonation of high explosives and as they relate to shocked air. Laser-induced breakdown spectroscopy (LIBS reliably correlates reactions involving atomic Al and aluminum monoxide (AlO with respect to both emission spectra and temperatures, as compared to small- and large-scale high-explosive detonations. Atomic Al and AlO resulting from laser ablation and a cited small-scale study, decay within ∼10-5 s, roughly 100 times faster than the Al and AlO decay rates (∼10-3 s observed following the large-scale detonation of an Al-encased explosive. Temperatures and species produced in laser-sparked air are compared to those produced with laser ablated graphite in air. With graphite present, CN is dominant relative to N2+. In studies where the height of the ablating laser’s focus was altered relative to the surface of the graphite substrate, CN concentration was found to decrease with laser focus below the graphite surface, indicating that laser intensity is a critical factor in the production of CN, via reactive nitrogen.

  12. Replication of micro and nano surface geometries

    DEFF Research Database (Denmark)

    Hansen, Hans Nørgaard; Hocken, R.J.; Tosello, Guido

    2011-01-01

    The paper describes the state-of-the-art in replication of surface texture and topography at micro and nano scale. The description includes replication of surfaces in polymers, metals and glass. Three different main technological areas enabled by surface replication processes are presented......: manufacture of net-shape micro/nano surfaces, tooling (i.e. master making), and surface quality control (metrology, inspection). Replication processes and methods as well as the metrology of surfaces to determine the degree of replication are presented and classified. Examples from various application areas...... are given including replication for surface texture measurements, surface roughness standards, manufacture of micro and nano structured functional surfaces, replicated surfaces for optical applications (e.g. optical gratings), and process chains based on combinations of repeated surface replication steps....

  13. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  14. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  15. Impact of Data Placement on Resilience in Large-Scale Object Storage Systems

    Energy Technology Data Exchange (ETDEWEB)

    Carns, Philip; Harms, Kevin; Jenkins, John; Mubarak, Misbah; Ross, Robert; Carothers, Christopher

    2016-05-02

    Distributed object storage architectures have become the de facto standard for high-performance storage in big data, cloud, and HPC computing. Object storage deployments using commodity hardware to reduce costs often employ object replication as a method to achieve data resilience. Repairing object replicas after failure is a daunting task for systems with thousands of servers and billions of objects, however, and it is increasingly difficult to evaluate such scenarios at scale on realworld systems. Resilience and availability are both compromised if objects are not repaired in a timely manner. In this work we leverage a high-fidelity discrete-event simulation model to investigate replica reconstruction on large-scale object storage systems with thousands of servers, billions of objects, and petabytes of data. We evaluate the behavior of CRUSH, a well-known object placement algorithm, and identify configuration scenarios in which aggregate rebuild performance is constrained by object placement policies. After determining the root cause of this bottleneck, we then propose enhancements to CRUSH and the usage policies atop it to enable scalable replica reconstruction. We use these methods to demonstrate a simulated aggregate rebuild rate of 410 GiB/s (within 5% of projected ideal linear scaling) on a 1,024-node commodity storage system. We also uncover an unexpected phenomenon in rebuild performance based on the characteristics of the data stored on the system.

  16. Validation of the Body Concealment Scale for Scleroderma (BCSS): Replication in the Scleroderma Patient-centered Intervention Network (SPIN) Cohort

    NARCIS (Netherlands)

    Jewett, L.R.; Kwakkenbos, C.M.C.; Carrier, M.E.; Malcarne, V.L.; Harcourt, D.; Rumsey, N.; Mayes, M.D.; Assassi, S.; Körner, A.; Fox, R.S.; Gholizadeh, S.; Mills, S.D.; Fortune, C.; Thombs, B.D.

    2017-01-01

    Body concealment is an important component of appearance distress for individuals with disfiguring conditions, including scleroderma. The objective was to replicate the validation study of the Body Concealment Scale for Scleroderma (BCSS) among 897 scleroderma patients. The factor structure of the

  17. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS

    NARCIS (Netherlands)

    Kuai, Moshen; Cheng, Gang; Pang, Y.; Li, Yong

    2018-01-01

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear

  18. Best Practices in the Evaluation of Large-scale STEM-focused Events: A Review of Recent Literature

    Science.gov (United States)

    Shebby, S.; Cobb, W. H.; Buxner, S.; Shipp, S. S.

    2015-12-01

    Each year, the National Aeronautics and Space Administration (NASA) sponsors a variety of educational events to share information with educators, students, and the general public. Intended outcomes of these events include increased interest in and awareness of the mission and goals of NASA. Events range in size from relatively small family science nights at a local school to large-scale mission and celestial event celebrations involving thousands of members of the general public. To support community members in designing event evaluations, the Science Mission Directorate (SMD) Planetary Science Forum sponsored the creation of a Best Practices Guide. The guide was generated by reviewing published large-scale event evaluation reports; however, the best practices described within are pertinent for all event organizers and evaluators regardless of event size. Each source included in the guide identified numerous challenges to conducting their event evaluation. These included difficulty in identifying extant instruments or items, collecting representative data, and disaggregating data to inform different evaluation questions. Overall, the guide demonstrates that evaluations of the large-scale events are generally done at a very basic level, with the types of data collected limited to observable demographic information and participant reactions collected via online survey. In addition to these findings, this presentation will describe evaluation best practices that will help practitioners move beyond these basic indicators and examine how to make the evaluation process an integral—and valuable—element of event planning, ultimately informing event outcomes and impacts. It will provide detailed information on five recommendations presented in the guide: 1) consider evaluation methodology, including data analysis, in advance; 2) design data collection instruments well in advance of the event; 3) collect data at different times and from multiple sources; 4) use

  19. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  20. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  1. Shaping of planetary nebulae

    International Nuclear Information System (INIS)

    Balick, B.

    1987-01-01

    The phases of stellar evolution and the development of planetary nebulae are examined. The relation between planetary nebulae and red giants is studied. Spherical and nonspherical cases of shaping planetaries with stellar winds are described. CCD images of nebulae are analyzed, and it is determined that the shape of planetary nebulae depends on ionization levels. Consideration is given to calculating the distances of planetaries using radio images, and molecular hydrogen envelopes which support the wind-shaping model of planetary nebulae

  2. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  3. Multi-Scale Models for the Scale Interaction of Organized Tropical Convection

    Science.gov (United States)

    Yang, Qiu

    Assessing the upscale impact of organized tropical convection from small spatial and temporal scales is a research imperative, not only for having a better understanding of the multi-scale structures of dynamical and convective fields in the tropics, but also for eventually helping in the design of new parameterization strategies to improve the next-generation global climate models. Here self-consistent multi-scale models are derived systematically by following the multi-scale asymptotic methods and used to describe the hierarchical structures of tropical atmospheric flows. The advantages of using these multi-scale models lie in isolating the essential components of multi-scale interaction and providing assessment of the upscale impact of the small-scale fluctuations onto the large-scale mean flow through eddy flux divergences of momentum and temperature in a transparent fashion. Specifically, this thesis includes three research projects about multi-scale interaction of organized tropical convection, involving tropical flows at different scaling regimes and utilizing different multi-scale models correspondingly. Inspired by the observed variability of tropical convection on multiple temporal scales, including daily and intraseasonal time scales, the goal of the first project is to assess the intraseasonal impact of the diurnal cycle on the planetary-scale circulation such as the Hadley cell. As an extension of the first project, the goal of the second project is to assess the intraseasonal impact of the diurnal cycle over the Maritime Continent on the Madden-Julian Oscillation. In the third project, the goals are to simulate the baroclinic aspects of the ITCZ breakdown and assess its upscale impact on the planetary-scale circulation over the eastern Pacific. These simple multi-scale models should be useful to understand the scale interaction of organized tropical convection and help improve the parameterization of unresolved processes in global climate models.

  4. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  5. LBT observations of the HR8799 planetary system

    Science.gov (United States)

    Mesa, D.; Arcidiacono, C.; Claudi, R. U.; Desidera, S.; Esposito, S.; Gratton, R.; Masciadri, E.

    2013-09-01

    We present here observations of the HR8799 planetary system performed in H and Ks band exploiting the AO system at the Large Binocular Telescope and the PISCES camera. Thanks to the excellent performence of the instrument we were able to detect for the first time the inner known planet of the system (HR8799) in the H band. Precise photometric and astrometric measures have been taken for all the four planets. Further, exploiting ours and previous astrometric results, we were able to put some limits on the planetary orbits of the four planets. The analysis of the dinamical stability of the system seems to show lower planetary masses than the ones adopted until now.

  6. Non-planetary Science from Planetary Missions

    Science.gov (United States)

    Elvis, M.; Rabe, K.; Daniels, K.

    2015-12-01

    Planetary science is naturally focussed on the issues of the origin and history of solar systems, especially our own. The implications of an early turbulent history of our solar system reach into many areas including the origin of Earth's oceans, of ores in the Earth's crust and possibly the seeding of life. There are however other areas of science that stand to be developed greatly by planetary missions, primarily to small solar system bodies. The physics of granular materials has been well-studied in Earth's gravity, but lacks a general theory. Because of the compacting effects of gravity, some experiments desired for testing these theories remain impossible on Earth. Studying the behavior of a micro-gravity rubble pile -- such as many asteroids are believed to be -- could provide a new route towards exploring general principles of granular physics. These same studies would also prove valuable for planning missions to sample these same bodies, as techniques for anchoring and deep sampling are difficult to plan in the absence of such knowledge. In materials physics, first-principles total-energy calculations for compounds of a given stoichiometry have identified metastable, or even stable, structures distinct from known structures obtained by synthesis under laboratory conditions. The conditions in the proto-planetary nebula, in the slowly cooling cores of planetesimals, and in the high speed collisions of planetesimals and their derivatives, are all conditions that cannot be achieved in the laboratory. Large samples from comets and asteroids offer the chance to find crystals with these as-yet unobserved structures as well as more exotic materials. Some of these could have unusual properties important for materials science. Meteorites give us a glimpse of these exotic materials, several dozen of which are known that are unique to meteorites. But samples retrieved directly from small bodies in space will not have been affected by atmospheric entry, warmth or

  7. Relation between radius and expansion velocity in planetary nebulae

    International Nuclear Information System (INIS)

    Chu, Y.H.; Kwitter, K.B.; Kaler, J.B.

    1984-01-01

    The expansion velocity-radius (R-V) relation for planetary nebulae is examined using the existing measurements of expansion velocities and recent calculations of radii. It is found that some of the previously alleged R-V relations for PN are not convincingly established. The scatter in the R-V plots may be due largely to stratification of ions in individual nebulae and to heterogeneity in the planetary nebula population. In addition, from new echelle/CCD observations of planetary nebulae, it is found that spatial information is essential in deriving the internal kinematic properties. Future investigations of R-V relations should be pursued separately for groups of planetaries with similar physical properties, and they should employ observations of appropriate low excitation lines in order to measure the expansion velocity at the surface of the nebula. 26 references

  8. Extremal dynamics in random replicator ecosystems

    Energy Technology Data Exchange (ETDEWEB)

    Kärenlampi, Petri P., E-mail: petri.karenlampi@uef.fi

    2015-10-02

    The seminal numerical experiment by Bak and Sneppen (BS) is repeated, along with computations with replicator models, including a greater amount of features. Both types of models do self-organize, and do obey power-law scaling for the size distribution of activity cycles. However species extinction within the replicator models interferes with the BS self-organized critical (SOC) activity. Speciation–extinction dynamics ruins any stationary state which might contain a steady size distribution of activity cycles. The BS-type activity appears as a dissimilar phenomenon in comparison to speciation–extinction dynamics in the replicator system. No criticality is found from the speciation–extinction dynamics. Neither are speciations and extinctions in real biological macroevolution known to contain any diverging distributions, or self-organization towards any critical state. Consequently, biological macroevolution probably is not a self-organized critical phenomenon. - Highlights: • Extremal Dynamics organizes random replicator ecosystems to two phases in fitness space. • Replicator systems show power-law scaling of activity. • Species extinction interferes with Bak–Sneppen type mutation activity. • Speciation–extinction dynamics does not show any critical phase transition. • Biological macroevolution probably is not a self-organized critical phenomenon.

  9. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  10. Genome-scale cluster analysis of replicated microarrays using shrinkage correlation coefficient.

    Science.gov (United States)

    Yao, Jianchao; Chang, Chunqi; Salmi, Mari L; Hung, Yeung Sam; Loraine, Ann; Roux, Stanley J

    2008-06-18

    Currently, clustering with some form of correlation coefficient as the gene similarity metric has become a popular method for profiling genomic data. The Pearson correlation coefficient and the standard deviation (SD)-weighted correlation coefficient are the two most widely-used correlations as the similarity metrics in clustering microarray data. However, these two correlations are not optimal for analyzing replicated microarray data generated by most laboratories. An effective correlation coefficient is needed to provide statistically sufficient analysis of replicated microarray data. In this study, we describe a novel correlation coefficient, shrinkage correlation coefficient (SCC), that fully exploits the similarity between the replicated microarray experimental samples. The methodology considers both the number of replicates and the variance within each experimental group in clustering expression data, and provides a robust statistical estimation of the error of replicated microarray data. The value of SCC is revealed by its comparison with two other correlation coefficients that are currently the most widely-used (Pearson correlation coefficient and SD-weighted correlation coefficient) using statistical measures on both synthetic expression data as well as real gene expression data from Saccharomyces cerevisiae. Two leading clustering methods, hierarchical and k-means clustering were applied for the comparison. The comparison indicated that using SCC achieves better clustering performance. Applying SCC-based hierarchical clustering to the replicated microarray data obtained from germinating spores of the fern Ceratopteris richardii, we discovered two clusters of genes with shared expression patterns during spore germination. Functional analysis suggested that some of the genetic mechanisms that control germination in such diverse plant lineages as mosses and angiosperms are also conserved among ferns. This study shows that SCC is an alternative to the Pearson

  11. Replicating the microbial community and water quality performance of full-scale slow sand filters in laboratory-scale filters.

    Science.gov (United States)

    Haig, Sarah-Jane; Quince, Christopher; Davies, Robert L; Dorea, Caetano C; Collins, Gavin

    2014-09-15

    Previous laboratory-scale studies to characterise the functional microbial ecology of slow sand filters have suffered from methodological limitations that could compromise their relevance to full-scale systems. Therefore, to ascertain if laboratory-scale slow sand filters (L-SSFs) can replicate the microbial community and water quality production of industrially operated full-scale slow sand filters (I-SSFs), eight cylindrical L-SSFs were constructed and were used to treat water from the same source as the I-SSFs. Half of the L-SSFs sand beds were composed of sterilized sand (sterile) from the industrial filters and the other half with sand taken directly from the same industrial filter (non-sterile). All filters were operated for 10 weeks, with the microbial community and water quality parameters sampled and analysed weekly. To characterize the microbial community phyla-specific qPCR assays and 454 pyrosequencing of the 16S rRNA gene were used in conjunction with an array of statistical techniques. The results demonstrate that it is possible to mimic both the water quality production and the structure of the microbial community of full-scale filters in the laboratory - at all levels of taxonomic classification except OTU - thus allowing comparison of LSSF experiments with full-scale units. Further, it was found that the sand type composing the filter bed (non-sterile or sterile), the water quality produced, the age of the filters and the depth of sand samples were all significant factors in explaining observed differences in the structure of the microbial consortia. This study is the first to the authors' knowledge that demonstrates that scaled-down slow sand filters can accurately reproduce the water quality and microbial consortia of full-scale slow sand filters. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Foundational Principles for Large-Scale Inference: Illustrations Through Correlation Mining

    Science.gov (United States)

    Hero, Alfred O.; Rajaratnam, Bala

    2015-01-01

    When can reliable inference be drawn in fue “Big Data” context? This paper presents a framework for answering this fundamental question in the context of correlation mining, wifu implications for general large scale inference. In large scale data applications like genomics, connectomics, and eco-informatics fue dataset is often variable-rich but sample-starved: a regime where the number n of acquired samples (statistical replicates) is far fewer than fue number p of observed variables (genes, neurons, voxels, or chemical constituents). Much of recent work has focused on understanding the computational complexity of proposed methods for “Big Data”. Sample complexity however has received relatively less attention, especially in the setting when the sample size n is fixed, and the dimension p grows without bound. To address fuis gap, we develop a unified statistical framework that explicitly quantifies the sample complexity of various inferential tasks. Sampling regimes can be divided into several categories: 1) the classical asymptotic regime where fue variable dimension is fixed and fue sample size goes to infinity; 2) the mixed asymptotic regime where both variable dimension and sample size go to infinity at comparable rates; 3) the purely high dimensional asymptotic regime where the variable dimension goes to infinity and the sample size is fixed. Each regime has its niche but only the latter regime applies to exa cale data dimension. We illustrate this high dimensional framework for the problem of correlation mining, where it is the matrix of pairwise and partial correlations among the variables fua t are of interest. Correlation mining arises in numerous applications and subsumes the regression context as a special case. we demonstrate various regimes of correlation mining based on the unifying perspective of high dimensional learning rates and sample complexity for different structured covariance models and different inference tasks. PMID:27087700

  13. Foundational Principles for Large-Scale Inference: Illustrations Through Correlation Mining.

    Science.gov (United States)

    Hero, Alfred O; Rajaratnam, Bala

    2016-01-01

    When can reliable inference be drawn in fue "Big Data" context? This paper presents a framework for answering this fundamental question in the context of correlation mining, wifu implications for general large scale inference. In large scale data applications like genomics, connectomics, and eco-informatics fue dataset is often variable-rich but sample-starved: a regime where the number n of acquired samples (statistical replicates) is far fewer than fue number p of observed variables (genes, neurons, voxels, or chemical constituents). Much of recent work has focused on understanding the computational complexity of proposed methods for "Big Data". Sample complexity however has received relatively less attention, especially in the setting when the sample size n is fixed, and the dimension p grows without bound. To address fuis gap, we develop a unified statistical framework that explicitly quantifies the sample complexity of various inferential tasks. Sampling regimes can be divided into several categories: 1) the classical asymptotic regime where fue variable dimension is fixed and fue sample size goes to infinity; 2) the mixed asymptotic regime where both variable dimension and sample size go to infinity at comparable rates; 3) the purely high dimensional asymptotic regime where the variable dimension goes to infinity and the sample size is fixed. Each regime has its niche but only the latter regime applies to exa cale data dimension. We illustrate this high dimensional framework for the problem of correlation mining, where it is the matrix of pairwise and partial correlations among the variables fua t are of interest. Correlation mining arises in numerous applications and subsumes the regression context as a special case. we demonstrate various regimes of correlation mining based on the unifying perspective of high dimensional learning rates and sample complexity for different structured covariance models and different inference tasks.

  14. Development and Validation of Chemical Kinetic Mechanism Reduction Scheme for Large-Scale Mechanisms

    DEFF Research Database (Denmark)

    Poon, Hiew Mun; Ng, Hoon Kiat; Gan, Suyin

    2014-01-01

    This work is an extension to a previously reported work on chemical kinetic mechanism reduction scheme for large-scale mechanisms. Here, Perfectly Stirred Reactor (PSR) was added as a criterion of data source for mechanism reduction instead of using only auto-ignition condition. As a result......) simulations were performed to study the spray combustion phenomena within a constant volume bomb. Both non-reacting and reacting conditions were applied in this study. Liquid and vapor penetration lengths were replicated for non-reacting diesel spray. For reacting diesel spray, both ignition delay and lift......-off length were simulated. The simulation results were then compared to the experimental data of Sandia National Laboratories and No. 2 Diesel Fuel (D2) was designated as the reference fuel. Both liquid and vapor penetrations for non-reacting condition were well-matched, while ignition delay was advanced...

  15. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  16. DNA Replication in Engineered Escherichia coli Genomes with Extra Replication Origins.

    Science.gov (United States)

    Milbredt, Sarah; Farmani, Neda; Sobetzko, Patrick; Waldminghaus, Torsten

    2016-10-21

    The standard outline of bacterial genomes is a single circular chromosome with a single replication origin. From the bioengineering perspective, it appears attractive to extend this basic setup. Bacteria with split chromosomes or multiple replication origins have been successfully constructed in the last few years. The characteristics of these engineered strains will largely depend on the respective DNA replication patterns. However, the DNA replication has not been investigated systematically in engineered bacteria with multiple origins or split replicons. Here we fill this gap by studying a set of strains consisting of (i) E. coli strains with an extra copy of the native replication origin (oriC), (ii) E. coli strains with an extra copy of the replication origin from the secondary chromosome of Vibrio cholerae (oriII), and (iii) a strain in which the E. coli chromosome is split into two linear replicons. A combination of flow cytometry, microarray-based comparative genomic hybridization (CGH), and modeling revealed silencing of extra oriC copies and differential timing of ectopic oriII copies compared to the native oriC. The results were used to derive construction rules for future multiorigin and multireplicon projects.

  17. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  18. How many bootstrap replicates are necessary?

    Science.gov (United States)

    Pattengale, Nicholas D; Alipour, Masoud; Bininda-Emonds, Olaf R P; Moret, Bernard M E; Stamatakis, Alexandros

    2010-03-01

    Phylogenetic bootstrapping (BS) is a standard technique for inferring confidence values on phylogenetic trees that is based on reconstructing many trees from minor variations of the input data, trees called replicates. BS is used with all phylogenetic reconstruction approaches, but we focus here on one of the most popular, maximum likelihood (ML). Because ML inference is so computationally demanding, it has proved too expensive to date to assess the impact of the number of replicates used in BS on the relative accuracy of the support values. For the same reason, a rather small number (typically 100) of BS replicates are computed in real-world studies. Stamatakis et al. recently introduced a BS algorithm that is 1 to 2 orders of magnitude faster than previous techniques, while yielding qualitatively comparable support values, making an experimental study possible. In this article, we propose stopping criteria--that is, thresholds computed at runtime to determine when enough replicates have been generated--and we report on the first large-scale experimental study to assess the effect of the number of replicates on the quality of support values, including the performance of our proposed criteria. We run our tests on 17 diverse real-world DNA--single-gene as well as multi-gene--datasets, which include 125-2,554 taxa. We find that our stopping criteria typically stop computations after 100-500 replicates (although the most conservative criterion may continue for several thousand replicates) while producing support values that correlate at better than 99.5% with the reference values on the best ML trees. Significantly, we also find that the stopping criteria can recommend very different numbers of replicates for different datasets of comparable sizes. Our results are thus twofold: (i) they give the first experimental assessment of the effect of the number of BS replicates on the quality of support values returned through BS, and (ii) they validate our proposals for

  19. A radio search for planetary nebulae near the galactic center

    International Nuclear Information System (INIS)

    Isaacman, R.B.

    1980-01-01

    Because of galactic center is a hostile environment, and because planetaries are weak radio emitters, it is not clear a priori that one expects to detect any planetary nebulae at all in the nuclear region of the Galaxy. Therefore the expected lifetime and flux density distribution of galactic center nebulae is considered. The principal observational results from the Westerbork data, and the results of some pilot observations with the Very Large Array, which were intended to distinguish planetaries from other radio sources on an individual basis are given. (Auth.)

  20. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  1. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  2. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  3. Techniques for Large-Scale Bacterial Genome Manipulation and Characterization of the Mutants with Respect to In Silico Metabolic Reconstructions.

    Science.gov (United States)

    diCenzo, George C; Finan, Turlough M

    2018-01-01

    The rate at which all genes within a bacterial genome can be identified far exceeds the ability to characterize these genes. To assist in associating genes with cellular functions, a large-scale bacterial genome deletion approach can be employed to rapidly screen tens to thousands of genes for desired phenotypes. Here, we provide a detailed protocol for the generation of deletions of large segments of bacterial genomes that relies on the activity of a site-specific recombinase. In this procedure, two recombinase recognition target sequences are introduced into known positions of a bacterial genome through single cross-over plasmid integration. Subsequent expression of the site-specific recombinase mediates recombination between the two target sequences, resulting in the excision of the intervening region and its loss from the genome. We further illustrate how this deletion system can be readily adapted to function as a large-scale in vivo cloning procedure, in which the region excised from the genome is captured as a replicative plasmid. We next provide a procedure for the metabolic analysis of bacterial large-scale genome deletion mutants using the Biolog Phenotype MicroArray™ system. Finally, a pipeline is described, and a sample Matlab script is provided, for the integration of the obtained data with a draft metabolic reconstruction for the refinement of the reactions and gene-protein-reaction relationships in a metabolic reconstruction.

  4. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  5. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  6. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  7. Histone hypoacetylation is required to maintain late replication timing of constitutive heterochromatin.

    Science.gov (United States)

    Casas-Delucchi, Corella S; van Bemmel, Joke G; Haase, Sebastian; Herce, Henry D; Nowak, Danny; Meilinger, Daniela; Stear, Jeffrey H; Leonhardt, Heinrich; Cardoso, M Cristina

    2012-01-01

    The replication of the genome is a spatio-temporally highly organized process. Yet, its flexibility throughout development suggests that this process is not genetically regulated. However, the mechanisms and chromatin modifications controlling replication timing are still unclear. We made use of the prominent structure and defined heterochromatic landscape of pericentric regions as an example of late replicating constitutive heterochromatin. We manipulated the major chromatin markers of these regions, namely histone acetylation, DNA and histone methylation, as well as chromatin condensation and determined the effects of these altered chromatin states on replication timing. Here, we show that manipulation of DNA and histone methylation as well as acetylation levels caused large-scale heterochromatin decondensation. Histone demethylation and the concomitant decondensation, however, did not affect replication timing. In contrast, immuno-FISH and time-lapse analyses showed that lowering DNA methylation, as well as increasing histone acetylation, advanced the onset of heterochromatin replication. While dnmt1(-)(/)(-) cells showed increased histone acetylation at chromocenters, histone hyperacetylation did not induce DNA demethylation. Hence, we propose that histone hypoacetylation is required to maintain normal heterochromatin duplication dynamics. We speculate that a high histone acetylation level might increase the firing efficiency of origins and, concomitantly, advances the replication timing of distinct genomic regions.

  8. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  9. Machine Learning Algorithms For Predicting the Instability Timescales of Compact Planetary Systems

    Science.gov (United States)

    Tamayo, Daniel; Ali-Dib, Mohamad; Cloutier, Ryan; Huang, Chelsea; Van Laerhoven, Christa L.; Leblanc, Rejean; Menou, Kristen; Murray, Norman; Obertas, Alysa; Paradise, Adiv; Petrovich, Cristobal; Rachkov, Aleksandar; Rein, Hanno; Silburt, Ari; Tacik, Nick; Valencia, Diana

    2016-10-01

    The Kepler mission has uncovered hundreds of compact multi-planet systems. The dynamical pathways to instability in these compact systems and their associated timescales are not well understood theoretically. However, long-term stability is often used as a constraint to narrow down the space of orbital solutions from the transit data. This requires a large suite of N-body integrations that can each take several weeks to complete. This computational bottleneck is therefore an important limitation in our ability to characterize compact multi-planet systems.From suites of numerical simulations, previous studies have fit simple scaling relations between the instability timescale and various system parameters. However, the numerically simulated systems can deviate strongly from these empirical fits.We present a new approach to the problem using machine learning algorithms that have enjoyed success across a broad range of high-dimensional industry applications. In particular, we have generated large training sets of direct N-body integrations of synthetic compact planetary systems to train several regression models (support vector machine, gradient boost) that predict the instability timescale. We find that ensembling these models predicts the instability timescale of planetary systems better than previous approaches using the simple scaling relations mentioned above.Finally, we will discuss how these models provide a powerful tool for not only understanding the current Kepler multi-planet sample, but also for characterizing and shaping the radial-velocity follow-up strategies of multi-planet systems from the upcoming Transiting Exoplanet Survey Satellite (TESS) mission, given its shorter observation baselines.

  10. Planetary Data Archiving Activities of ISRO

    Science.gov (United States)

    Gopala Krishna, Barla; D, Rao J.; Thakkar, Navita; Prashar, Ajay; Manthira Moorthi, S.

    ISRO has launched its first planetary mission to moon viz., Chandrayaan-1 on October 22, 2008. This mission carried eleven instruments; a wealth of science data has been collected during its mission life (November 2008 to August 2009), which is archived at Indian Space Science Data Centre (ISSDC). The data centre ISSDC is responsible for the Ingest, storage, processing, Archive, and dissemination of the payload and related ancillary data in addition to real-time spacecraft operations support. ISSDC is designed to provide high computation power, large storage and hosting a variety of applications necessary to support all the planetary and space science missions of ISRO. State-of-the-art architecture of ISSDC provides the facility to ingest the raw payload data of all the science payloads of the science satellites in automatic manner, processes raw data and generates payload specific processed outputs, generate higher level products and disseminates the data sets to principal investigators, guest observers, payload operations centres (POC) and to general public. The data archive makes use of the well-proven archive standards of the Planetary Data System (PDS). The long term Archive for five payloads of Chandrayaan-1 data viz., TMC, HySI, SARA, M3 and MiniSAR is released from ISSDC on19th April 2013 (http://www.issdc.gov.in) to the users. Additionally DEMs generated from possible passes of Chandrayaan-1 TMC stereo data and sample map sheets of Lunar Atlas are also archived and released from ISSDC along with the LTA. Mars Orbiter Mission (MOM) is the recent planetary mission launched on October 22, 2013; currently enroute to MARS, carrying five instruments (http://www.isro.org) viz., Mars Color Camera (MCC) to map various morphological features on Mars with varying resolution and scales using the unique elliptical orbit, Methane Sensor for Mars (MSM) to measure total column of methane in the Martian atmosphere, Thermal Infrared Imaging Spectrometer (TIS) to map surface

  11. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  12. Magnetic Fields of Extrasolar Planets: Planetary Interiors and Habitability

    Science.gov (United States)

    Lazio, T. Joseph

    2018-06-01

    Ground-based observations showed that Jupiter's radio emission is linked to its planetary-scale magnetic field, and subsequent spacecraft observations have shown that most planets, and some moons, have or had a global magnetic field. Generated by internal dynamos, magnetic fields are one of the few remote sensing means of constraining the properties of planetary interiors. For the Earth, its magnetic field has been speculated to be partially responsible for its habitability, and knowledge of an extrasolar planet's magnetic field may be necessary to assess its habitability. The radio emission from Jupiter and other solar system planets is produced by an electron cyclotron maser, and detections of extrasolar planetary electron cyclotron masers will enable measurements of extrasolar planetary magnetic fields. Based on experience from the solar system, such observations will almost certainly require space-based observations, but they will also be guided by on-going and near-future ground-based observations.This work has benefited from the discussion and participants of the W. M. Keck Institute of Space Studies "Planetary Magnetic Fields: Planetary Interiors and Habitability" and content within a white paper submitted to the National Academy of Science Committee on Exoplanet Science Strategy. Part of this research was carried out at the Jet Propulsion Laboratory, California Institute of Technology, under a contract with the National Aeronautics and Space Administration.

  13. Miniaturisation of imaging spectrometer for planetary exploration

    Science.gov (United States)

    Drossart, Pierre; Sémery, Alain; Réess, Jean-Michel; Combes, Michel

    2017-11-01

    Future planetary exploration on telluric or giant planets will need a new kind of instrumentation combining imaging and spectroscopy at high spectral resolution to achieve new scientific measurements, in particular for atmospheric studies in nadir configuration. We present here a study of a Fourier Transform heterodyne spectrometer, which can achieve these objectives, in the visible or infrared. The system is composed of a Michelson interferometer, whose mirrors have been replaced by gratings, a configuration studied in the early days of Fourier Transform spectroscopy, but only recently reused for space instrumentation, with the availability of large infrared mosaics. A complete study of an instrument is underway, with optical and electronic tests, as well as data processing analysis. This instrument will be proposed for future planetary missions, including ESA/Bepi Colombo Mercury Planetary Orbiter or Earth orbiting platforms.

  14. AutoCNet: A Python library for sparse multi-image correspondence identification for planetary data

    Science.gov (United States)

    Laura, Jason; Rodriguez, Kelvin; Paquette, Adam C.; Dunn, Evin

    2018-01-01

    In this work we describe the AutoCNet library, written in Python, to support the application of computer vision techniques for n-image correspondence identification in remotely sensed planetary images and subsequent bundle adjustment. The library is designed to support exploratory data analysis, algorithm and processing pipeline development, and application at scale in High Performance Computing (HPC) environments for processing large data sets and generating foundational data products. We also present a brief case study illustrating high level usage for the Apollo 15 Metric camera.

  15. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  16. NASA Johnson Space Center's Planetary Sample Analysis and Mission Science (PSAMS) Laboratory: A National Facility for Planetary Research

    Science.gov (United States)

    Draper, D. S.

    2016-01-01

    NASA Johnson Space Center's (JSC's) Astromaterials Research and Exploration Science (ARES) Division, part of the Exploration Integration and Science Directorate, houses a unique combination of laboratories and other assets for conducting cutting edge planetary research. These facilities have been accessed for decades by outside scientists, most at no cost and on an informal basis. ARES has thus provided substantial leverage to many past and ongoing science projects at the national and international level. Here we propose to formalize that support via an ARES/JSC Plane-tary Sample Analysis and Mission Science Laboratory (PSAMS Lab). We maintain three major research capa-bilities: astromaterial sample analysis, planetary process simulation, and robotic-mission analog research. ARES scientists also support planning for eventual human ex-ploration missions, including astronaut geological training. We outline our facility's capabilities and its potential service to the community at large which, taken together with longstanding ARES experience and expertise in curation and in applied mission science, enable multi-disciplinary planetary research possible at no other institution. Comprehensive campaigns incorporating sample data, experimental constraints, and mission science data can be conducted under one roof.

  17. Robots and Humans in Planetary Exploration: Working Together?

    Science.gov (United States)

    Landis, Geoffrey A.; Lyons, Valerie (Technical Monitor)

    2002-01-01

    Today's approach to human-robotic cooperation in planetary exploration focuses on using robotic probes as precursors to human exploration. A large portion of current NASA planetary surface exploration is focussed on Mars, and robotic probes are seen as precursors to human exploration in: Learning about operation and mobility on Mars; Learning about the environment of Mars; Mapping the planet and selecting landing sites for human mission; Demonstration of critical technology; Manufacture fuel before human presence, and emplace elements of human-support infrastructure

  18. Shadow Replication: An Energy-Aware, Fault-Tolerant Computational Model for Green Cloud Computing

    Directory of Open Access Journals (Sweden)

    Xiaolong Cui

    2014-08-01

    Full Text Available As the demand for cloud computing continues to increase, cloud service providers face the daunting challenge to meet the negotiated SLA agreement, in terms of reliability and timely performance, while achieving cost-effectiveness. This challenge is increasingly compounded by the increasing likelihood of failure in large-scale clouds and the rising impact of energy consumption and CO2 emission on the environment. This paper proposes Shadow Replication, a novel fault-tolerance model for cloud computing, which seamlessly addresses failure at scale, while minimizing energy consumption and reducing its impact on the environment. The basic tenet of the model is to associate a suite of shadow processes to execute concurrently with the main process, but initially at a much reduced execution speed, to overcome failures as they occur. Two computationally-feasible schemes are proposed to achieve Shadow Replication. A performance evaluation framework is developed to analyze these schemes and compare their performance to traditional replication-based fault tolerance methods, focusing on the inherent tradeoff between fault tolerance, the specified SLA and profit maximization. The results show that Shadow Replication leads to significant energy reduction, and is better suited for compute-intensive execution models, where up to 30% more profit increase can be achieved due to reduced energy consumption.

  19. A Synoptic- and Planetary-Scale Analysis of Widespread North American Ice Storms

    Science.gov (United States)

    McCray, C.; Gyakum, J. R.; Atallah, E.

    2017-12-01

    Freezing rain can have devastating impacts, particularly when it persists for many hours. Predicting the precise temperature stratification necessary for long duration freezing rain events remains an important forecast challenge. To better elucidate the conditions responsible for the most severe events, we concentrate on surface observations of long-duration (6 or more hours) freezing rain events over North America from 1979-2016. Furthermore, we analyze cases in which multiple stations observe long-duration events simultaneously. Following these cases over successive days allows us to generate maps of freezing rain "tracks." We then categorize recurring geographic patterns to examine the meteorological conditions leading to these events. While freezing rain is most frequently observed in the northeastern United States and southeastern Canada, long-duration events have affected areas as far south as the Gulf Coast. Notably, a disproportionately large number of very long duration (18 or more hours) events have occurred in the Southern Plains states relative to the climatological annual frequency of freezing rain there. Classification of individual cases shows that most of these very long duration events are associated with a recurring pattern which produces freezing rain along a southwest-northeast swath from Texas/Oklahoma into the northeastern U.S. and eastern Canada. Storms classified within this pattern include the January 1998 and December 2013 ice storms. While this pattern is the most widespread, additional spatially extensive patterns occur. One of these areas extends from the Southern Plains eastward along the Gulf Coast to Georgia and the Carolinas. A third category of events extends from the Upper Midwest into the northeastern U.S. and southeastern Canada. The expansive areal extent and long duration of these events make them especially problematic. An analysis of the planetary- to synoptic-scale settings responsible for these cases and the differences

  20. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  1. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  2. Planetary Sciences, Geodynamics, Impacts, Mass Extinctions, and Evolution: Developments and Interconnections

    Directory of Open Access Journals (Sweden)

    Jaime Urrutia-Fucugauchi

    2016-01-01

    Full Text Available Research frontiers in geophysics are being expanded, with development of new fields resulting from technological advances such as the Earth observation satellite network, global positioning system, high pressure-temperature physics, tomographic methods, and big data computing. Planetary missions and enhanced exoplanets detection capabilities, with discovery of a wide range of exoplanets and multiple systems, have renewed attention to models of planetary system formation and planet’s characteristics, Earth’s interior, and geodynamics, highlighting the need to better understand the Earth system, processes, and spatio-temporal scales. Here we review the emerging interconnections resulting from advances in planetary sciences, geodynamics, high pressure-temperature physics, meteorite impacts, and mass extinctions.

  3. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  4. The signatures of the parental cluster on field planetary systems

    Science.gov (United States)

    Cai, Maxwell Xu; Portegies Zwart, Simon; van Elteren, Arjen

    2018-03-01

    Due to the high stellar densities in young clusters, planetary systems formed in these environments are likely to have experienced perturbations from encounters with other stars. We carry out direct N-body simulations of multiplanet systems in star clusters to study the combined effects of stellar encounters and internal planetary dynamics. These planetary systems eventually become part of the Galactic field population as the parental cluster dissolves, which is where most presently known exoplanets are observed. We show that perturbations induced by stellar encounters lead to distinct signatures in the field planetary systems, most prominently, the excited orbital inclinations and eccentricities. Planetary systems that form within the cluster's half-mass radius are more prone to such perturbations. The orbital elements are most strongly excited in the outermost orbit, but the effect propagates to the entire planetary system through secular evolution. Planet ejections may occur long after a stellar encounter. The surviving planets in these reduced systems tend to have, on average, higher inclinations and larger eccentricities compared to systems that were perturbed less strongly. As soon as the parental star cluster dissolves, external perturbations stop affecting the escaped planetary systems, and further evolution proceeds on a relaxation time-scale. The outer regions of these ejected planetary systems tend to relax so slowly that their state carries the memory of their last strong encounter in the star cluster. Regardless of the stellar density, we observe a robust anticorrelation between multiplicity and mean inclination/eccentricity. We speculate that the `Kepler dichotomy' observed in field planetary systems is a natural consequence of their early evolution in the parental cluster.

  5. Galactic planetary nebulae and evolution of their nuclei

    International Nuclear Information System (INIS)

    Khromov, G.S.

    1980-01-01

    The galactic system of planetary nebulae is investigated using previously constructed distance scale and kinematics data. A strong effect of observational selection is established, which has the consequence that with increasing distance, ever brighter and younger objects are observed. More accurate determinations of the spatial and surface densities of the planetary nebulae system are obtained as well as a new estimate of their total number in the Galaxy, which is approximately 200,000. New estimates are also made of the masses of the nebulae, the absolute magnitudes of the nebulae and their nuclei, and other physical parameters of these objects. The spatial and kinematic characteristics of the planetary nebulae indicate that they are objects of the old type I population. It is possible that their remote ancestors are main sequence stars of the type B8-A5-F or as yet unidentified objects of the same galactic subsystem

  6. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  7. Planetary magnetospheres

    International Nuclear Information System (INIS)

    Hill, T.W.; Michel, F.C.

    1975-01-01

    Recent planetary probes have resulted in the realization of the generality of magnetospheric interactions between the solar wind and the planets. The three categories of planetary magnetospheres are discussed: intrinsic slowly rotating magnetospheres, intrinsic rapidly rotating magnetospheres, and induced magnetospheres. (BJG)

  8. Nature of global large-scale sea level variability in relation to atmospheric forcing: A modeling study

    Science.gov (United States)

    Fukumori, Ichiro; Raghunath, Ramanujam; Fu, Lee-Lueng

    1998-03-01

    The relation between large-scale sea level variability and ocean circulation is studied using a numerical model. A global primitive equation model of the ocean is forced by daily winds and climatological heat fluxes corresponding to the period from January 1992 to January 1994. The physical nature of sea level's temporal variability from periods of days to a year is examined on the basis of spectral analyses of model results and comparisons with satellite altimetry and tide gauge measurements. The study elucidates and diagnoses the inhomogeneous physics of sea level change in space and frequency domain. At midlatitudes, large-scale sea level variability is primarily due to steric changes associated with the seasonal heating and cooling cycle of the surface layer. In comparison, changes in the tropics and high latitudes are mainly wind driven. Wind-driven variability exhibits a strong latitudinal dependence in itself. Wind-driven changes are largely baroclinic in the tropics but barotropic at higher latitudes. Baroclinic changes are dominated by the annual harmonic of the first baroclinic mode and is largest off the equator; variabilities associated with equatorial waves are smaller in comparison. Wind-driven barotropic changes exhibit a notable enhancement over several abyssal plains in the Southern Ocean, which is likely due to resonant planetary wave modes in basins semienclosed by discontinuities in potential vorticity. Otherwise, barotropic sea level changes are typically dominated by high frequencies with as much as half the total variance in periods shorter than 20 days, reflecting the frequency spectra of wind stress curl. Implications of the findings with regards to analyzing observations and data assimilation are discussed.

  9. Evolution of the large-scale atmospheric circulation in response to changing ice sheets over the last glacial cycle

    Directory of Open Access Journals (Sweden)

    M. Löfverström

    2014-07-01

    Full Text Available We present modelling results of the atmospheric circulation at the cold periods of marine isotope stage 5b (MIS 5b, MIS 4 and the Last Glacial Maximum (LGM, as well as the interglacial. The palaeosimulations are forced by ice-sheet reconstructions consistent with geological evidence and by appropriate insolation and greenhouse gas concentrations. The results suggest that the large-scale atmospheric winter circulation remained largely similar to the interglacial for a significant part of the glacial cycle. The proposed explanation is that the ice sheets were located in areas where their interaction with the mean flow is limited. However, the LGM Laurentide Ice Sheet induces a much larger planetary wave that leads to a zonalisation of the Atlantic jet. In summer, the ice-sheet topography dynamically induces warm temperatures in Alaska and central Asia that inhibits the expansion of the ice sheets into these regions. The warm temperatures may also serve as an explanation for westward propagation of the Eurasian Ice Sheet from MIS 4 to the LGM.

  10. Using Primary Literature for Teaching Undergraduate Planetary Sciences

    Science.gov (United States)

    Levine, J.

    2013-05-01

    Articles from the primary scientific literature can be a valuable teaching tool in undergraduate classrooms. At Colgate University, I emphasize selected research articles in an upper-level undergraduate course in planetary sciences. In addition to their value for conveying specific scientific content, I find that they also impart larger lessons which are especially apt in planetary sciences and allied fields. First, because of the interdisciplinary nature of planetary sciences, students discover that contributions to outstanding problems may arrive from unexpected directions, so they need to be aware of the multi-faceted nature of scientific problems. For instance, after millennia of astrometric attempts, the scale of the Solar System was determined with extraordinary precision with emerging radar technology in the 1960's. Second, students learn the importance of careful work, with due attention to detail. After all, the timescales of planetary formation are encoded in systematic isotopic variations of a few parts in 10,000; in students' own experiences with laboratory data they might well overlook such a small effect. Third, students identify the often-tortuous connections between measured and inferred quantities, which corrects a common student misconception that all quantities of interest (e.g., the age of a meteorite) can be measured directly. Fourth, research articles provide opportunities for students to practice the interpretation of graphical data, since figures often represent a large volume of data in succinct form. Fifth, and perhaps of greatest importance, by considering the uncertainties inherent in reported data, students come to recognize the limits of scientific understanding, the extent to which scientific conclusions are justified (or not), and the lengths to which working scientists go to mitigate their uncertainties. These larger lessons are best mediated by students' own encounters with the articles they read, but require instructors to make

  11. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  12. Immersive Interaction, Manipulation and Analysis of Large 3D Datasets for Planetary and Earth Sciences

    Science.gov (United States)

    Pariser, O.; Calef, F.; Manning, E. M.; Ardulov, V.

    2017-12-01

    We will present implementation and study of several use-cases of utilizing Virtual Reality (VR) for immersive display, interaction and analysis of large and complex 3D datasets. These datasets have been acquired by the instruments across several Earth, Planetary and Solar Space Robotics Missions. First, we will describe the architecture of the common application framework that was developed to input data, interface with VR display devices and program input controllers in various computing environments. Tethered and portable VR technologies will be contrasted and advantages of each highlighted. We'll proceed to presenting experimental immersive analytics visual constructs that enable augmentation of 3D datasets with 2D ones such as images and statistical and abstract data. We will conclude by presenting comparative analysis with traditional visualization applications and share the feedback provided by our users: scientists and engineers.

  13. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  14. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  15. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  16. A Replication of the Internal Validity Structure of Three Major Teaching Rating Scales

    Science.gov (United States)

    Peters, Scott J.; Pereira, Nielsen

    2017-01-01

    Even as the importance of replication research has become more widely understood, the field of gifted education is almost completely devoid of replication studies. An area in which replication is a particular problem is in student identification research, since instrument validity is a necessary prerequisite for any sound psychometric decision. To…

  17. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  18. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  19. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  20. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  1. Earth and planetary sciences

    International Nuclear Information System (INIS)

    Wetherill, G.W.; Drake, C.L.

    1980-01-01

    The earth is a dynamic body. The major surface manifestation of this dynamism has been fragmentation of the earth's outer shell and subsequent relative movement of the pieces on a large scale. Evidence for continental movement came from studies of geomagnetism. As the sea floor spreads and new crust is formed, it is magnetized with the polarity of the field at the time of its formation. The plate tectonics model explains the history, nature, and topography of the oceanic crust. When a lithospheric plate surmounted by continental crust collides with an oceanic lithosphere, it is the denser oceanic lithosphere that is subducted. Hence the ancient oceans have vanished and the knowledge of ancient earth will require deciphering the complex continental geological record. Geochemical investigation shows that the source region of continental rocks is not simply the depleted mantle that is characteristic of the source region of basalts produced at the oceanic ridges. The driving force of plate tectonics is convection within the earth, but much remains to be learned about the convection and interior of the earth. A brief discussion of planetary exploration is given

  2. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  3. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  4. Formation of Planetary Populations I: Metallicity & Envelope Opacity Effects

    Science.gov (United States)

    Alessi, Matthew; Pudritz, Ralph E.

    2018-05-01

    We present a comprehensive body of simulations of the formation of exoplanetary populations that incorporate the role of planet traps in slowing planetary migration. The traps we include in our model are the water ice line, the disk heat transition, and the dead zone outer edge. We reduce our model parameter set to two physical parameters: the opacity of the accreting planetary atmospheres (κenv) and a measure of the efficiency of planetary accretion after gap opening (fmax). We perform planet population synthesis calculations based on the initial observed distributions of host star and disk properties - their disk masses, lifetimes, and stellar metallicities. We find the frequency of giant planet formation scales with disk metallicity, in agreement with the observed Jovian planet frequency-metallicity relation. We consider both X-ray and cosmic ray disk ionization models, whose differing ionization rates lead to different dead zone trap locations. In both cases, Jovian planets form in our model out to 2-3 AU, with a distribution at smaller radii dependent on the disk ionization source and the setting of envelope opacity. We find that low values of κenv (0.001-0.002 cm2 g-1) and X-ray disk ionization are necessary to obtain a separation between hot Jupiters near 0.1 AU, and warm Jupiters outside 0.6 AU, a feature present in the data. Our model also produces a large number of super Earths, but the majority are outside of 2 AU. As our model assumes a constant dust to gas ratio, we suggest that radial dust evolution must be taken into account to reproduce the observed super Earth population.

  5. A Successful Replication of the River Visitor Inventory and Monitoring Process for Capacity Management

    Science.gov (United States)

    Kenneth Chilman; James Vogel; Greg Brown; John H. Burde

    2004-01-01

    This paper has 3 purposes: to discuss 1. case study research and its utility for recreation management decisionmaking, 2. the recreation visitor inventory and monitoring process developed from case study research, and 3. a successful replication of the process in a large-scale, multi-year application. Although case study research is discussed in research textbooks as...

  6. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  7. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  8. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  9. Statistical and physical study of one-sided planetary nebulae.

    Science.gov (United States)

    Ali, A.; El-Nawawy, M. S.; Pfleiderer, J.

    The authors have investigated the spatial orientation of one-sided planetary nebulae. Most of them if not all are interacting with the interstellar medium. Seventy percent of the nebulae in the sample have inclination angles larger than 45° to the Galactic plane and 30% of the inclination angles are less than 45°. Most of the selected objects are old, evolved planetary nebulae with large dimensions, and not far away from the Galactic plane. Seventy-five percent of the objects are within 160 pc from the Galactic plane. The enhanced concavity arc can be explained physically as a result of the 'planetary nebulae-interstellar matter' interaction. The authors discuss the possible effect of the interstellar magnetic field in the concavity regions.

  10. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  11. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  12. Miniaturized Environmental Scanning Electron Microscope for In Situ Planetary Studies

    Science.gov (United States)

    Gaskin, Jessica; Abbott, Terry; Medley, Stephanie; Gregory, Don; Thaisen, Kevin; Taylor , Lawrence; Ramsey, Brian; Jerman, Gregory; Sampson, Allen; Harvey, Ralph

    2010-01-01

    The exploration of remote planetary surfaces calls for the advancement of low power, highly-miniaturized instrumentation. Instruments of this nature that are capable of multiple types of analyses will prove to be particularly useful as we prepare for human return to the moon, and as we continue to explore increasingly remote locations in our Solar System. To this end, our group has been developing a miniaturized Environmental-Scanning Electron Microscope (mESEM) capable of remote investigations of mineralogical samples through in-situ topographical and chemical analysis on a fine scale. The functioning of an SEM is well known: an electron beam is focused to nanometer-scale onto a given sample where resulting emissions such as backscattered and secondary electrons, X-rays, and visible light are registered. Raster scanning the primary electron beam across the sample then gives a fine-scale image of the surface topography (texture), crystalline structure and orientation, with accompanying elemental composition. The flexibility in the types of measurements the mESEM is capable of, makes it ideally suited for a variety of applications. The mESEM is appropriate for use on multiple planetary surfaces, and for a variety of mission goals (from science to non-destructive analysis to ISRU). We will identify potential applications and range of potential uses related to planetary exploration. Over the past few of years we have initiated fabrication and testing of a proof-of-concept assembly, consisting of a cold-field-emission electron gun and custom high-voltage power supply, electrostatic electron-beam focusing column, and scanning-imaging electronics plus backscatter detector. Current project status will be discussed. This effort is funded through the NASA Research Opportunities in Space and Earth Sciences - Planetary Instrument Definition and Development Program.

  13. The Role of NASA's Planetary Data System in the Planetary Spatial Data Infrastructure Initiative

    Science.gov (United States)

    Arvidson, R. E.; Gaddis, L. R.

    2017-12-01

    An effort underway in NASA's planetary science community is the Mapping and Planetary Spatial Infrastructure Team (MAPSIT, http://www.lpi.usra.edu/mapsit/). MAPSIT is a community assessment group organized to address a lack of strategic spatial data planning for space science and exploration. Working with MAPSIT, a new initiative of NASA and USGS is the development of a Planetary Spatial Data Infrastructure (PSDI) that builds on extensive knowledge on storing, accessing, and working with terrestrial spatial data. PSDI is a knowledge and technology framework that enables the efficient discovery, access, and exploitation of planetary spatial data to facilitate data analysis, knowledge synthesis, and decision-making. NASA's Planetary Data System (PDS) archives >1.2 petabytes of digital data resulting from decades of planetary exploration and research. The PDS charter focuses on the efficient collection, archiving, and accessibility of these data. The PDS emphasis on data preservation and archiving is complementary to that of the PSDI initiative because the latter utilizes and extends available data to address user needs in the areas of emerging technologies, rapid development of tailored delivery systems, and development of online collaborative research environments. The PDS plays an essential PSDI role because it provides expertise to help NASA missions and other data providers to organize and document their planetary data, to collect and maintain the archives with complete, well-documented and peer-reviewed planetary data, to make planetary data accessible by providing online data delivery tools and search services, and ultimately to ensure the long-term preservation and usability of planetary data. The current PDS4 information model extends and expands PDS metadata and relationships between and among elements of the collections. The PDS supports data delivery through several node services, including the Planetary Image Atlas (https

  14. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  15. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  16. The Internal Consistency and Validity of the Vaccination Attitudes Examination Scale: A Replication Study.

    Science.gov (United States)

    Wood, Louise; Smith, Michael; Miller, Christopher B; O'Carroll, Ronan E

    2018-06-19

    Vaccinations are important preventative health behaviors. The recently developed Vaccination Attitudes Examination (VAX) Scale aims to measure the reasons behind refusal/hesitancy regarding vaccinations. The aim of this replication study is to conduct an independent test of the newly developed VAX Scale in the UK. We tested (a) internal consistency (Cronbach's α); (b) convergent validity by assessing its relationships with beliefs about medication, medical mistrust, and perceived sensitivity to medicines; and (c) construct validity by testing how well the VAX Scale discriminated between vaccinators and nonvaccinators. A sample of 243 UK adults completed the VAX Scale, the Beliefs About Medicines Questionnaire, the Perceived Sensitivity to Medicines Scale, and the Medical Mistrust Index, in addition to demographics of age, gender, education levels, and social deprivation. Participants were asked (a) whether they received an influenza vaccination in the past year and (b) if they had a young child, whether they had vaccinated the young child against influenza in the past year. The VAX (a) demonstrated high internal consistency (α = .92); (b) was positively correlated with medical mistrust and beliefs about medicines, and less strongly correlated with perceived sensitivity to medicines; and (c) successfully differentiated parental influenza vaccinators from nonvaccinators. The VAX demonstrated good internal consistency, convergent validity, and construct validity in an independent UK sample. It appears to be a useful measure to help us understand the health beliefs that promote or deter vaccination behavior.

  17. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  18. Influence of Planetary Protection Guidelines on Waste Management Operations

    Science.gov (United States)

    Hogan, John A.; Fisher, John W.; Levri, Julie A.; Wignarajah, Kanapathipi; Race, Margaret S.; Stabekis, Perry D.; Rummel, John D.

    2005-01-01

    Newly outlined missions in the Space Exploration Initiative include extended human habitation on Mars. During these missions, large amounts of waste materials will be generated in solid, liquid and gaseous form. Returning these wastes to Earth will be extremely costly, and will therefore likely remain on Mars. Untreated, these wastes are a reservoir of live/dead organisms and molecules considered to be "biomarkers" i.e., indicators of life). If released to the planetary surface, these materials can potentially confound exobiology experiments and disrupt Martian ecology indefinitely (if existent). Waste management systems must therefore be specifically designed to control release of problematic materials both during the active phase of the mission, and for any specified post-mission duration. To effectively develop waste management requirements for Mars missions, planetary protection guidelines must first be established. While previous policies for Apollo lunar missions exist, it is anticipated that the increased probability of finding evidence of life on Mars, as well as the lengthy mission durations will initially lead to more conservative planetary protection measures. To facilitate the development of overall requirements for both waste management and planetary protection for future missions, a workshop was conducted to identify how these two areas interface, and to establish a preliminary set of planetary protection guidelines that address waste management operations. This paper provides background regarding past and current planetary protection and waste management issues, and their interactions. A summary of the recommended planetary protection guidelines, anticipated ramifications and research needs for waste management system design for both forward (Mars) and backward (Earth) contamination is also provided.

  19. Meta-analysis of genome-wide association data and large-scale replication identifies additional susceptibility loci for type 2 diabetes

    DEFF Research Database (Denmark)

    Zeggini, Eleftheria; Scott, Laura J; Saxena, Richa

    2008-01-01

    analyses had limited power to identify variants with modest effects, we carried out meta-analysis of three T2D GWA scans comprising 10,128 individuals of European descent and approximately 2.2 million SNPs (directly genotyped and imputed), followed by replication testing in an independent sample......Genome-wide association (GWA) studies have identified multiple loci at which common variants modestly but reproducibly influence risk of type 2 diabetes (T2D). Established associations to common and rare variants explain only a small proportion of the heritability of T2D. As previously published...

  20. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  1. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  2. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  3. Modeling inhomogeneous DNA replication kinetics.

    Directory of Open Access Journals (Sweden)

    Michel G Gauthier

    Full Text Available In eukaryotic organisms, DNA replication is initiated at a series of chromosomal locations called origins, where replication forks are assembled proceeding bidirectionally to replicate the genome. The distribution and firing rate of these origins, in conjunction with the velocity at which forks progress, dictate the program of the replication process. Previous attempts at modeling DNA replication in eukaryotes have focused on cases where the firing rate and the velocity of replication forks are homogeneous, or uniform, across the genome. However, it is now known that there are large variations in origin activity along the genome and variations in fork velocities can also take place. Here, we generalize previous approaches to modeling replication, to allow for arbitrary spatial variation of initiation rates and fork velocities. We derive rate equations for left- and right-moving forks and for replication probability over time that can be solved numerically to obtain the mean-field replication program. This method accurately reproduces the results of DNA replication simulation. We also successfully adapted our approach to the inverse problem of fitting measurements of DNA replication performed on single DNA molecules. Since such measurements are performed on specified portion of the genome, the examined DNA molecules may be replicated by forks that originate either within the studied molecule or outside of it. This problem was solved by using an effective flux of incoming replication forks at the model boundaries to represent the origin activity outside the studied region. Using this approach, we show that reliable inferences can be made about the replication of specific portions of the genome even if the amount of data that can be obtained from single-molecule experiments is generally limited.

  4. Using Intel Xeon Phi to accelerate the WRF TEMF planetary boundary layer scheme

    Science.gov (United States)

    Mielikainen, Jarno; Huang, Bormin; Huang, Allen

    2014-05-01

    The Weather Research and Forecasting (WRF) model is designed for numerical weather prediction and atmospheric research. The WRF software infrastructure consists of several components such as dynamic solvers and physics schemes. Numerical models are used to resolve the large-scale flow. However, subgrid-scale parameterizations are for an estimation of small-scale properties (e.g., boundary layer turbulence and convection, clouds, radiation). Those have a significant influence on the resolved scale due to the complex nonlinear nature of the atmosphere. For the cloudy planetary boundary layer (PBL), it is fundamental to parameterize vertical turbulent fluxes and subgrid-scale condensation in a realistic manner. A parameterization based on the Total Energy - Mass Flux (TEMF) that unifies turbulence and moist convection components produces a better result that the other PBL schemes. For that reason, the TEMF scheme is chosen as the PBL scheme we optimized for Intel Many Integrated Core (MIC), which ushers in a new era of supercomputing speed, performance, and compatibility. It allows the developers to run code at trillions of calculations per second using the familiar programming model. In this paper, we present our optimization results for TEMF planetary boundary layer scheme. The optimizations that were performed were quite generic in nature. Those optimizations included vectorization of the code to utilize vector units inside each CPU. Furthermore, memory access was improved by scalarizing some of the intermediate arrays. The results show that the optimization improved MIC performance by 14.8x. Furthermore, the optimizations increased CPU performance by 2.6x compared to the original multi-threaded code on quad core Intel Xeon E5-2603 running at 1.8 GHz. Compared to the optimized code running on a single CPU socket the optimized MIC code is 6.2x faster.

  5. Agriculture production as a major driver of the Earth system exceeding planetary boundaries

    Directory of Open Access Journals (Sweden)

    Bruce M. Campbell

    2017-12-01

    Full Text Available We explore the role of agriculture in destabilizing the Earth system at the planetary scale, through examining nine planetary boundaries, or "safe limits": land-system change, freshwater use, biogeochemical flows, biosphere integrity, climate change, ocean acidification, stratospheric ozone depletion, atmospheric aerosol loading, and introduction of novel entities. Two planetary boundaries have been fully transgressed, i.e., are at high risk, biosphere integrity and biogeochemical flows, and agriculture has been the major driver of the transgression. Three are in a zone of uncertainty i.e., at increasing risk, with agriculture the major driver of two of those, land-system change and freshwater use, and a significant contributor to the third, climate change. Agriculture is also a significant or major contributor to change for many of those planetary boundaries still in the safe zone. To reduce the role of agriculture in transgressing planetary boundaries, many interventions will be needed, including those in broader food systems.

  6. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  7. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  8. The Impact of the Implementation Cost of Replication in Data Grid Job Scheduling

    Directory of Open Access Journals (Sweden)

    Babar Nazir

    2018-05-01

    Full Text Available Data Grids deal with geographically-distributed large-scale data-intensive applications. Schemes scheduled for data grids attempt to not only improve data access time, but also aim to improve the ratio of data availability to a node, where the data requests are generated. Data replication techniques manage large data by storing a number of data files efficiently. In this paper, we propose centralized dynamic scheduling strategy-replica placement strategies (CDSS-RPS. CDSS-RPS schedule the data and task so that it minimizes the implementation cost and data transfer time. CDSS-RPS consists of two algorithms, namely (a centralized dynamic scheduling (CDS and (b replica placement strategy (RPS. CDS considers the computing capacity of a node and finds an appropriate location for the job. RPS attempts to improve file access time by using replication on the basis of number of accesses, storage capacity of a computing node, and response time of a requested file. Extensive simulations are carried out to demonstrate the effectiveness of the proposed strategy. Simulation results demonstrate that the replication and scheduling strategies improve the implementation cost and average access time significantly.

  9. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  10. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  11. The planetary water drama: Dual task of feeding humanity and curbing climate change

    Science.gov (United States)

    Rockström, J.; Falkenmark, M.; Lannerstad, M.; Karlberg, L.

    2012-08-01

    This paper analyses the potential conflict between resilience of the Earth system and global freshwater requirements for the dual task of carbon sequestration to reduce CO2 in the atmosphere, and food production to feed humanity by 2050. It makes an attempt to assess the order of magnitude of the increased consumptive water use involved and analyses the implications as seen from two parallel perspectives; the global perspective of human development within a “safe operating space” with regard to the definition of the Planetary Boundary for freshwater; and the social-ecological implications at the regional river basin scale in terms of sharpening water shortages and threats to aquatic ecosystems. The paper shows that the consumptive water use involved in the dual task would both transgress the proposed planetary boundary range for global consumptive freshwater use and would further exacerbate already severe river depletion, causing societal problems related to water shortage and water allocation. Thus, strategies to rely on sequestration of CO2 as a mitigation strategy must recognize the high freshwater costs involved, implying that the key climate mitigation strategy must be to reduce emissions. The paper finally highlights the need to analyze both water and carbon tradeoffs from anticipated large scale biofuel production climate change mitigation strategy, to reveal gains and impact of this in contrast to carbon sequestration strategies.

  12. Pyrimidine dimers block simian virus 40 replication forks

    International Nuclear Information System (INIS)

    Berger, C.A.; Edenberg, H.J.

    1986-01-01

    UV light produces lesions, predominantly pyrimidine dimers, which inhibit DNA replication in mammalian cells. The mechanism of inhibition is controversial: is synthesis of a daughter strand halted at a lesion while the replication fork moves on and reinitiates downstream, or is fork progression itself blocked for some time at the site of a lesion? We directly addressed this question by using electron microscopy to examine the distances of replication forks from the origin in unirradiated and UV-irradiated simian virus 40 chromosomes. If UV lesions block replication fork progression, the forks should be asymmetrically located in a large fraction of the irradiated molecules; if replication forks move rapidly past lesions, the forks should be symmetrically located. A large fraction of the simian virus 40 replication forks in irradiated molecules were asymmetrically located, demonstrating that UV lesions present at the frequency of pyrimidine dimers block replication forks. As a mechanism for this fork blockage, we propose that polymerization of the leading strand makes a significant contribution to the energetics of fork movement, so any lesion in the template for the leading strand which blocks polymerization should also block fork movement

  13. The Strength Analysis of Differential Planetary Gears of Gearbox for Concrete Mixer Truck

    Science.gov (United States)

    Bae, M. H.; Bae, T. Y.; Kim, D. J.

    2018-03-01

    The power train of mixer gearbox for concrete mixer truck includes differential planetary gears to get large reduction ratio for operating mixer a drum and simple structure. The planetary gears are very important part of a mixer gearbox where strength problems namely gear bending stress, gear compressive stress and scoring failure are the main concern. In the present study, calculating specifications of the differential planetary gears and analyzing the gear bending and compressive stresses as well as scoring factor of the differential planetary gears gearbox for an optimal design of the mixer gearbox in respect to cost and reliability are investigated. The analyses of actual gear bending and compressive stresses of the differential planetary gears using Lewes & Hertz equation and verifications of the calculated specifications of the differential planetary gears evaluate the results with the data of allowable bending and compressive stress from the Stress-No. of cycles curves of gears. In addition, we also analyze actual gear scoring factor as well as evaluate the possibility of scoring failure of the differential planetary gear.

  14. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  15. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  16. 3He Abundances in Planetary Nebulae

    Science.gov (United States)

    Guzman-Ramirez, Lizette

    2017-10-01

    Determination of the 3He isotope is important to many fields of astrophysics, including stellar evolution, chemical evolution, and cosmology. The isotope is produced in stars which evolve through the planetary nebula phase. Planetary nebulae are the final evolutionary phase of low- and intermediate-mass stars, where the extensive mass lost by the star on the asymptotic giant branch is ionised by the emerging white dwarf. This ejecta quickly disperses and merges with the surrounding ISM. 3He abundances in planetary nebulae have been derived from the hyperfine transition of the ionised 3He, 3He+, at the radio rest frequency 8.665 GHz. 3He abundances in PNe can help test models of the chemical evolution of the Galaxy. Many hours have been put into trying to detect this line, using telescopes like the Effelsberg 100m dish of the Max Planck Institute for Radio Astronomy, the National Radio Astronomy Observatory (NRAO) 140-foot telescope, the NRAO Very Large Array, the Arecibo antenna, the Green Bank Telescope, and only just recently, the Deep Space Station 63 antenna from the Madrid Deep Space Communications Complex.

  17. Identifying gene-environment interactions in schizophrenia: contemporary challenges for integrated, large-scale investigations.

    Science.gov (United States)

    van Os, Jim; Rutten, Bart P; Myin-Germeys, Inez; Delespaul, Philippe; Viechtbauer, Wolfgang; van Zelst, Catherine; Bruggeman, Richard; Reininghaus, Ulrich; Morgan, Craig; Murray, Robin M; Di Forti, Marta; McGuire, Philip; Valmaggia, Lucia R; Kempton, Matthew J; Gayer-Anderson, Charlotte; Hubbard, Kathryn; Beards, Stephanie; Stilo, Simona A; Onyejiaka, Adanna; Bourque, Francois; Modinos, Gemma; Tognin, Stefania; Calem, Maria; O'Donovan, Michael C; Owen, Michael J; Holmans, Peter; Williams, Nigel; Craddock, Nicholas; Richards, Alexander; Humphreys, Isla; Meyer-Lindenberg, Andreas; Leweke, F Markus; Tost, Heike; Akdeniz, Ceren; Rohleder, Cathrin; Bumb, J Malte; Schwarz, Emanuel; Alptekin, Köksal; Üçok, Alp; Saka, Meram Can; Atbaşoğlu, E Cem; Gülöksüz, Sinan; Gumus-Akay, Guvem; Cihan, Burçin; Karadağ, Hasan; Soygür, Haldan; Cankurtaran, Eylem Şahin; Ulusoy, Semra; Akdede, Berna; Binbay, Tolga; Ayer, Ahmet; Noyan, Handan; Karadayı, Gülşah; Akturan, Elçin; Ulaş, Halis; Arango, Celso; Parellada, Mara; Bernardo, Miguel; Sanjuán, Julio; Bobes, Julio; Arrojo, Manuel; Santos, Jose Luis; Cuadrado, Pedro; Rodríguez Solano, José Juan; Carracedo, Angel; García Bernardo, Enrique; Roldán, Laura; López, Gonzalo; Cabrera, Bibiana; Cruz, Sabrina; Díaz Mesa, Eva Ma; Pouso, María; Jiménez, Estela; Sánchez, Teresa; Rapado, Marta; González, Emiliano; Martínez, Covadonga; Sánchez, Emilio; Olmeda, Ma Soledad; de Haan, Lieuwe; Velthorst, Eva; van der Gaag, Mark; Selten, Jean-Paul; van Dam, Daniella; van der Ven, Elsje; van der Meer, Floor; Messchaert, Elles; Kraan, Tamar; Burger, Nadine; Leboyer, Marion; Szoke, Andrei; Schürhoff, Franck; Llorca, Pierre-Michel; Jamain, Stéphane; Tortelli, Andrea; Frijda, Flora; Vilain, Jeanne; Galliot, Anne-Marie; Baudin, Grégoire; Ferchiou, Aziz; Richard, Jean-Romain; Bulzacka, Ewa; Charpeaud, Thomas; Tronche, Anne-Marie; De Hert, Marc; van Winkel, Ruud; Decoster, Jeroen; Derom, Catherine; Thiery, Evert; Stefanis, Nikos C; Sachs, Gabriele; Aschauer, Harald; Lasser, Iris; Winklbaur, Bernadette; Schlögelhofer, Monika; Riecher-Rössler, Anita; Borgwardt, Stefan; Walter, Anna; Harrisberger, Fabienne; Smieskova, Renata; Rapp, Charlotte; Ittig, Sarah; Soguel-dit-Piquard, Fabienne; Studerus, Erich; Klosterkötter, Joachim; Ruhrmann, Stephan; Paruch, Julia; Julkowski, Dominika; Hilboll, Desiree; Sham, Pak C; Cherny, Stacey S; Chen, Eric Y H; Campbell, Desmond D; Li, Miaoxin; Romeo-Casabona, Carlos María; Emaldi Cirión, Aitziber; Urruela Mora, Asier; Jones, Peter; Kirkbride, James; Cannon, Mary; Rujescu, Dan; Tarricone, Ilaria; Berardi, Domenico; Bonora, Elena; Seri, Marco; Marcacci, Thomas; Chiri, Luigi; Chierzi, Federico; Storbini, Viviana; Braca, Mauro; Minenna, Maria Gabriella; Donegani, Ivonne; Fioritti, Angelo; La Barbera, Daniele; La Cascia, Caterina Erika; Mulè, Alice; Sideli, Lucia; Sartorio, Rachele; Ferraro, Laura; Tripoli, Giada; Seminerio, Fabio; Marinaro, Anna Maria; McGorry, Patrick; Nelson, Barnaby; Amminger, G Paul; Pantelis, Christos; Menezes, Paulo R; Del-Ben, Cristina M; Gallo Tenan, Silvia H; Shuhama, Rosana; Ruggeri, Mirella; Tosato, Sarah; Lasalvia, Antonio; Bonetto, Chiara; Ira, Elisa; Nordentoft, Merete; Krebs, Marie-Odile; Barrantes-Vidal, Neus; Cristóbal, Paula; Kwapil, Thomas R; Brietzke, Elisa; Bressan, Rodrigo A; Gadelha, Ary; Maric, Nadja P; Andric, Sanja; Mihaljevic, Marina; Mirjanic, Tijana

    2014-07-01

    Recent years have seen considerable progress in epidemiological and molecular genetic research into environmental and genetic factors in schizophrenia, but methodological uncertainties remain with regard to validating environmental exposures, and the population risk conferred by individual molecular genetic variants is small. There are now also a limited number of studies that have investigated molecular genetic candidate gene-environment interactions (G × E), however, so far, thorough replication of findings is rare and G × E research still faces several conceptual and methodological challenges. In this article, we aim to review these recent developments and illustrate how integrated, large-scale investigations may overcome contemporary challenges in G × E research, drawing on the example of a large, international, multi-center study into the identification and translational application of G × E in schizophrenia. While such investigations are now well underway, new challenges emerge for G × E research from late-breaking evidence that genetic variation and environmental exposures are, to a significant degree, shared across a range of psychiatric disorders, with potential overlap in phenotype. © The Author 2014. Published by Oxford University Press on behalf of the Maryland Psychiatric Research Center. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  18. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  19. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  20. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  1. The Immersive Virtual Reality Lab: Possibilities for Remote Experimental Manipulations of Autonomic Activity on a Large Scale

    Directory of Open Access Journals (Sweden)

    Joshua Juvrud

    2018-05-01

    Full Text Available There is a need for large-scale remote data collection in a controlled environment, and the in-home availability of virtual reality (VR and the commercial availability of eye tracking for VR present unique and exciting opportunities for researchers. We propose and provide a proof-of-concept assessment of a robust system for large-scale in-home testing using consumer products that combines psychophysiological measures and VR, here referred to as a Virtual Lab. For the first time, this method is validated by correlating autonomic responses, skin conductance response (SCR, and pupillary dilation, in response to a spider, a beetle, and a ball using commercially available VR. Participants demonstrated greater SCR and pupillary responses to the spider, and the effect was dependent on the proximity of the stimuli to the participant, with a stronger response when the spider was close to the virtual self. We replicated these effects across two experiments and in separate physical room contexts to mimic variability in home environment. Together, these findings demonstrate the utility of pupil dilation as a marker of autonomic arousal and the feasibility to assess this in commercially available VR hardware and support a robust Virtual Lab tool for massive remote testing.

  2. The Immersive Virtual Reality Lab: Possibilities for Remote Experimental Manipulations of Autonomic Activity on a Large Scale.

    Science.gov (United States)

    Juvrud, Joshua; Gredebäck, Gustaf; Åhs, Fredrik; Lerin, Nils; Nyström, Pär; Kastrati, Granit; Rosén, Jörgen

    2018-01-01

    There is a need for large-scale remote data collection in a controlled environment, and the in-home availability of virtual reality (VR) and the commercial availability of eye tracking for VR present unique and exciting opportunities for researchers. We propose and provide a proof-of-concept assessment of a robust system for large-scale in-home testing using consumer products that combines psychophysiological measures and VR, here referred to as a Virtual Lab. For the first time, this method is validated by correlating autonomic responses, skin conductance response (SCR), and pupillary dilation, in response to a spider, a beetle, and a ball using commercially available VR. Participants demonstrated greater SCR and pupillary responses to the spider, and the effect was dependent on the proximity of the stimuli to the participant, with a stronger response when the spider was close to the virtual self. We replicated these effects across two experiments and in separate physical room contexts to mimic variability in home environment. Together, these findings demonstrate the utility of pupil dilation as a marker of autonomic arousal and the feasibility to assess this in commercially available VR hardware and support a robust Virtual Lab tool for massive remote testing.

  3. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  4. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  5. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  6. Assessment and improvement of statistical tools for comparative proteomics analysis of sparse data sets with few experimental replicates

    DEFF Research Database (Denmark)

    Schwämmle, Veit; León, Ileana R.; Jensen, Ole Nørregaard

    2013-01-01

    Large-scale quantitative analyses of biological systems are often performed with few replicate experiments, leading to multiple nonidentical data sets due to missing values. For example, mass spectrometry driven proteomics experiments are frequently performed with few biological or technical...... replicates due to sample-scarcity or due to duty-cycle or sensitivity constraints, or limited capacity of the available instrumentation, leading to incomplete results where detection of significant feature changes becomes a challenge. This problem is further exacerbated for the detection of significant...... as a novel and optimal way to detect significantly changing features in these data sets. This approach is suitable for large quantitative data sets from stable isotope labeling and mass spectrometry experiments and should be applicable to large data sets of any type. An R script that implements the improved...

  7. Surface topography analysis for dimensional quality control of replication at the micrometre scale

    DEFF Research Database (Denmark)

    Balcon, M.; Marinello, F.; Tosello, Guido

    2011-01-01

    Replication of geometrical features and surfaces are present at different production levels, from realization of moulds to final product. Geometrical features must be reproduced within specification limits, to ensure product functionality . In order to control the replication quality, mould...... and replica surfaces must be quantitatively analysed and compared. In the present work, reference simulated surfaces were considered and studied in order to evaluate the effectiveness and traceability of different analysis tools for replication quality control. Topographies were analysed simulating different...

  8. Atmospheric Rivers across Multi-scales of the Hydrologic cycle

    Science.gov (United States)

    Hu, H.

    2017-12-01

    Atmospheric Rivers (ARs) are defined as filamentary structures with strong water vapor transport in the atmosphere, moving as much water as is discharged by the Amazon River. As a large-scale phenomenon, ARs are embedded in the planetary-scale Rossby waves and account for the majority of poleward moisture transport in the midlatitudes. On the other hand, AR is the fundamental physical mechanism leading to extreme basin-scale precipitation and flooding over the U.S. West Coast in the winter season. The moisture transported by ARs is forced to rise and generate precipitation when it impinges on the mountainous coastal lands. My goal is to build the connection between the multi-scale features associated with ARs with their impacts on local hydrology, with particular focus on the U.S. West Coast. Moving across the different scales I have: (1) examined the planetary-scale dynamics in the upper-troposphere, and established a robust relationship between the two regimes of Rossby wave breaking and AR-precipitation and streamflow along the West Coast; (2) quantified the contribution from the tropics/subtropics to AR-related precipitation intensity and found a significant modulation from the large-scale thermodynamics; (3) developed a water tracer tool in a land surface model to track the lifecycle of the water collected from AR precipitation over the terrestrial system, so that the role of catchment-scale factors in modulating ARs' hydrological consequences could be examined. Ultimately, the information gather from these studies will indicate how the dynamic and thermodynamic changes as a response to climate change could affect the local flooding and water resource, which would be helpful in decision making.

  9. Planetary Defense

    Science.gov (United States)

    2016-05-01

    4 Abstract Planetary defense against asteroids should be a major concern for every government in the world . Millions of asteroids and...helps make Planetary Defense viable because defending the Earth against asteroids benefits from all the above technologies. So if our planet security...information about their physical characteristics so we can employ the right strategies. It is a crucial difference if asteroids are made up of metal

  10. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  11. Bioremediation at a global scale: from the test tube to planet Earth.

    Science.gov (United States)

    de Lorenzo, Víctor; Marlière, Philippe; Solé, Ricard

    2016-09-01

    Planet Earth's biosphere has evolved over billions of years as a balanced bio-geological system ultimately sustained by sunpower and the large-scale cycling of elements largely run by the global environmental microbiome. Humans have been part of this picture for much of their existence. But the industrial revolution started in the XIX century and the subsequent advances in medicine, chemistry, agriculture and communications have impacted such balances to an unprecedented degree - and the problem has nothing but exacerbated in the last 20 years. Human overpopulation, industrial growth along with unsustainable use of natural resources have driven many sites and perhaps the planetary ecosystem as a whole, beyond recovery by spontaneous natural means, even if the immediate causes could be stopped. The most conspicuous indications of such a state of affairs include the massive change in land use, the accelerated increase in the levels of greenhouse gases, the frequent natural disasters associated to climate change and the growing non-recyclable waste (e.g. plastics and recalcitrant chemicals) that we release to the Environment. While the whole planet is afflicted at a global scale by chemical pollution and anthropogenic emissions, the ongoing development of systems and synthetic biology, metagenomics, modern chemistry and some key concepts from ecological theory allow us to tackle this phenomenal challenge and propose large-scale interventions aimed at reversing and even improving the situation. This involves (i) identification of key reactions or processes that need to be re-established (or altogether created) for ecosystem reinstallation, (ii) implementation of such reactions in natural or designer hosts able to self-replicate and deliver the corresponding activities when/where needed in a fashion guided by sound ecological modelling, (iii) dispersal of niche-creating agents at a global scale and (iv) containment, monitoring and risk assessment of the whole process

  12. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  13. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  14. Europlanet Research Infrastructure: Planetary Simulation Facilities

    Science.gov (United States)

    Davies, G. R.; Mason, N. J.; Green, S.; Gómez, F.; Prieto, O.; Helbert, J.; Colangeli, L.; Srama, R.; Grande, M.; Merrison, J.

    2008-09-01

    EuroPlanet The Europlanet Research Infrastructure consortium funded under FP7 aims to provide the EU Planetary Science community greater access for to research infrastructure. A series of networking and outreach initiatives will be complimented by joint research activities and the formation of three Trans National Access distributed service laboratories (TNA's) to provide a unique and comprehensive set of analogue field sites, laboratory simulation facilities, and extraterrestrial sample analysis tools. Here we report on the infrastructure that comprises the second TNA; Planetary Simulation Facilities. 11 laboratory based facilities are able to recreate the conditions found in the atmospheres and on the surfaces of planetary systems with specific emphasis on Martian, Titan and Europa analogues. The strategy has been to offer some overlap in capabilities to ensure access to the highest number of users and to allow for progressive and efficient development strategies. For example initial testing of mobility capability prior to the step wise development within planetary atmospheres that can be made progressively more hostile through the introduction of extreme temperatures, radiation, wind and dust. Europlanet Research Infrastructure Facilties: Mars atmosphere simulation chambers at VUA and OU These relatively large chambers (up to 1 x 0.5 x 0.5 m) simulate Martian atmospheric conditions and the dual cooling options at VUA allows stabilised instrument temperatures while the remainder of the sample chamber can be varied between 220K and 350K. Researchers can therefore assess analytical protocols for instruments operating on Mars; e.g. effect of pCO2, temperature and material (e.g., ± ice) on spectroscopic and laser ablation techniques while monitoring the performance of detection technologies such as CCD at low T & variable p H2O & pCO2. Titan atmosphere and surface simulation chamber at OU The chamber simulates Titan's atmospheric composition under a range of

  15. Updating the planetary time scale: focus on Mars

    Science.gov (United States)

    Tanaka, Kenneth L.; Quantin-Nataf, Cathy

    2013-01-01

    Formal stratigraphic systems have been developed for the surface materials of the Moon, Mars, Mercury, and the Galilean satellite Ganymede. These systems are based on geologic mapping, which establishes relative ages of surfaces delineated by superposition, morphology, impact crater densities, and other relations and features. Referent units selected from the mapping determine time-stratigraphic bases and/or representative materials characteristic of events and periods for definition of chronologic units. Absolute ages of these units in some cases can be estimated using crater size-frequency data. For the Moon, the chronologic units and cratering record are calibrated by radiometric ages measured from samples collected from the lunar surface. Model ages for other cratered planetary surfaces are constructed primarily by estimating cratering rates relative to that of the Moon. Other cratered bodies with estimated surface ages include Venus and the Galilean satellites of Jupiter. New global geologic mapping and crater dating studies of Mars are resulting in more accurate and detailed reconstructions of its geologic history.

  16. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  17. Solar planetary systems stardust to terrestrial and extraterrestrial planetary sciences

    CERN Document Server

    Bhattacharya, Asit B

    2017-01-01

    The authors have put forth great efforts in gathering present day knowledge about different objects within our solar system and universe. This book features the most current information on the subject with information acquired from noted scientists in this area. The main objective is to convey the importance of the subject and provide detailed information on the physical makeup of our planetary system and technologies used for research. Information on educational projects has also been included in the Radio Astronomy chapters.This information is a real plus for students and educators considering a career in Planetary Science or for increasing their knowledge about our planetary system

  18. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  19. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  20. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  1. A Distributed OpenCL Framework using Redundant Computation and Data Replication

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Junghyun [Seoul National University, Korea; Gangwon, Jo [Seoul National University, Korea; Jaehoon, Jung [Seoul National University, Korea; Lee, Jaejin [Seoul National University, Korea

    2016-01-01

    Applications written solely in OpenCL or CUDA cannot execute on a cluster as a whole. Most previous approaches that extend these programming models to clusters are based on a common idea: designating a centralized host node and coordinating the other nodes with the host for computation. However, the centralized host node is a serious performance bottleneck when the number of nodes is large. In this paper, we propose a scalable and distributed OpenCL framework called SnuCL-D for large-scale clusters. SnuCL-D's remote device virtualization provides an OpenCL application with an illusion that all compute devices in a cluster are confined in a single node. To reduce the amount of control-message and data communication between nodes, SnuCL-D replicates the OpenCL host program execution and data in each node. We also propose a new OpenCL host API function and a queueing optimization technique that significantly reduce the overhead incurred by the previous centralized approaches. To show the effectiveness of SnuCL-D, we evaluate SnuCL-D with a microbenchmark and eleven benchmark applications on a large-scale CPU cluster and a medium-scale GPU cluster.

  2. Exploring the planetary boundary for chemical pollution.

    Science.gov (United States)

    Diamond, Miriam L; de Wit, Cynthia A; Molander, Sverker; Scheringer, Martin; Backhaus, Thomas; Lohmann, Rainer; Arvidsson, Rickard; Bergman, Åke; Hauschild, Michael; Holoubek, Ivan; Persson, Linn; Suzuki, Noriyuki; Vighi, Marco; Zetzsch, Cornelius

    2015-05-01

    Rockström et al. (2009a, 2009b) have warned that humanity must reduce anthropogenic impacts defined by nine planetary boundaries if "unacceptable global change" is to be avoided. Chemical pollution was identified as one of those boundaries for which continued impacts could erode the resilience of ecosystems and humanity. The central concept of the planetary boundary (or boundaries) for chemical pollution (PBCP or PBCPs) is that the Earth has a finite assimilative capacity for chemical pollution, which includes persistent, as well as readily degradable chemicals released at local to regional scales, which in aggregate threaten ecosystem and human viability. The PBCP allows humanity to explicitly address the increasingly global aspects of chemical pollution throughout a chemical's life cycle and the need for a global response of internationally coordinated control measures. We submit that sufficient evidence shows stresses on ecosystem and human health at local to global scales, suggesting that conditions are transgressing the safe operating space delimited by a PBCP. As such, current local to global pollution control measures are insufficient. However, while the PBCP is an important conceptual step forward, at this point single or multiple PBCPs are challenging to operationalize due to the extremely large number of commercial chemicals or mixtures of chemicals that cause myriad adverse effects to innumerable species and ecosystems, and the complex linkages between emissions, environmental concentrations, exposures and adverse effects. As well, the normative nature of a PBCP presents challenges of negotiating pollution limits amongst societal groups with differing viewpoints. Thus, a combination of approaches is recommended as follows: develop indicators of chemical pollution, for both control and response variables, that will aid in quantifying a PBCP(s) and gauging progress towards reducing chemical pollution; develop new technologies and technical and social

  3. Annual review of earth and planetary sciences. Volume 16

    International Nuclear Information System (INIS)

    Wetherill, G.W.; Albee, A.L.; Stehli, F.G.

    1988-01-01

    Various papers on earth and planetary science topics are presented. The subjects addressed include: role and status of earth science field work; phase relations of prealuminous granitic rocks and their petrogenetic implications; chondritic meteorites and the solar nebula; volcanic winters; mass wasting on continental margins; earthquake ground motions; ore deposits as guides to geologic history of the earth; geology of high-level nuclear waste disposal; and tectonic evolution of the Caribbean. Also discussed are: the earth's rotation; the geophysics of a restless caldera (Long Valley, California); observations of cometary nuclei; geology of Venus; seismic stratigraphy; in situ-produced cosmogenic isotopes in terrestrial rocks; time variations of the earth's magnetic field; deep slabs, geochemical heterogeneity, and the large-scale structure of mantle convection; early proterozoic assembly and growth of Laurentia; concepts and methods of high-resolution event stratigraphy

  4. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  5. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Exploring the planetary boundary for chemical pollution

    DEFF Research Database (Denmark)

    Diamond, Miriam L.; de Wit, Cynthia A.; Molander, Sverker

    2015-01-01

    Rockström et al. (2009a, 2009b) have warned that humanity must reduce anthropogenic impacts defined by nine planetary boundaries if "unacceptable global change" is to be avoided. Chemical pollution was identified as one of those boundaries for which continued impacts could erode the resilience...... of ecosystems and humanity. The central concept of the planetary boundary (or boundaries) for chemical pollution (PBCP or PBCPs) is that the Earth has a finite assimilative capacity for chemical pollution, which includes persistent, as well as readily degradable chemicals released at local to regional scales......, which in aggregate threaten ecosystem and human viability. The PBCP allows humanity to explicitly address the increasingly global aspects of chemical pollution throughout a chemical's life cycle and the need for a global response of internationally coordinated control measures. We submit that sufficient...

  7. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  8. Effect of rotation on fingering convection in stellar and planetary interiors

    Science.gov (United States)

    Sengupta, Sutirtha; Garaud, Pascale

    2018-01-01

    We study the effects of global rotation on the growth and saturation of the fingering (double-diffusive) instability at low Prandtl numbers and estimate the compositional transport rates as a function of the relevant non-dimensional parameters - the Taylor number, Ta^* (defined in terms of the rotation rate, Ω, thermal diffusivity κ_T and associated finger length scale d) and density ratio through direct numerical simulations. Within our explored range of parameters, we find rotation to have very little effect on vertical transport apart for an exceptional case where a cyclonic large scale vortex (LSV) is observed at low density ratio and fairly high Taylor number. The LSV leads to significant enhancement in the fingering transport rates by concentrating high composition fluid at its core which moves downward. The formation of such LSVs is of particular interest for solving the missing mixing problem in the astrophysical context of RGB stars though the parameter regime in which we observe the emergence of this LSV seems to be quite far from the stellar scenario. However, understanding the basic mechanism driving such large scale structures as observed frequently in polar regions of planets (e.g. those seen by Juno near the poles of Jupiter) is important in general for studies of rotating turbulence and its applications to stellar and planetary interior studies, and will be investigated in further detail in a forthcoming work.

  9. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  10. Lack of association between digit ratio (2D:4D) and assertiveness: replication in a large sample.

    Science.gov (United States)

    Voracek, Martin

    2009-12-01

    Findings regarding within-sex associations of digit ratio (2D:4D), a putative pointer to long-lasting effects of prenatal androgen action, and sexually differentiated personality traits have generally been inconsistent or unreplicable, suggesting that effects in this domain, if any, are likely small. In contrast to evidence from Wilson's important 1983 study, a forerunner of modern 2D:4D research, two recent studies in 2005 and 2008 by Freeman, et al. and Hampson, et al. showed assertiveness, a presumably male-typed personality trait, was not associated with 2D:4D; however, these studies were clearly statistically underpowered. Hence this study examined this question anew, based on a large sample of 491 men and 627 women. Assertiveness was only modestly sexually differentiated, favoring men, and a positive correlate of age and education and a negative correlate of weight and Body Mass Index among women, but not men. Replicating the two prior studies, 2D:4D was throughout unrelated to assertiveness scores. This null finding was preserved with controls for correlates of assertiveness, also in nonparametric analysis and with tests for curvilinear relations. Discussed are implications of this specific null finding, now replicated in a large sample, for studies of 2D:4D and personality in general and novel research approaches to proceed in this field.

  11. G25.5 + 0.2: a very young supernova remnant or a galactic planetary nebula?

    International Nuclear Information System (INIS)

    White, R.L.; Becker, R.H.

    1990-01-01

    G25.5 + 0.2, a radio source suggested by previous authors to be a very young galactic supernova remnant, is more likely to be a planetary nebula. Its IRAS colours and fluxes and its radio spectrum and morphology are all consistent with the properties of planetary nebulae; its radio flux and distance imply a large mass of ionized gas, which is expected from a Type I planetary nebula lying in the galactic plane. We suggest some definitive observations which should be able to determine whether this interesting object is a planetary nebula or a supernova remnant. (author)

  12. The final fate of planetary systems

    Science.gov (United States)

    Gaensicke, Boris

    2015-12-01

    The discovery of the first extra-solar planet around a main-sequence star in 1995 has changed the way we think about the Universe: our solar system is not unique. Twenty years later, we know that planetary systems are ubiquitous, orbit stars spanning a wide range in mass, and form in an astonishing variety of architectures. Yet, one fascinating aspect of planetary systems has received relatively little attention so far: their ultimate fate.Most planet hosts will eventually evolve into white dwarfs, Earth-sized stellar embers, and the outer parts of their planetary systems (in the solar system, Mars and beyond) can survive largely intact for billions of years. While scattered and tidally disrupted planetesimals are directly detected at a small number of white dwarfs in the form infrared excess, the most powerful probe for detecting evolved planetary systems is metal pollution of the otherwise pristine H/He atmospheres.I will present the results of a multi-cycle HST survey that has obtained COS observations of 136 white dwarfs. These ultraviolet spectra are exquisitely sensitive to the presence of metals contaminating the white atmosphere. Our sophisticated model atmosphere analysis demonstrates that at least 27% of all targets are currently accreting planetary debris, and an additional 29% have very likely done so in the past. These numbers suggest that planet formation around A-stars (the dominant progenitors of today's white dwarf population) is similarly efficient as around FGK stars.In addition to post-main sequence planetary system demographics, spectroscopy of the debris-polluted white dwarf atmospheres provides a direct window into the bulk composition of exo-planetesimals, analogous to the way we use of meteorites to determine solar-system abundances. Our ultraviolet spectroscopy is particularly sensitive to the detection of Si, a dominant rock-forming species, and we identify up to ten additional volatile and refractory elements in the most strongly

  13. Same but Different. Scaling through partnership replication : The case of the Sustainable Development Programme for Coffee Growing Families in Nariño

    NARCIS (Netherlands)

    S.M. Pfisterer (Stella); N. Payandeh (Nasim)

    2014-01-01

    markdownabstractCross-sector development partnerships (CSDPs) often start as ‘pilots’– testing out the potential of partnering approach to deliver desired results within a pre-defined time. As projects, CSDPs have potential for replication because they test innovative approaches which can be scaled

  14. New and misclassified planetary nebulae

    International Nuclear Information System (INIS)

    Kohoutek, L.

    1978-01-01

    Since the 'Catalogue of Galactic Planetary Nebulae' 226 new objects have been classified as planetary nebulae. They are summarized in the form of designations, names, coordinates and the references to the discovery. Further 9 new objects have been added and called 'proto-planetary nebulae', but their status is still uncertain. Only 34 objects have been included in the present list of misclassified planetary nebulae although the number of doubtful cases is much larger. (Auth.)

  15. The Landscape Evolution Observatory: a large-scale controllable infrastructure to study coupled Earth-surface processes

    Science.gov (United States)

    Pangle, Luke A.; DeLong, Stephen B.; Abramson, Nate; Adams, John; Barron-Gafford, Greg A.; Breshears, David D.; Brooks, Paul D.; Chorover, Jon; Dietrich, William E.; Dontsova, Katerina; Durcik, Matej; Espeleta, Javier; Ferré, T.P.A.; Ferriere, Regis; Henderson, Whitney; Hunt, Edward A.; Huxman, Travis E.; Millar, David; Murphy, Brendan; Niu, Guo-Yue; Pavao-Zuckerman, Mitch; Pelletier, Jon D.; Rasmussen, Craig; Ruiz, Joaquin; Saleska, Scott; Schaap, Marcel; Sibayan, Michael; Troch, Peter A.; Tuller, Markus; van Haren, Joost; Zeng, Xubin

    2015-01-01

    Zero-order drainage basins, and their constituent hillslopes, are the fundamental geomorphic unit comprising much of Earth's uplands. The convergent topography of these landscapes generates spatially variable substrate and moisture content, facilitating biological diversity and influencing how the landscape filters precipitation and sequesters atmospheric carbon dioxide. In light of these significant ecosystem services, refining our understanding of how these functions are affected by landscape evolution, weather variability, and long-term climate change is imperative. In this paper we introduce the Landscape Evolution Observatory (LEO): a large-scale controllable infrastructure consisting of three replicated artificial landscapes (each 330 m2 surface area) within the climate-controlled Biosphere 2 facility in Arizona, USA. At LEO, experimental manipulation of rainfall, air temperature, relative humidity, and wind speed are possible at unprecedented scale. The Landscape Evolution Observatory was designed as a community resource to advance understanding of how topography, physical and chemical properties of soil, and biological communities coevolve, and how this coevolution affects water, carbon, and energy cycles at multiple spatial scales. With well-defined boundary conditions and an extensive network of sensors and samplers, LEO enables an iterative scientific approach that includes numerical model development and virtual experimentation, physical experimentation, data analysis, and model refinement. We plan to engage the broader scientific community through public dissemination of data from LEO, collaborative experimental design, and community-based model development.

  16. The Impact of Teacher Study Groups in Vocabulary on Teaching Practice, Teacher Knowledge, and Student Vocabulary Knowledge: A Large-Scale Replication Study

    Science.gov (United States)

    Jayanthi, Madhavi; Dimino, Joseph; Gersten, Russell; Taylor, Mary Jo; Haymond, Kelly; Smolkowski, Keith; Newman-Gonchar, Rebecca

    2018-01-01

    The purpose of this replication study was to examine the impact of the Teacher Study Group (TSG) professional development in vocabulary on first-grade teachers' knowledge of vocabulary instruction and observed teaching practice, and on students' vocabulary knowledge. Sixty-two schools from 16 districts in four states were randomly assigned to…

  17. Dust in planetary nebulae

    International Nuclear Information System (INIS)

    Kwok, S.

    1980-01-01

    A two-component dust model is suggested to explain the infrared emission from planetary nebulae. A cold dust component located in the extensive remnant of the red-giant envelope exterior to the visible nebula is responsible for the far-infrared emission. A ward dust component, which is condensed after the formation of the planetary nebula and confined within the ionized gas shell, emits most of the near- and mid-infrared radiation. The observations of NGC 7027 are shown to be consisten with such a model. The correlation of silicate emission in several planetary nebulae with an approximately +1 spectral index at low radio frequencies suggests that both the silicate and radio emissions originate from the remnant of the circumstellar envelope of th precursor star and are observable only while the planetary nebula is young. It is argued that oxygen-rich stars as well as carbon-rich stars can be progenitors of planetary nebulae

  18. Modulation of surface meteorological parameters by extratropical planetary-scale Rossby waves

    Directory of Open Access Journals (Sweden)

    K. Niranjan Kumar

    2016-01-01

    Full Text Available This study examines the link between upper-tropospheric planetary-scale Rossby waves and surface meteorological parameters based on the observations made in association with the Ganges Valley Aerosol Experiment (GVAX campaign at an extratropical site at Aryabhatta Research Institute of Observational Sciences, Nainital (29.45° N, 79.5° E during November–December 2011. The spectral analysis of the tropospheric wind field from radiosonde measurements indicates a predominance power of around 8 days in the upper troposphere during the observational period. An analysis of the 200 hPa meridional wind (v200 hPa anomalies from the Modern-Era Retrospective Analysis for Research and Applications (MERRA reanalysis shows distinct Rossby-wave-like structures over a high-altitude site in the central Himalayan region. Furthermore, the spectral analysis of global v200 hPa anomalies indicates the Rossby waves are characterized by zonal wave number 6. The amplification of the Rossby wave packets over the site leads to persistent subtropical jet stream (STJ patterns, which further affects the surface weather conditions. The propagating Rossby waves in the upper troposphere along with the undulations in the STJ create convergence and divergence regions in the mid-troposphere. Therefore, the surface meteorological parameters such as the relative humidity, wind speeds, and temperature are synchronized with the phase of the propagating Rossby waves. Moreover, the present study finds important implications for medium-range forecasting through the upper-level Rossby waves over the study region.

  19. Large scale dynamics of protoplanetary discs

    Science.gov (United States)

    Béthune, William

    2017-08-01

    Planets form in the gaseous and dusty disks orbiting young stars. These protoplanetary disks are dispersed in a few million years, being accreted onto the central star or evaporated into the interstellar medium. To explain the observed accretion rates, it is commonly assumed that matter is transported through the disk by turbulence, although the mechanism sustaining turbulence is uncertain. On the other side, irradiation by the central star could heat up the disk surface and trigger a photoevaporative wind, but thermal effects cannot account for the observed acceleration and collimation of the wind into a narrow jet perpendicular to the disk plane. Both issues can be solved if the disk is sensitive to magnetic fields. Weak fields lead to the magnetorotational instability, whose outcome is a state of sustained turbulence. Strong fields can slow down the disk, causing it to accrete while launching a collimated wind. However, the coupling between the disk and the neutral gas is done via electric charges, each of which is outnumbered by several billion neutral molecules. The imperfect coupling between the magnetic field and the neutral gas is described in terms of "non-ideal" effects, introducing new dynamical behaviors. This thesis is devoted to the transport processes happening inside weakly ionized and weakly magnetized accretion disks; the role of microphysical effects on the large-scale dynamics of the disk is of primary importance. As a first step, I exclude the wind and examine the impact of non-ideal effects on the turbulent properties near the disk midplane. I show that the flow can spontaneously organize itself if the ionization fraction is low enough; in this case, accretion is halted and the disk exhibits axisymmetric structures, with possible consequences on planetary formation. As a second step, I study the launching of disk winds via a global model of stratified disk embedded in a warm atmosphere. This model is the first to compute non-ideal effects from

  20. Symmetry of interactions rules in incompletely connected random replicator ecosystems.

    Science.gov (United States)

    Kärenlampi, Petri P

    2014-06-01

    The evolution of an incompletely connected system of species with speciation and extinction is investigated in terms of random replicators. It is found that evolving random replicator systems with speciation do become large and complex, depending on speciation parameters. Antisymmetric interactions result in large systems, whereas systems with symmetric interactions remain small. A co-dominating feature is within-species interaction pressure: large within-species interaction increases species diversity. Average fitness evolves in all systems, however symmetry and connectivity evolve in small systems only. Newcomers get extinct almost immediately in symmetric systems. The distribution in species lifetimes is determined for antisymmetric systems. The replicator systems investigated do not show any sign of self-organized criticality. The generalized Lotka-Volterra system is shown to be a tedious way of implementing the replicator system.

  1. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  2. Is Implicit Theory of Mind a Real and Robust Phenomenon? Results From a Systematic Replication Study.

    Science.gov (United States)

    Kulke, Louisa; von Duhn, Britta; Schneider, Dana; Rakoczy, Hannes

    2018-06-01

    Recently, theory-of-mind research has been revolutionized by findings from novel implicit tasks suggesting that at least some aspects of false-belief reasoning develop earlier in ontogeny than previously assumed and operate automatically throughout adulthood. Although these findings are the empirical basis for far-reaching theories, systematic replications are still missing. This article reports a preregistered large-scale attempt to replicate four influential anticipatory-looking implicit theory-of-mind tasks using original stimuli and procedures. Results showed that only one of the four paradigms was reliably replicated. A second set of studies revealed, further, that this one paradigm was no longer replicated once confounds were removed, which calls its validity into question. There were also no correlations between paradigms, and thus, no evidence for their convergent validity. In conclusion, findings from anticipatory-looking false-belief paradigms seem less reliable and valid than previously assumed, thus limiting the conclusions that can be drawn from them.

  3. UNSTABLE PLANETARY SYSTEMS EMERGING OUT OF GAS DISKS

    International Nuclear Information System (INIS)

    Matsumura, Soko; Thommes, Edward W.; Chatterjee, Sourav; Rasio, Frederic A.

    2010-01-01

    The discovery of over 400 extrasolar planets allows us to statistically test our understanding of the formation and dynamics of planetary systems via numerical simulations. Traditional N-body simulations of multiple-planet systems without gas disks have successfully reproduced the eccentricity (e) distribution of the observed systems by assuming that the planetary systems are relatively closely packed when the gas disk dissipates, so that they become dynamically unstable within the stellar lifetime. However, such studies cannot explain the small semimajor axes a of extrasolar planetary systems, if planets are formed, as the standard planet formation theory suggests, beyond the ice line. In this paper, we numerically study the evolution of three-planet systems in dissipating gas disks, and constrain the initial conditions that reproduce the observed a and e distributions simultaneously. We adopt initial conditions that are motivated by the standard planet formation theory, and self-consistently simulate the disk evolution and planet migration, by using a hybrid N-body and one-dimensional gas disk code. We also take into account eccentricity damping, and investigate the effect of saturation of corotation resonances on the evolution of planetary systems. We find that the a distribution is largely determined in a gas disk, while the e distribution is determined after the disk dissipation. We also find that there may be an optimum disk mass which leads to the observed a-e distribution. Our simulations generate a larger fraction of planetary systems trapped in mean-motion resonances (MMRs) than the observations, indicating that the disk's perturbation to the planetary orbits may be important to explain the observed rate of MMRs. We also find a much lower occurrence of planets on retrograde orbits than the current observations of close-in planets suggest.

  4. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  5. River food web response to large-scale riparian zone manipulations.

    Directory of Open Access Journals (Sweden)

    J Timothy Wootton

    Full Text Available Conservation programs often focus on select species, leading to management plans based on the autecology of the focal species, but multiple ecosystem components can be affected both by the environmental factors impacting, and the management targeting, focal species. These broader effects can have indirect impacts on target species through the web of interactions within ecosystems. For example, human activity can strongly alter riparian vegetation, potentially impacting both economically-important salmonids and their associated river food web. In an Olympic Peninsula river, Washington state, USA, replicated large-scale riparian vegetation manipulations implemented with the long-term (>40 yr goal of improving salmon habitat did not affect water temperature, nutrient limitation or habitat characteristics, but reduced canopy cover, causing reduced energy input via leaf litter, increased incident solar radiation (UV and PAR and increased algal production compared to controls. In response, benthic algae, most insect taxa, and juvenile salmonids increased in manipulated areas. Stable isotope analysis revealed a predominant contribution of algal-derived energy to salmonid diets in manipulated reaches. The experiment demonstrates that riparian management targeting salmonids strongly affects river food webs via changes in the energy base, illustrates how species-based management strategies can have unanticipated indirect effects on the target species via the associated food web, and supports ecosystem-based management approaches for restoring depleted salmonid stocks.

  6. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  7. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  8. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  9. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  10. From red giants to planetary nebulae

    International Nuclear Information System (INIS)

    Kwok, S.

    1982-01-01

    The transition from red giants to planetary nebulae is studied by comparing the spectral characteristics of red giant envelopes and planetary nebulae. Observational and theoretical evidence both suggest that remnants of red giant envelopes may still be present in planetary nebula systems and should have significant effects on their formation. The dynamical effects of the interaction of stellar winds from central stars of planetary nebulae with the remnant red giant envelopes are evaluated and the mechanism found to be capable of producing the observed masses and momenta of planetary nebulae. The observed mass-radii relation of planetary nebulae may also be best explained by the interacting winds model. The possibility that red giant mass loss, and therefore the production of planetary nebulae, is different between Population I and II systems is also discussed

  11. The ENIGMA Consortium: large-scale collaborative analyses of neuroimaging and genetic data.

    Science.gov (United States)

    Thompson, Paul M; Stein, Jason L; Medland, Sarah E; Hibar, Derrek P; Vasquez, Alejandro Arias; Renteria, Miguel E; Toro, Roberto; Jahanshad, Neda; Schumann, Gunter; Franke, Barbara; Wright, Margaret J; Martin, Nicholas G; Agartz, Ingrid; Alda, Martin; Alhusaini, Saud; Almasy, Laura; Almeida, Jorge; Alpert, Kathryn; Andreasen, Nancy C; Andreassen, Ole A; Apostolova, Liana G; Appel, Katja; Armstrong, Nicola J; Aribisala, Benjamin; Bastin, Mark E; Bauer, Michael; Bearden, Carrie E; Bergmann, Orjan; Binder, Elisabeth B; Blangero, John; Bockholt, Henry J; Bøen, Erlend; Bois, Catherine; Boomsma, Dorret I; Booth, Tom; Bowman, Ian J; Bralten, Janita; Brouwer, Rachel M; Brunner, Han G; Brohawn, David G; Buckner, Randy L; Buitelaar, Jan; Bulayeva, Kazima; Bustillo, Juan R; Calhoun, Vince D; Cannon, Dara M; Cantor, Rita M; Carless, Melanie A; Caseras, Xavier; Cavalleri, Gianpiero L; Chakravarty, M Mallar; Chang, Kiki D; Ching, Christopher R K; Christoforou, Andrea; Cichon, Sven; Clark, Vincent P; Conrod, Patricia; Coppola, Giovanni; Crespo-Facorro, Benedicto; Curran, Joanne E; Czisch, Michael; Deary, Ian J; de Geus, Eco J C; den Braber, Anouk; Delvecchio, Giuseppe; Depondt, Chantal; de Haan, Lieuwe; de Zubicaray, Greig I; Dima, Danai; Dimitrova, Rali; Djurovic, Srdjan; Dong, Hongwei; Donohoe, Gary; Duggirala, Ravindranath; Dyer, Thomas D; Ehrlich, Stefan; Ekman, Carl Johan; Elvsåshagen, Torbjørn; Emsell, Louise; Erk, Susanne; Espeseth, Thomas; Fagerness, Jesen; Fears, Scott; Fedko, Iryna; Fernández, Guillén; Fisher, Simon E; Foroud, Tatiana; Fox, Peter T; Francks, Clyde; Frangou, Sophia; Frey, Eva Maria; Frodl, Thomas; Frouin, Vincent; Garavan, Hugh; Giddaluru, Sudheer; Glahn, David C; Godlewska, Beata; Goldstein, Rita Z; Gollub, Randy L; Grabe, Hans J; Grimm, Oliver; Gruber, Oliver; Guadalupe, Tulio; Gur, Raquel E; Gur, Ruben C; Göring, Harald H H; Hagenaars, Saskia; Hajek, Tomas; Hall, Geoffrey B; Hall, Jeremy; Hardy, John; Hartman, Catharina A; Hass, Johanna; Hatton, Sean N; Haukvik, Unn K; Hegenscheid, Katrin; Heinz, Andreas; Hickie, Ian B; Ho, Beng-Choon; Hoehn, David; Hoekstra, Pieter J; Hollinshead, Marisa; Holmes, Avram J; Homuth, Georg; Hoogman, Martine; Hong, L Elliot; Hosten, Norbert; Hottenga, Jouke-Jan; Hulshoff Pol, Hilleke E; Hwang, Kristy S; Jack, Clifford R; Jenkinson, Mark; Johnston, Caroline; Jönsson, Erik G; Kahn, René S; Kasperaviciute, Dalia; Kelly, Sinead; Kim, Sungeun; Kochunov, Peter; Koenders, Laura; Krämer, Bernd; Kwok, John B J; Lagopoulos, Jim; Laje, Gonzalo; Landen, Mikael; Landman, Bennett A; Lauriello, John; Lawrie, Stephen M; Lee, Phil H; Le Hellard, Stephanie; Lemaître, Herve; Leonardo, Cassandra D; Li, Chiang-Shan; Liberg, Benny; Liewald, David C; Liu, Xinmin; Lopez, Lorna M; Loth, Eva; Lourdusamy, Anbarasu; Luciano, Michelle; Macciardi, Fabio; Machielsen, Marise W J; Macqueen, Glenda M; Malt, Ulrik F; Mandl, René; Manoach, Dara S; Martinot, Jean-Luc; Matarin, Mar; Mather, Karen A; Mattheisen, Manuel; Mattingsdal, Morten; Meyer-Lindenberg, Andreas; McDonald, Colm; McIntosh, Andrew M; McMahon, Francis J; McMahon, Katie L; Meisenzahl, Eva; Melle, Ingrid; Milaneschi, Yuri; Mohnke, Sebastian; Montgomery, Grant W; Morris, Derek W; Moses, Eric K; Mueller, Bryon A; Muñoz Maniega, Susana; Mühleisen, Thomas W; Müller-Myhsok, Bertram; Mwangi, Benson; Nauck, Matthias; Nho, Kwangsik; Nichols, Thomas E; Nilsson, Lars-Göran; Nugent, Allison C; Nyberg, Lars; Olvera, Rene L; Oosterlaan, Jaap; Ophoff, Roel A; Pandolfo, Massimo; Papalampropoulou-Tsiridou, Melina; Papmeyer, Martina; Paus, Tomas; Pausova, Zdenka; Pearlson, Godfrey D; Penninx, Brenda W; Peterson, Charles P; Pfennig, Andrea; Phillips, Mary; Pike, G Bruce; Poline, Jean-Baptiste; Potkin, Steven G; Pütz, Benno; Ramasamy, Adaikalavan; Rasmussen, Jerod; Rietschel, Marcella; Rijpkema, Mark; Risacher, Shannon L; Roffman, Joshua L; Roiz-Santiañez, Roberto; Romanczuk-Seiferth, Nina; Rose, Emma J; Royle, Natalie A; Rujescu, Dan; Ryten, Mina; Sachdev, Perminder S; Salami, Alireza; Satterthwaite, Theodore D; Savitz, Jonathan; Saykin, Andrew J; Scanlon, Cathy; Schmaal, Lianne; Schnack, Hugo G; Schork, Andrew J; Schulz, S Charles; Schür, Remmelt; Seidman, Larry; Shen, Li; Shoemaker, Jody M; Simmons, Andrew; Sisodiya, Sanjay M; Smith, Colin; Smoller, Jordan W; Soares, Jair C; Sponheim, Scott R; Sprooten, Emma; Starr, John M; Steen, Vidar M; Strakowski, Stephen; Strike, Lachlan; Sussmann, Jessika; Sämann, Philipp G; Teumer, Alexander; Toga, Arthur W; Tordesillas-Gutierrez, Diana; Trabzuni, Daniah; Trost, Sarah; Turner, Jessica; Van den Heuvel, Martijn; van der Wee, Nic J; van Eijk, Kristel; van Erp, Theo G M; van Haren, Neeltje E M; van 't Ent, Dennis; van Tol, Marie-Jose; Valdés Hernández, Maria C; Veltman, Dick J; Versace, Amelia; Völzke, Henry; Walker, Robert; Walter, Henrik; Wang, Lei; Wardlaw, Joanna M; Weale, Michael E; Weiner, Michael W; Wen, Wei; Westlye, Lars T; Whalley, Heather C; Whelan, Christopher D; White, Tonya; Winkler, Anderson M; Wittfeld, Katharina; Woldehawariat, Girma; Wolf, Christiane; Zilles, David; Zwiers, Marcel P; Thalamuthu, Anbupalam; Schofield, Peter R; Freimer, Nelson B; Lawrence, Natalia S; Drevets, Wayne

    2014-06-01

    The Enhancing NeuroImaging Genetics through Meta-Analysis (ENIGMA) Consortium is a collaborative network of researchers working together on a range of large-scale studies that integrate data from 70 institutions worldwide. Organized into Working Groups that tackle questions in neuroscience, genetics, and medicine, ENIGMA studies have analyzed neuroimaging data from over 12,826 subjects. In addition, data from 12,171 individuals were provided by the CHARGE consortium for replication of findings, in a total of 24,997 subjects. By meta-analyzing results from many sites, ENIGMA has detected factors that affect the brain that no individual site could detect on its own, and that require larger numbers of subjects than any individual neuroimaging study has currently collected. ENIGMA's first project was a genome-wide association study identifying common variants in the genome associated with hippocampal volume or intracranial volume. Continuing work is exploring genetic associations with subcortical volumes (ENIGMA2) and white matter microstructure (ENIGMA-DTI). Working groups also focus on understanding how schizophrenia, bipolar illness, major depression and attention deficit/hyperactivity disorder (ADHD) affect the brain. We review the current progress of the ENIGMA Consortium, along with challenges and unexpected discoveries made on the way.

  12. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  13. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  14. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  15. Measurement of replication structures at the nanometer scale using super-resolution light microscopy.

    Science.gov (United States)

    Baddeley, D; Chagin, V O; Schermelleh, L; Martin, S; Pombo, A; Carlton, P M; Gahl, A; Domaing, P; Birk, U; Leonhardt, H; Cremer, C; Cardoso, M C

    2010-01-01

    DNA replication, similar to other cellular processes, occurs within dynamic macromolecular structures. Any comprehensive understanding ultimately requires quantitative data to establish and test models of genome duplication. We used two different super-resolution light microscopy techniques to directly measure and compare the size and numbers of replication foci in mammalian cells. This analysis showed that replication foci vary in size from 210 nm down to 40 nm. Remarkably, spatially modulated illumination (SMI) and 3D-structured illumination microscopy (3D-SIM) both showed an average size of 125 nm that was conserved throughout S-phase and independent of the labeling method, suggesting a basic unit of genome duplication. Interestingly, the improved optical 3D resolution identified 3- to 5-fold more distinct replication foci than previously reported. These results show that optical nanoscopy techniques enable accurate measurements of cellular structures at a level previously achieved only by electron microscopy and highlight the possibility of high-throughput, multispectral 3D analyses.

  16. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  17. Planetary Radio Interferometry and Doppler Experiment (PRIDE) for Planetary Atmospheric Studies

    Science.gov (United States)

    Bocanegra Bahamon, Tatiana; Cimo, Giuseppe; Duev, Dmitry; Gurvits, Leonid; Molera Calves, Guifre; Pogrebenko, Sergei

    2015-04-01

    The Planetary Radio Interferometry and Doppler Experiment (PRIDE) is a technique that allows the determination of the radial velocity and lateral coordinates of planetary spacecraft with very high accuracy (Duev, 2012). The setup of the experiment consists of several ground stations from the European VLBI Network (EVN) located around the globe, which simultaneously perform Doppler tracking of a spacecraft carrier radio signal, and are subsequently processed in a VLBI-style in phase referencing mode. Because of the accurate examination of the changes in phase and amplitude of the radio signal propagating from the spacecraft to the multiple stations on Earth, the PRIDE technique can be used for several fields of planetary research, among which planetary atmospheric studies, gravimetry and ultra-precise celestial mechanics of planetary systems. In the study at hand the application of this technique for planetary atmospheric investigations is demonstrated. As a test case, radio occultation experiments were conducted with PRIDE having as target ESA's Venus Express, during different observing sessions with multiple ground stations in April 2012 and March 2014. Once each of the stations conducts the observation, the raw data is delivered to the correlation center at the Joint Institute for VLBI in Europe (JIVE) located in the Netherlands. The signals are processed with a high spectral resolution and phase detection software package from which Doppler observables of each station are derived. Subsequently the Doppler corrected signals are correlated to derive the VLBI observables. These two sets of observables are used for precise orbit determination. The reconstructed orbit along with the Doppler observables are used as input for the radio occultation processing software, which consists of mainly two modules, the geometrical optics module and the ray tracing inversion module, from which vertical density profiles, and subsequently, temperature and pressure profiles of Venus

  18. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  19. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  20. Planet gaps in the dust layer of 3D proto-planetary disks: Observability with ALMA

    OpenAIRE

    Gonzalez, Jean-François; Pinte, Christophe; Maddison, Sarah T.; Ménard, François

    2013-01-01

    2 pages, 2 figures, to appear in the Proceedings of IAU Symp. 299: Exploring the Formation and Evolution of Planetary Systems (Victoria, Canada); International audience; Among the numerous known extrasolar planets, only a handful have been imaged directly so far, at large orbital radii and in rather evolved systems. The Atacama Large Millimeter/submillimeter Array (ALMA) will have the capacity to observe these wide planetary systems at a younger age, thus bringing a better understanding of th...

  1. Planetary Atmospheres and Evolution of Complex Life

    Science.gov (United States)

    Catling, D.

    2014-04-01

    Let us define "complex life" as actively mobile organisms exceeding tens of centimeter size scale with specialized, differentiated anatomy comparable to advanced metazoans. Such organisms on any planet will need considerable energy for growth and metabolism, and an atmosphere is likely to play a key role. The history of life on Earth suggests that there were at least two major hurdles to overcome before complex life developed. The first was biological. Large, three-dimensional multicellular animals and plants are made only of eukaryotic cells, which are the only type that can develop into a large, diverse range of cell types unlike the cells of microbes. Exactly how eukaryotes allow 3D multicellularity and how they originated are matters of debate. But the internal structure and bigger and more modular genomes of eukaryotes are important factors. The second obstacle for complex life was having sufficient free, diatomic oxygen (O2). Aerobic metabolism provides about an order of magnitude more energy for a given intake of food than anaerobic metabolism, so anaerobes don't grow multicellular beyond filaments because of prohibitive growth efficiencies. A precursor to a 2.4 Ga rise of oxygen was the evolution of water-splitting, oxygen-producing photosynthesis. But although the atmosphere became oxidizing at 2.4 Ga, sufficient atmospheric O2 did not occur until about 0.6 Ga. Earth-system factors were involved including planetary outgassing (as affected by size and composition), hydrogen escape, and processing of organic carbon. An atmosphere rich in O2 provides the largest feasible energy source per electron transfer in the Periodic Table, which suggests that O2 would be important for complex life on exoplanets. But plentiful O2 is unusual in a planetary atmosphere because O2 is easily consumed in chemical reactions with reducing gases or surface materials. Even with aerobic metabolism, the partial pressure of O2 (pO2) must exceed 10^3 Pa to allow organisms that rely on

  2. Trends in Planetary Data Analysis. Executive summary of the Planetary Data Workshop

    Science.gov (United States)

    Evans, N.

    1984-09-01

    Planetary data include non-imaging remote sensing data, which includes spectrometric, radiometric, and polarimetric remote sensing observations. Also included are in-situ, radio/radar data, and Earth based observation. Also discussed is development of a planetary data system. A catalog to identify observations will be the initial entry point for all levels of users into the data system. There are seven distinct data support services: encyclopedia, data index, data inventory, browse, search, sample, and acquire. Data systems for planetary science users must provide access to data, process, store, and display data. Two standards will be incorporated into the planetary data system: Standard communications protocol and Standard format data unit. The data system configuration must combine a distributed system with those of a centralized system. Fiscal constraints have made prioritization important. Activities include saving previous mission data, planning/cost analysis, and publishing of proceedings.

  3. Planetary Data System (PDS)

    Data.gov (United States)

    National Aeronautics and Space Administration — The Planetary Data System (PDS) is an archive of data products from NASA planetary missions, which is sponsored by NASA's Science Mission Directorate. We actively...

  4. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  5. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  6. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  7. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  8. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  9. Planetary Science Training for NASA's Astronauts: Preparing for Future Human Planetary Exploration

    Science.gov (United States)

    Bleacher, J. E.; Evans, C. A.; Graff, T. G.; Young, K. E.; Zeigler, R.

    2017-02-01

    Astronauts selected in 2017 and in future years will carry out in situ planetary science research during exploration of the solar system. Training to enable this goal is underway and is flexible to accommodate an evolving planetary science vision.

  10. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  11. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  12. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  13. Large-scale purification and crystallization of the endoribonuclease XendoU: troubleshooting with His-tagged proteins

    International Nuclear Information System (INIS)

    Renzi, Fabiana; Panetta, Gianna; Vallone, Beatrice; Brunori, Maurizio; Arceci, Massimo; Bozzoni, Irene; Laneve, Pietro; Caffarelli, Elisa

    2006-01-01

    Recombinant His-tagged XendoU, a eukaryotic endoribonuclease, appeared to aggregate in the presence of divalent cations. Monodisperse protein which yielded crystals diffracting to 2.2 Å was obtained by addition of EDTA. XendoU is the first endoribonuclease described in higher eukaryotes as being involved in the endonucleolytic processing of intron-encoded small nucleolar RNAs. It is conserved among eukaryotes and its viral homologue is essential in SARS replication and transcription. The large-scale purification and crystallization of recombinant XendoU are reported. The tendency of the recombinant enzyme to aggregate could be reversed upon the addition of chelating agents (EDTA, imidazole): aggregation is a potential drawback when purifying and crystallizing His-tagged proteins, which are widely used, especially in high-throughput structural studies. Purified monodisperse XendoU crystallized in two different space groups: trigonal P3 1 21, diffracting to low resolution, and monoclinic C2, diffracting to higher resolution

  14. A large replication study and meta-analysis in European samples provides further support for association of AHI1 markers with schizophrenia

    DEFF Research Database (Denmark)

    Ingason, Andrés; Giegling, Ina; Cichon, Sven

    2010-01-01

    The Abelson helper integration site 1 (AHI1) gene locus on chromosome 6q23 is among a group of candidate loci for schizophrenia susceptibility that were initially identified by linkage followed by linkage disequilibrium mapping, and subsequent replication of the association in an independent sample....... Here, we present results of a replication study of AHI1 locus markers, previously implicated in schizophrenia, in a large European sample (in total 3907 affected and 7429 controls). Furthermore, we perform a meta-analysis of the implicated markers in 4496 affected and 18,920 controls. Both...... as the neighbouring phosphodiesterase 7B (PDE7B)-may be considered candidates for involvement in the genetic aetiology of schizophrenia....

  15. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  16. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  17. A facile and cost-effective approach to engineer surface roughness for preparation of large-scale superhydrophobic substrate with high adhesive force

    Science.gov (United States)

    Zhou, Bingpu; Tian, Jingxuan; Wang, Cong; Gao, Yibo; Wen, Weijia

    2016-12-01

    This study presents a convenient avenue to fabricate polydimethylsiloxane (PDMS) with controllable surface morphologies and wetting characteristics via standard molding technique. The templates with engineered surface roughness were simply prepared by combinations of microfluidics and photo-polymerization of N-Isopropylacrylamide (NIPAM). The surface morphology of mold could be adjusted via ultraviolet-curing duration or the grafting density, which means that the surface of PDMS sample replicated from the mold could also be easily controlled based on the proposed method. Furthermore, via multiple grafting and replication processes, we have successfully demonstrated that hydrophobicity properties of prepared PDMS samples could be swiftly enhanced to ∼154° with highly adhesive force with resident water droplets. The obtained PDMS samples exhibited well resistance to external mechanical deformation even up to 100 cycles. The proposed scheme is timesaving, cost-effective and suitable for large-scale production of superhydrophobic PDMS substrates. We believe that the presented approach can provide a promising method for preparing superhydrophobic surface with highly adhesive force for on-chip liquid transport, localized reaction, etc.

  18. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  19. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    Shimakawa, Yoshio; Nibe, Nobuaki; Hori, Toru

    2002-05-01

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  20. Preparing Planetary Scientists to Engage Audiences

    Science.gov (United States)

    Shupla, C. B.; Shaner, A. J.; Hackler, A. S.

    2017-12-01

    While some planetary scientists have extensive experience sharing their science with audiences, many can benefit from guidance on giving presentations or conducting activities for students. The Lunar and Planetary Institute (LPI) provides resources and trainings to support planetary scientists in their communication efforts. Trainings have included sessions for students and early career scientists at conferences (providing opportunities for them to practice their delivery and receive feedback for their poster and oral presentations), as well as separate communication workshops on how to engage various audiences. LPI has similarly begun coaching planetary scientists to help them prepare their public presentations. LPI is also helping to connect different audiences and their requests for speakers to planetary scientists. Scientists have been key contributors in developing and conducting activities in LPI education and public events. LPI is currently working with scientists to identify and redesign short planetary science activities for scientists to use with different audiences. The activities will be tied to fundamental planetary science concepts, with basic materials and simple modifications to engage different ages and audience size and background. Input from the planetary science community on these efforts is welcome. Current results and resources, as well as future opportunities will be shared.

  1. Large scale CMB anomalies from thawing cosmic strings

    Energy Technology Data Exchange (ETDEWEB)

    Ringeval, Christophe [Centre for Cosmology, Particle Physics and Phenomenology, Institute of Mathematics and Physics, Louvain University, 2 Chemin du Cyclotron, 1348 Louvain-la-Neuve (Belgium); Yamauchi, Daisuke; Yokoyama, Jun' ichi [Research Center for the Early Universe (RESCEU), Graduate School of Science, The University of Tokyo, Tokyo 113-0033 (Japan); Bouchet, François R., E-mail: christophe.ringeval@uclouvain.be, E-mail: yamauchi@resceu.s.u-tokyo.ac.jp, E-mail: yokoyama@resceu.s.u-tokyo.ac.jp, E-mail: bouchet@iap.fr [Institut d' Astrophysique de Paris, UMR 7095-CNRS, Université Pierre et Marie Curie, 98bis boulevard Arago, 75014 Paris (France)

    2016-02-01

    Cosmic strings formed during inflation are expected to be either diluted over super-Hubble distances, i.e., invisible today, or to have crossed our past light cone very recently. We discuss the latter situation in which a few strings imprint their signature in the Cosmic Microwave Background (CMB) Anisotropies after recombination. Being almost frozen in the Hubble flow, these strings are quasi static and evade almost all of the previously derived constraints on their tension while being able to source large scale anisotropies in the CMB sky. Using a local variance estimator on thousand of numerically simulated Nambu-Goto all sky maps, we compute the expected signal and show that it can mimic a dipole modulation at large angular scales while being negligible at small angles. Interestingly, such a scenario generically produces one cold spot from the thawing of a cosmic string loop. Mixed with anisotropies of inflationary origin, we find that a few strings of tension GU = O(1) × 10{sup −6} match the amplitude of the dipole modulation reported in the Planck satellite measurements and could be at the origin of other large scale anomalies.

  2. Practical experience from the Office of Adolescent Health's large scale implementation of an evidence-based Teen Pregnancy Prevention Program.

    Science.gov (United States)

    Margolis, Amy Lynn; Roper, Allison Yvonne

    2014-03-01

    After 3 years of experience overseeing the implementation and evaluation of evidence-based teen pregnancy prevention programs in a diversity of populations and settings across the country, the Office of Adolescent Health (OAH) has learned numerous lessons through practical application and new experiences. These lessons and experiences are applicable to those working to implement evidence-based programs on a large scale. The lessons described in this paper focus on what it means for a program to be implementation ready, the role of the program developer in replicating evidence-based programs, the importance of a planning period to ensure quality implementation, the need to define and measure fidelity, and the conditions necessary to support rigorous grantee-level evaluation. Published by Elsevier Inc.

  3. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  4. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  5. From Planetary Boundaries to national fair shares of the global safe operating space — How can the scales be bridged?

    NARCIS (Netherlands)

    Häyhä, Tiina; Lucas, Paul L.|info:eu-repo/dai/nl/272607444; van Vuuren, Detlef P.|info:eu-repo/dai/nl/11522016X; Cornell, Sarah E.; Hoff, Holger

    2016-01-01

    The planetary boundaries framework proposes quantitative global limits to the anthropogenic perturbation of crucial Earth system processes, and thus marks out a planetary safe operating space for human activities. Yet, decisions regarding resource use and emissions are mostly made at less aggregated

  6. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  7. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  8. Biomass-based negative emissions difficult to reconcile with planetary boundaries

    Science.gov (United States)

    Heck, Vera; Gerten, Dieter; Lucht, Wolfgang; Popp, Alexander

    2018-01-01

    Under the Paris Agreement, 195 nations have committed to holding the increase in the global average temperature to well below 2 °C above pre-industrial levels and to strive to limit the increase to 1.5 °C (ref. 1). It is noted that this requires "a balance between anthropogenic emissions by sources and removals by sinks of greenhouse gases in the second half of the century"1. This either calls for zero greenhouse gas (GHG) emissions or a balance between positive and negative emissions (NE)2,3. Roadmaps and socio-economic scenarios compatible with a 2 °C or 1.5 °C goal depend upon NE via bioenergy with carbon capture and storage (BECCS) to balance remaining GHG emissions4-7. However, large-scale deployment of BECCS would imply significant impacts on many Earth system components besides atmospheric CO2 concentrations8,9. Here we explore the feasibility of NE via BECCS from dedicated plantations and potential trade-offs with planetary boundaries (PBs)10,11 for multiple socio-economic pathways. We show that while large-scale BECCS is intended to lower the pressure on the PB for climate change, it would most likely steer the Earth system closer to the PB for freshwater use and lead to further transgression of the PBs for land-system change, biosphere integrity and biogeochemical flows.

  9. An interactive display system for large-scale 3D models

    Science.gov (United States)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  10. Persistence of the planetary wave type oscillations in foF2 over Europe

    Directory of Open Access Journals (Sweden)

    J. Laštovička

    2003-07-01

    Full Text Available Planetary waves are oscillations of very predominantly tropospheric origin with typical periods of about 2–30 days. Their dominant zonal wave numbers are 1, 2 and 3, i.e. the waves are of large-scale (global character. The planetary wave type oscillations have been observed in the lower and middle atmosphere but also in the ionosphere, including the ionospheric F2-layer. Here, we deal only with the oscillations analyzed for four European stations over a solar cycle with the use of the Meyer and Morlet wavelet transforms. Waves with periods near 5, 10 and 16 days are studied. Only events with a duration of three wave-cycles and more are considered. The 5-day period wave events display a typical duration of 4 cycles, while 10- and 16-day wave events are less persistent, with a typical duration of about 3.5 cycles and 3 cycles, respectively. The persistence pattern in terms of number of cycles and in terms of number of days is different. In terms of number of cycles, the typical persistence of oscillations decreases with increasing period. On the other hand, in terms of number of days the typical persistence evidently increases with increasing period. The spectral distribution of event duration is too broad to allow for a reasonable prediction of event duration. Thus, the predictability of the planetary wave type oscillations in foF2 seems to be very questionable.Key words. Ionosphere (ionosphere-atmosphere interaction, mid-latitude ionosphere, ionospheric disturbances – Meteorology and atmospheric dynamics (waves and tides

  11. Self-Replication of Localized Vegetation Patches in Scarce Environments

    Science.gov (United States)

    Bordeu, Ignacio; Clerc, Marcel G.; Couteron, Piere; Lefever, René; Tlidi, Mustapha

    2016-09-01

    Desertification due to climate change and increasing drought periods is a worldwide problem for both ecology and economy. Our ability to understand how vegetation manages to survive and propagate through arid and semiarid ecosystems may be useful in the development of future strategies to prevent desertification, preserve flora—and fauna within—or even make use of scarce resources soils. In this paper, we study a robust phenomena observed in semi-arid ecosystems, by which localized vegetation patches split in a process called self-replication. Localized patches of vegetation are visible in nature at various spatial scales. Even though they have been described in literature, their growth mechanisms remain largely unexplored. Here, we develop an innovative statistical analysis based on real field observations to show that patches may exhibit deformation and splitting. This growth mechanism is opposite to the desertification since it allows to repopulate territories devoid of vegetation. We investigate these aspects by characterizing quantitatively, with a simple mathematical model, a new class of instabilities that lead to the self-replication phenomenon observed.

  12. Nonequilibrium Entropic Bounds for Darwinian Replicators

    Directory of Open Access Journals (Sweden)

    Jordi Piñero

    2018-01-01

    Full Text Available Life evolved on our planet by means of a combination of Darwinian selection and innovations leading to higher levels of complexity. The emergence and selection of replicating entities is a central problem in prebiotic evolution. Theoretical models have shown how populations of different types of replicating entities exclude or coexist with other classes of replicators. Models are typically kinetic, based on standard replicator equations. On the other hand, the presence of thermodynamical constraints for these systems remain an open question. This is largely due to the lack of a general theory of statistical methods for systems far from equilibrium. Nonetheless, a first approach to this problem has been put forward in a series of novel developements falling under the rubric of the extended second law of thermodynamics. The work presented here is twofold: firstly, we review this theoretical framework and provide a brief description of the three fundamental replicator types in prebiotic evolution: parabolic, malthusian and hyperbolic. Secondly, we employ these previously mentioned techinques to explore how replicators are constrained by thermodynamics. Finally, we comment and discuss where further research should be focused on.

  13. Changing Use of Seventh Chords: A Replication of Mauch et al. (2015

    Directory of Open Access Journals (Sweden)

    Hubert Léveillé Gauvin

    2016-07-01

    Full Text Available Mauch, MacCallum, Levy, and Leroi (2015 carried out a large-scale study of changes in American popular music between 1960 and 2010. Using signal processing methods, they found evidence suggesting a decreasing use of the dominant seventh chord and increasing use of the minor-minor seventh chord. While signal analysis methods have improved substantially in recent years, the accuracy of signal-based analysis remains imperfect. Using a contrasting method and independent musical sample, this paper reports converging evidence replicating these findings.

  14. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  15. Vaccinia virus as a subhelper for AAV replication and packaging

    Directory of Open Access Journals (Sweden)

    Andrea R Moore

    Full Text Available Adeno-associated virus (AAV has been widely used as a gene therapy vector to treat a variety of disorders. While these vectors are increasingly popular and successful in the clinic, there is still much to learn about the viruses. Understanding the biology of these viruses is essential in engineering better vectors and generating vectors more efficiently for large-scale use. AAV requires a helper for production and replication making this aspect of the viral life cycle crucial. Vaccinia virus (VV has been widely cited as a helper virus for AAV. However, to date, there are no detailed analyses of its helper function. Here, the helper role of VV was studied in detail. In contrast to common belief, we demonstrated that VV was not a sufficient helper virus for AAV replication. Vaccinia failed to produce rAAV and activate AAV promoters. While this virus could not support rAAV production, Vaccinia could initiate AAV replication and packaging when AAV promoter activation is not necessary. This activity is due to the ability of Vaccinia-driven Rep78 to transcribe in the cytoplasm and subsequently translate in the nucleus and undergo typical functions in the AAV life cycle. As such, VV is subhelper for AAV compared to complete helper functions of adenovirus.

  16. Large-scale perturbations from the waterfall field in hybrid inflation

    International Nuclear Information System (INIS)

    Fonseca, José; Wands, David; Sasaki, Misao

    2010-01-01

    We estimate large-scale curvature perturbations from isocurvature fluctuations in the waterfall field during hybrid inflation, in addition to the usual inflaton field perturbations. The tachyonic instability at the end of inflation leads to an explosive growth of super-Hubble scale perturbations, but they retain the steep blue spectrum characteristic of vacuum fluctuations in a massive field during inflation. The power spectrum thus peaks around the Hubble-horizon scale at the end of inflation. We extend the usual δN formalism to include the essential role of these small fluctuations when estimating the large-scale curvature perturbation. The resulting curvature perturbation due to fluctuations in the waterfall field is second-order and the spectrum is expected to be of order 10 −54 on cosmological scales

  17. Decoupling local mechanics from large-scale structure in modular metamaterials

    Science.gov (United States)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  18. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  19. Large palindromes in the lambda phage genome are preserved in a rec/sup +/ host by inhibiting lambda DNA replication

    Energy Technology Data Exchange (ETDEWEB)

    Shurvinton, C.E.; Stahl, M.M.; Stahl, F.W.

    1987-03-01

    A large palindrome carried by phage lambda has been shown to prevent growth of the phage on a rec/sup +/ strain of Escherichia coli. The phage do form plaques on recBC sbcB strains, but the palindrome is not stable - deletions that either destroy the palindrome or diminish its size overgrow the original engineered palindrome-containing phage. The authors have prepared stocks of lambda carrying a palindrome that is 2 x 4200 base pairs long. lambda phage were density labeled by UV induction of lysogens grown in minimal medium containing (/sup 13/C) glucose and /sup 15/NH/sub 4/Cl. These phage stocks are produced by induction of a lysogen in which the two halves of the palindrome are stored at opposite ends of the prophage and are of sufficient titer (10/sup 9/ phage per ml) to enable one-step growth experiments with replication-blocked phage. They find that the large palindrome as well as a lesser palindrome of 2 x 265 base pairs are recovered intact among particles carrying unreplicated chromosomes following such an infection of a rec/sup +/ host. they propose that DNA replication drives the extrusion of palindromic sequences in vivo, forming secondary structures that are substrates for the recBC and sbcB gene products.

  20. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  1. Extrasolar Planetary Imaging Coronagraph (EPIC)

    Science.gov (United States)

    Clampin, Mark

    2009-01-01

    The Extrasolar Planetary Imaging Coronagraph (EPIC) is a proposed NASA Exoplanet Probe mission to image and characterize extrasolar giant planets. EPIC will provide insights into the physical nature and architecture of a variety of planets in other solar systems. Initially, it will detect and characterize the atmospheres of planets identified by radial velocity surveys, determine orbital inclinations and masses and characterize the atmospheres around A and F type stars which cannot be found with RV techniques. It will also observe the inner spatial structure of exozodiacal disks. EPIC has a heliocentric Earth trailing drift-away orbit, with a 5 year mission lifetime. The robust mission design is simple and flexible ensuring mission success while minimizing cost and risk. The science payload consists of a heritage optical telescope assembly (OTA), and visible nulling coronagraph (VNC) instrument. The instrument achieves a contrast ratio of 10^9 over a 5 arcsecond field-of-view with an unprecedented inner working angle of 0.13 arcseconds over the spectral range of 440-880 nm. The telescope is a 1.65 meter off-axis Cassegrain with an OTA wavefront error of lambda/9, which when coupled to the VNC greatly reduces the requirements on the large scale optics.

  2. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  3. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  4. Biological evolution of replicator systems: towards a quantitative approach.

    Science.gov (United States)

    Martin, Osmel; Horvath, J E

    2013-04-01

    The aim of this work is to study the features of a simple replicator chemical model of the relation between kinetic stability and entropy production under the action of external perturbations. We quantitatively explore the different paths leading to evolution in a toy model where two independent replicators compete for the same substrate. To do that, the same scenario described originally by Pross (J Phys Org Chem 17:312-316, 2004) is revised and new criteria to define the kinetic stability are proposed. Our results suggest that fast replicator populations are continually favored by the effects of strong stochastic environmental fluctuations capable to determine the global population, the former assumed to be the only acting evolution force. We demonstrate that the process is continually driven by strong perturbations only, and that population crashes may be useful proxies for these catastrophic environmental fluctuations. As expected, such behavior is particularly enhanced under very large scale perturbations, suggesting a likely dynamical footprint in the recovery patterns of new species after mass extinction events in the Earth's geological past. Furthermore, the hypothesis that natural selection always favors the faster processes may give theoretical support to different studies that claim the applicability of maximum principles like the Maximum Metabolic Flux (MMF) or Maximum Entropy Productions Principle (MEPP), seen as the main goal of biological evolution.

  5. PLANETARY NEBULAE IN FACE-ON SPIRAL GALAXIES. II. PLANETARY NEBULA SPECTROSCOPY

    International Nuclear Information System (INIS)

    Herrmann, Kimberly A.; Ciardullo, Robin

    2009-01-01

    As the second step in our investigation of the mass-to-light ratio of spiral disks, we present the results of a spectroscopic survey of planetary nebulae (PNe) in five nearby, low-inclination galaxies: IC 342, M74 (NGC 628), M83 (NGC 5236), M94 (NGC 4736), and M101 (NGC 5457). Using 50 setups of the WIYN/Hydra and Blanco/Hydra spectrographs, and 25 observations with the Hobby-Eberly Telescope's Medium Resolution Spectrograph, we determine the radial velocities of 99, 102, 162, 127, and 48 PNe, respectively, to a precision better than 15 km s -1 . Although the main purpose of this data set is to facilitate dynamical mass measurements throughout the inner and outer disks of large spiral galaxies, our spectroscopy has other uses as well. Here, we co-add these spectra to show that, to first order, the [O III] and Balmer line ratios of PNe vary little over the top ∼1.5 mag of the PN luminosity function. The only obvious spectral change occurs with [N II], which increases in strength as one proceeds down the luminosity function. We also show that typical [O III]-bright planetaries have E(B - V) ∼ 0.2 of circumstellar extinction, and that this value is virtually independent of [O III] luminosity. We discuss the implications this has for understanding the population of PN progenitors.

  6. Planetary gyre, time-dependent eddies, torsional waves, and equatorial jets at the Earth's core surface

    DEFF Research Database (Denmark)

    Gillet, N.; Jault, D.; Finlay, Chris

    2015-01-01

    between the magnetic field and subdecadal nonzonal motions within the fluid outer core. Both the zonal and the more energetic nonzonal interannual motions were particularly intense close to the equator (below 10∘ latitude) between 1995 and 2010. We revise down the amplitude of the decade fluctuations......We report a calculation of time-dependent quasi-geostrophic core flows for 1940–2010. Inverting recursively for an ensemble of solutions, we evaluate the main source of uncertainties, namely, the model errors arising from interactions between unresolved core surface motions and magnetic fields....... Temporal correlations of these uncertainties are accounted for. The covariance matrix for the flow coefficients is also obtained recursively from the dispersion of an ensemble of solutions. Maps of the flow at the core surface show, upon a planetary-scale gyre, time-dependent large-scale eddies...

  7. Generation of Zonal Flow and Magnetic Field by Electromagnetic Planetary Waves in the Ionospheric E-Layer

    Science.gov (United States)

    Kahlon, L. Z.; Kaladze, T. D.

    2017-12-01

    We review the excitation of zonal flow and magnetic field by coupled electromagnetic (EM) ULF planetary waves in the Earth's ionospheric E layer. Coupling of different planetary low-frequency electromagnetic waves under the typical ionospheric E-layer conditions is revealed. Propagation of coupled internal-gravity-Alfvén (CIGA), coupled Rossby-Khantadze (CRK) and coupled Rossby-Alfvén-Khantadze (CRAK) waves is shown and studied. A set of appropriate nonlinear equations describing the interaction of such waves with sheared zonal flow is derived. The conclusion on the instability of short wavelength turbulence of such coupled waves with respect to the excitation of low-frequency and large-scale perturbation of the sheared zonal flow and sheared magnetic field is inferred. This nonlinear instability's mechanism is depended on the parametric excitation of triple finite-amplitude coupled waves leading to the inverse energy cascade towards the longer wavelength. The possibility of generation of the intense mean magnetic field is shown. Obtained growth rates are discussed for each considered coupled waves.

  8. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  9. Temporal flexibility and careers: The role of large-scale organizations for physicians

    OpenAIRE

    Forrest Briscoe

    2006-01-01

    Temporal flexibility and careers: The role of large-scale organizations for physicians. Forrest Briscoe Briscoe This study investigates how employment in large-scale organizations affects the work lives of practicing physicians. Well-established theory associates larger organizations with bureaucratic constraint, loss of workplace control, and dissatisfaction, but this author finds that large scale is also associated with greater schedule and career flexibility. Ironically, the bureaucratic p...

  10. The role of large scale motions on passive scalar transport

    Science.gov (United States)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  11. Signatures of non-universal large scales in conditional structure functions from various turbulent flows

    International Nuclear Information System (INIS)

    Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K

    2011-01-01

    We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)

  12. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    NARCIS (Netherlands)

    Loon, van A.F.; Huijgevoort, van M.H.J.; Lanen, van H.A.J.

    2012-01-01

    Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological

  13. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  14. Dual Decomposition for Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Vandenberghe, Lieven

    2013-01-01

    Dual decomposition is applied to power balancing of exible thermal storage units. The centralized large-scale problem is decomposed into smaller subproblems and solved locallyby each unit in the Smart Grid. Convergence is achieved by coordinating the units consumption through a negotiation...

  15. Generation of large-scale vortives in compressible helical turbulence

    International Nuclear Information System (INIS)

    Chkhetiani, O.G.; Gvaramadze, V.V.

    1989-01-01

    We consider generation of large-scale vortices in compressible self-gravitating turbulent medium. The closed equation describing evolution of the large-scale vortices in helical turbulence with finite correlation time is obtained. This equation has the form similar to the hydromagnetic dynamo equation, which allows us to call the vortx genertation effect the vortex dynamo. It is possible that principally the same mechanism is responsible both for amplification and maintenance of density waves and magnetic fields in gaseous disks of spiral galaxies. (author). 29 refs

  16. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  17. Impact of large-scale tides on cosmological distortions via redshift-space power spectrum

    Science.gov (United States)

    Akitsu, Kazuyuki; Takada, Masahiro

    2018-03-01

    Although large-scale perturbations beyond a finite-volume survey region are not direct observables, these affect measurements of clustering statistics of small-scale (subsurvey) perturbations in large-scale structure, compared with the ensemble average, via the mode-coupling effect. In this paper we show that a large-scale tide induced by scalar perturbations causes apparent anisotropic distortions in the redshift-space power spectrum of galaxies in a way depending on an alignment between the tide, wave vector of small-scale modes and line-of-sight direction. Using the perturbation theory of structure formation, we derive a response function of the redshift-space power spectrum to large-scale tide. We then investigate the impact of large-scale tide on estimation of cosmological distances and the redshift-space distortion parameter via the measured redshift-space power spectrum for a hypothetical large-volume survey, based on the Fisher matrix formalism. To do this, we treat the large-scale tide as a signal, rather than an additional source of the statistical errors, and show that a degradation in the parameter is restored if we can employ the prior on the rms amplitude expected for the standard cold dark matter (CDM) model. We also discuss whether the large-scale tide can be constrained at an accuracy better than the CDM prediction, if the effects up to a larger wave number in the nonlinear regime can be included.

  18. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  19. Planetary-Scale Geospatial Data Analysis Techniques in Google's Earth Engine Platform (Invited)

    Science.gov (United States)

    Hancher, M.

    2013-12-01

    Geoscientists have more and more access to new tools for large-scale computing. With any tool, some tasks are easy and other tasks hard. It is natural to look to new computing platforms to increase the scale and efficiency of existing techniques, but there is a more exiting opportunity to discover and develop a new vocabulary of fundamental analysis idioms that are made easy and effective by these new tools. Google's Earth Engine platform is a cloud computing environment for earth data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog includes a nearly complete archive of scenes from Landsat 4, 5, 7, and 8 that have been processed by the USGS, as well as a wide variety of other remotely-sensed and ancillary data products. Earth Engine supports a just-in-time computation model that enables real-time preview during algorithm development and debugging as well as during experimental data analysis and open-ended data exploration. Data processing operations are performed in parallel across many computers in Google's datacenters. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, resampling, and associating image metadata with pixel data. Early applications of Earth Engine have included the development of Google's global cloud-free fifteen-meter base map and global multi-decadal time-lapse animations, as well as numerous large and small experimental analyses by scientists from a range of academic, government, and non-governmental institutions, working in a wide variety of application areas including forestry, agriculture, urban mapping, and species habitat modeling. Patterns in the successes and failures of these early efforts have begun to emerge, sketching the outlines of a new set of simple and effective approaches to geospatial data analysis.

  20. The Hamburg large scale geostrophic ocean general circulation model. Cycle 1

    International Nuclear Information System (INIS)

    Maier-Reimer, E.; Mikolajewicz, U.

    1992-02-01

    The rationale for the Large Scale Geostrophic ocean circulation model (LSG-OGCM) is based on the observations that for a large scale ocean circulation model designed for climate studies, the relevant characteristic spatial scales are large compared with the internal Rossby radius throughout most of the ocean, while the characteristic time scales are large compared with the periods of gravity modes and barotropic Rossby wave modes. In the present version of the model, the fast modes have been filtered out by a conventional technique of integrating the full primitive equations, including all terms except the nonlinear advection of momentum, by an implicit time integration method. The free surface is also treated prognostically, without invoking a rigid lid approximation. The numerical scheme is unconditionally stable and has the additional advantage that it can be applied uniformly to the entire globe, including the equatorial and coastal current regions. (orig.)

  1. Women in Planetary Science: Career Resources and e-Mentoring on Blogs, Twitter, Facebook, Google+, and Pinterest

    Science.gov (United States)

    Niebur, S. M.; Singer, K.; Gardner-Vandy, K.

    2012-08-01

    Fifty-one interviews with women in planetary science are now available as an e-mentoring and teaching resource on WomeninPlanetaryScience.com. Each scientist was nominated and interviewed by a fellow member of the planetary science community, and each gladly shared her advice for advancement in the field. Women in Planetary Science was founded in 2008 to connect communities of current and prospective scientists, to promote proposal and award opportunities, and to stimulate discussion in the planetary science community at large. Regular articles, or posts, by nearly a dozen collaborators highlight a range of current issues for women in this field. These articles are promoted by collaborators on Twitter, Facebook, and Google+ and shared again by the collaborators' contacts, reaching a significantly wider audience. The group's latest project, on Pinterest, is a crowd-sourced photo gallery of more than 350 inspiring women in planetary science; each photo links to the scientist's CV. The interviews, the essays, and the photo gallery are available online as resources for prospective scientists, planetary scientists, parents, and educators.

  2. Soft X-ray Emission from Large-Scale Galactic Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S.; O'Dea, C.; Veilleux, S.

    1998-01-01

    Kiloparsec-scale soft X-ray nebulae extend along the galaxy minor axes in several Seyfert galaxies, including NGC 2992, NGC 4388 and NGC 5506. In these three galaxies, the extended X-ray emission observed in ROSAT HRI images has 0.2-2.4 keV X-ray luminosities of 0.4-3.5 x 10(40) erg s(-1) . The X-ray nebulae are roughly co-spatial with the large-scale radio emission, suggesting that both are produced by large-scale galactic outflows. Assuming pressure balance between the radio and X-ray plasmas, the X-ray filling factor is >~ 10(4) times as large as the radio plasma filling factor, suggesting that large-scale outflows in Seyfert galaxies are predominantly winds of thermal X-ray emitting gas. We favor an interpretation in which large-scale outflows originate as AGN-driven jets that entrain and heat gas on kpc scales as they make their way out of the galaxy. AGN- and starburst-driven winds are also possible explanations if the winds are oriented along the rotation axis of the galaxy disk. Since large-scale outflows are present in at least 50 percent of Seyfert galaxies, the soft X-ray emission from the outflowing gas may, in many cases, explain the ``soft excess" X-ray feature observed below 2 keV in X-ray spectra of many Seyfert 2 galaxies.

  3. XLID-causing mutations and associated genes challenged in light of data from large-scale human exome sequencing.

    Science.gov (United States)

    Piton, Amélie; Redin, Claire; Mandel, Jean-Louis

    2013-08-08

    Because of the unbalanced sex ratio (1.3-1.4 to 1) observed in intellectual disability (ID) and the identification of large ID-affected families showing X-linked segregation, much attention has been focused on the genetics of X-linked ID (XLID). Mutations causing monogenic XLID have now been reported in over 100 genes, most of which are commonly included in XLID diagnostic gene panels. Nonetheless, the boundary between true mutations and rare non-disease-causing variants often remains elusive. The sequencing of a large number of control X chromosomes, required for avoiding false-positive results, was not systematically possible in the past. Such information is now available thanks to large-scale sequencing projects such as the National Heart, Lung, and Blood (NHLBI) Exome Sequencing Project, which provides variation information on 10,563 X chromosomes from the general population. We used this NHLBI cohort to systematically reassess the implication of 106 genes proposed to be involved in monogenic forms of XLID. We particularly question the implication in XLID of ten of them (AGTR2, MAGT1, ZNF674, SRPX2, ATP6AP2, ARHGEF6, NXF5, ZCCHC12, ZNF41, and ZNF81), in which truncating variants or previously published mutations are observed at a relatively high frequency within this cohort. We also highlight 15 other genes (CCDC22, CLIC2, CNKSR2, FRMPD4, HCFC1, IGBP1, KIAA2022, KLF8, MAOA, NAA10, NLGN3, RPL10, SHROOM4, ZDHHC15, and ZNF261) for which replication studies are warranted. We propose that similar reassessment of reported mutations (and genes) with the use of data from large-scale human exome sequencing would be relevant for a wide range of other genetic diseases. Copyright © 2013 The American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  4. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  5. Optorsim: A Grid Simulator for Studying Dynamic Data Replication Strategies

    CERN Document Server

    Bell, William H; Millar, A Paul; Capozza, Luigi; Stockinger, Kurt; Zini, Floriano

    2003-01-01

    Computational grids process large, computationally intensive problems on small data sets. In contrast, data grids process large computational problems that in turn require evaluating, mining and producing large amounts of data. Replication, creating geographically disparate identical copies of data, is regarded as one of the major optimization techniques for reducing data access costs. In this paper, several replication algorithms are discussed. These algorithms were studied using the Grid simulator: OptorSim. OptorSim provides a modular framework within which optimization strategies can be studied under different Grid configurations. The goal is to explore the stability and transient behaviour of selected optimization techniques. We detail the design and implementation of OptorSim and analyze various replication algorithms based on different Grid workloads.

  6. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  7. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  8. The Solar Connections Observatory for Planetary Environments

    Science.gov (United States)

    Oliversen, Ronald J.; Harris, Walter M.; Oegerle, William R. (Technical Monitor)

    2002-01-01

    The NASA Sun-Earth Connection theme roadmap calls for comparative study of how the planets, comets, and local interstellar medium (LISM) interact with the Sun and respond to solar variability. Through such a study we advance our understanding of basic physical plasma and gas dynamic processes, thus increasing our predictive capabilities for the terrestrial, planetary, and interplanetary environments where future remote and human exploration will occur. Because the other planets have lacked study initiatives comparable to the terrestrial ITM, LWS, and EOS programs, our understanding of the upper atmospheres and near space environments on these worlds is far less detailed than our knowledge of the Earth. To close this gap we propose a mission to study {\\it all) of the solar interacting bodies in our planetary system out to the heliopause with a single remote sensing space observatory, the Solar Connections Observatory for Planetary Environments (SCOPE). SCOPE consists of a binocular EUV/FUV telescope operating from a remote, driftaway orbit that provides sub-arcsecond imaging and broadband medium resolution spectro-imaging over the 55-290 nm bandpass, and high (R>10$^{5}$ resolution H Ly-$\\alpha$ emission line profile measurements of small scale planetary and wide field diffuse solar system structures. A key to the SCOPE approach is to include Earth as a primary science target. From its remote vantage point SCOPE will be able to observe auroral emission to and beyond the rotational pole. The other planets and comets will be monitored in long duration campaigns centered when possible on solar opposition when interleaved terrestrial-planet observations can be used to directly compare the response of both worlds to the same solar wind stream and UV radiation field. Using a combination of observations and MHD models, SCOPE will isolate the different controlling parameters in each planet system and gain insight into the underlying physical processes that define the

  9. Evaluation of Large-scale Public Sector Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Hanne Foss

    2017-01-01

    and more delimited policy areas take place. In our analysis we apply four governance perspectives (rational-instrumental, rational-interest based, institutional-cultural and a chaos perspective) in a comparative analysis of the evaluations of two large-scale public sector reforms in Denmark and Norway. We...

  10. Evaluating the use of local ecological knowledge to monitor hunted tropical-forest wildlife over large spatial scales

    Directory of Open Access Journals (Sweden)

    Luke Parry

    2015-09-01

    Full Text Available Monitoring the distribution and abundance of hunted wildlife is critical to achieving sustainable resource use, yet adequate data are sparse for most tropical regions. Conventional methods for monitoring hunted forest-vertebrate species require intensive in situ survey effort, which severely constrains spatial and temporal replication. Integrating local ecological knowledge (LEK into monitoring and management is appealing because it can be cost-effective, enhance community participation, and provide novel insights into sustainable resource use. We develop a technique to monitor population depletion of hunted forest wildlife in the Brazilian Amazon, based on the local ecological knowledge of rural hunters. We performed rapid interview surveys to estimate the landscape-scale depletion of ten large-bodied vertebrate species around 161 Amazonian riverine settlements. We assessed the explanatory and predictive power of settlement and landscape characteristics and were able to develop robust estimates of local faunal depletion. By identifying species-specific drivers of depletion and using secondary data on human population density, land form, and physical accessibility, we then estimated landscape- and regional-scale depletion. White-lipped peccary (Tayassu pecari, for example, were estimated to be absent from 17% of their putative range in Brazil's largest state (Amazonas, despite 98% of the original forest cover remaining intact. We found evidence that bushmeat consumption in small urban centers has far-reaching impacts on some forest species, including severe depletion well over 100 km from urban centers. We conclude that LEK-based approaches require further field validation, but have significant potential for community-based participatory monitoring as well as cost-effective, large-scale monitoring of threatened forest species.

  11. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  12. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...

  13. Large-scale coastal impact induced by a catastrophic storm

    DEFF Research Database (Denmark)

    Fruergaard, Mikkel; Andersen, Thorbjørn Joest; Johannessen, Peter N

    breaching. Our results demonstrate that violent, millennial-scale storms can trigger significant large-scale and long-term changes on barrier coasts, and that coastal changes assumed to take place over centuries or even millennia may occur in association with a single extreme storm event....

  14. Large-eddy simulation with accurate implicit subgrid-scale diffusion

    NARCIS (Netherlands)

    B. Koren (Barry); C. Beets

    1996-01-01

    textabstractA method for large-eddy simulation is presented that does not use an explicit subgrid-scale diffusion term. Subgrid-scale effects are modelled implicitly through an appropriate monotone (in the sense of Spekreijse 1987) discretization method for the advective terms. Special attention is

  15. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  16. Macroecological factors explain large-scale spatial population patterns of ancient agriculturalists

    NARCIS (Netherlands)

    Xu, C.; Chen, B.; Abades, S.; Reino, L.; Teng, S.; Ljungqvist, F.C.; Huang, Z.Y.X.; Liu, X.

    2015-01-01

    Aim: It has been well demonstrated that the large-scale distribution patterns of numerous species are driven by similar macroecological factors. However, understanding of this topic remains limited when applied to our own species. Here we take a large-scale look at ancient agriculturalist

  17. Large Scale Investments in Infrastructure : Competing Policy regimes to Control Connections

    NARCIS (Netherlands)

    Otsuki, K.; Read, M.L.; Zoomers, E.B.

    2016-01-01

    This paper proposes to analyse implications of large-scale investments in physical infrastructure for social and environmental justice. While case studies on the global land rush and climate change have advanced our understanding of how large-scale investments in land, forests and water affect

  18. Rotation invariant fast features for large-scale recognition

    Science.gov (United States)

    Takacs, Gabriel; Chandrasekhar, Vijay; Tsai, Sam; Chen, David; Grzeszczuk, Radek; Girod, Bernd

    2012-10-01

    We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.

  19. Interactive investigations into planetary interiors

    Science.gov (United States)

    Rose, I.

    2015-12-01

    Many processes in Earth science are difficult to observe or visualize due to the large timescales and lengthscales over which they operate. The dynamics of planetary mantles are particularly challenging as we cannot even look at the rocks involved. As a result, much teaching material on mantle dynamics relies on static images and cartoons, many of which are decades old. Recent improvements in computing power and technology (largely driven by game and web development) have allowed for advances in real-time physics simulations and visualizations, but these have been slow to affect Earth science education.Here I demonstrate a teaching tool for mantle convection and seismology which solves the equations for conservation of mass, momentum, and energy in real time, allowing users make changes to the simulation and immediately see the effects. The user can ask and answer questions about what happens when they add heat in one place, or take it away from another place, or increase the temperature at the base of the mantle. They can also pause the simulation, and while it is paused, create and visualize seismic waves traveling through the mantle. These allow for investigations into and discussions about plate tectonics, earthquakes, hot spot volcanism, and planetary cooling.The simulation is rendered to the screen using OpenGL, and is cross-platform. It can be run as a native application for maximum performance, but it can also be embedded in a web browser for easy deployment and portability.

  20. Hot-spot application of biocontrol agents to replace pesticides in large scale commercial rose farms in Kenya

    DEFF Research Database (Denmark)

    Gacheri, Catherine; Kigen, Thomas; Sigsgaard, Lene

    2015-01-01

    Rose (Rosa hybrida L.) is the most important ornamental crop in Kenya, with huge investments in pest management. We provide the first full-scale, replicated experiment comparing cost and yield of conventional two-spotted spider mite (Tetranychus urticae Koch) control with hot-spot applications of...

  1. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  2. Large-scale structure in the universe: Theory vs observations

    International Nuclear Information System (INIS)

    Kashlinsky, A.; Jones, B.J.T.

    1990-01-01

    A variety of observations constrain models of the origin of large scale cosmic structures. We review here the elements of current theories and comment in detail on which of the current observational data provide the principal constraints. We point out that enough observational data have accumulated to constrain (and perhaps determine) the power spectrum of primordial density fluctuations over a very large range of scales. We discuss the theories in the light of observational data and focus on the potential of future observations in providing even (and ever) tighter constraints. (orig.)

  3. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    Full Text Available Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological drought? To answer this question, we evaluated the simulation of drought propagation in an ensemble mean of ten large-scale models, both land-surface models and global hydrological models, that participated in the model intercomparison project of WATCH (WaterMIP. For a selection of case study areas, we studied drought characteristics (number of droughts, duration, severity, drought propagation features (pooling, attenuation, lag, lengthening, and hydrological drought typology (classical rainfall deficit drought, rain-to-snow-season drought, wet-to-dry-season drought, cold snow season drought, warm snow season drought, composite drought.

    Drought characteristics simulated by large-scale models clearly reflected drought propagation; i.e. drought events became fewer and longer when moving through the hydrological cycle. However, more differentiation was expected between fast and slowly responding systems, with slowly responding systems having fewer and longer droughts in runoff than fast responding systems. This was not found using large-scale models. Drought propagation features were poorly reproduced by the large-scale models, because runoff reacted immediately to precipitation, in all case study areas. This fast reaction to precipitation, even in cold climates in winter and in semi-arid climates in summer, also greatly influenced the hydrological drought typology as identified by the large-scale models. In general, the large-scale models had the correct representation of drought types, but the percentages of occurrence had some important mismatches, e.g. an overestimation of classical rainfall deficit droughts, and an

  4. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  5. Structural properties of replication origins in yeast DNA sequences

    International Nuclear Information System (INIS)

    Cao Xiaoqin; Zeng Jia; Yan Hong

    2008-01-01

    Sequence-dependent DNA flexibility is an important structural property originating from the DNA 3D structure. In this paper, we investigate the DNA flexibility of the budding yeast (S. Cerevisiae) replication origins on a genome-wide scale using flexibility parameters from two different models, the trinucleotide and the tetranucleotide models. Based on analyzing average flexibility profiles of 270 replication origins, we find that yeast replication origins are significantly rigid compared with their surrounding genomic regions. To further understand the highly distinctive property of replication origins, we compare the flexibility patterns between yeast replication origins and promoters, and find that they both contain significantly rigid DNAs. Our results suggest that DNA flexibility is an important factor that helps proteins recognize and bind the target sites in order to initiate DNA replication. Inspired by the role of the rigid region in promoters, we speculate that the rigid replication origins may facilitate binding of proteins, including the origin recognition complex (ORC), Cdc6, Cdt1 and the MCM2-7 complex

  6. Toward Instructional Leadership: Principals' Perceptions of Large-Scale Assessment in Schools

    Science.gov (United States)

    Prytula, Michelle; Noonan, Brian; Hellsten, Laurie

    2013-01-01

    This paper describes a study of the perceptions that Saskatchewan school principals have regarding large-scale assessment reform and their perceptions of how assessment reform has affected their roles as principals. The findings revealed that large-scale assessments, especially provincial assessments, have affected the principal in Saskatchewan…

  7. A large scale field experiment in the Amazon basin (LAMBADA/BATERISTA)

    NARCIS (Netherlands)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C.

    1995-01-01

    A description is given of a large-scale field experiment planned in the Amazon basin, aimed at assessing the large-scale balances of energy, water and carbon dioxide. The embedding of this experiment in global change programmes is described, viz. the Biospheric Aspects of the Hydrological Cycle

  8. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  9. The Planetary Data System— Archiving Planetary Data for the use of the Planetary Science Community

    Science.gov (United States)

    Morgan, Thomas H.; McLaughlin, Stephanie A.; Grayzeck, Edwin J.; Vilas, Faith; Knopf, William P.; Crichton, Daniel J.

    2014-11-01

    NASA’s Planetary Data System (PDS) archives, curates, and distributes digital data from NASA’s planetary missions. PDS provides the planetary science community convenient online access to data from NASA’s missions so that they can continue to mine these rich data sets for new discoveries. The PDS is a federated system consisting of nodes for specific discipline areas ranging from planetary geology to space physics. Our federation includes an engineering node that provides systems engineering support to the entire PDS.In order to adequately capture complete mission data sets containing not only raw and reduced instrument data, but also calibration and documentation and geometry data required to interpret and use these data sets both singly and together (data from multiple instruments, or from multiple missions), PDS personnel work with NASA missions from the initial AO through the end of mission to define, organize, and document the data. This process includes peer-review of data sets by members of the science community to ensure that the data sets are scientifically useful, effectively organized, and well documented. PDS makes the data in PDS easily searchable so that members of the planetary community can both query the archive to find data relevant to specific scientific investigations and easily retrieve the data for analysis. To ensure long-term preservation of data and to make data sets more easily searchable with the new capabilities in Information Technology now available (and as existing technologies become obsolete), the PDS (together with the COSPAR sponsored IPDA) developed and deployed a new data archiving system known as PDS4, released in 2013. The LADEE, MAVEN, OSIRIS REx, InSight, and Mars2020 missions are using PDS4. ESA has adopted PDS4 for the upcoming BepiColumbo mission. The PDS is actively migrating existing data records into PDS4 and developing tools to aid data providers and users. The PDS is also incorporating challenge

  10. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Nusser, Adi [Physics Department and the Asher Space Science Institute-Technion, Haifa 32000 (Israel); Branchini, Enzo [Department of Physics, Universita Roma Tre, Via della Vasca Navale 84, 00146 Rome (Italy); Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu [Departments of Astronomy and Physics, University of California, Berkeley, CA 94720 (United States)

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  11. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  12. Planck intermediate results XLII. Large-scale Galactic magnetic fields

    DEFF Research Database (Denmark)

    Adam, R.; Ade, P. A. R.; Alves, M. I. R.

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured ...

  13. Large-scale assessment of benthic communities across multiple marine protected areas using an autonomous underwater vehicle.

    Science.gov (United States)

    Ferrari, Renata; Marzinelli, Ezequiel M; Ayroza, Camila Rezende; Jordan, Alan; Figueira, Will F; Byrne, Maria; Malcolm, Hamish A; Williams, Stefan B; Steinberg, Peter D

    2018-01-01

    Marine protected areas (MPAs) are designed to reduce threats to biodiversity and ecosystem functioning from anthropogenic activities. Assessment of MPAs effectiveness requires synchronous sampling of protected and non-protected areas at multiple spatial and temporal scales. We used an autonomous underwater vehicle to map benthic communities in replicate 'no-take' and 'general-use' (fishing allowed) zones within three MPAs along 7o of latitude. We recorded 92 taxa and 38 morpho-groups across three large MPAs. We found that important habitat-forming biota (e.g. massive sponges) were more prevalent and abundant in no-take zones, while short ephemeral algae were more abundant in general-use zones, suggesting potential short-term effects of zoning (5-10 years). Yet, short-term effects of zoning were not detected at the community level (community structure or composition), while community structure varied significantly among MPAs. We conclude that by allowing rapid, simultaneous assessments at multiple spatial scales, autonomous underwater vehicles are useful to document changes in marine communities and identify adequate scales to manage them. This study advanced knowledge of marine benthic communities and their conservation in three ways. First, we quantified benthic biodiversity and abundance, generating the first baseline of these benthic communities against which the effectiveness of three large MPAs can be assessed. Second, we identified the taxonomic resolution necessary to assess both short and long-term effects of MPAs, concluding that coarse taxonomic resolution is sufficient given that analyses of community structure at different taxonomic levels were generally consistent. Yet, observed differences were taxa-specific and may have not been evident using our broader taxonomic classifications, a classification of mid to high taxonomic resolution may be necessary to determine zoning effects on key taxa. Third, we provide an example of statistical analyses and

  14. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  15. Data replicating the factor structure and reliability of commonly used measures of resilience: The Connor–Davidson Resilience Scale, Resilience Scale, and Scale of Protective Factors

    Directory of Open Access Journals (Sweden)

    A.N. Madewell

    2016-09-01

    Full Text Available The data presented in this article are related to the article entitled “Assessing Resilience in Emerging Adulthood: The Resilience Scale (RS, Connor Davidson Resilience Scale (CD-RISC, and Scale of Protective Factors (SPF” (Madewell and Ponce-Garcia, 2016 [1]. The data were collected from a sample of 451 college students from three universities located in the Southwestern region of the United States: 374 from a large public university and 67 from two smaller regional universities. The data from the three universities did not significantly differ in terms of demographics. The data represent participant responses on six measurements to include the Resilience Scale-25 (RS-25, Resilience Scale-14 (RS-14, Connor Davidson Resilience Scale-25 (CD-RISC-25, Connor Davidson Resilience Scale-10 (CD-RISC-10, Scale of Protective Factors-24 (SPF-24, and the Life Stressor Checklist Revised (LSC-R. Keywords: Scale of Protective Factors, Resilience Scale, Connor–Davidson Resilience Scale, Emerging adulthood, Confirmatory factor analysis

  16. No Large Scale Curvature Perturbations during Waterfall of Hybrid Inflation

    OpenAIRE

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2010-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depend crucially on the competition between the classical and the quantum mechanical back-reactions to terminate inflation. If one considers only the clas...

  17. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  18. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  19. Pseudoscalar-photon mixing and the large scale alignment of QsO ...

    Indian Academy of Sciences (India)

    physics pp. 679-682. Pseudoscalar-photon mixing and the large scale alignment of QsO optical polarizations. PANKAJ JAIN, sUKANTA PANDA and s sARALA. Physics Department, Indian Institute of Technology, Kanpur 208 016, India. Abstract. We review the observation of large scale alignment of QSO optical polariza-.

  20. On the universal character of the large scale structure of the universe

    International Nuclear Information System (INIS)

    Demianski, M.; International Center for Relativistic Astrophysics; Rome Univ.; Doroshkevich, A.G.

    1991-01-01

    We review different theories of formation of the large scale structure of the Universe. Special emphasis is put on the theory of inertial instability. We show that for a large class of initial spectra the resulting two point correlation functions are similar. We discuss also the adhesion theory which uses the Burgers equation, Navier-Stokes equation or coagulation process. We review the Zeldovich theory of gravitational instability and discuss the internal structure of pancakes. Finally we discuss the role of the velocity potential in determining the global characteristics of large scale structures (distribution of caustics, scale of voids, etc.). In the last chapter we list the main unsolved problems and main successes of the theory of formation of large scale structure. (orig.)

  1. Mission-directed path planning for planetary rover exploration

    Science.gov (United States)

    Tompkins, Paul

    2005-07-01

    Robotic rovers uniquely benefit planetary exploration---they enable regional exploration with the precision of in-situ measurements, a combination impossible from an orbiting spacecraft or fixed lander. Mission planning for planetary rover exploration currently utilizes sophisticated software for activity planning and scheduling, but simplified path planning and execution approaches tailored for localized operations to individual targets. This approach is insufficient for the investigation of multiple, regionally distributed targets in a single command cycle. Path planning tailored for this task must consider the impact of large scale terrain on power, speed and regional access; the effect of route timing on resource availability; the limitations of finite resource capacity and other operational constraints on vehicle range and timing; and the mutual influence between traverses and upstream and downstream stationary activities. Encapsulating this reasoning in an efficient autonomous planner would allow a rover to continue operating rationally despite significant deviations from an initial plan. This research presents mission-directed path planning that enables an autonomous, strategic reasoning capability for robotic explorers. Planning operates in a space of position, time and energy. Unlike previous hierarchical approaches, it treats these dimensions simultaneously to enable globally-optimal solutions. The approach calls on a near incremental search algorithm designed for planning and re-planning under global constraints, in spaces of higher than two dimensions. Solutions under this method specify routes that avoid terrain obstacles, optimize the collection and use of rechargable energy, satisfy local and global mission constraints, and account for the time and energy of interleaved mission activities. Furthermore, the approach efficiently re-plans in response to updates in vehicle state and world models, and is well suited to online operation aboard a robot

  2. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  3. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  4. Probing cosmology with the homogeneity scale of the Universe through large scale structure surveys

    International Nuclear Information System (INIS)

    Ntelis, Pierros

    2017-01-01

    This thesis exposes my contribution to the measurement of homogeneity scale using galaxies, with the cosmological interpretation of results. In physics, any model is characterized by a set of principles. Most models in cosmology are based on the Cosmological Principle, which states that the universe is statistically homogeneous and isotropic on a large scales. Today, this principle is considered to be true since it is respected by those cosmological models that accurately describe the observations. However, while the isotropy of the universe is now confirmed by many experiments, it is not the case for the homogeneity. To study cosmic homogeneity, we propose to not only test a model but to test directly one of the postulates of modern cosmology. Since 1998 the measurements of cosmic distances using type Ia supernovae, we know that the universe is now in a phase of accelerated expansion. This phenomenon can be explained by the addition of an unknown energy component, which is called dark energy. Since dark energy is responsible for the expansion of the universe, we can study this mysterious fluid by measuring the rate of expansion of the universe. The universe has imprinted in its matter distribution a standard ruler, the Baryon Acoustic Oscillation (BAO) scale. By measuring this scale at different times during the evolution of our universe, it is then possible to measure the rate of expansion of the universe and thus characterize this dark energy. Alternatively, we can use the homogeneity scale to study this dark energy. Studying the homogeneity and the BAO scale requires the statistical study of the matter distribution of the universe at large scales, superior to tens of Mega-parsecs. Galaxies and quasars are formed in the vast over densities of matter and they are very luminous: these sources trace the distribution of matter. By measuring the emission spectra of these sources using large spectroscopic surveys, such as BOSS and eBOSS, we can measure their positions

  5. Does an Intrinsic Magnetic Field Inhibit or Enhance Planetary Ionosphere Outflow and Loss?

    Science.gov (United States)

    Strangeway, R. J.; Russell, C. T.; Luhmann, J. G.; Moore, T. E.; Foster, J. C.; Barabash, S. V.; Nilsson, H.

    2017-12-01

    A characteristic feature of the planets Earth, Venus and Mars is the observation of the outflow of ionospheric ions, most notably oxygen. The oxygen ion outflow is frequently assumed to be a proxy for the loss of water from the planetary atmosphere. In terms of global outflow rates for the Earth the rate varies from 1025 to 1026 s-1, depending on geomagnetic activity. For both Venus and Mars global rates of the order 5x1024 s-1 have been reported. Venus and Mars do not have a large-scale intrinsic magnetic field, and there are several pathways for atmospheric and ionospheric loss. At Mars, because of its low gravity, neutral oxygen can escape through dissociative recombination. At Venus only processes related to the solar wind interaction with the planet such as sputtering and direct scavenging of the ionosphere by the solar wind can result in oxygen escape. At the Earth the intrinsic magnetic field forms a barrier to the solar wind, but reconnection of the Earth's magnetic field with the Interplanetary Magnetic Field allows solar wind energy and momentum to be transferred into the magnetosphere, resulting in ionospheric outflows. Observations of oxygen ions at the dayside magnetopause suggest that at least some of these ions escape. In terms of the evolution of planetary atmospheres how the solar-wind driven escape rates vary for magnetized versus umagnetized planets is also not clear. An enhanced solar wind dynamic pressure will increase escape from the unmagnetized planets, but it may also result in enhanced reconnection at the Earth, increasing outflow and loss rates for the Earth as well. Continued improvement in our understanding of the different pathways for ionospheric and atmospheric loss will allow us to determine how effective an intrinsic planetary field is in preserving a planetary atmosphere, or if we have to look for other explanations as to why the atmospheres of Venus and Mars have evolved to their desiccated state.

  6. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  7. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  8. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  9. Kinematics of galactic planetary nebulae

    International Nuclear Information System (INIS)

    Kiosa, M.I.; Khromov, G.S.

    1979-01-01

    The classical method of determining the components of the solar motion relative to the centroid of the system of planetary nebulae with known radial velocities is investigated. It is shown that this method is insensitive to random errors in the radial velocities and that low accuracy in determining the coordinates of the solar apex and motion results from the insufficient number of planetaries with measured radial velocities. The planetary nebulae are found not to satisfy well the law of differential galactic rotation with circular orbits. This is attributed to the elongation of their galactic orbits. A method for obtaining the statistical parallax of planetary nebulae is considered, and the parallax calculated from the tau components of their proper motion is shown to be the most reliable

  10. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  11. Large-scale Flow and Transport of Magnetic Flux in the Solar ...

    Indian Academy of Sciences (India)

    tribpo

    Abstract. Horizontal large-scale velocity field describes horizontal displacement of the photospheric magnetic flux in zonal and meridian directions. The flow systems of solar plasma, constructed according to the velocity field, create the large-scale cellular-like patterns with up-flow in the center and the down-flow on the ...

  12. Improving accessibility and discovery of ESA planetary data through the new planetary science archive

    Science.gov (United States)

    Macfarlane, A. J.; Docasal, R.; Rios, C.; Barbarisi, I.; Saiz, J.; Vallejo, F.; Besse, S.; Arviset, C.; Barthelemy, M.; De Marchi, G.; Fraga, D.; Grotheer, E.; Heather, D.; Lim, T.; Martinez, S.; Vallat, C.

    2018-01-01

    The Planetary Science Archive (PSA) is the European Space Agency's (ESA) repository of science data from all planetary science and exploration missions. The PSA provides access to scientific data sets through various interfaces at http://psa.esa.int. Mostly driven by the evolution of the PDS standards which all new ESA planetary missions shall follow and the need to update the interfaces to the archive, the PSA has undergone an important re-engineering. In order to maximise the scientific exploitation of ESA's planetary data holdings, significant improvements have been made by utilising the latest technologies and implementing widely recognised open standards. To facilitate users in handling and visualising the many products stored in the archive which have spatial data associated, the new PSA supports Geographical Information Systems (GIS) by implementing the standards approved by the Open Geospatial Consortium (OGC). The modernised PSA also attempts to increase interoperability with the international community by implementing recognised planetary science specific protocols such as the PDAP (Planetary Data Access Protocol) and EPN-TAP (EuroPlanet-Table Access Protocol). In this paper we describe some of the methods by which the archive may be accessed and present the challenges that are being faced in consolidating data sets of the older PDS3 version of the standards with the new PDS4 deliveries into a single data model mapping to ensure transparent access to the data for users and services whilst maintaining a high performance.

  13. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  14. Large-scale modeling of rain fields from a rain cell deterministic model

    Science.gov (United States)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  15. Facile Large-scale synthesis of stable CuO nanoparticles

    Science.gov (United States)

    Nazari, P.; Abdollahi-Nejand, B.; Eskandari, M.; Kohnehpoushi, S.

    2018-04-01

    In this work, a novel approach in synthesizing the CuO nanoparticles was introduced. A sequential corrosion and detaching was proposed in the growth and dispersion of CuO nanoparticles in the optimum pH value of eight. The produced CuO nanoparticles showed six nm (±2 nm) in diameter and spherical feather with a high crystallinity and uniformity in size. In this method, a large-scale production of CuO nanoparticles (120 grams in an experimental batch) from Cu micro-particles was achieved which may met the market criteria for large-scale production of CuO nanoparticles.

  16. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    Science.gov (United States)

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...disadvantages of ML- Chord are its fixed size (two layers), and limited scala - bility for large-scale systems. RC-Chord extends ML- D. Karrels et al...configurable before runtime. This can be improved by incorporating a distributed learning algorithm to tune the number and range of the DLoE tracking

  17. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  18. Puzzles of large scale structure and gravitation

    International Nuclear Information System (INIS)

    Sidharth, B.G.

    2006-01-01

    We consider the puzzle of cosmic voids bounded by two-dimensional structures of galactic clusters as also a puzzle pointed out by Weinberg: How can the mass of a typical elementary particle depend on a cosmic parameter like the Hubble constant? An answer to the first puzzle is proposed in terms of 'Scaled' Quantum Mechanical like behaviour which appears at large scales. The second puzzle can be answered by showing that the gravitational mass of an elementary particle has a Machian character (see Ahmed N. Cantorian small worked, Mach's principle and the universal mass network. Chaos, Solitons and Fractals 2004;21(4))

  19. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  20. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  1. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  2. Psychological type and attitude toward Christianity: a replication.

    Science.gov (United States)

    Francis, Leslie J; Robbins, Mandy; Boxer, Anna; Lewis, Christopher Alan; McGuckin, Conor; McDaid, Charles J

    2003-02-01

    A sample of 149 university students completed the Francis Psychological Type Scales together with the Francis Scale of Attitude Toward Christianity. The data indicated that university students classified as Feeling Types hold a more positive attitude toward Christianity than those classified as Thinking Types. These findings replicate the 1999 report of Jones and Francis.

  3. In the Shadow of Coal: How Large-Scale Industries Contributed to Present-Day Regional Differences in Personality and Well-Being.

    Science.gov (United States)

    Obschonka, Martin; Stuetzer, Michael; Rentfrow, Peter J; Shaw-Taylor, Leigh; Satchell, Max; Silbereisen, Rainer K; Potter, Jeff; Gosling, Samuel D

    2017-11-20

    Recent research has identified regional variation of personality traits within countries but we know little about the underlying drivers of this variation. We propose that the Industrial Revolution, as a key era in the history of industrialized nations, has led to a persistent clustering of well-being outcomes and personality traits associated with psychological adversity via processes of selective migration and socialization. Analyzing data from England and Wales, we examine relationships between the historical employment share in large-scale coal-based industries (coal mining and steam-powered manufacturing industries that used this coal as fuel for their steam engines) and today's regional variation in personality and well-being. Even after controlling for possible historical confounds (historical energy supply, education, wealth, geology, climate, population density), we find that the historical local dominance of large-scale coal-based industries predicts today's markers of psychological adversity (lower Conscientiousness [and order facet scores], higher Neuroticism [and anxiety and depression facet scores], lower activity [an Extraversion facet], and lower life satisfaction and life expectancy). An instrumental variable analysis, using the historical location of coalfields, supports the causal assumption behind these effects (with the exception of life satisfaction). Further analyses focusing on mechanisms hint at the roles of selective migration and persisting economic hardship. Finally, a robustness check in the U.S. replicates the effect of the historical concentration of large-scale industries on today's levels of psychological adversity. Taken together, the results show how today's regional patterns of personality and well-being (which shape the future trajectories of these regions) may have their roots in major societal changes underway decades or centuries earlier. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  4. Large transverse momentum processes in a non-scaling parton model

    International Nuclear Information System (INIS)

    Stirling, W.J.

    1977-01-01

    The production of large transverse momentum mesons in hadronic collisions by the quark fusion mechanism is discussed in a parton model which gives logarithmic corrections to Bjorken scaling. It is found that the moments of the large transverse momentum structure function exhibit a simple scale breaking behaviour similar to the behaviour of the Drell-Yan and deep inelastic structure functions of the model. An estimate of corresponding experimental consequences is made and the extent to which analogous results can be expected in an asymptotically free gauge theory is discussed. A simple set of rules is presented for incorporating the logarithmic corrections to scaling into all covariant parton model calculations. (Auth.)

  5. On the Renormalization of the Effective Field Theory of Large Scale Structures

    OpenAIRE

    Pajer, Enrico; Zaldarriaga, Matias

    2013-01-01

    Standard perturbation theory (SPT) for large-scale matter inhomogeneities is unsatisfactory for at least three reasons: there is no clear expansion parameter since the density contrast is not small on all scales; it does not fully account for deviations at large scales from a perfect pressureless fluid induced by short-scale non-linearities; for generic initial conditions, loop corrections are UV-divergent, making predictions cutoff dependent and hence unphysical. The Effective Field Theory o...

  6. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    Science.gov (United States)

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Visualizing planetary data by using 3D engines

    Science.gov (United States)

    Elgner, S.; Adeli, S.; Gwinner, K.; Preusker, F.; Kersten, E.; Matz, K.-D.; Roatsch, T.; Jaumann, R.; Oberst, J.

    2017-09-01

    We examined 3D gaming engines for their usefulness in visualizing large planetary image data sets. These tools allow us to include recent developments in the field of computer graphics in our scientific visualization systems and present data products interactively and in higher quality than before. We started to set up the first applications which will take use of virtual reality (VR) equipment.

  8. Methods for Large-Scale Nonlinear Optimization.

    Science.gov (United States)

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  9. Generation of large-scale vorticity in rotating stratified turbulence with inhomogeneous helicity: mean-field theory

    Science.gov (United States)

    Kleeorin, N.

    2018-06-01

    We discuss a mean-field theory of the generation of large-scale vorticity in a rotating density stratified developed turbulence with inhomogeneous kinetic helicity. We show that the large-scale non-uniform flow is produced due to either a combined action of a density stratified rotating turbulence and uniform kinetic helicity or a combined effect of a rotating incompressible turbulence and inhomogeneous kinetic helicity. These effects result in the formation of a large-scale shear, and in turn its interaction with the small-scale turbulence causes an excitation of the large-scale instability (known as a vorticity dynamo) due to a combined effect of the large-scale shear and Reynolds stress-induced generation of the mean vorticity. The latter is due to the effect of large-scale shear on the Reynolds stress. A fast rotation suppresses this large-scale instability.

  10. Special issue on enabling open and interoperable access to Planetary Science and Heliophysics databases and tools

    Science.gov (United States)

    2018-01-01

    The large amount of data generated by modern space missions calls for a change of organization of data distribution and access procedures. Although long term archives exist for telescopic and space-borne observations, high-level functions need to be developed on top of these repositories to make Planetary Science and Heliophysics data more accessible and to favor interoperability. Results of simulations and reference laboratory data also need to be integrated to support and interpret the observations. Interoperable software and interfaces have recently been developed in many scientific domains. The Virtual Observatory (VO) interoperable standards developed for Astronomy by the International Virtual Observatory Alliance (IVOA) can be adapted to Planetary Sciences, as demonstrated by the VESPA (Virtual European Solar and Planetary Access) team within the Europlanet-H2020-RI project. Other communities have developed their own standards: GIS (Geographic Information System) for Earth and planetary surfaces tools, SPASE (Space Physics Archive Search and Extract) for space plasma, PDS4 (NASA Planetary Data System, version 4) and IPDA (International Planetary Data Alliance) for planetary mission archives, etc, and an effort to make them interoperable altogether is starting, including automated workflows to process related data from different sources.

  11. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    In foundries, violent explosions occur occasionally when molten metal comes into contact with water. If similar explosions can occur with other materials, hazardous situations may arise for example in LNG marine transportation accidents, or in liquid cooled reactor incidents when molten UO 2 contacts water or sodium coolant. Over the last 10 years a large body of experimental data has been obtained on the behaviour of small quantities of hot material in contact with a vaporisable coolant. Such experiments generally give low energy yields, despite producing fine fragmentation of the molten material. These events have been interpreted in terms of a wide range of phenomena such as violent boiling, liquid entrainment, bubble collapse, superheat, surface cracking and many others. Many of these studies have been aimed at understanding the small scale behaviour of the particular materials of interest. However, understanding the nature of the energetic events which were the original cause for concern may also be necessary to give confidence that violent events cannot occur for these materials in large scale situations. More recently, there has been a trend towards larger experiments and some of these have produced explosions of moderately high efficiency. Although occurrence of such large scale explosions can depend rather critically on initial conditions in a way which is not fully understood, there are signs that the interpretation of these events may be more straightforward than that of the single drop experiments. In the last two years several theoretical models for large scale explosions have appeared which attempt a self contained explanation of at least some stages of such high yield events: these have as their common feature a description of how a propagating breakdown of an initially quasi-stable distribution of materials is induced by the pressure and flow field caused by the energy release in adjacent regions. These models have led to the idea that for a full

  12. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  13. Temporal organization of cellular self-replication

    Science.gov (United States)

    Alexandrov, Victor; Pugatch, Rami

    Recent experiments demonstrate that single cells grow exponentially in time. A coarse grained model of cellular self-replication is presented based on a novel concept - the cell is viewed as a self-replicating queue. This allows to have a more fundamental look into various temporal organizations and, importantly, the inherent non-Markovianity of noise distributions. As an example, the distribution of doubling times can be inferred and compared to single cell experiments in bacteria. We observe data collapse upon scaling by the average doubling time for different environments and present an inherent task allocation trade-off. Support from the Simons Center for Systems Biology, IAS, Princeon.

  14. Large Scale GW Calculations on the Cori System

    Science.gov (United States)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  15. Cosmic ray acceleration by large scale galactic shocks

    International Nuclear Information System (INIS)

    Cesarsky, C.J.; Lagage, P.O.

    1987-01-01

    The mechanism of diffusive shock acceleration may account for the existence of galactic cosmic rays detailed application to stellar wind shocks and especially to supernova shocks have been developed. Existing models can usually deal with the energetics or the spectral slope, but the observed energy range of cosmic rays is not explained. Therefore it seems worthwhile to examine the effect that large scale, long-lived galactic shocks may have on galactic cosmic rays, in the frame of the diffusive shock acceleration mechanism. Large scale fast shocks can only be expected to exist in the galactic halo. We consider three situations where they may arise: expansion of a supernova shock in the halo, galactic wind, galactic infall; and discuss the possible existence of these shocks and their role in accelerating cosmic rays

  16. Lagrangian space consistency relation for large scale structure

    International Nuclear Information System (INIS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-01-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space

  17. Electron drift in a large scale solid xenon

    International Nuclear Information System (INIS)

    Yoo, J.; Jaskierny, W.F.

    2015-01-01

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Therefore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon

  18. Wind and Photovoltaic Large-Scale Regional Models for hourly production evaluation

    DEFF Research Database (Denmark)

    Marinelli, Mattia; Maule, Petr; Hahmann, Andrea N.

    2015-01-01

    This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesosca...... of the transmission system, especially regarding the cross-border power flows. The tuning of these regional models is done using historical meteorological data acquired on a per-country basis and using publicly available data of installed capacity.......This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesoscale...

  19. Large natural geophysical events: planetary planning

    International Nuclear Information System (INIS)

    Knox, J.B.; Smith, J.V.

    1984-09-01

    Geological and geophysical data suggest that during the evolution of the earth and its species, that there have been many mass extinctions due to large impacts from comets and large asteroids, and major volcanic events. Today, technology has developed to the stage where we can begin to consider protective measures for the planet. Evidence of the ecological disruption and frequency of these major events is presented. Surveillance and warning systems are most critical to develop wherein sufficient lead times for warnings exist so that appropriate interventions could be designed. The long term research undergirding these warning systems, implementation, and proof testing is rich in opportunities for collaboration for peace

  20. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    International Nuclear Information System (INIS)

    Jin Zhenxing; Wu Yong; Li Baizhan; Gao Yafeng

    2009-01-01

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China.

  1. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Zhenxing; Li, Baizhan; Gao, Yafeng [The Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China); Wu, Yong [The Department of Science and Technology, Ministry of Construction, Beijing 100835 (China)

    2009-06-15

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China. (author)

  2. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    Energy Technology Data Exchange (ETDEWEB)

    Jin Zhenxing [Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China)], E-mail: jinzhenxing33@sina.com; Wu Yong [Department of Science and Technology, Ministry of Construction, Beijing 100835 (China); Li Baizhan; Gao Yafeng [Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China)

    2009-06-15

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China.

  3. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  4. The Large-Scale Structure of Scientific Method

    Science.gov (United States)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  5. Detection of large-scale concentric gravity waves from a Chinese airglow imager network

    Science.gov (United States)

    Lai, Chang; Yue, Jia; Xu, Jiyao; Yuan, Wei; Li, Qinzeng; Liu, Xiao

    2018-06-01

    Concentric gravity waves (CGWs) contain a broad spectrum of horizontal wavelengths and periods due to their instantaneous localized sources (e.g., deep convection, volcanic eruptions, or earthquake, etc.). However, it is difficult to observe large-scale gravity waves of >100 km wavelength from the ground for the limited field of view of a single camera and local bad weather. Previously, complete large-scale CGW imagery could only be captured by satellite observations. In the present study, we developed a novel method that uses assembling separate images and applying low-pass filtering to obtain temporal and spatial information about complete large-scale CGWs from a network of all-sky airglow imagers. Coordinated observations from five all-sky airglow imagers in Northern China were assembled and processed to study large-scale CGWs over a wide area (1800 km × 1 400 km), focusing on the same two CGW events as Xu et al. (2015). Our algorithms yielded images of large-scale CGWs by filtering out the small-scale CGWs. The wavelengths, wave speeds, and periods of CGWs were measured from a sequence of consecutive assembled images. Overall, the assembling and low-pass filtering algorithms can expand the airglow imager network to its full capacity regarding the detection of large-scale gravity waves.

  6. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... Security can help increase accountability for large-scale land acquisitions in ... to build decent economic livelihoods and participate meaningfully in decisions ... its 2017 call for proposals to establish Cyber Policy Centres in the Global South.

  7. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Harrington, Kathleen; Marriange, Tobias; Aamir, Ali; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  8. Measuring the topology of large-scale structure in the universe

    Science.gov (United States)

    Gott, J. Richard, III

    1988-11-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.

  9. Measuring the topology of large-scale structure in the universe

    International Nuclear Information System (INIS)

    Gott, J.R. III

    1988-01-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data. 45 references

  10. How the Internet Will Help Large-Scale Assessment Reinvent Itself

    Directory of Open Access Journals (Sweden)

    Randy Elliot Bennett

    2001-02-01

    Full Text Available Large-scale assessment in the United States is undergoing enormous pressure to change. That pressure stems from many causes. Depending upon the type of test, the issues precipitating change include an outmoded cognitive-scientific basis for test design; a mismatch with curriculum; the differential performance of population groups; a lack of information to help individuals improve; and inefficiency. These issues provide a strong motivation to reconceptualize both the substance and the business of large-scale assessment. At the same time, advances in technology, measurement, and cognitive science are providing the means to make that reconceptualization a reality. The thesis of this paper is that the largest facilitating factor will be technological, in particular the Internet. In the same way that it is already helping to revolutionize commerce, education, and even social interaction, the Internet will help revolutionize the business and substance of large-scale assessment.

  11. Planetary mass function and planetary systems

    Science.gov (United States)

    Dominik, M.

    2011-02-01

    With planets orbiting stars, a planetary mass function should not be seen as a low-mass extension of the stellar mass function, but a proper formalism needs to take care of the fact that the statistical properties of planet populations are linked to the properties of their respective host stars. This can be accounted for by describing planet populations by means of a differential planetary mass-radius-orbit function, which together with the fraction of stars with given properties that are orbited by planets and the stellar mass function allows the derivation of all statistics for any considered sample. These fundamental functions provide a framework for comparing statistics that result from different observing techniques and campaigns which all have their very specific selection procedures and detection efficiencies. Moreover, recent results both from gravitational microlensing campaigns and radial-velocity surveys of stars indicate that planets tend to cluster in systems rather than being the lonely child of their respective parent star. While planetary multiplicity in an observed system becomes obvious with the detection of several planets, its quantitative assessment however comes with the challenge to exclude the presence of further planets. Current exoplanet samples begin to give us first hints at the population statistics, whereas pictures of planet parameter space in its full complexity call for samples that are 2-4 orders of magnitude larger. In order to derive meaningful statistics, however, planet detection campaigns need to be designed in such a way that well-defined fully deterministic target selection, monitoring and detection criteria are applied. The probabilistic nature of gravitational microlensing makes this technique an illustrative example of all the encountered challenges and uncertainties.

  12. Bayesian evaluation of effect size after replicating an original study

    NARCIS (Netherlands)

    Van Aert, Robbie C M; Van Assen, Marcel A.L.M.

    2017-01-01

    The vast majority of published results in the literature is statistically significant, which raises concerns about their reliability. The Reproducibility Project Psychology (RPP) and Experimental Economics Replication Project (EE-RP) both replicated a large number of published studies in psychology

  13. Distributional Replication

    OpenAIRE

    Beare, Brendan K.

    2009-01-01

    Suppose that X and Y are random variables. We define a replicating function to be a function f such that f(X) and Y have the same distribution. In general, the set of replicating functions for a given pair of random variables may be infinite. Suppose we have some objective function, or cost function, defined over the set of replicating functions, and we seek to estimate the replicating function with the lowest cost. We develop an approach to estimating the cheapest replicating function that i...

  14. Contribution of large scale coherence to wind turbine power: A large eddy simulation study in periodic wind farms

    Science.gov (United States)

    Chatterjee, Tanmoy; Peet, Yulia T.

    2018-03-01

    Length scales of eddies involved in the power generation of infinite wind farms are studied by analyzing the spectra of the turbulent flux of mean kinetic energy (MKE) from large eddy simulations (LES). Large-scale structures with an order of magnitude bigger than the turbine rotor diameter (D ) are shown to have substantial contribution to wind power. Varying dynamics in the intermediate scales (D -10 D ) are also observed from a parametric study involving interturbine distances and hub height of the turbines. Further insight about the eddies responsible for the power generation have been provided from the scaling analysis of two-dimensional premultiplied spectra of MKE flux. The LES code is developed in a high Reynolds number near-wall modeling framework, using an open-source spectral element code Nek5000, and the wind turbines have been modelled using a state-of-the-art actuator line model. The LES of infinite wind farms have been validated against the statistical results from the previous literature. The study is expected to improve our understanding of the complex multiscale dynamics in the domain of large wind farms and identify the length scales that contribute to the power. This information can be useful for design of wind farm layout and turbine placement that take advantage of the large-scale structures contributing to wind turbine power.

  15. uvsF RFC1, the large subunit of replication factor C in Aspergillus nidulans, is essential for DNA replication, functions in UV repair and is upregulated in response to MMS-induced DNA damage.

    Science.gov (United States)

    Kafer, Etta; Chae, Suhn-Kee

    2008-09-01

    uvsF201 was the first highly UV-sensitive repair-defective mutation isolated in Aspergillus nidulans. It showed epistasis only with postreplication repair mutations, but caused lethal interactions with many other repair-defective strains. Unexpectedly, closest homology of uvsF was found to the large subunit of human DNA replication factor RFC that is essential for DNA replication. Sequencing of the uvsF201 region identified changes at two close base pairs and the corresponding amino acids in the 5'-region of uvsF(RFC1). This viable mutant represents a novel and possibly important type. Additional sequencing of the uvsF region confirmed a mitochondrial ribosomal protein gene, mrpA(L16), closely adjacent, head-to-head with a 0.2kb joint promoter region. MMS-induced transcription of both the genes, but especially uvsF(RFC1), providing evidence for a function in DNA damage response.

  16. Global profiling of DNA replication timing and efficiency reveals that efficient replication/firing occurs late during S-phase in S. pombe.

    Directory of Open Access Journals (Sweden)

    Majid Eshaghi

    Full Text Available BACKGROUND: During S. pombe S-phase, initiation of DNA replication occurs at multiple sites (origins that are enriched with AT-rich sequences, at various times. Current studies of genome-wide DNA replication profiles have focused on the DNA replication timing and origin location. However, the replication and/or firing efficiency of the individual origins on the genomic scale remain unclear. METHODOLOGY/PRINCIPAL FINDINGS: Using the genome-wide ORF-specific DNA microarray analysis, we show that in S. pombe, individual origins fire with varying efficiencies and at different times during S-phase. The increase in DNA copy number plotted as a function of time is approximated to the near-sigmoidal model, when considering the replication start and end timings at individual loci in cells released from HU-arrest. Replication efficiencies differ from origin to origin, depending on the origin's firing efficiency. We have found that DNA replication is inefficient early in S-phase, due to inefficient firing at origins. Efficient replication occurs later, attributed to efficient but late-firing origins. Furthermore, profiles of replication timing in cds1Delta cells are abnormal, due to the failure in resuming replication at the collapsed forks. The majority of the inefficient origins, but not the efficient ones, are found to fire in cds1Delta cells after HU removal, owing to the firing at the remaining unused (inefficient origins during HU treatment. CONCLUSIONS/SIGNIFICANCE: Taken together, our results indicate that efficient DNA replication/firing occurs late in S-phase progression in cells after HU removal, due to efficient late-firing origins. Additionally, checkpoint kinase Cds1p is required for maintaining the efficient replication/firing late in S-phase. We further propose that efficient late-firing origins are essential for ensuring completion of DNA duplication by the end of S-phase.

  17. Middle Atmosphere Dynamics with Gravity Wave Interactions in the Numerical Spectral Model: Tides and Planetary Waves

    Science.gov (United States)

    Mayr, Hans G.; Mengel, J. G.; Chan, K. L.; Huang, F. T.

    2010-01-01

    As Lindzen (1981) had shown, small-scale gravity waves (GW) produce the observed reversals of the zonal-mean circulation and temperature variations in the upper mesosphere. The waves also play a major role in modulating and amplifying the diurnal tides (DT) (e.g., Waltersheid, 1981; Fritts and Vincent, 1987; Fritts, 1995a). We summarize here the modeling studies with the mechanistic numerical spectral model (NSM) with Doppler spread parameterization for GW (Hines, 1997a, b), which describes in the middle atmosphere: (a) migrating and non-migrating DT, (b) planetary waves (PW), and (c) global-scale inertio gravity waves. Numerical experiments are discussed that illuminate the influence of GW filtering and nonlinear interactions between DT, PW, and zonal mean variations. Keywords: Theoretical modeling, Middle atmosphere dynamics, Gravity wave interactions, Migrating and non-migrating tides, Planetary waves, Global-scale inertio gravity waves.

  18. Chromatin Immunoprecipitation of Replication Factors Moving with the Replication Fork

    OpenAIRE

    Rapp, Jordan B.; Ansbach, Alison B.; Noguchi, Chiaki; Noguchi, Eishi

    2009-01-01

    Replication of chromosomes involves a variety of replication proteins including DNA polymerases, DNA helicases, and other accessory factors. Many of these proteins are known to localize at replication forks and travel with them as components of the replisome complex. Other proteins do not move with replication forks but still play an essential role in DNA replication. Therefore, in order to understand the mechanisms of DNA replication and its controls, it is important to examine localization ...

  19. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    fuel/energy, climate, and finance has occurred and one of the most ... this wave of large-scale land acquisitions. In fact, esti- ... Environmental Rights Action/Friends of the Earth,. Nigeria ... map the differentiated impacts (gender, ethnicity,.

  20. Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows

    Science.gov (United States)

    Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel

    2017-11-01

    We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.