WorldWideScience

Sample records for large scale population

  1. Large Scale Flood Risk Analysis using a New Hyper-resolution Population Dataset

    Science.gov (United States)

    Smith, A.; Neal, J. C.; Bates, P. D.; Quinn, N.; Wing, O.

    2017-12-01

    Here we present the first national scale flood risk analyses, using high resolution Facebook Connectivity Lab population data and data from a hyper resolution flood hazard model. In recent years the field of large scale hydraulic modelling has been transformed by new remotely sensed datasets, improved process representation, highly efficient flow algorithms and increases in computational power. These developments have allowed flood risk analysis to be undertaken in previously unmodeled territories and from continental to global scales. Flood risk analyses are typically conducted via the integration of modelled water depths with an exposure dataset. Over large scales and in data poor areas, these exposure data typically take the form of a gridded population dataset, estimating population density using remotely sensed data and/or locally available census data. The local nature of flooding dictates that for robust flood risk analysis to be undertaken both hazard and exposure data should sufficiently resolve local scale features. Global flood frameworks are enabling flood hazard data to produced at 90m resolution, resulting in a mis-match with available population datasets which are typically more coarsely resolved. Moreover, these exposure data are typically focused on urban areas and struggle to represent rural populations. In this study we integrate a new population dataset with a global flood hazard model. The population dataset was produced by the Connectivity Lab at Facebook, providing gridded population data at 5m resolution, representing a resolution increase over previous countrywide data sets of multiple orders of magnitude. Flood risk analysis undertaken over a number of developing countries are presented, along with a comparison of flood risk analyses undertaken using pre-existing population datasets.

  2. Ethics of large-scale change

    OpenAIRE

    Arler, Finn

    2006-01-01

      The subject of this paper is long-term large-scale changes in human society. Some very significant examples of large-scale change are presented: human population growth, human appropriation of land and primary production, the human use of fossil fuels, and climate change. The question is posed, which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, th...

  3. Coverage of the migrant population in large-scale assessment surveys. Experiences from PIAAC in Germany

    Directory of Open Access Journals (Sweden)

    Débora B. Maehler

    2017-03-01

    Full Text Available Abstract Background European countries, and especially Germany, are currently very much affected by human migration flows, with the result that the task of integration has become a challenge. Only very little empirical evidence on topics such as labor market participation and processes of social integration of migrant subpopulations is available to date from large-scale population surveys. The present paper provides an overview of the representation of the migrant population in the German Programme for the International Assessment of Adult Competencies (PIAAC sample and evaluates reasons for the under-coverage of this population. Methods We examine outcome rates and reasons for nonresponse among the migrant population based on sampling frame data, and we also examine para data from the interviewers’ contact protocols to evaluate time patterns for the successful contacting of migrants. Results and Conclusions This is the first time that results of this kind have been presented for a large-scale assessment in educational research. These results are also discussed in the context of future PIAAC cycles. Overall, they confirm the expectations in the literature that factors such as language problems result in lower contact and response rates among migrants.

  4. Transfrontier consequences to the population of Greece of large scale nuclear accidents: a preliminary assessment

    International Nuclear Information System (INIS)

    Kollas, J.G.; Catsaros, Nicolas.

    1985-06-01

    In this report the consequences to the population of Greece from hypothetical large scale nuclear accidents at the Kozlodui (Bulgaria) nuclear power station are estimated under some simplifying assumptions. Three different hypothetical accident scenarios - the most serious for pressurized water reactors - are examined. The analysis is performed by the current Greek version of code CRAC2 and includes health and economic consequences to the population of Greece. (author)

  5. Measuring happiness in large population

    Science.gov (United States)

    Wenas, Annabelle; Sjahputri, Smita; Takwin, Bagus; Primaldhi, Alfindra; Muhamad, Roby

    2016-01-01

    The ability to know emotional states for large number of people is important, for example, to ensure the effectiveness of public policies. In this study, we propose a measure of happiness that can be used in large scale population that is based on the analysis of Indonesian language lexicons. Here, we incorporate human assessment of Indonesian words, then quantify happiness on large-scale of texts gathered from twitter conversations. We used two psychological constructs to measure happiness: valence and arousal. We found that Indonesian words have tendency towards positive emotions. We also identified several happiness patterns during days of the week, hours of the day, and selected conversation topics.

  6. Macroecological factors explain large-scale spatial population patterns of ancient agriculturalists

    NARCIS (Netherlands)

    Xu, C.; Chen, B.; Abades, S.; Reino, L.; Teng, S.; Ljungqvist, F.C.; Huang, Z.Y.X.; Liu, X.

    2015-01-01

    Aim: It has been well demonstrated that the large-scale distribution patterns of numerous species are driven by similar macroecological factors. However, understanding of this topic remains limited when applied to our own species. Here we take a large-scale look at ancient agriculturalist

  7. Large-scale control site selection for population monitoring: an example assessing Sage-grouse trends

    Science.gov (United States)

    Fedy, Bradley C.; O'Donnell, Michael; Bowen, Zachary H.

    2015-01-01

    Human impacts on wildlife populations are widespread and prolific and understanding wildlife responses to human impacts is a fundamental component of wildlife management. The first step to understanding wildlife responses is the documentation of changes in wildlife population parameters, such as population size. Meaningful assessment of population changes in potentially impacted sites requires the establishment of monitoring at similar, nonimpacted, control sites. However, it is often difficult to identify appropriate control sites in wildlife populations. We demonstrated use of Geographic Information System (GIS) data across large spatial scales to select biologically relevant control sites for population monitoring. Greater sage-grouse (Centrocercus urophasianus; hearafter, sage-grouse) are negatively affected by energy development, and monitoring of sage-grouse population within energy development areas is necessary to detect population-level responses. Weused population data (1995–2012) from an energy development area in Wyoming, USA, the Atlantic Rim Project Area (ARPA), and GIS data to identify control sites that were not impacted by energy development for population monitoring. Control sites were surrounded by similar habitat and were within similar climate areas to the ARPA. We developed nonlinear trend models for both the ARPA and control sites and compared long-term trends from the 2 areas. We found little difference between the ARPA and control sites trends over time. This research demonstrated an approach for control site selection across large landscapes and can be used as a template for similar impact-monitoring studies. It is important to note that identification of changes in population parameters between control and treatment sites is only the first step in understanding the mechanisms that underlie those changes. Published 2015. This article is a U.S. Government work and is in the public domain in the USA.

  8. Analysis using large-scale ringing data

    Directory of Open Access Journals (Sweden)

    Baillie, S. R.

    2004-06-01

    Full Text Available Birds are highly mobile organisms and there is increasing evidence that studies at large spatial scales are needed if we are to properly understand their population dynamics. While classical metapopulation models have rarely proved useful for birds, more general metapopulation ideas involving collections of populations interacting within spatially structured landscapes are highly relevant (Harrison, 1994. There is increasing interest in understanding patterns of synchrony, or lack of synchrony, between populations and the environmental and dispersal mechanisms that bring about these patterns (Paradis et al., 2000. To investigate these processes we need to measure abundance, demographic rates and dispersal at large spatial scales, in addition to gathering data on relevant environmental variables. There is an increasing realisation that conservation needs to address rapid declines of common and widespread species (they will not remain so if such trends continue as well as the management of small populations that are at risk of extinction. While the knowledge needed to support the management of small populations can often be obtained from intensive studies in a few restricted areas, conservation of widespread species often requires information on population trends and processes measured at regional, national and continental scales (Baillie, 2001. While management prescriptions for widespread populations may initially be developed from a small number of local studies or experiments, there is an increasing need to understand how such results will scale up when applied across wider areas. There is also a vital role for monitoring at large spatial scales both in identifying such population declines and in assessing population recovery. Gathering data on avian abundance and demography at large spatial scales usually relies on the efforts of large numbers of skilled volunteers. Volunteer studies based on ringing (for example Constant Effort Sites [CES

  9. Response of human populations to large-scale emergencies

    Science.gov (United States)

    Bagrow, James; Wang, Dashun; Barabási, Albert-László

    2010-03-01

    Until recently, little quantitative data regarding collective human behavior during dangerous events such as bombings and riots have been available, despite its importance for emergency management, safety and urban planning. Understanding how populations react to danger is critical for prediction, detection and intervention strategies. Using a large telecommunications dataset, we study for the first time the spatiotemporal, social and demographic response properties of people during several disasters, including a bombing, a city-wide power outage, and an earthquake. Call activity rapidly increases after an event and we find that, when faced with a truly life-threatening emergency, information rapidly propagates through a population's social network. Other events, such as sports games, do not exhibit this propagation.

  10. Homogenization of Large-Scale Movement Models in Ecology

    Science.gov (United States)

    Garlick, M.J.; Powell, J.A.; Hooten, M.B.; McFarlane, L.R.

    2011-01-01

    A difficulty in using diffusion models to predict large scale animal population dispersal is that individuals move differently based on local information (as opposed to gradients) in differing habitat types. This can be accommodated by using ecological diffusion. However, real environments are often spatially complex, limiting application of a direct approach. Homogenization for partial differential equations has long been applied to Fickian diffusion (in which average individual movement is organized along gradients of habitat and population density). We derive a homogenization procedure for ecological diffusion and apply it to a simple model for chronic wasting disease in mule deer. Homogenization allows us to determine the impact of small scale (10-100 m) habitat variability on large scale (10-100 km) movement. The procedure generates asymptotic equations for solutions on the large scale with parameters defined by small-scale variation. The simplicity of this homogenization procedure is striking when compared to the multi-dimensional homogenization procedure for Fickian diffusion,and the method will be equally straightforward for more complex models. ?? 2010 Society for Mathematical Biology.

  11. Detecting differential protein expression in large-scale population proteomics

    Energy Technology Data Exchange (ETDEWEB)

    Ryu, Soyoung; Qian, Weijun; Camp, David G.; Smith, Richard D.; Tompkins, Ronald G.; Davis, Ronald W.; Xiao, Wenzhong

    2014-06-17

    Mass spectrometry-based high-throughput quantitative proteomics shows great potential in clinical biomarker studies, identifying and quantifying thousands of proteins in biological samples. However, methods are needed to appropriately handle issues/challenges unique to mass spectrometry data in order to detect as many biomarker proteins as possible. One issue is that different mass spectrometry experiments generate quite different total numbers of quantified peptides, which can result in more missing peptide abundances in an experiment with a smaller total number of quantified peptides. Another issue is that the quantification of peptides is sometimes absent, especially for less abundant peptides and such missing values contain the information about the peptide abundance. Here, we propose a Significance Analysis for Large-scale Proteomics Studies (SALPS) that handles missing peptide intensity values caused by the two mechanisms mentioned above. Our model has a robust performance in both simulated data and proteomics data from a large clinical study. Because varying patients’ sample qualities and deviating instrument performances are not avoidable for clinical studies performed over the course of several years, we believe that our approach will be useful to analyze large-scale clinical proteomics data.

  12. Understanding uncertainties in non-linear population trajectories: a Bayesian semi-parametric hierarchical approach to large-scale surveys of coral cover.

    Directory of Open Access Journals (Sweden)

    Julie Vercelloni

    Full Text Available Recently, attempts to improve decision making in species management have focussed on uncertainties associated with modelling temporal fluctuations in populations. Reducing model uncertainty is challenging; while larger samples improve estimation of species trajectories and reduce statistical errors, they typically amplify variability in observed trajectories. In particular, traditional modelling approaches aimed at estimating population trajectories usually do not account well for nonlinearities and uncertainties associated with multi-scale observations characteristic of large spatio-temporal surveys. We present a Bayesian semi-parametric hierarchical model for simultaneously quantifying uncertainties associated with model structure and parameters, and scale-specific variability over time. We estimate uncertainty across a four-tiered spatial hierarchy of coral cover from the Great Barrier Reef. Coral variability is well described; however, our results show that, in the absence of additional model specifications, conclusions regarding coral trajectories become highly uncertain when considering multiple reefs, suggesting that management should focus more at the scale of individual reefs. The approach presented facilitates the description and estimation of population trajectories and associated uncertainties when variability cannot be attributed to specific causes and origins. We argue that our model can unlock value contained in large-scale datasets, provide guidance for understanding sources of uncertainty, and support better informed decision making.

  13. Effects of Large-Scale Releases on the Genetic Structure of Red Sea Bream (Pagrus major, Temminck et Schlegel) Populations in Japan.

    Science.gov (United States)

    Blanco Gonzalez, Enrique; Aritaki, Masato; Knutsen, Halvor; Taniguchi, Nobuhiko

    2015-01-01

    Large-scale hatchery releases are carried out for many marine fish species worldwide; nevertheless, the long-term effects of this practice on the genetic structure of natural populations remains unclear. The lack of knowledge is especially evident when independent stock enhancement programs are conducted simultaneously on the same species at different geographical locations, as occurs with red sea bream (Pagrus major, Temminck et Schlegel) in Japan. In this study, we examined the putative effects of intensive offspring releases on the genetic structure of red sea bream populations along the Japanese archipelago by genotyping 848 fish at fifteen microsatellite loci. Our results suggests weak but consistent patterns of genetic divergence (F(ST) = 0.002, p Red sea bream in Japan appeared spatially structured with several patches of distinct allelic composition, which corresponded to areas receiving an important influx of fish of hatchery origin, either released intentionally or from unintentional escapees from aquaculture operations. In addition to impacts upon local populations inhabiting semi-enclosed embayments, large-scale releases (either intentionally or from unintentional escapes) appeared also to have perturbed genetic structure in open areas. Hence, results of the present study suggest that independent large-scale marine stock enhancement programs conducted simultaneously on one species at different geographical locations may compromise native genetic structure and lead to patchy patterns in population genetic structure.

  14. Large scale electrolysers

    International Nuclear Information System (INIS)

    B Bello; M Junker

    2006-01-01

    Hydrogen production by water electrolysis represents nearly 4 % of the world hydrogen production. Future development of hydrogen vehicles will require large quantities of hydrogen. Installation of large scale hydrogen production plants will be needed. In this context, development of low cost large scale electrolysers that could use 'clean power' seems necessary. ALPHEA HYDROGEN, an European network and center of expertise on hydrogen and fuel cells, has performed for its members a study in 2005 to evaluate the potential of large scale electrolysers to produce hydrogen in the future. The different electrolysis technologies were compared. Then, a state of art of the electrolysis modules currently available was made. A review of the large scale electrolysis plants that have been installed in the world was also realized. The main projects related to large scale electrolysis were also listed. Economy of large scale electrolysers has been discussed. The influence of energy prices on the hydrogen production cost by large scale electrolysis was evaluated. (authors)

  15. Large-scale climatic anomalies affect marine predator foraging behaviour and demography

    Science.gov (United States)

    Bost, Charles A.; Cotté, Cedric; Terray, Pascal; Barbraud, Christophe; Bon, Cécile; Delord, Karine; Gimenez, Olivier; Handrich, Yves; Naito, Yasuhiko; Guinet, Christophe; Weimerskirch, Henri

    2015-10-01

    Determining the links between the behavioural and population responses of wild species to environmental variations is critical for understanding the impact of climate variability on ecosystems. Using long-term data sets, we show how large-scale climatic anomalies in the Southern Hemisphere affect the foraging behaviour and population dynamics of a key marine predator, the king penguin. When large-scale subtropical dipole events occur simultaneously in both subtropical Southern Indian and Atlantic Oceans, they generate tropical anomalies that shift the foraging zone southward. Consequently the distances that penguins foraged from the colony and their feeding depths increased and the population size decreased. This represents an example of a robust and fast impact of large-scale climatic anomalies affecting a marine predator through changes in its at-sea behaviour and demography, despite lack of information on prey availability. Our results highlight a possible behavioural mechanism through which climate variability may affect population processes.

  16. Large-scale Reconstructions and Independent, Unbiased Clustering Based on Morphological Metrics to Classify Neurons in Selective Populations.

    Science.gov (United States)

    Bragg, Elise M; Briggs, Farran

    2017-02-15

    This protocol outlines large-scale reconstructions of neurons combined with the use of independent and unbiased clustering analyses to create a comprehensive survey of the morphological characteristics observed among a selective neuronal population. Combination of these techniques constitutes a novel approach for the collection and analysis of neuroanatomical data. Together, these techniques enable large-scale, and therefore more comprehensive, sampling of selective neuronal populations and establish unbiased quantitative methods for describing morphologically unique neuronal classes within a population. The protocol outlines the use of modified rabies virus to selectively label neurons. G-deleted rabies virus acts like a retrograde tracer following stereotaxic injection into a target brain structure of interest and serves as a vehicle for the delivery and expression of EGFP in neurons. Large numbers of neurons are infected using this technique and express GFP throughout their dendrites, producing "Golgi-like" complete fills of individual neurons. Accordingly, the virus-mediated retrograde tracing method improves upon traditional dye-based retrograde tracing techniques by producing complete intracellular fills. Individual well-isolated neurons spanning all regions of the brain area under study are selected for reconstruction in order to obtain a representative sample of neurons. The protocol outlines procedures to reconstruct cell bodies and complete dendritic arborization patterns of labeled neurons spanning multiple tissue sections. Morphological data, including positions of each neuron within the brain structure, are extracted for further analysis. Standard programming functions were utilized to perform independent cluster analyses and cluster evaluations based on morphological metrics. To verify the utility of these analyses, statistical evaluation of a cluster analysis performed on 160 neurons reconstructed in the thalamic reticular nucleus of the thalamus

  17. Worldwide large-scale fluctuations of sardine and anchovy ...

    African Journals Online (AJOL)

    Worldwide large-scale fluctuations of sardine and anchovy populations. ... African Journal of Marine Science. Journal Home · ABOUT THIS JOURNAL · Advanced ... Fullscreen Fullscreen Off. http://dx.doi.org/10.2989/AJMS.2008.30.1.13.463.

  18. Worldwide large-scale fluctuations of sardine and anchovy ...

    African Journals Online (AJOL)

    Worldwide large-scale fluctuations of sardine and anchovy populations. ... African Journal of Marine Science. Journal Home · ABOUT THIS JOURNAL · Advanced ... http://dx.doi.org/10.2989/AJMS.2008.30.1.13.463 · AJOL African Journals ...

  19. Did Large-Scale Vaccination Drive Changes in the Circulating Rotavirus Population in Belgium?

    Science.gov (United States)

    Pitzer, Virginia E.; Bilcke, Joke; Heylen, Elisabeth; Crawford, Forrest W.; Callens, Michael; De Smet, Frank; Van Ranst, Marc; Zeller, Mark; Matthijnssens, Jelle

    2015-01-01

    Vaccination can place selective pressures on viral populations, leading to changes in the distribution of strains as viruses evolve to escape immunity from the vaccine. Vaccine-driven strain replacement is a major concern after nationwide rotavirus vaccine introductions. However, the distribution of the predominant rotavirus genotypes varies from year to year in the absence of vaccination, making it difficult to determine what changes can be attributed to the vaccines. To gain insight in the underlying dynamics driving changes in the rotavirus population, we fitted a hierarchy of mathematical models to national and local genotype-specific hospitalization data from Belgium, where large-scale vaccination was introduced in 2006. We estimated that natural- and vaccine-derived immunity was strongest against completely homotypic strains and weakest against fully heterotypic strains, with an intermediate immunity amongst partially heterotypic strains. The predominance of G2P[4] infections in Belgium after vaccine introduction can be explained by a combination of natural genotype fluctuations and weaker natural and vaccine-induced immunity against infection with strains heterotypic to the vaccine, in the absence of significant variation in strain-specific vaccine effectiveness against disease. However, the incidence of rotavirus gastroenteritis is predicted to remain low despite vaccine-driven changes in the distribution of genotypes. PMID:26687288

  20. Large scale study of tooth enamel

    International Nuclear Information System (INIS)

    Bodart, F.; Deconninck, G.; Martin, M.T.

    Human tooth enamel contains traces of foreign elements. The presence of these elements is related to the history and the environment of the human body and can be considered as the signature of perturbations which occur during the growth of a tooth. A map of the distribution of these traces on a large scale sample of the population will constitute a reference for further investigations of environmental effects. On hundred eighty samples of teeth were first analyzed using PIXE, backscattering and nuclear reaction techniques. The results were analyzed using statistical methods. Correlations between O, F, Na, P, Ca, Mn, Fe, Cu, Zn, Pb and Sr were observed and cluster analysis was in progress. The techniques described in the present work have been developed in order to establish a method for the exploration of very large samples of the Belgian population. (author)

  1. WKB theory of large deviations in stochastic populations

    Science.gov (United States)

    Assaf, Michael; Meerson, Baruch

    2017-06-01

    Stochasticity can play an important role in the dynamics of biologically relevant populations. These span a broad range of scales: from intra-cellular populations of molecules to population of cells and then to groups of plants, animals and people. Large deviations in stochastic population dynamics—such as those determining population extinction, fixation or switching between different states—are presently in a focus of attention of statistical physicists. We review recent progress in applying different variants of dissipative WKB approximation (after Wentzel, Kramers and Brillouin) to this class of problems. The WKB approximation allows one to evaluate the mean time and/or probability of population extinction, fixation and switches resulting from either intrinsic (demographic) noise, or a combination of the demographic noise and environmental variations, deterministic or random. We mostly cover well-mixed populations, single and multiple, but also briefly consider populations on heterogeneous networks and spatial populations. The spatial setting also allows one to study large fluctuations of the speed of biological invasions. Finally, we briefly discuss possible directions of future work.

  2. WKB theory of large deviations in stochastic populations

    International Nuclear Information System (INIS)

    Assaf, Michael; Meerson, Baruch

    2017-01-01

    Stochasticity can play an important role in the dynamics of biologically relevant populations. These span a broad range of scales: from intra-cellular populations of molecules to population of cells and then to groups of plants, animals and people. Large deviations in stochastic population dynamics—such as those determining population extinction, fixation or switching between different states—are presently in a focus of attention of statistical physicists. We review recent progress in applying different variants of dissipative WKB approximation (after Wentzel, Kramers and Brillouin) to this class of problems. The WKB approximation allows one to evaluate the mean time and/or probability of population extinction, fixation and switches resulting from either intrinsic (demographic) noise, or a combination of the demographic noise and environmental variations, deterministic or random. We mostly cover well-mixed populations, single and multiple, but also briefly consider populations on heterogeneous networks and spatial populations. The spatial setting also allows one to study large fluctuations of the speed of biological invasions. Finally, we briefly discuss possible directions of future work. (topical review)

  3. Neutrinos and large-scale structure

    International Nuclear Information System (INIS)

    Eisenstein, Daniel J.

    2015-01-01

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos

  4. Neutrinos and large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Eisenstein, Daniel J. [Daniel J. Eisenstein, Harvard-Smithsonian Center for Astrophysics, 60 Garden St., MS #20, Cambridge, MA 02138 (United States)

    2015-07-15

    I review the use of cosmological large-scale structure to measure properties of neutrinos and other relic populations of light relativistic particles. With experiments to measure the anisotropies of the cosmic microwave anisotropies and the clustering of matter at low redshift, we now have securely measured a relativistic background with density appropriate to the cosmic neutrino background. Our limits on the mass of the neutrino continue to shrink. Experiments coming in the next decade will greatly improve the available precision on searches for the energy density of novel relativistic backgrounds and the mass of neutrinos.

  5. Large-scale solar purchasing

    International Nuclear Information System (INIS)

    1999-01-01

    The principal objective of the project was to participate in the definition of a new IEA task concerning solar procurement (''the Task'') and to assess whether involvement in the task would be in the interest of the UK active solar heating industry. The project also aimed to assess the importance of large scale solar purchasing to UK active solar heating market development and to evaluate the level of interest in large scale solar purchasing amongst potential large scale purchasers (in particular housing associations and housing developers). A further aim of the project was to consider means of stimulating large scale active solar heating purchasing activity within the UK. (author)

  6. Extension of landscape-based population viability models to ecoregional scales for conservation planning

    Science.gov (United States)

    Thomas W. Bonnot; Frank R. III Thompson; Joshua Millspaugh

    2011-01-01

    Landscape-based population models are potentially valuable tools in facilitating conservation planning and actions at large scales. However, such models have rarely been applied at ecoregional scales. We extended landscape-based population models to ecoregional scales for three species of concern in the Central Hardwoods Bird Conservation Region and compared model...

  7. The Feasibility of Using Large-Scale Text Mining to Detect Adverse Childhood Experiences in a VA-Treated Population.

    Science.gov (United States)

    Hammond, Kenric W; Ben-Ari, Alon Y; Laundry, Ryan J; Boyko, Edward J; Samore, Matthew H

    2015-12-01

    Free text in electronic health records resists large-scale analysis. Text records facts of interest not found in encoded data, and text mining enables their retrieval and quantification. The U.S. Department of Veterans Affairs (VA) clinical data repository affords an opportunity to apply text-mining methodology to study clinical questions in large populations. To assess the feasibility of text mining, investigation of the relationship between exposure to adverse childhood experiences (ACEs) and recorded diagnoses was conducted among all VA-treated Gulf war veterans, utilizing all progress notes recorded from 2000-2011. Text processing extracted ACE exposures recorded among 44.7 million clinical notes belonging to 243,973 veterans. The relationship of ACE exposure to adult illnesses was analyzed using logistic regression. Bias considerations were assessed. ACE score was strongly associated with suicide attempts and serious mental disorders (ORs = 1.84 to 1.97), and less so with behaviorally mediated and somatic conditions (ORs = 1.02 to 1.36) per unit. Bias adjustments did not remove persistent associations between ACE score and most illnesses. Text mining to detect ACE exposure in a large population was feasible. Analysis of the relationship between ACE score and adult health conditions yielded patterns of association consistent with prior research. Copyright © 2015 International Society for Traumatic Stress Studies.

  8. Energy transfers in large-scale and small-scale dynamos

    Science.gov (United States)

    Samtaney, Ravi; Kumar, Rohit; Verma, Mahendra

    2015-11-01

    We present the energy transfers, mainly energy fluxes and shell-to-shell energy transfers in small-scale dynamo (SSD) and large-scale dynamo (LSD) using numerical simulations of MHD turbulence for Pm = 20 (SSD) and for Pm = 0.2 on 10243 grid. For SSD, we demonstrate that the magnetic energy growth is caused by nonlocal energy transfers from the large-scale or forcing-scale velocity field to small-scale magnetic field. The peak of these energy transfers move towards lower wavenumbers as dynamo evolves, which is the reason for the growth of the magnetic fields at the large scales. The energy transfers U2U (velocity to velocity) and B2B (magnetic to magnetic) are forward and local. For LSD, we show that the magnetic energy growth takes place via energy transfers from large-scale velocity field to large-scale magnetic field. We observe forward U2U and B2B energy flux, similar to SSD.

  9. Foundational perspectives on causality in large-scale brain networks

    Science.gov (United States)

    Mannino, Michael; Bressler, Steven L.

    2015-12-01

    A profusion of recent work in cognitive neuroscience has been concerned with the endeavor to uncover causal influences in large-scale brain networks. However, despite the fact that many papers give a nod to the important theoretical challenges posed by the concept of causality, this explosion of research has generally not been accompanied by a rigorous conceptual analysis of the nature of causality in the brain. This review provides both a descriptive and prescriptive account of the nature of causality as found within and between large-scale brain networks. In short, it seeks to clarify the concept of causality in large-scale brain networks both philosophically and scientifically. This is accomplished by briefly reviewing the rich philosophical history of work on causality, especially focusing on contributions by David Hume, Immanuel Kant, Bertrand Russell, and Christopher Hitchcock. We go on to discuss the impact that various interpretations of modern physics have had on our understanding of causality. Throughout all this, a central focus is the distinction between theories of deterministic causality (DC), whereby causes uniquely determine their effects, and probabilistic causality (PC), whereby causes change the probability of occurrence of their effects. We argue that, given the topological complexity of its large-scale connectivity, the brain should be considered as a complex system and its causal influences treated as probabilistic in nature. We conclude that PC is well suited for explaining causality in the brain for three reasons: (1) brain causality is often mutual; (2) connectional convergence dictates that only rarely is the activity of one neuronal population uniquely determined by another one; and (3) the causal influences exerted between neuronal populations may not have observable effects. A number of different techniques are currently available to characterize causal influence in the brain. Typically, these techniques quantify the statistical

  10. Large-scale data analytics

    CERN Document Server

    Gkoulalas-Divanis, Aris

    2014-01-01

    Provides cutting-edge research in large-scale data analytics from diverse scientific areas Surveys varied subject areas and reports on individual results of research in the field Shares many tips and insights into large-scale data analytics from authors and editors with long-term experience and specialization in the field

  11. Large-scale grid management

    International Nuclear Information System (INIS)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-01-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series

  12. Origin of large-scale cell structure in the universe

    International Nuclear Information System (INIS)

    Zel'dovich, Y.B.

    1982-01-01

    A qualitative explanation is offered for the characteristic global structure of the universe, wherein ''black'' regions devoid of galaxies are surrounded on all sides by closed, comparatively thin, ''bright'' layers populated by galaxies. The interpretation rests on some very general arguments regarding the growth of large-scale perturbations in a cold gas

  13. Understanding the faint red galaxy population using large-scale clustering measurements from SDSS DR7

    OpenAIRE

    Ross, Ashley; Tojeiro, Rita; Percival, Will

    2011-01-01

    We use data from the SDSS to investigate the evolution of the large-scale galaxy bias as a function of luminosity for red galaxies. We carefully consider correlation functions of galaxies selected from both photometric and spectroscopic data, and cross-correlations between them, to obtain multiple measurements of the large-scale bias. We find, for our most robust analyses, a strong increase in bias with luminosity for the most luminous galaxies, an intermediate regime where bias does not evol...

  14. How the Internet Will Help Large-Scale Assessment Reinvent Itself

    Directory of Open Access Journals (Sweden)

    Randy Elliot Bennett

    2001-02-01

    Full Text Available Large-scale assessment in the United States is undergoing enormous pressure to change. That pressure stems from many causes. Depending upon the type of test, the issues precipitating change include an outmoded cognitive-scientific basis for test design; a mismatch with curriculum; the differential performance of population groups; a lack of information to help individuals improve; and inefficiency. These issues provide a strong motivation to reconceptualize both the substance and the business of large-scale assessment. At the same time, advances in technology, measurement, and cognitive science are providing the means to make that reconceptualization a reality. The thesis of this paper is that the largest facilitating factor will be technological, in particular the Internet. In the same way that it is already helping to revolutionize commerce, education, and even social interaction, the Internet will help revolutionize the business and substance of large-scale assessment.

  15. Using large-scale data analysis to assess life history and behavioural traits: the case of the reintroduced White stork Ciconia ciconia population in the Netherlands

    NARCIS (Netherlands)

    Doligez, B.; Thomson, D.L.; Van Noordwijk, A.J.

    2004-01-01

    The White stork Ciconia ciconia has been the object of several successful reintroduction programmes in the last decades. As a consequence, populations have been monitored over large spatial scales. Despite these intense efforts, very few reliable estimates of life history traits are available for

  16. Estimating large carnivore populations at global scale based on spatial predictions of density and distribution – Application to the jaguar (Panthera onca)

    Science.gov (United States)

    Robinson, Hugh S.; Abarca, Maria; Zeller, Katherine A.; Velasquez, Grisel; Paemelaere, Evi A. D.; Goldberg, Joshua F.; Payan, Esteban; Hoogesteijn, Rafael; Boede, Ernesto O.; Schmidt, Krzysztof; Lampo, Margarita; Viloria, Ángel L.; Carreño, Rafael; Robinson, Nathaniel; Lukacs, Paul M.; Nowak, J. Joshua; Salom-Pérez, Roberto; Castañeda, Franklin; Boron, Valeria; Quigley, Howard

    2018-01-01

    Broad scale population estimates of declining species are desired for conservation efforts. However, for many secretive species including large carnivores, such estimates are often difficult. Based on published density estimates obtained through camera trapping, presence/absence data, and globally available predictive variables derived from satellite imagery, we modelled density and occurrence of a large carnivore, the jaguar, across the species’ entire range. We then combined these models in a hierarchical framework to estimate the total population. Our models indicate that potential jaguar density is best predicted by measures of primary productivity, with the highest densities in the most productive tropical habitats and a clear declining gradient with distance from the equator. Jaguar distribution, in contrast, is determined by the combined effects of human impacts and environmental factors: probability of jaguar occurrence increased with forest cover, mean temperature, and annual precipitation and declined with increases in human foot print index and human density. Probability of occurrence was also significantly higher for protected areas than outside of them. We estimated the world’s jaguar population at 173,000 (95% CI: 138,000–208,000) individuals, mostly concentrated in the Amazon Basin; elsewhere, populations tend to be small and fragmented. The high number of jaguars results from the large total area still occupied (almost 9 million km2) and low human densities (conservation actions. PMID:29579129

  17. Political consultation and large-scale research

    International Nuclear Information System (INIS)

    Bechmann, G.; Folkers, H.

    1977-01-01

    Large-scale research and policy consulting have an intermediary position between sociological sub-systems. While large-scale research coordinates science, policy, and production, policy consulting coordinates science, policy and political spheres. In this very position, large-scale research and policy consulting lack of institutional guarantees and rational back-ground guarantee which are characteristic for their sociological environment. This large-scale research can neither deal with the production of innovative goods under consideration of rentability, nor can it hope for full recognition by the basis-oriented scientific community. Policy consulting knows neither the competence assignment of the political system to make decisions nor can it judge succesfully by the critical standards of the established social science, at least as far as the present situation is concerned. This intermediary position of large-scale research and policy consulting has, in three points, a consequence supporting the thesis which states that this is a new form of institutionalization of science: These are: 1) external control, 2) the organization form, 3) the theoretical conception of large-scale research and policy consulting. (orig.) [de

  18. Achieving online consent to participation in large-scale gene-environment studies: a tangible destination

    NARCIS (Netherlands)

    Wood, F.; Kowalczuk, J.; Elwyn, G.; Mitchell, C.; Gallacher, J.

    2011-01-01

    BACKGROUND: Population based genetics studies are dependent on large numbers of individuals in the pursuit of small effect sizes. Recruiting and consenting a large number of participants is both costly and time consuming. We explored whether an online consent process for large-scale genetics studies

  19. Large-scale multimedia modeling applications

    International Nuclear Information System (INIS)

    Droppo, J.G. Jr.; Buck, J.W.; Whelan, G.; Strenge, D.L.; Castleton, K.J.; Gelston, G.M.

    1995-08-01

    Over the past decade, the US Department of Energy (DOE) and other agencies have faced increasing scrutiny for a wide range of environmental issues related to past and current practices. A number of large-scale applications have been undertaken that required analysis of large numbers of potential environmental issues over a wide range of environmental conditions and contaminants. Several of these applications, referred to here as large-scale applications, have addressed long-term public health risks using a holistic approach for assessing impacts from potential waterborne and airborne transport pathways. Multimedia models such as the Multimedia Environmental Pollutant Assessment System (MEPAS) were designed for use in such applications. MEPAS integrates radioactive and hazardous contaminants impact computations for major exposure routes via air, surface water, ground water, and overland flow transport. A number of large-scale applications of MEPAS have been conducted to assess various endpoints for environmental and human health impacts. These applications are described in terms of lessons learned in the development of an effective approach for large-scale applications

  20. Estimating demographic parameters from large-scale population genomic data using Approximate Bayesian Computation

    Directory of Open Access Journals (Sweden)

    Li Sen

    2012-03-01

    Full Text Available Abstract Background The Approximate Bayesian Computation (ABC approach has been used to infer demographic parameters for numerous species, including humans. However, most applications of ABC still use limited amounts of data, from a small number of loci, compared to the large amount of genome-wide population-genetic data which have become available in the last few years. Results We evaluated the performance of the ABC approach for three 'population divergence' models - similar to the 'isolation with migration' model - when the data consists of several hundred thousand SNPs typed for multiple individuals by simulating data from known demographic models. The ABC approach was used to infer demographic parameters of interest and we compared the inferred values to the true parameter values that was used to generate hypothetical "observed" data. For all three case models, the ABC approach inferred most demographic parameters quite well with narrow credible intervals, for example, population divergence times and past population sizes, but some parameters were more difficult to infer, such as population sizes at present and migration rates. We compared the ability of different summary statistics to infer demographic parameters, including haplotype and LD based statistics, and found that the accuracy of the parameter estimates can be improved by combining summary statistics that capture different parts of information in the data. Furthermore, our results suggest that poor choices of prior distributions can in some circumstances be detected using ABC. Finally, increasing the amount of data beyond some hundred loci will substantially improve the accuracy of many parameter estimates using ABC. Conclusions We conclude that the ABC approach can accommodate realistic genome-wide population genetic data, which may be difficult to analyze with full likelihood approaches, and that the ABC can provide accurate and precise inference of demographic parameters from

  1. Decentralized Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Poulsen, Niels Kjølstad

    2013-01-01

    problem is formulated as a centralized large-scale optimization problem but is then decomposed into smaller subproblems that are solved locally by each unit connected to an aggregator. For large-scale systems the method is faster than solving the full problem and can be distributed to include an arbitrary...

  2. Automating large-scale reactor systems

    International Nuclear Information System (INIS)

    Kisner, R.A.

    1985-01-01

    This paper conveys a philosophy for developing automated large-scale control systems that behave in an integrated, intelligent, flexible manner. Methods for operating large-scale systems under varying degrees of equipment degradation are discussed, and a design approach that separates the effort into phases is suggested. 5 refs., 1 fig

  3. Some ecological guidelines for large-scale biomass plantations

    Energy Technology Data Exchange (ETDEWEB)

    Hoffman, W.; Cook, J.H.; Beyea, J. [National Audubon Society, Tavernier, FL (United States)

    1993-12-31

    The National Audubon Society sees biomass as an appropriate and necessary source of energy to help replace fossil fuels in the near future, but is concerned that large-scale biomass plantations could displace significant natural vegetation and wildlife habitat, and reduce national and global biodiversity. We support the development of an industry large enough to provide significant portions of our energy budget, but we see a critical need to ensure that plantations are designed and sited in ways that minimize ecological disruption, or even provide environmental benefits. We have been studying the habitat value of intensively managed short-rotation tree plantations. Our results show that these plantations support large populations of some birds, but not all of the species using the surrounding landscape, and indicate that their value as habitat can be increased greatly by including small areas of mature trees within them. We believe short-rotation plantations can benefit regional biodiversity if they can be deployed as buffers for natural forests, or as corridors connecting forest tracts. To realize these benefits, and to avoid habitat degradation, regional biomass plantation complexes (e.g., the plantations supplying all the fuel for a powerplant) need to be planned, sited, and developed as large-scale units in the context of the regional landscape mosaic.

  4. Global analysis of seagrass restoration: the importance of large-scale planting

    KAUST Repository

    van Katwijk, Marieke M.; Thorhaug, Anitra; Marbà , Nú ria; Orth, Robert J.; Duarte, Carlos M.; Kendrick, Gary A.; Althuizen, Inge H. J.; Balestri, Elena; Bernard, Guillaume; Cambridge, Marion L.; Cunha, Alexandra; Durance, Cynthia; Giesen, Wim; Han, Qiuying; Hosokawa, Shinya; Kiswara, Wawan; Komatsu, Teruhisa; Lardicci, Claudio; Lee, Kun-Seop; Meinesz, Alexandre; Nakaoka, Masahiro; O'Brien, Katherine R.; Paling, Erik I.; Pickerell, Chris; Ransijn, Aryan M. A.; Verduin, Jennifer J.

    2015-01-01

    In coastal and estuarine systems, foundation species like seagrasses, mangroves, saltmarshes or corals provide important ecosystem services. Seagrasses are globally declining and their reintroduction has been shown to restore ecosystem functions. However, seagrass restoration is often challenging, given the dynamic and stressful environment that seagrasses often grow in. From our world-wide meta-analysis of seagrass restoration trials (1786 trials), we describe general features and best practice for seagrass restoration. We confirm that removal of threats is important prior to replanting. Reduced water quality (mainly eutrophication), and construction activities led to poorer restoration success than, for instance, dredging, local direct impact and natural causes. Proximity to and recovery of donor beds were positively correlated with trial performance. Planting techniques can influence restoration success. The meta-analysis shows that both trial survival and seagrass population growth rate in trials that survived are positively affected by the number of plants or seeds initially transplanted. This relationship between restoration scale and restoration success was not related to trial characteristics of the initial restoration. The majority of the seagrass restoration trials have been very small, which may explain the low overall trial survival rate (i.e. estimated 37%). Successful regrowth of the foundation seagrass species appears to require crossing a minimum threshold of reintroduced individuals. Our study provides the first global field evidence for the requirement of a critical mass for recovery, which may also hold for other foundation species showing strong positive feedback to a dynamic environment. Synthesis and applications. For effective restoration of seagrass foundation species in its typically dynamic, stressful environment, introduction of large numbers is seen to be beneficial and probably serves two purposes. First, a large-scale planting

  5. Global analysis of seagrass restoration: the importance of large-scale planting

    KAUST Repository

    van Katwijk, Marieke M.

    2015-10-28

    In coastal and estuarine systems, foundation species like seagrasses, mangroves, saltmarshes or corals provide important ecosystem services. Seagrasses are globally declining and their reintroduction has been shown to restore ecosystem functions. However, seagrass restoration is often challenging, given the dynamic and stressful environment that seagrasses often grow in. From our world-wide meta-analysis of seagrass restoration trials (1786 trials), we describe general features and best practice for seagrass restoration. We confirm that removal of threats is important prior to replanting. Reduced water quality (mainly eutrophication), and construction activities led to poorer restoration success than, for instance, dredging, local direct impact and natural causes. Proximity to and recovery of donor beds were positively correlated with trial performance. Planting techniques can influence restoration success. The meta-analysis shows that both trial survival and seagrass population growth rate in trials that survived are positively affected by the number of plants or seeds initially transplanted. This relationship between restoration scale and restoration success was not related to trial characteristics of the initial restoration. The majority of the seagrass restoration trials have been very small, which may explain the low overall trial survival rate (i.e. estimated 37%). Successful regrowth of the foundation seagrass species appears to require crossing a minimum threshold of reintroduced individuals. Our study provides the first global field evidence for the requirement of a critical mass for recovery, which may also hold for other foundation species showing strong positive feedback to a dynamic environment. Synthesis and applications. For effective restoration of seagrass foundation species in its typically dynamic, stressful environment, introduction of large numbers is seen to be beneficial and probably serves two purposes. First, a large-scale planting

  6. The Software Reliability of Large Scale Integration Circuit and Very Large Scale Integration Circuit

    OpenAIRE

    Artem Ganiyev; Jan Vitasek

    2010-01-01

    This article describes evaluation method of faultless function of large scale integration circuits (LSI) and very large scale integration circuits (VLSI). In the article there is a comparative analysis of factors which determine faultless of integrated circuits, analysis of already existing methods and model of faultless function evaluation of LSI and VLSI. The main part describes a proposed algorithm and program for analysis of fault rate in LSI and VLSI circuits.

  7. Phylogenetic distribution of large-scale genome patchiness

    Directory of Open Access Journals (Sweden)

    Hackenberg Michael

    2008-04-01

    Full Text Available Abstract Background The phylogenetic distribution of large-scale genome structure (i.e. mosaic compositional patchiness has been explored mainly by analytical ultracentrifugation of bulk DNA. However, with the availability of large, good-quality chromosome sequences, and the recently developed computational methods to directly analyze patchiness on the genome sequence, an evolutionary comparative analysis can be carried out at the sequence level. Results The local variations in the scaling exponent of the Detrended Fluctuation Analysis are used here to analyze large-scale genome structure and directly uncover the characteristic scales present in genome sequences. Furthermore, through shuffling experiments of selected genome regions, computationally-identified, isochore-like regions were identified as the biological source for the uncovered large-scale genome structure. The phylogenetic distribution of short- and large-scale patchiness was determined in the best-sequenced genome assemblies from eleven eukaryotic genomes: mammals (Homo sapiens, Pan troglodytes, Mus musculus, Rattus norvegicus, and Canis familiaris, birds (Gallus gallus, fishes (Danio rerio, invertebrates (Drosophila melanogaster and Caenorhabditis elegans, plants (Arabidopsis thaliana and yeasts (Saccharomyces cerevisiae. We found large-scale patchiness of genome structure, associated with in silico determined, isochore-like regions, throughout this wide phylogenetic range. Conclusion Large-scale genome structure is detected by directly analyzing DNA sequences in a wide range of eukaryotic chromosome sequences, from human to yeast. In all these genomes, large-scale patchiness can be associated with the isochore-like regions, as directly detected in silico at the sequence level.

  8. Managing large-scale models: DBS

    International Nuclear Information System (INIS)

    1981-05-01

    A set of fundamental management tools for developing and operating a large scale model and data base system is presented. Based on experience in operating and developing a large scale computerized system, the only reasonable way to gain strong management control of such a system is to implement appropriate controls and procedures. Chapter I discusses the purpose of the book. Chapter II classifies a broad range of generic management problems into three groups: documentation, operations, and maintenance. First, system problems are identified then solutions for gaining management control are disucssed. Chapters III, IV, and V present practical methods for dealing with these problems. These methods were developed for managing SEAS but have general application for large scale models and data bases

  9. Large Scale Self-Organizing Information Distribution System

    National Research Council Canada - National Science Library

    Low, Steven

    2005-01-01

    This project investigates issues in "large-scale" networks. Here "large-scale" refers to networks with large number of high capacity nodes and transmission links, and shared by a large number of users...

  10. Large scale structure and baryogenesis

    International Nuclear Information System (INIS)

    Kirilova, D.P.; Chizhov, M.V.

    2001-08-01

    We discuss a possible connection between the large scale structure formation and the baryogenesis in the universe. An update review of the observational indications for the presence of a very large scale 120h -1 Mpc in the distribution of the visible matter of the universe is provided. The possibility to generate a periodic distribution with the characteristic scale 120h -1 Mpc through a mechanism producing quasi-periodic baryon density perturbations during inflationary stage, is discussed. The evolution of the baryon charge density distribution is explored in the framework of a low temperature boson condensate baryogenesis scenario. Both the observed very large scale of a the visible matter distribution in the universe and the observed baryon asymmetry value could naturally appear as a result of the evolution of a complex scalar field condensate, formed at the inflationary stage. Moreover, for some model's parameters a natural separation of matter superclusters from antimatter ones can be achieved. (author)

  11. Automatic management software for large-scale cluster system

    International Nuclear Information System (INIS)

    Weng Yunjian; Chinese Academy of Sciences, Beijing; Sun Gongxing

    2007-01-01

    At present, the large-scale cluster system faces to the difficult management. For example the manager has large work load. It needs to cost much time on the management and the maintenance of large-scale cluster system. The nodes in large-scale cluster system are very easy to be chaotic. Thousands of nodes are put in big rooms so that some managers are very easy to make the confusion with machines. How do effectively carry on accurate management under the large-scale cluster system? The article introduces ELFms in the large-scale cluster system. Furthermore, it is proposed to realize the large-scale cluster system automatic management. (authors)

  12. Large-Scale Environment Properties of Narrow-Line Seyfert 1 Galaxies at z < 0.4

    Energy Technology Data Exchange (ETDEWEB)

    Järvelä, Emilia [Metsähovi Radio Observatory, Aalto University, Espoo (Finland); Department of Electronics and Nanoengineering, Aalto University, Espoo (Finland); Lähteenmäki, A. [Metsähovi Radio Observatory, Aalto University, Espoo (Finland); Department of Electronics and Nanoengineering, Aalto University, Espoo (Finland); Tartu Observatory, Tõravere (Estonia); Lietzen, H., E-mail: emilia.jarvela@aalto.fi [Tartu Observatory, Tõravere (Estonia)

    2017-11-30

    The large-scale environment is believed to affect the evolution and intrinsic properties of galaxies. It offers a new perspective on narrow-line Seyfert 1 galaxies (NLS1) which have not been extensively studied in this context before. We study a large and diverse sample of 960 NLS1 galaxies using a luminosity-density field constructed using Sloan Digital Sky Survey. We investigate how the large-scale environment is connected to the properties of NLS1 galaxies, especially their radio loudness. Furthermore, we compare the large-scale environment properties of NLS1 galaxies with other active galactic nuclei (AGN) classes, for example, other jetted AGN and broad-line Seyfert 1 (BLS1) galaxies, to shed light on their possible relations. In general NLS1 galaxies reside in less dense large-scale environments than any of our comparison samples, thus supporting their young age. The average luminosity-density and distribution to different luminosity-density regions of NLS1 sources is significantly different compared to BLS1 galaxies. This contradicts the simple orientation-based unification of NLS1 and BLS1 galaxies, and weakens the hypothesis that BLS1 galaxies are the parent population of NLS1 galaxies. The large-scale environment density also has an impact on the intrinsic properties of NLS1 galaxies; the radio loudness increases with the increasing luminosity-density. However, our results suggest that the NLS1 population is indeed heterogeneous, and that a considerable fraction of them are misclassified. We support a suggested description that the traditional classification based on the radio loudness should be replaced with the division to jetted and non-jetted sources.

  13. Large-scale immigration and political response: popular reaction in California.

    Science.gov (United States)

    Clark, W A

    1998-03-01

    Over the past 3 years, the level of political debate has grown over the nature and extent of the recent large-scale immigration to the US in general, and to California in particular. California's Proposition 187 to deny welfare benefits to illegal immigrants brought national attention to the immigration debate, and no doubt influenced recent decisions to significantly change the US's welfare program. The author studied the vote on Proposition 187 in the November 1994 California election to better understand the nature of reaction to large-scale immigration and recent arguments about anti-immigrant sentiment and nativism. The only counties which voted against the proposition were Sonoma, Marin, San Mateo, Santa Cruz, Yolo, Alameda, and Santa Clara, as well as the population of San Francisco. The vote generated political responses from across the border as well as within California. Statements from Mexican and other Central American governments reflected their concern over the possibility of returning populations, for whom there are neither jobs nor public services in their countries of origin. Findings are presented from a spatial analysis of the vote by census tracts in Los Angeles County.

  14. Large scale network-centric distributed systems

    CERN Document Server

    Sarbazi-Azad, Hamid

    2014-01-01

    A highly accessible reference offering a broad range of topics and insights on large scale network-centric distributed systems Evolving from the fields of high-performance computing and networking, large scale network-centric distributed systems continues to grow as one of the most important topics in computing and communication and many interdisciplinary areas. Dealing with both wired and wireless networks, this book focuses on the design and performance issues of such systems. Large Scale Network-Centric Distributed Systems provides in-depth coverage ranging from ground-level hardware issu

  15. Large-Scale Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S. A.

    1995-12-01

    \\catcode`\\@=11 \\ialign{m @th#1hfil ##hfil \\crcr#2\\crcr\\sim\\crcr}}} \\catcode`\\@=12 Highly collimated outflows extend out to Mpc scales in many radio-loud active galaxies. In Seyfert galaxies, which are radio-quiet, the outflows extend out to kpc scales and do not appear to be as highly collimated. In order to study the nature of large-scale (>~1 kpc) outflows in Seyferts, we have conducted optical, radio and X-ray surveys of a distance-limited sample of 22 edge-on Seyfert galaxies. Results of the optical emission-line imaging and spectroscopic survey imply that large-scale outflows are present in >~{{1} /{4}} of all Seyferts. The radio (VLA) and X-ray (ROSAT) surveys show that large-scale radio and X-ray emission is present at about the same frequency. Kinetic luminosities of the outflows in Seyferts are comparable to those in starburst-driven superwinds. Large-scale radio sources in Seyferts appear diffuse, but do not resemble radio halos found in some edge-on starburst galaxies (e.g. M82). We discuss the feasibility of the outflows being powered by the active nucleus (e.g. a jet) or a circumnuclear starburst.

  16. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele

    2015-08-23

    The interaction between scales is investigated in a turbulent mixing layer. The large-scale amplitude modulation of the small scales already observed in other works depends on the crosswise location. Large-scale positive fluctuations correlate with a stronger activity of the small scales on the low speed-side of the mixing layer, and a reduced activity on the high speed-side. However, from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  17. The health system and population health implications of large-scale diabetes screening in India: a microsimulation model of alternative approaches.

    Directory of Open Access Journals (Sweden)

    Sanjay Basu

    2015-05-01

    Full Text Available Like a growing number of rapidly developing countries, India has begun to develop a system for large-scale community-based screening for diabetes. We sought to identify the implications of using alternative screening instruments to detect people with undiagnosed type 2 diabetes among diverse populations across India.We developed and validated a microsimulation model that incorporated data from 58 studies from across the country into a nationally representative sample of Indians aged 25-65 y old. We estimated the diagnostic and health system implications of three major survey-based screening instruments and random glucometer-based screening. Of the 567 million Indians eligible for screening, depending on which of four screening approaches is utilized, between 158 and 306 million would be expected to screen as "high risk" for type 2 diabetes, and be referred for confirmatory testing. Between 26 million and 37 million of these people would be expected to meet international diagnostic criteria for diabetes, but between 126 million and 273 million would be "false positives." The ratio of false positives to true positives varied from 3.9 (when using random glucose screening to 8.2 (when using a survey-based screening instrument in our model. The cost per case found would be expected to be from US$5.28 (when using random glucose screening to US$17.06 (when using a survey-based screening instrument, presenting a total cost of between US$169 and US$567 million. The major limitation of our analysis is its dependence on published cohort studies that are unlikely fully to capture the poorest and most rural areas of the country. Because these areas are thought to have the lowest diabetes prevalence, this may result in overestimation of the efficacy and health benefits of screening.Large-scale community-based screening is anticipated to produce a large number of false-positive results, particularly if using currently available survey-based screening

  18. Dissecting the large-scale galactic conformity

    Science.gov (United States)

    Seo, Seongu

    2018-01-01

    Galactic conformity is an observed phenomenon that galaxies located in the same region have similar properties such as star formation rate, color, gas fraction, and so on. The conformity was first observed among galaxies within in the same halos (“one-halo conformity”). The one-halo conformity can be readily explained by mutual interactions among galaxies within a halo. Recent observations however further witnessed a puzzling connection among galaxies with no direct interaction. In particular, galaxies located within a sphere of ~5 Mpc radius tend to show similarities, even though the galaxies do not share common halos with each other ("two-halo conformity" or “large-scale conformity”). Using a cosmological hydrodynamic simulation, Illustris, we investigate the physical origin of the two-halo conformity and put forward two scenarios. First, back-splash galaxies are likely responsible for the large-scale conformity. They have evolved into red galaxies due to ram-pressure stripping in a given galaxy cluster and happen to reside now within a ~5 Mpc sphere. Second, galaxies in strong tidal field induced by large-scale structure also seem to give rise to the large-scale conformity. The strong tides suppress star formation in the galaxies. We discuss the importance of the large-scale conformity in the context of galaxy evolution.

  19. An optimum city size? The scaling relationship for urban population and fine particulate (PM_2_._5) concentration

    International Nuclear Information System (INIS)

    Han, Lijian; Zhou, Weiqi; Pickett, Steward T.A.; Li, Weifeng; Li, Li

    2016-01-01

    We utilize the distribution of PM_2_._5 concentration and population in large cities at the global scale to illustrate the relationship between urbanization and urban air quality. We found: 1) The relationship varies greatly among continents and countries. Large cities in North America, Europe, and Latin America have better air quality than those in other continents, while those in China and India have the worst air quality. 2) The relationships between urban population size and PM_2_._5 concentration in large cities of different continents or countries were different. PM_2_._5 concentration in large cities in North America, Europe, and Latin America showed little fluctuation or a small increasing trend, but those in Africa and India represent a “U” type relationship and in China represent an inverse “U” type relationship. 3) The potential contribution of population to PM_2_._5 concentration was higher in the large cities in China and India, but lower in other large cities. - Highlights: • Urban population and PM_2_._5 concentration varies greatly among regions. • Urban population size increase does not always enhances PM_2_._5 concentration. • Population's potential contribution to PM_2_._5 concentration higher in China. - We utilize the distribution of PM_2_._5 concentration and population in large cities at the global scale to illustrate the relationship between urbanization and urban air quality.

  20. The intertidal community in West Greenland: Large-scale patterns and small-scale variation on ecosystem dynamics along a climate gradient

    DEFF Research Database (Denmark)

    Thyrring, Jakob; Blicher, Martin; Sejr, Mikael Kristian

    are largely unknown. The West Greenland coast is north - south orientated. This provides an ideal setting to study the impact of climate change on marine species population dynamics and distribution. We investigated the latitudinal changes in the rocky intertidal community along 18° latitudes (59-77°N......) in West Greenland. Using cleared quadrats we quantified patterns in abundance, biomass and species richness in the intertidal zone. We use this data to disentangle patterns in Arctic intertidal communities at different scales. We describe the effects of different environmental drivers and species...... interactions on distribution and dynamics of intertidal species. Our results indicate that changes in distribution and abundance of foundation species can have large effects on the ecosystem. We also show that the importance of small-scale variation may be of same magnitude as large- scale variation. Only...

  1. Spatial variability and macro‐scale drivers of growth for native and introduced Flathead Catfish populations

    Science.gov (United States)

    Massie, Danielle L.; Smith, Geoffrey; Bonvechio, Timothy F.; Bunch, Aaron J.; Lucchesi, David O.; Wagner, Tyler

    2018-01-01

    Quantifying spatial variability in fish growth and identifying large‐scale drivers of growth are fundamental to many conservation and management decisions. Although fish growth studies often focus on a single population, it is becoming increasingly clear that large‐scale studies are likely needed for addressing transboundary management needs. This is particularly true for species with high recreational value and for those with negative ecological consequences when introduced outside of their native range, such as the Flathead Catfish Pylodictis olivaris. This study quantified growth variability of the Flathead Catfish across a large portion of its contemporary range to determine whether growth differences existed between habitat types (i.e., reservoirs and rivers) and between native and introduced populations. Additionally, we investigated whether growth parameters varied as a function of latitude and time since introduction (for introduced populations). Length‐at‐age data from 26 populations across 11 states in the USA were modeled using a Bayesian hierarchical von Bertalanffy growth model. Population‐specific growth trajectories revealed large variation in Flathead Catfish growth and relatively high uncertainty in growth parameters for some populations. Relatively high uncertainty was also evident when comparing populations and when quantifying large‐scale patterns. Growth parameters (Brody growth coefficient [K] and theoretical maximum average length [L∞]) were not different (based on overlapping 90% credible intervals) between habitat types or between native and introduced populations. For populations within the introduced range of Flathead Catfish, latitude was negatively correlated with K. For native populations, we estimated an 85% probability that L∞ estimates were negatively correlated with latitude. Contrary to predictions, time since introduction was not correlated with growth parameters in introduced populations of Flathead Catfish

  2. Using large-scale data analysis to assess life history and behavioural traits: the case of the reintroduced White stork Ciconia ciconia population in the Netherlands

    Directory of Open Access Journals (Sweden)

    Doligez, B.

    2004-06-01

    Full Text Available The White stork Ciconia ciconia has been the object of several successful reintroduction programmes in the last decades. As a consequence, populations have been monitored over large spatial scales. Despite these intense efforts, very few reliable estimates of life history traits are available for this species. Such general knowledge however constitutes a prerequisite for investigating the consequences of conservation measures. Using the large–scale and long–term ringing and resighting data set of White storks in the Netherlands, we investigated the variation of survival and resighting rates with age, time and previous individual resighting history, and in a second step supplementary feeding, using capture–recapture models. Providing food did not seem to affect survival directly, but may have an indirect effect via the alteration of migratory behaviour. Large–scale population monitoring is important in obtaining precise and reliable estimates of life history traits and assessing the consequences of conservation measures on these traits, which will prove useful for managers to take adequate measures in future conservation strategies.

  3. Large-scale perspective as a challenge

    NARCIS (Netherlands)

    Plomp, M.G.A.

    2012-01-01

    1. Scale forms a challenge for chain researchers: when exactly is something ‘large-scale’? What are the underlying factors (e.g. number of parties, data, objects in the chain, complexity) that determine this? It appears to be a continuum between small- and large-scale, where positioning on that

  4. Algorithm 896: LSA: Algorithms for Large-Scale Optimization

    Czech Academy of Sciences Publication Activity Database

    Lukšan, Ladislav; Matonoha, Ctirad; Vlček, Jan

    2009-01-01

    Roč. 36, č. 3 (2009), 16-1-16-29 ISSN 0098-3500 R&D Pro jects: GA AV ČR IAA1030405; GA ČR GP201/06/P397 Institutional research plan: CEZ:AV0Z10300504 Keywords : algorithms * design * large-scale optimization * large-scale nonsmooth optimization * large-scale nonlinear least squares * large-scale nonlinear minimax * large-scale systems of nonlinear equations * sparse pro blems * partially separable pro blems * limited-memory methods * discrete Newton methods * quasi-Newton methods * primal interior-point methods Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 1.904, year: 2009

  5. Scale interactions in a mixing layer – the role of the large-scale gradients

    KAUST Repository

    Fiscaletti, D.

    2016-02-15

    © 2016 Cambridge University Press. The interaction between the large and the small scales of turbulence is investigated in a mixing layer, at a Reynolds number based on the Taylor microscale of , via direct numerical simulations. The analysis is performed in physical space, and the local vorticity root-mean-square (r.m.s.) is taken as a measure of the small-scale activity. It is found that positive large-scale velocity fluctuations correspond to large vorticity r.m.s. on the low-speed side of the mixing layer, whereas, they correspond to low vorticity r.m.s. on the high-speed side. The relationship between large and small scales thus depends on position if the vorticity r.m.s. is correlated with the large-scale velocity fluctuations. On the contrary, the correlation coefficient is nearly constant throughout the mixing layer and close to unity if the vorticity r.m.s. is correlated with the large-scale velocity gradients. Therefore, the small-scale activity appears closely related to large-scale gradients, while the correlation between the small-scale activity and the large-scale velocity fluctuations is shown to reflect a property of the large scales. Furthermore, the vorticity from unfiltered (small scales) and from low pass filtered (large scales) velocity fields tend to be aligned when examined within vortical tubes. These results provide evidence for the so-called \\'scale invariance\\' (Meneveau & Katz, Annu. Rev. Fluid Mech., vol. 32, 2000, pp. 1-32), and suggest that some of the large-scale characteristics are not lost at the small scales, at least at the Reynolds number achieved in the present simulation.

  6. Large-scale matrix-handling subroutines 'ATLAS'

    International Nuclear Information System (INIS)

    Tsunematsu, Toshihide; Takeda, Tatsuoki; Fujita, Keiichi; Matsuura, Toshihiko; Tahara, Nobuo

    1978-03-01

    Subroutine package ''ATLAS'' has been developed for handling large-scale matrices. The package is composed of four kinds of subroutines, i.e., basic arithmetic routines, routines for solving linear simultaneous equations and for solving general eigenvalue problems and utility routines. The subroutines are useful in large scale plasma-fluid simulations. (auth.)

  7. Large-scale solar heat

    Energy Technology Data Exchange (ETDEWEB)

    Tolonen, J.; Konttinen, P.; Lund, P. [Helsinki Univ. of Technology, Otaniemi (Finland). Dept. of Engineering Physics and Mathematics

    1998-12-31

    In this project a large domestic solar heating system was built and a solar district heating system was modelled and simulated. Objectives were to improve the performance and reduce costs of a large-scale solar heating system. As a result of the project the benefit/cost ratio can be increased by 40 % through dimensioning and optimising the system at the designing stage. (orig.)

  8. The scaling of population persistence with carrying capacity does not asymptote in populations of a fish experiencing extreme climate variability.

    Science.gov (United States)

    White, Richard S A; Wintle, Brendan A; McHugh, Peter A; Booker, Douglas J; McIntosh, Angus R

    2017-06-14

    Despite growing concerns regarding increasing frequency of extreme climate events and declining population sizes, the influence of environmental stochasticity on the relationship between population carrying capacity and time-to-extinction has received little empirical attention. While time-to-extinction increases exponentially with carrying capacity in constant environments, theoretical models suggest increasing environmental stochasticity causes asymptotic scaling, thus making minimum viable carrying capacity vastly uncertain in variable environments. Using empirical estimates of environmental stochasticity in fish metapopulations, we showed that increasing environmental stochasticity resulting from extreme droughts was insufficient to create asymptotic scaling of time-to-extinction with carrying capacity in local populations as predicted by theory. Local time-to-extinction increased with carrying capacity due to declining sensitivity to demographic stochasticity, and the slope of this relationship declined significantly as environmental stochasticity increased. However, recent 1 in 25 yr extreme droughts were insufficient to extirpate populations with large carrying capacity. Consequently, large populations may be more resilient to environmental stochasticity than previously thought. The lack of carrying capacity-related asymptotes in persistence under extreme climate variability reveals how small populations affected by habitat loss or overharvesting, may be disproportionately threatened by increases in extreme climate events with global warming. © 2017 The Author(s).

  9. Probes of large-scale structure in the Universe

    International Nuclear Information System (INIS)

    Suto, Yasushi; Gorski, K.; Juszkiewicz, R.; Silk, J.

    1988-01-01

    Recent progress in observational techniques has made it possible to confront quantitatively various models for the large-scale structure of the Universe with detailed observational data. We develop a general formalism to show that the gravitational instability theory for the origin of large-scale structure is now capable of critically confronting observational results on cosmic microwave background radiation angular anisotropies, large-scale bulk motions and large-scale clumpiness in the galaxy counts. (author)

  10. Punishment sustains large-scale cooperation in prestate warfare

    Science.gov (United States)

    Mathew, Sarah; Boyd, Robert

    2011-01-01

    Understanding cooperation and punishment in small-scale societies is crucial for explaining the origins of human cooperation. We studied warfare among the Turkana, a politically uncentralized, egalitarian, nomadic pastoral society in East Africa. Based on a representative sample of 88 recent raids, we show that the Turkana sustain costly cooperation in combat at a remarkably large scale, at least in part, through punishment of free-riders. Raiding parties comprised several hundred warriors and participants are not kin or day-to-day interactants. Warriors incur substantial risk of death and produce collective benefits. Cowardice and desertions occur, and are punished by community-imposed sanctions, including collective corporal punishment and fines. Furthermore, Turkana norms governing warfare benefit the ethnolinguistic group, a population of a half-million people, at the expense of smaller social groupings. These results challenge current views that punishment is unimportant in small-scale societies and that human cooperation evolved in small groups of kin and familiar individuals. Instead, these results suggest that cooperation at the larger scale of ethnolinguistic units enforced by third-party sanctions could have a deep evolutionary history in the human species. PMID:21670285

  11. Large-scale grid management; Storskala Nettforvaltning

    Energy Technology Data Exchange (ETDEWEB)

    Langdal, Bjoern Inge; Eggen, Arnt Ove

    2003-07-01

    The network companies in the Norwegian electricity industry now have to establish a large-scale network management, a concept essentially characterized by (1) broader focus (Broad Band, Multi Utility,...) and (2) bigger units with large networks and more customers. Research done by SINTEF Energy Research shows so far that the approaches within large-scale network management may be structured according to three main challenges: centralization, decentralization and out sourcing. The article is part of a planned series.

  12. Japanese large-scale interferometers

    CERN Document Server

    Kuroda, K; Miyoki, S; Ishizuka, H; Taylor, C T; Yamamoto, K; Miyakawa, O; Fujimoto, M K; Kawamura, S; Takahashi, R; Yamazaki, T; Arai, K; Tatsumi, D; Ueda, A; Fukushima, M; Sato, S; Shintomi, T; Yamamoto, A; Suzuki, T; Saitô, Y; Haruyama, T; Sato, N; Higashi, Y; Uchiyama, T; Tomaru, T; Tsubono, K; Ando, M; Takamori, A; Numata, K; Ueda, K I; Yoneda, H; Nakagawa, K; Musha, M; Mio, N; Moriwaki, S; Somiya, K; Araya, A; Kanda, N; Telada, S; Sasaki, M; Tagoshi, H; Nakamura, T; Tanaka, T; Ohara, K

    2002-01-01

    The objective of the TAMA 300 interferometer was to develop advanced technologies for kilometre scale interferometers and to observe gravitational wave events in nearby galaxies. It was designed as a power-recycled Fabry-Perot-Michelson interferometer and was intended as a step towards a final interferometer in Japan. The present successful status of TAMA is presented. TAMA forms a basis for LCGT (large-scale cryogenic gravitational wave telescope), a 3 km scale cryogenic interferometer to be built in the Kamioka mine in Japan, implementing cryogenic mirror techniques. The plan of LCGT is schematically described along with its associated R and D.

  13. Relation between metabolic syndrome and body compositions among Chinese adolescents and adults from a large-scale population survey

    Directory of Open Access Journals (Sweden)

    Tao Xu

    2017-04-01

    Full Text Available Abstract Background Few nationally representative surveys regarding body composition and metabolic syndrome (MetS have been done in a large-scale representative Chinese population to explore the prediction of body composition indicators for MetS. The objective of this study was to examine the relation of body composition and MetS and to determine the optimal cut-off values of body composition indicators that predict MetS in a large representative Chinese sample based on multiple provinces and ethnicities, covering a broad age range from 10 to 80 years old. Methods The subjects came from a large-scale population survey on Chinese physiological constants and health conditions conducted in six provinces. 32,036 subjects completed all blood biochemical testing and body composition measure. Subjects meeting at least 3 of the following 5 criteria qualify as having MetS: elevated blood pressure, lower high density lipoprotein cholesterol level, higher triglyceride level, higher fasting glucose level and abdominal obesity. Results The total prevalence rate of MetS for males (9.29% was lower than for females (11.58%. The prevalence rates were 12.03% for male adults and 15.57% for female adults respectively. The risk of MetS increased 44.6% (OR = 1.446, 95%CI: 1.414–1.521 for males and 53.4% (OR = 1.534, 95%CI: 1.472–1.598 for females with each 5% increase of percentage of body fat. The risk of MetS increased two-fold (OR = 2.020, 95%CI: 1.920–2.125 for males; OR = 2.047, 95%CI: 1.954–2.144 for females respectively with each 5% increase of waist-hip ratio. The risk of MetS increased three-fold (OR = 2.915, 95%CI: 2.742–3.099 for males; OR = 2.950, 95%CI: 2.784–3.127 for females respectively with each 5% increase of Waist-to-Height Ratio (WHtR. Areas under the receiver operating curve (AUC of most body composition indicators were larger than 0.70 and the sensitivities and the specificities of most cut-off values were larger than 0

  14. How Large Asexual Populations Adapt

    Science.gov (United States)

    Desai, Michael

    2007-03-01

    We often think of beneficial mutations as being rare, and of adaptation as a sequence of selected substitutions: a beneficial mutation occurs, spreads through a population in a selective sweep, then later another beneficial mutation occurs, and so on. This simple picture is the basis for much of our intuition about adaptive evolution, and underlies a number of practical techniques for analyzing sequence data. Yet many large and mostly asexual populations -- including a wide variety of unicellular organisms and viruses -- live in a very different world. In these populations, beneficial mutations are common, and frequently interfere or cooperate with one another as they all attempt to sweep simultaneously. This radically changes the way these populations adapt: rather than an orderly sequence of selective sweeps, evolution is a constant swarm of competing and interfering mutations. I will describe some aspects of these dynamics, including why large asexual populations cannot evolve very quickly and the character of the diversity they maintain. I will explain how this changes our expectations of sequence data, how sex can help a population adapt, and the potential role of ``mutator'' phenotypes with abnormally high mutation rates. Finally, I will discuss comparisons of these predictions with evolution experiments in laboratory yeast populations.

  15. Large scale model testing

    International Nuclear Information System (INIS)

    Brumovsky, M.; Filip, R.; Polachova, H.; Stepanek, S.

    1989-01-01

    Fracture mechanics and fatigue calculations for WWER reactor pressure vessels were checked by large scale model testing performed using large testing machine ZZ 8000 (with a maximum load of 80 MN) at the SKODA WORKS. The results are described from testing the material resistance to fracture (non-ductile). The testing included the base materials and welded joints. The rated specimen thickness was 150 mm with defects of a depth between 15 and 100 mm. The results are also presented of nozzles of 850 mm inner diameter in a scale of 1:3; static, cyclic, and dynamic tests were performed without and with surface defects (15, 30 and 45 mm deep). During cyclic tests the crack growth rate in the elastic-plastic region was also determined. (author). 6 figs., 2 tabs., 5 refs

  16. Why small-scale cannabis growers stay small: five mechanisms that prevent small-scale growers from going large scale.

    Science.gov (United States)

    Hammersvik, Eirik; Sandberg, Sveinung; Pedersen, Willy

    2012-11-01

    Over the past 15-20 years, domestic cultivation of cannabis has been established in a number of European countries. New techniques have made such cultivation easier; however, the bulk of growers remain small-scale. In this study, we explore the factors that prevent small-scale growers from increasing their production. The study is based on 1 year of ethnographic fieldwork and qualitative interviews conducted with 45 Norwegian cannabis growers, 10 of whom were growing on a large-scale and 35 on a small-scale. The study identifies five mechanisms that prevent small-scale indoor growers from going large-scale. First, large-scale operations involve a number of people, large sums of money, a high work-load and a high risk of detection, and thus demand a higher level of organizational skills than for small growing operations. Second, financial assets are needed to start a large 'grow-site'. Housing rent, electricity, equipment and nutrients are expensive. Third, to be able to sell large quantities of cannabis, growers need access to an illegal distribution network and knowledge of how to act according to black market norms and structures. Fourth, large-scale operations require advanced horticultural skills to maximize yield and quality, which demands greater skills and knowledge than does small-scale cultivation. Fifth, small-scale growers are often embedded in the 'cannabis culture', which emphasizes anti-commercialism, anti-violence and ecological and community values. Hence, starting up large-scale production will imply having to renegotiate or abandon these values. Going from small- to large-scale cannabis production is a demanding task-ideologically, technically, economically and personally. The many obstacles that small-scale growers face and the lack of interest and motivation for going large-scale suggest that the risk of a 'slippery slope' from small-scale to large-scale growing is limited. Possible political implications of the findings are discussed. Copyright

  17. Stochastic models for structured populations scaling limits and long time behavior

    CERN Document Server

    Meleard, Sylvie

    2015-01-01

    In this contribution, several probabilistic tools to study population dynamics are developed. The focus is on scaling limits of qualitatively different stochastic individual based models and the long time behavior of some classes of limiting processes. Structured population dynamics are modeled by measure-valued processes describing the individual behaviors and taking into account the demographic and mutational parameters, and possible interactions between individuals. Many quantitative parameters appear in these models and several relevant normalizations are considered, leading  to infinite-dimensional deterministic or stochastic large-population approximations. Biologically relevant questions are considered, such as extinction criteria, the effect of large birth events, the impact of  environmental catastrophes, the mutation-selection trade-off, recovery criteria in parasite infections, genealogical properties of a sample of individuals. These notes originated from a lecture series on Structured P...

  18. Distributed large-scale dimensional metrology new insights

    CERN Document Server

    Franceschini, Fiorenzo; Maisano, Domenico

    2011-01-01

    Focuses on the latest insights into and challenges of distributed large scale dimensional metrology Enables practitioners to study distributed large scale dimensional metrology independently Includes specific examples of the development of new system prototypes

  19. Large-scale community echocardiographic screening reveals a major burden of undiagnosed valvular heart disease in older people: the OxVALVE Population Cohort Study†

    Science.gov (United States)

    d'Arcy, Joanna L.; Coffey, Sean; Loudon, Margaret A.; Kennedy, Andrew; Pearson-Stuttard, Jonathan; Birks, Jacqueline; Frangou, Eleni; Farmer, Andrew J.; Mant, David; Wilson, Jo; Myerson, Saul G.; Prendergast, Bernard D.

    2016-01-01

    Background Valvular heart disease (VHD) is expected to become more common as the population ages. However, current estimates of its natural history and prevalence are based on historical studies with potential sources of bias. We conducted a cross-sectional analysis of the clinical and epidemiological characteristics of VHD identified at recruitment of a large cohort of older people. Methods and results We enrolled 2500 individuals aged ≥65 years from a primary care population and screened for undiagnosed VHD using transthoracic echocardiography. Newly identified (predominantly mild) VHD was detected in 51% of participants. The most common abnormalities were aortic sclerosis (34%), mitral regurgitation (22%), and aortic regurgitation (15%). Aortic stenosis was present in 1.3%. The likelihood of undiagnosed VHD was two-fold higher in the two most deprived socioeconomic quintiles than in the most affluent quintile, and three-fold higher in individuals with atrial fibrillation. Clinically significant (moderate or severe) undiagnosed VHD was identified in 6.4%. In addition, 4.9% of the cohort had pre-existing VHD (a total prevalence of 11.3%). Projecting these findings using population data, we estimate that the prevalence of clinically significant VHD will double before 2050. Conclusions Previously undetected VHD affects 1 in 2 of the elderly population and is more common in lower socioeconomic classes. These unique data demonstrate the contemporary clinical and epidemiological characteristics of VHD in a large population-based cohort of older people and confirm the scale of the emerging epidemic of VHD, with widespread implications for clinicians and healthcare resources. PMID:27354049

  20. SCALE INTERACTION IN A MIXING LAYER. THE ROLE OF THE LARGE-SCALE GRADIENTS

    KAUST Repository

    Fiscaletti, Daniele; Attili, Antonio; Bisetti, Fabrizio; Elsinga, Gerrit E.

    2015-01-01

    from physical considerations we would expect the scales to interact in a qualitatively similar way within the flow and across different turbulent flows. Therefore, instead of the large-scale fluctuations, the large-scale gradients modulation of the small scales has been additionally investigated.

  1. Large-scale, multidirectional larval connectivity among coral reef fish populations in the Great Barrier Reef Marine Park

    KAUST Repository

    Williamson, David H.; Harrison, Hugo B.; Almany, Glenn R.; Berumen, Michael L.; Bode, Michael; Bonin, Mary C.; Choukroun, Severine; Doherty, Peter J.; Frisch, Ashley J.; Saenz-Agudelo, Pablo; Jones, Geoffrey P.

    2016-01-01

    Larval dispersal is the key process by which populations of most marine fishes and invertebrates are connected and replenished. Advances in larval tagging and genetics have enhanced our capacity to track larval dispersal, assess scales of population connectivity, and quantify larval exchange among no-take marine reserves and fished areas. Recent studies have found that reserves can be a significant source of recruits for populations up to 40 km away, but the scale and direction of larval connectivity across larger seascapes remain unknown. Here, we apply genetic parentage analysis to investigate larval dispersal patterns for two exploited coral reef groupers (Plectropomus maculatus and Plectropomus leopardus) within and among three clusters of reefs separated by 60–220 km within the Great Barrier Reef Marine Park, Australia. A total of 69 juvenile P. maculatus and 17 juvenile P. leopardus (representing 6% and 9% of the total juveniles sampled, respectively) were genetically assigned to parent individuals on reefs within the study area. We identified both short-distance larval dispersal within regions (200 m to 50 km) and long-distance, multidirectional dispersal of up to ~250 km among regions. Dispersal strength declined significantly with distance, with best-fit dispersal kernels estimating median dispersal distances of ~110 km for P. maculatus and ~190 km for P. leopardus. Larval exchange among reefs demonstrates that established reserves form a highly connected network and contribute larvae for the replenishment of fished reefs at multiple spatial scales. Our findings highlight the potential for long-distance dispersal in an important group of reef fishes, and provide further evidence that effectively protected reserves can yield recruitment and sustainability benefits for exploited fish populations.

  2. Large-scale, multidirectional larval connectivity among coral reef fish populations in the Great Barrier Reef Marine Park

    KAUST Repository

    Williamson, David H.

    2016-11-15

    Larval dispersal is the key process by which populations of most marine fishes and invertebrates are connected and replenished. Advances in larval tagging and genetics have enhanced our capacity to track larval dispersal, assess scales of population connectivity, and quantify larval exchange among no-take marine reserves and fished areas. Recent studies have found that reserves can be a significant source of recruits for populations up to 40 km away, but the scale and direction of larval connectivity across larger seascapes remain unknown. Here, we apply genetic parentage analysis to investigate larval dispersal patterns for two exploited coral reef groupers (Plectropomus maculatus and Plectropomus leopardus) within and among three clusters of reefs separated by 60–220 km within the Great Barrier Reef Marine Park, Australia. A total of 69 juvenile P. maculatus and 17 juvenile P. leopardus (representing 6% and 9% of the total juveniles sampled, respectively) were genetically assigned to parent individuals on reefs within the study area. We identified both short-distance larval dispersal within regions (200 m to 50 km) and long-distance, multidirectional dispersal of up to ~250 km among regions. Dispersal strength declined significantly with distance, with best-fit dispersal kernels estimating median dispersal distances of ~110 km for P. maculatus and ~190 km for P. leopardus. Larval exchange among reefs demonstrates that established reserves form a highly connected network and contribute larvae for the replenishment of fished reefs at multiple spatial scales. Our findings highlight the potential for long-distance dispersal in an important group of reef fishes, and provide further evidence that effectively protected reserves can yield recruitment and sustainability benefits for exploited fish populations.

  3. Visual attention mitigates information loss in small- and large-scale neural codes

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-01-01

    Summary The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires processing sensory signals in a manner that protects information about relevant stimuli from degradation. Such selective processing – or selective attention – is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. PMID:25769502

  4. Trends in large-scale testing of reactor structures

    International Nuclear Information System (INIS)

    Blejwas, T.E.

    2003-01-01

    Large-scale tests of reactor structures have been conducted at Sandia National Laboratories since the late 1970s. This paper describes a number of different large-scale impact tests, pressurization tests of models of containment structures, and thermal-pressure tests of models of reactor pressure vessels. The advantages of large-scale testing are evident, but cost, in particular limits its use. As computer models have grown in size, such as number of degrees of freedom, the advent of computer graphics has made possible very realistic representation of results - results that may not accurately represent reality. A necessary condition to avoiding this pitfall is the validation of the analytical methods and underlying physical representations. Ironically, the immensely larger computer models sometimes increase the need for large-scale testing, because the modeling is applied to increasing more complex structural systems and/or more complex physical phenomena. Unfortunately, the cost of large-scale tests is a disadvantage that will likely severely limit similar testing in the future. International collaborations may provide the best mechanism for funding future programs with large-scale tests. (author)

  5. The population genomics of begomoviruses: global scale population structure and gene flow

    Directory of Open Access Journals (Sweden)

    Prasanna HC

    2010-09-01

    Full Text Available Abstract Background The rapidly growing availability of diverse full genome sequences from across the world is increasing the feasibility of studying the large-scale population processes that underly observable pattern of virus diversity. In particular, characterizing the genetic structure of virus populations could potentially reveal much about how factors such as geographical distributions, host ranges and gene flow between populations combine to produce the discontinuous patterns of genetic diversity that we perceive as distinct virus species. Among the richest and most diverse full genome datasets that are available is that for the dicotyledonous plant infecting genus, Begomovirus, in the Family Geminiviridae. The begomoviruses all share the same whitefly vector, are highly recombinogenic and are distributed throughout tropical and subtropical regions where they seriously threaten the food security of the world's poorest people. Results We focus here on using a model-based population genetic approach to identify the genetically distinct sub-populations within the global begomovirus meta-population. We demonstrate the existence of at least seven major sub-populations that can further be sub-divided into as many as thirty four significantly differentiated and genetically cohesive minor sub-populations. Using the population structure framework revealed in the present study, we further explored the extent of gene flow and recombination between genetic populations. Conclusions Although geographical barriers are apparently the most significant underlying cause of the seven major population sub-divisions, within the framework of these sub-divisions, we explore patterns of gene flow to reveal that both host range differences and genetic barriers to recombination have probably been major contributors to the minor population sub-divisions that we have identified. We believe that the global Begomovirus population structure revealed here could

  6. Large Scale Computations in Air Pollution Modelling

    DEFF Research Database (Denmark)

    Zlatev, Z.; Brandt, J.; Builtjes, P. J. H.

    Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998......Proceedings of the NATO Advanced Research Workshop on Large Scale Computations in Air Pollution Modelling, Sofia, Bulgaria, 6-10 July 1998...

  7. Remote sensing of the biological dynamics of large-scale salt evaporation ponds

    Science.gov (United States)

    Richardson, Laurie L.; Bachoon, Dave; Ingram-Willey, Vebbra; Chow, Colin C.; Weinstock, Kenneth

    1992-01-01

    Optical properties of salt evaporation ponds associated with Exportadora de Sal, a salt production company in Baja California Sur, Mexico, were analyzed using a combination of spectroradiometer and extracted pigment data, and Landsat-5 Thematic Mapper imagery. The optical characteristics of each pond are determined by the biota, which consists of dense populations of algae and photosynthetic bacteria containing a wide variety of photosynthetic and photoprotective pigments. Analysis has shown that spectral and image data can differentiate between taxonomic groups of the microbiota, detect changes in population distributions, and reveal large-scale seasonal dynamics.

  8. Large-Scale 3D Printing: The Way Forward

    Science.gov (United States)

    Jassmi, Hamad Al; Najjar, Fady Al; Ismail Mourad, Abdel-Hamid

    2018-03-01

    Research on small-scale 3D printing has rapidly evolved, where numerous industrial products have been tested and successfully applied. Nonetheless, research on large-scale 3D printing, directed to large-scale applications such as construction and automotive manufacturing, yet demands a great a great deal of efforts. Large-scale 3D printing is considered an interdisciplinary topic and requires establishing a blended knowledge base from numerous research fields including structural engineering, materials science, mechatronics, software engineering, artificial intelligence and architectural engineering. This review article summarizes key topics of relevance to new research trends on large-scale 3D printing, particularly pertaining (1) technological solutions of additive construction (i.e. the 3D printers themselves), (2) materials science challenges, and (3) new design opportunities.

  9. Growth Limits in Large Scale Networks

    DEFF Research Database (Denmark)

    Knudsen, Thomas Phillip

    limitations. The rising complexity of network management with the convergence of communications platforms is shown as problematic for both automatic management feasibility and for manpower resource management. In the fourth step the scope is extended to include the present society with the DDN project as its......The Subject of large scale networks is approached from the perspective of the network planner. An analysis of the long term planning problems is presented with the main focus on the changing requirements for large scale networks and the potential problems in meeting these requirements. The problems...... the fundamental technological resources in network technologies are analysed for scalability. Here several technological limits to continued growth are presented. The third step involves a survey of major problems in managing large scale networks given the growth of user requirements and the technological...

  10. Accelerating sustainability in large-scale facilities

    CERN Multimedia

    Marina Giampietro

    2011-01-01

    Scientific research centres and large-scale facilities are intrinsically energy intensive, but how can big science improve its energy management and eventually contribute to the environmental cause with new cleantech? CERN’s commitment to providing tangible answers to these questions was sealed in the first workshop on energy management for large scale scientific infrastructures held in Lund, Sweden, on the 13-14 October.   Participants at the energy management for large scale scientific infrastructures workshop. The workshop, co-organised with the European Spallation Source (ESS) and  the European Association of National Research Facilities (ERF), tackled a recognised need for addressing energy issues in relation with science and technology policies. It brought together more than 150 representatives of Research Infrastrutures (RIs) and energy experts from Europe and North America. “Without compromising our scientific projects, we can ...

  11. Large scale reflood test

    International Nuclear Information System (INIS)

    Hirano, Kemmei; Murao, Yoshio

    1980-01-01

    The large-scale reflood test with a view to ensuring the safety of light water reactors was started in fiscal 1976 based on the special account act for power source development promotion measures by the entrustment from the Science and Technology Agency. Thereafter, to establish the safety of PWRs in loss-of-coolant accidents by joint international efforts, the Japan-West Germany-U.S. research cooperation program was started in April, 1980. Thereupon, the large-scale reflood test is now included in this program. It consists of two tests using a cylindrical core testing apparatus for examining the overall system effect and a plate core testing apparatus for testing individual effects. Each apparatus is composed of the mock-ups of pressure vessel, primary loop, containment vessel and ECCS. The testing method, the test results and the research cooperation program are described. (J.P.N.)

  12. Large-scale community echocardiographic screening reveals a major burden of undiagnosed valvular heart disease in older people: the OxVALVE Population Cohort Study.

    Science.gov (United States)

    d'Arcy, Joanna L; Coffey, Sean; Loudon, Margaret A; Kennedy, Andrew; Pearson-Stuttard, Jonathan; Birks, Jacqueline; Frangou, Eleni; Farmer, Andrew J; Mant, David; Wilson, Jo; Myerson, Saul G; Prendergast, Bernard D

    2016-12-14

    Valvular heart disease (VHD) is expected to become more common as the population ages. However, current estimates of its natural history and prevalence are based on historical studies with potential sources of bias. We conducted a cross-sectional analysis of the clinical and epidemiological characteristics of VHD identified at recruitment of a large cohort of older people. We enrolled 2500 individuals aged ≥65 years from a primary care population and screened for undiagnosed VHD using transthoracic echocardiography. Newly identified (predominantly mild) VHD was detected in 51% of participants. The most common abnormalities were aortic sclerosis (34%), mitral regurgitation (22%), and aortic regurgitation (15%). Aortic stenosis was present in 1.3%. The likelihood of undiagnosed VHD was two-fold higher in the two most deprived socioeconomic quintiles than in the most affluent quintile, and three-fold higher in individuals with atrial fibrillation. Clinically significant (moderate or severe) undiagnosed VHD was identified in 6.4%. In addition, 4.9% of the cohort had pre-existing VHD (a total prevalence of 11.3%). Projecting these findings using population data, we estimate that the prevalence of clinically significant VHD will double before 2050. Previously undetected VHD affects 1 in 2 of the elderly population and is more common in lower socioeconomic classes. These unique data demonstrate the contemporary clinical and epidemiological characteristics of VHD in a large population-based cohort of older people and confirm the scale of the emerging epidemic of VHD, with widespread implications for clinicians and healthcare resources. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2016. For permissions please email: journals.permissions@oup.com.

  13. Large Scale Cosmological Anomalies and Inhomogeneous Dark Energy

    Directory of Open Access Journals (Sweden)

    Leandros Perivolaropoulos

    2014-01-01

    Full Text Available A wide range of large scale observations hint towards possible modifications on the standard cosmological model which is based on a homogeneous and isotropic universe with a small cosmological constant and matter. These observations, also known as “cosmic anomalies” include unexpected Cosmic Microwave Background perturbations on large angular scales, large dipolar peculiar velocity flows of galaxies (“bulk flows”, the measurement of inhomogenous values of the fine structure constant on cosmological scales (“alpha dipole” and other effects. The presence of the observational anomalies could either be a large statistical fluctuation in the context of ΛCDM or it could indicate a non-trivial departure from the cosmological principle on Hubble scales. Such a departure is very much constrained by cosmological observations for matter. For dark energy however there are no significant observational constraints for Hubble scale inhomogeneities. In this brief review I discuss some of the theoretical models that can naturally lead to inhomogeneous dark energy, their observational constraints and their potential to explain the large scale cosmic anomalies.

  14. Large-scale patterns in Rayleigh-Benard convection

    International Nuclear Information System (INIS)

    Hardenberg, J. von; Parodi, A.; Passoni, G.; Provenzale, A.; Spiegel, E.A.

    2008-01-01

    Rayleigh-Benard convection at large Rayleigh number is characterized by the presence of intense, vertically moving plumes. Both laboratory and numerical experiments reveal that the rising and descending plumes aggregate into separate clusters so as to produce large-scale updrafts and downdrafts. The horizontal scales of the aggregates reported so far have been comparable to the horizontal extent of the containers, but it has not been clear whether that represents a limitation imposed by domain size. In this work, we present numerical simulations of convection at sufficiently large aspect ratio to ascertain whether there is an intrinsic saturation scale for the clustering process when that ratio is large enough. From a series of simulations of Rayleigh-Benard convection with Rayleigh numbers between 10 5 and 10 8 and with aspect ratios up to 12π, we conclude that the clustering process has a finite horizontal saturation scale with at most a weak dependence on Rayleigh number in the range studied

  15. Visual attention mitigates information loss in small- and large-scale neural codes.

    Science.gov (United States)

    Sprague, Thomas C; Saproo, Sameer; Serences, John T

    2015-04-01

    The visual system transforms complex inputs into robust and parsimonious neural codes that efficiently guide behavior. Because neural communication is stochastic, the amount of encoded visual information necessarily decreases with each synapse. This constraint requires that sensory signals are processed in a manner that protects information about relevant stimuli from degradation. Such selective processing--or selective attention--is implemented via several mechanisms, including neural gain and changes in tuning properties. However, examining each of these effects in isolation obscures their joint impact on the fidelity of stimulus feature representations by large-scale population codes. Instead, large-scale activity patterns can be used to reconstruct representations of relevant and irrelevant stimuli, thereby providing a holistic understanding about how neuron-level modulations collectively impact stimulus encoding. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Manufacturing test of large scale hollow capsule and long length cladding in the large scale oxide dispersion strengthened (ODS) martensitic steel

    International Nuclear Information System (INIS)

    Narita, Takeshi; Ukai, Shigeharu; Kaito, Takeji; Ohtsuka, Satoshi; Fujiwara, Masayuki

    2004-04-01

    Mass production capability of oxide dispersion strengthened (ODS) martensitic steel cladding (9Cr) has being evaluated in the Phase II of the Feasibility Studies on Commercialized Fast Reactor Cycle System. The cost for manufacturing mother tube (raw materials powder production, mechanical alloying (MA) by ball mill, canning, hot extrusion, and machining) is a dominant factor in the total cost for manufacturing ODS ferritic steel cladding. In this study, the large-sale 9Cr-ODS martensitic steel mother tube which is made with a large-scale hollow capsule, and long length claddings were manufactured, and the applicability of these processes was evaluated. Following results were obtained in this study. (1) Manufacturing the large scale mother tube in the dimension of 32 mm OD, 21 mm ID, and 2 m length has been successfully carried out using large scale hollow capsule. This mother tube has a high degree of accuracy in size. (2) The chemical composition and the micro structure of the manufactured mother tube are similar to the existing mother tube manufactured by a small scale can. And the remarkable difference between the bottom and top sides in the manufactured mother tube has not been observed. (3) The long length cladding has been successfully manufactured from the large scale mother tube which was made using a large scale hollow capsule. (4) For reducing the manufacturing cost of the ODS steel claddings, manufacturing process of the mother tubes using a large scale hollow capsules is promising. (author)

  17. Amplification of large-scale magnetic field in nonhelical magnetohydrodynamics

    KAUST Repository

    Kumar, Rohit

    2017-08-11

    It is typically assumed that the kinetic and magnetic helicities play a crucial role in the growth of large-scale dynamo. In this paper, we demonstrate that helicity is not essential for the amplification of large-scale magnetic field. For this purpose, we perform nonhelical magnetohydrodynamic (MHD) simulation, and show that the large-scale magnetic field can grow in nonhelical MHD when random external forcing is employed at scale 1/10 the box size. The energy fluxes and shell-to-shell transfer rates computed using the numerical data show that the large-scale magnetic energy grows due to the energy transfers from the velocity field at the forcing scales.

  18. Effects of local and large-scale climate patterns on estuarine resident fishes: The example of Pomatoschistus microps and Pomatoschistus minutus

    Science.gov (United States)

    Nyitrai, Daniel; Martinho, Filipe; Dolbeth, Marina; Rito, João; Pardal, Miguel A.

    2013-12-01

    Large-scale and local climate patterns are known to influence several aspects of the life cycle of marine fish. In this paper, we used a 9-year database (2003-2011) to analyse the populations of two estuarine resident fishes, Pomatoschistus microps and Pomatoschistus minutus, in order to determine their relationships with varying environmental stressors operating over local and large scales. This study was performed in the Mondego estuary, Portugal. Firstly, the variations in abundance, growth, population structure and secondary production were evaluated. These species appeared in high densities in the beginning of the study period, with subsequent occasional high annual density peaks, while their secondary production was lower in dry years. The relationships between yearly fish abundance and the environmental variables were evaluated separately for both species using Spearman correlation analysis, considering the yearly abundance peaks for the whole population, juveniles and adults. Among the local climate patterns, precipitation, river runoff, salinity and temperature were used in the analyses, and North Atlantic Oscillation (NAO) index and sea surface temperature (SST) were tested as large-scale factors. For P. microps, precipitation and NAO were the significant factors explaining abundance of the whole population, the adults and the juveniles as well. Regarding P. minutus, for the whole population, juveniles and adults river runoff was the significant predictor. The results for both species suggest a differential influence of climate patterns on the various life cycle stages, confirming also the importance of estuarine resident fishes as indicators of changes in local and large-scale climate patterns, related to global climate change.

  19. Scaling Law between Urban Electrical Consumption and Population in China

    Science.gov (United States)

    Zhu, Xiaowu; Xiong, Aimin; Li, Liangsheng; Liu, Maoxin; Chen, X. S.

    The relation between the household electrical consumption Y and population N for Chinese cities in 2006 has been investigated with the power law scaling form Y = A_0 N^{β}. It is found that the Chinese cities should be divided into three categories characterized by different scaling exponent β. The first category, which includes the biggest and coastal cities of China, has the scaling exponent β> 1. The second category, which includes mostly the cities in central China, has the scaling exponent β ≈ 1. The third category, which consists of the cities in northwestern China, has the scaling exponent β 1, there is also a fixed point population N f . If the initial population N(0) > N f , the population increases very fast with time and diverges within a finite time. If the initial population N(0) < N f , the population decreases with time and collapse finally. The pattern of population evolution in a city is determined by its scaling exponent and initial population.

  20. Hydrometeorological variability on a large french catchment and its relation to large-scale circulation across temporal scales

    Science.gov (United States)

    Massei, Nicolas; Dieppois, Bastien; Fritier, Nicolas; Laignel, Benoit; Debret, Maxime; Lavers, David; Hannah, David

    2015-04-01

    In the present context of global changes, considerable efforts have been deployed by the hydrological scientific community to improve our understanding of the impacts of climate fluctuations on water resources. Both observational and modeling studies have been extensively employed to characterize hydrological changes and trends, assess the impact of climate variability or provide future scenarios of water resources. In the aim of a better understanding of hydrological changes, it is of crucial importance to determine how and to what extent trends and long-term oscillations detectable in hydrological variables are linked to global climate oscillations. In this work, we develop an approach associating large-scale/local-scale correlation, enmpirical statistical downscaling and wavelet multiresolution decomposition of monthly precipitation and streamflow over the Seine river watershed, and the North Atlantic sea level pressure (SLP) in order to gain additional insights on the atmospheric patterns associated with the regional hydrology. We hypothesized that: i) atmospheric patterns may change according to the different temporal wavelengths defining the variability of the signals; and ii) definition of those hydrological/circulation relationships for each temporal wavelength may improve the determination of large-scale predictors of local variations. The results showed that the large-scale/local-scale links were not necessarily constant according to time-scale (i.e. for the different frequencies characterizing the signals), resulting in changing spatial patterns across scales. This was then taken into account by developing an empirical statistical downscaling (ESD) modeling approach which integrated discrete wavelet multiresolution analysis for reconstructing local hydrometeorological processes (predictand : precipitation and streamflow on the Seine river catchment) based on a large-scale predictor (SLP over the Euro-Atlantic sector) on a monthly time-step. This approach

  1. Ward identities and consistency relations for the large scale structure with multiple species

    International Nuclear Information System (INIS)

    Peloso, Marco; Pietroni, Massimo

    2014-01-01

    We present fully nonlinear consistency relations for the squeezed bispectrum of Large Scale Structure. These relations hold when the matter component of the Universe is composed of one or more species, and generalize those obtained in [1,2] in the single species case. The multi-species relations apply to the standard dark matter + baryons scenario, as well as to the case in which some of the fields are auxiliary quantities describing a particular population, such as dark matter halos or a specific galaxy class. If a large scale velocity bias exists between the different populations new terms appear in the consistency relations with respect to the single species case. As an illustration, we discuss two physical cases in which such a velocity bias can exist: (1) a new long range scalar force in the dark matter sector (resulting in a violation of the equivalence principle in the dark matter-baryon system), and (2) the distribution of dark matter halos relative to that of the underlying dark matter field

  2. Superconducting materials for large scale applications

    International Nuclear Information System (INIS)

    Dew-Hughes, D.

    1975-01-01

    Applications of superconductors capable of carrying large current densities in large-scale electrical devices are examined. Discussions are included on critical current density, superconducting materials available, and future prospects for improved superconducting materials. (JRD)

  3. Large-scale influences in near-wall turbulence.

    Science.gov (United States)

    Hutchins, Nicholas; Marusic, Ivan

    2007-03-15

    Hot-wire data acquired in a high Reynolds number facility are used to illustrate the need for adequate scale separation when considering the coherent structure in wall-bounded turbulence. It is found that a large-scale motion in the log region becomes increasingly comparable in energy to the near-wall cycle as the Reynolds number increases. Through decomposition of fluctuating velocity signals, it is shown that this large-scale motion has a distinct modulating influence on the small-scale energy (akin to amplitude modulation). Reassessment of DNS data, in light of these results, shows similar trends, with the rate and intensity of production due to the near-wall cycle subject to a modulating influence from the largest-scale motions.

  4. PKI security in large-scale healthcare networks.

    Science.gov (United States)

    Mantas, Georgios; Lymberopoulos, Dimitrios; Komninos, Nikos

    2012-06-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a large-scale Internet-based healthcare network connecting a wide spectrum of healthcare units geographically distributed within a wide region. Furthermore, the proposed PKI infrastructure facilitates the trust issues that arise in a large-scale healthcare network including multi-domain PKI infrastructures.

  5. Emerging large-scale solar heating applications

    International Nuclear Information System (INIS)

    Wong, W.P.; McClung, J.L.

    2009-01-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  6. Emerging large-scale solar heating applications

    Energy Technology Data Exchange (ETDEWEB)

    Wong, W.P.; McClung, J.L. [Science Applications International Corporation (SAIC Canada), Ottawa, Ontario (Canada)

    2009-07-01

    Currently the market for solar heating applications in Canada is dominated by outdoor swimming pool heating, make-up air pre-heating and domestic water heating in homes, commercial and institutional buildings. All of these involve relatively small systems, except for a few air pre-heating systems on very large buildings. Together these applications make up well over 90% of the solar thermal collectors installed in Canada during 2007. These three applications, along with the recent re-emergence of large-scale concentrated solar thermal for generating electricity, also dominate the world markets. This paper examines some emerging markets for large scale solar heating applications, with a focus on the Canadian climate and market. (author)

  7. Large-scale model-based assessment of deer-vehicle collision risk.

    Directory of Open Access Journals (Sweden)

    Torsten Hothorn

    Full Text Available Ungulates, in particular the Central European roe deer Capreolus capreolus and the North American white-tailed deer Odocoileus virginianus, are economically and ecologically important. The two species are risk factors for deer-vehicle collisions and as browsers of palatable trees have implications for forest regeneration. However, no large-scale management systems for ungulates have been implemented, mainly because of the high efforts and costs associated with attempts to estimate population sizes of free-living ungulates living in a complex landscape. Attempts to directly estimate population sizes of deer are problematic owing to poor data quality and lack of spatial representation on larger scales. We used data on >74,000 deer-vehicle collisions observed in 2006 and 2009 in Bavaria, Germany, to model the local risk of deer-vehicle collisions and to investigate the relationship between deer-vehicle collisions and both environmental conditions and browsing intensities. An innovative modelling approach for the number of deer-vehicle collisions, which allows nonlinear environment-deer relationships and assessment of spatial heterogeneity, was the basis for estimating the local risk of collisions for specific road types on the scale of Bavarian municipalities. Based on this risk model, we propose a new "deer-vehicle collision index" for deer management. We show that the risk of deer-vehicle collisions is positively correlated to browsing intensity and to harvest numbers. Overall, our results demonstrate that the number of deer-vehicle collisions can be predicted with high precision on the scale of municipalities. In the densely populated and intensively used landscapes of Central Europe and North America, a model-based risk assessment for deer-vehicle collisions provides a cost-efficient instrument for deer management on the landscape scale. The measures derived from our model provide valuable information for planning road protection and defining

  8. Large-scale regions of antimatter

    International Nuclear Information System (INIS)

    Grobov, A. V.; Rubin, S. G.

    2015-01-01

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era

  9. Large-scale regions of antimatter

    Energy Technology Data Exchange (ETDEWEB)

    Grobov, A. V., E-mail: alexey.grobov@gmail.com; Rubin, S. G., E-mail: sgrubin@mephi.ru [National Research Nuclear University MEPhI (Russian Federation)

    2015-07-15

    Amodified mechanism of the formation of large-scale antimatter regions is proposed. Antimatter appears owing to fluctuations of a complex scalar field that carries a baryon charge in the inflation era.

  10. Large-Scale Analysis of Art Proportions

    DEFF Research Database (Denmark)

    Jensen, Karl Kristoffer

    2014-01-01

    While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square) and with majo......While literature often tries to impute mathematical constants into art, this large-scale study (11 databases of paintings and photos, around 200.000 items) shows a different truth. The analysis, consisting of the width/height proportions, shows a value of rarely if ever one (square...

  11. The Expanded Large Scale Gap Test

    Science.gov (United States)

    1987-03-01

    NSWC TR 86-32 DTIC THE EXPANDED LARGE SCALE GAP TEST BY T. P. LIDDIARD D. PRICE RESEARCH AND TECHNOLOGY DEPARTMENT ’ ~MARCH 1987 Ap~proved for public...arises, to reduce the spread in the LSGT 50% gap value.) The worst charges, such as those with the highest or lowest densities, the largest re-pressed...Arlington, VA 22217 PE 62314N INS3A 1 RJ14E31 7R4TBK 11 TITLE (Include Security CIlmsilficatiorn The Expanded Large Scale Gap Test . 12. PEIRSONAL AUTHOR() T

  12. Large-Scale Gene-Centric Meta-Analysis across 39 Studies Identifies Type 2 Diabetes Loci

    NARCIS (Netherlands)

    Saxena, Richa; Elbers, Clara C.; Guo, Yiran; Peter, Inga; Gaunt, Tom R.; Mega, Jessica L.; Lanktree, Matthew B.; Tare, Archana; Almoguera Castillo, Berta; Li, Yun R.; Johnson, Toby; Bruinenberg, Marcel; Gilbert-Diamond, Diane; Rajagopalan, Ramakrishnan; Voight, Benjamin F.; Balasubramanyam, Ashok; Barnard, John; Bauer, Florianne; Baumert, Jens; Bhangale, Tushar; Boehm, Bernhard O.; Braund, Peter S.; Burton, Paul R.; Chandrupatla, Hareesh R.; Clarke, Robert; Cooper-DeHoff, Rhonda M.; Crook, Errol D.; Davey-Smith, George; Day, Ian N.; de Boer, Anthonius; de Groot, Mark C. H.; Drenos, Fotios; Ferguson, Jane; Fox, Caroline S.; Furlong, Clement E.; Gibson, Quince; Gieger, Christian; Gilhuijs-Pederson, Lisa A.; Glessner, Joseph T.; Goel, Anuj; Gong, Yan; Grant, Struan F. A.; Kumari, Meena; van der Harst, Pim; van Vliet-Ostaptchouk, Jana V.; Verweij, Niek; Wolffenbuttel, Bruce H. R.; Hofker, Marten H.; Asselbergs, Folkert W.; Wijmenga, Cisca

    2012-01-01

    To identify genetic factors contributing to type 2 diabetes (T2D), we performed large-scale meta-analyses by using a custom similar to 50,000 SNP genotyping array (the ITMAT-Broad-CARe array) with similar to 2000 candidate genes in 39 multiethnic population-based studies, case-control studies, and

  13. Large scale and big data processing and management

    CERN Document Server

    Sakr, Sherif

    2014-01-01

    Large Scale and Big Data: Processing and Management provides readers with a central source of reference on the data management techniques currently available for large-scale data processing. Presenting chapters written by leading researchers, academics, and practitioners, it addresses the fundamental challenges associated with Big Data processing tools and techniques across a range of computing environments.The book begins by discussing the basic concepts and tools of large-scale Big Data processing and cloud computing. It also provides an overview of different programming models and cloud-bas

  14. Scaling of Attitudes Toward Population Problems

    Science.gov (United States)

    Watkins, George A.

    1975-01-01

    This study related population problem attitudes and socioeconomic variables. Six items concerned with number of children, birth control, family, science, economic depression, and overpopulation were selected for a Guttman scalogram. Education, occupation, and number of children were correlated with population problems scale scores; marital status,…

  15. Large scale cluster computing workshop

    International Nuclear Information System (INIS)

    Dane Skow; Alan Silverman

    2002-01-01

    Recent revolutions in computer hardware and software technologies have paved the way for the large-scale deployment of clusters of commodity computers to address problems heretofore the domain of tightly coupled SMP processors. Near term projects within High Energy Physics and other computing communities will deploy clusters of scale 1000s of processors and be used by 100s to 1000s of independent users. This will expand the reach in both dimensions by an order of magnitude from the current successful production facilities. The goals of this workshop were: (1) to determine what tools exist which can scale up to the cluster sizes foreseen for the next generation of HENP experiments (several thousand nodes) and by implication to identify areas where some investment of money or effort is likely to be needed. (2) To compare and record experimences gained with such tools. (3) To produce a practical guide to all stages of planning, installing, building and operating a large computing cluster in HENP. (4) To identify and connect groups with similar interest within HENP and the larger clustering community

  16. Spatiotemporal dynamics of large-scale brain activity

    Science.gov (United States)

    Neuman, Jeremy

    Understanding the dynamics of large-scale brain activity is a tough challenge. One reason for this is the presence of an incredible amount of complexity arising from having roughly 100 billion neurons connected via 100 trillion synapses. Because of the extremely high number of degrees of freedom in the nervous system, the question of how the brain manages to properly function and remain stable, yet also be adaptable, must be posed. Neuroscientists have identified many ways the nervous system makes this possible, of which synaptic plasticity is possibly the most notable one. On the other hand, it is vital to understand how the nervous system also loses stability, resulting in neuropathological diseases such as epilepsy, a disease which affects 1% of the population. In the following work, we seek to answer some of these questions from two different perspectives. The first uses mean-field theory applied to neuronal populations, where the variables of interest are the percentages of active excitatory and inhibitory neurons in a network, to consider how the nervous system responds to external stimuli, self-organizes and generates epileptiform activity. The second method uses statistical field theory, in the framework of single neurons on a lattice, to study the concept of criticality, an idea borrowed from physics which posits that in some regime the brain operates in a collectively stable or marginally stable manner. This will be examined in two different neuronal networks with self-organized criticality serving as the overarching theme for the union of both perspectives. One of the biggest problems in neuroscience is the question of to what extent certain details are significant to the functioning of the brain. These details give rise to various spatiotemporal properties that at the smallest of scales explain the interaction of single neurons and synapses and at the largest of scales describe, for example, behaviors and sensations. In what follows, we will shed some

  17. Large-Scale Agriculture and Outgrower Schemes in Ethiopia

    DEFF Research Database (Denmark)

    Wendimu, Mengistu Assefa

    , the impact of large-scale agriculture and outgrower schemes on productivity, household welfare and wages in developing countries is highly contentious. Chapter 1 of this thesis provides an introduction to the study, while also reviewing the key debate in the contemporary land ‘grabbing’ and historical large...... sugarcane outgrower scheme on household income and asset stocks. Chapter 5 examines the wages and working conditions in ‘formal’ large-scale and ‘informal’ small-scale irrigated agriculture. The results in Chapter 2 show that moisture stress, the use of untested planting materials, and conflict over land...... commands a higher wage than ‘formal’ large-scale agriculture, while rather different wage determination mechanisms exist in the two sectors. Human capital characteristics (education and experience) partly explain the differences in wages within the formal sector, but play no significant role...

  18. Scale Reliability Evaluation with Heterogeneous Populations

    Science.gov (United States)

    Raykov, Tenko; Marcoulides, George A.

    2015-01-01

    A latent variable modeling approach for scale reliability evaluation in heterogeneous populations is discussed. The method can be used for point and interval estimation of reliability of multicomponent measuring instruments in populations representing mixtures of an unknown number of latent classes or subpopulations. The procedure is helpful also…

  19. Economically viable large-scale hydrogen liquefaction

    Science.gov (United States)

    Cardella, U.; Decker, L.; Klein, H.

    2017-02-01

    The liquid hydrogen demand, particularly driven by clean energy applications, will rise in the near future. As industrial large scale liquefiers will play a major role within the hydrogen supply chain, production capacity will have to increase by a multiple of today’s typical sizes. The main goal is to reduce the total cost of ownership for these plants by increasing energy efficiency with innovative and simple process designs, optimized in capital expenditure. New concepts must ensure a manageable plant complexity and flexible operability. In the phase of process development and selection, a dimensioning of key equipment for large scale liquefiers, such as turbines and compressors as well as heat exchangers, must be performed iteratively to ensure technological feasibility and maturity. Further critical aspects related to hydrogen liquefaction, e.g. fluid properties, ortho-para hydrogen conversion, and coldbox configuration, must be analysed in detail. This paper provides an overview on the approach, challenges and preliminary results in the development of efficient as well as economically viable concepts for large-scale hydrogen liquefaction.

  20. Large scale chromatographic separations using continuous displacement chromatography (CDC)

    International Nuclear Information System (INIS)

    Taniguchi, V.T.; Doty, A.W.; Byers, C.H.

    1988-01-01

    A process for large scale chromatographic separations using a continuous chromatography technique is described. The process combines the advantages of large scale batch fixed column displacement chromatography with conventional analytical or elution continuous annular chromatography (CAC) to enable large scale displacement chromatography to be performed on a continuous basis (CDC). Such large scale, continuous displacement chromatography separations have not been reported in the literature. The process is demonstrated with the ion exchange separation of a binary lanthanide (Nd/Pr) mixture. The process is, however, applicable to any displacement chromatography separation that can be performed using conventional batch, fixed column chromatography

  1. Large Scale Processes and Extreme Floods in Brazil

    Science.gov (United States)

    Ribeiro Lima, C. H.; AghaKouchak, A.; Lall, U.

    2016-12-01

    Persistent large scale anomalies in the atmospheric circulation and ocean state have been associated with heavy rainfall and extreme floods in water basins of different sizes across the world. Such studies have emerged in the last years as a new tool to improve the traditional, stationary based approach in flood frequency analysis and flood prediction. Here we seek to advance previous studies by evaluating the dominance of large scale processes (e.g. atmospheric rivers/moisture transport) over local processes (e.g. local convection) in producing floods. We consider flood-prone regions in Brazil as case studies and the role of large scale climate processes in generating extreme floods in such regions is explored by means of observed streamflow, reanalysis data and machine learning methods. The dynamics of the large scale atmospheric circulation in the days prior to the flood events are evaluated based on the vertically integrated moisture flux and its divergence field, which are interpreted in a low-dimensional space as obtained by machine learning techniques, particularly supervised kernel principal component analysis. In such reduced dimensional space, clusters are obtained in order to better understand the role of regional moisture recycling or teleconnected moisture in producing floods of a given magnitude. The convective available potential energy (CAPE) is also used as a measure of local convection activities. We investigate for individual sites the exceedance probability in which large scale atmospheric fluxes dominate the flood process. Finally, we analyze regional patterns of floods and how the scaling law of floods with drainage area responds to changes in the climate forcing mechanisms (e.g. local vs large scale).

  2. Computing in Large-Scale Dynamic Systems

    NARCIS (Netherlands)

    Pruteanu, A.S.

    2013-01-01

    Software applications developed for large-scale systems have always been difficult to de- velop due to problems caused by the large number of computing devices involved. Above a certain network size (roughly one hundred), necessary services such as code updating, topol- ogy discovery and data

  3. Fires in large scale ventilation systems

    International Nuclear Information System (INIS)

    Gregory, W.S.; Martin, R.A.; White, B.W.; Nichols, B.D.; Smith, P.R.; Leslie, I.H.; Fenton, D.L.; Gunaji, M.V.; Blythe, J.P.

    1991-01-01

    This paper summarizes the experience gained simulating fires in large scale ventilation systems patterned after ventilation systems found in nuclear fuel cycle facilities. The series of experiments discussed included: (1) combustion aerosol loading of 0.61x0.61 m HEPA filters with the combustion products of two organic fuels, polystyrene and polymethylemethacrylate; (2) gas dynamic and heat transport through a large scale ventilation system consisting of a 0.61x0.61 m duct 90 m in length, with dampers, HEPA filters, blowers, etc.; (3) gas dynamic and simultaneous transport of heat and solid particulate (consisting of glass beads with a mean aerodynamic diameter of 10μ) through the large scale ventilation system; and (4) the transport of heat and soot, generated by kerosene pool fires, through the large scale ventilation system. The FIRAC computer code, designed to predict fire-induced transients in nuclear fuel cycle facility ventilation systems, was used to predict the results of experiments (2) through (4). In general, the results of the predictions were satisfactory. The code predictions for the gas dynamics, heat transport, and particulate transport and deposition were within 10% of the experimentally measured values. However, the code was less successful in predicting the amount of soot generation from kerosene pool fires, probably due to the fire module of the code being a one-dimensional zone model. The experiments revealed a complicated three-dimensional combustion pattern within the fire room of the ventilation system. Further refinement of the fire module within FIRAC is needed. (orig.)

  4. Reduced fine-scale spatial genetic structure in grazed populations of Dianthus carthusianorum.

    Science.gov (United States)

    Rico, Y; Wagner, H H

    2016-11-01

    Strong spatial genetic structure in plant populations can increase homozygosity, reducing genetic diversity and adaptive potential. The strength of spatial genetic structure largely depends on rates of seed dispersal and pollen flow. Seeds without dispersal adaptations are likely to be dispersed over short distances within the vicinity of the mother plant, resulting in spatial clustering of related genotypes (fine-scale spatial genetic structure, hereafter spatial genetic structure (SGS)). However, primary seed dispersal by zoochory can promote effective dispersal, increasing the mixing of seeds and influencing SGS within plant populations. In this study, we investigated the effects of seed dispersal by rotational sheep grazing on the strength of SGS and genetic diversity using 11 nuclear microsatellites for 49 populations of the calcareous grassland forb Dianthus carthusianorum. Populations connected by rotational sheep grazing showed significantly weaker SGS and higher genetic diversity than populations in ungrazed grasslands. Independent of grazing treatment, small populations showed significantly stronger SGS and lower genetic diversity than larger populations, likely due to genetic drift. A lack of significant differences in the strength of SGS and genetic diversity between populations that were recently colonized and pre-existing populations suggested that populations colonized after the reintroduction of rotational sheep grazing were likely founded by colonists from diverse source populations. We conclude that dispersal by rotational sheep grazing has the potential to considerably reduce SGS within D. carthusianorum populations. Our study highlights the effectiveness of landscape management by rotational sheep grazing to importantly reduce genetic structure at local scales within restored plant populations.

  5. Large-scale Complex IT Systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2011-01-01

    This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that identifies the major challen...

  6. Large-scale complex IT systems

    OpenAIRE

    Sommerville, Ian; Cliff, Dave; Calinescu, Radu; Keen, Justin; Kelly, Tim; Kwiatkowska, Marta; McDermid, John; Paige, Richard

    2012-01-01

    12 pages, 2 figures This paper explores the issues around the construction of large-scale complex systems which are built as 'systems of systems' and suggests that there are fundamental reasons, derived from the inherent complexity in these systems, why our current software engineering methods and techniques cannot be scaled up to cope with the engineering challenges of constructing such systems. It then goes on to propose a research and education agenda for software engineering that ident...

  7. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed; Elsawy, Hesham; Gharbieh, Mohammad; Alouini, Mohamed-Slim; Adinoyi, Abdulkareem; Alshaalan, Furaih

    2017-01-01

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end

  8. Natural Selection in Large Populations

    Science.gov (United States)

    Desai, Michael

    2011-03-01

    I will discuss theoretical and experimental approaches to the evolutionary dynamics and population genetics of natural selection in large populations. In these populations, many mutations are often present simultaneously, and because recombination is limited, selection cannot act on them all independently. Rather, it can only affect whole combinations of mutations linked together on the same chromosome. Methods common in theoretical population genetics have been of limited utility in analyzing this coupling between the fates of different mutations. In the past few years it has become increasingly clear that this is a crucial gap in our understanding, as sequence data has begun to show that selection appears to act pervasively on many linked sites in a wide range of populations, including viruses, microbes, Drosophila, and humans. I will describe approaches that combine analytical tools drawn from statistical physics and dynamical systems with traditional methods in theoretical population genetics to address this problem, and describe how experiments in budding yeast can help us directly observe these evolutionary dynamics.

  9. A large-scale study of epilepsy in Ecuador: methodological aspects.

    Science.gov (United States)

    Placencia, M; Suarez, J; Crespo, F; Sander, J W; Shorvon, S D; Ellison, R H; Cascante, S M

    1992-01-01

    The methodology is presented of a large-scale study of epilepsy carried out in a highland area in northern Ecuador, South America, covering a population of 72,121 people; The study was carried out in two phases, the first, a cross-sectional phase, consisted of a house-to-house survey of all persons in this population, screening for epileptic seizures using a specially designed questionnaire. Possible cases identified in screening were assessed in a cascade diagnostic procedure applied by general doctors and neurologists. Its objectives were: to establish a comprehensive epidemiological profile of epileptic seizures; to describe the clinical phenomenology of this condition in the community; to validate methods for diagnosis and classification of epileptic seizures by a non-specialised team; and to ascertain the community's knowledge, attitudes and practices regarding epilepsy. A sample was selected in this phase in order to study the social aspects of epilepsy in this community. The second phase, which was longitudinal, assessed the ability of non-specialist care in the treatment of epilepsy. It consisted of a prospective clinical trial of antiepileptic therapy in untreated patients using two standard anti-epileptic drugs. Patients were followed for 12 months by a multidisciplinary team consisting of a primary health worker, rural doctor, neurologist, anthropologist, and psychologist. Standardised, reproducible instruments and methods were used. This study was carried out through co-operation between the medical profession, political agencies and the pharmaceutical industry, at an international level. We consider this a model for further large-scale studies of this type.

  10. Prospects for large scale electricity storage in Denmark

    DEFF Research Database (Denmark)

    Krog Ekman, Claus; Jensen, Søren Højgaard

    2010-01-01

    In a future power systems with additional wind power capacity there will be an increased need for large scale power management as well as reliable balancing and reserve capabilities. Different technologies for large scale electricity storage provide solutions to the different challenges arising w...

  11. Evolution of scaling emergence in large-scale spatial epidemic spreading.

    Science.gov (United States)

    Wang, Lin; Li, Xiang; Zhang, Yi-Qing; Zhang, Yan; Zhang, Kan

    2011-01-01

    Zipf's law and Heaps' law are two representatives of the scaling concepts, which play a significant role in the study of complexity science. The coexistence of the Zipf's law and the Heaps' law motivates different understandings on the dependence between these two scalings, which has still hardly been clarified. In this article, we observe an evolution process of the scalings: the Zipf's law and the Heaps' law are naturally shaped to coexist at the initial time, while the crossover comes with the emergence of their inconsistency at the larger time before reaching a stable state, where the Heaps' law still exists with the disappearance of strict Zipf's law. Such findings are illustrated with a scenario of large-scale spatial epidemic spreading, and the empirical results of pandemic disease support a universal analysis of the relation between the two laws regardless of the biological details of disease. Employing the United States domestic air transportation and demographic data to construct a metapopulation model for simulating the pandemic spread at the U.S. country level, we uncover that the broad heterogeneity of the infrastructure plays a key role in the evolution of scaling emergence. The analyses of large-scale spatial epidemic spreading help understand the temporal evolution of scalings, indicating the coexistence of the Zipf's law and the Heaps' law depends on the collective dynamics of epidemic processes, and the heterogeneity of epidemic spread indicates the significance of performing targeted containment strategies at the early time of a pandemic disease.

  12. Large-Scale Structure and Hyperuniformity of Amorphous Ices

    Science.gov (United States)

    Martelli, Fausto; Torquato, Salvatore; Giovambattista, Nicolas; Car, Roberto

    2017-09-01

    We investigate the large-scale structure of amorphous ices and transitions between their different forms by quantifying their large-scale density fluctuations. Specifically, we simulate the isothermal compression of low-density amorphous ice (LDA) and hexagonal ice to produce high-density amorphous ice (HDA). Both HDA and LDA are nearly hyperuniform; i.e., they are characterized by an anomalous suppression of large-scale density fluctuations. By contrast, in correspondence with the nonequilibrium phase transitions to HDA, the presence of structural heterogeneities strongly suppresses the hyperuniformity and the system becomes hyposurficial (devoid of "surface-area fluctuations"). Our investigation challenges the largely accepted "frozen-liquid" picture, which views glasses as structurally arrested liquids. Beyond implications for water, our findings enrich our understanding of pressure-induced structural transformations in glasses.

  13. Large-scale assessment of olfactory preferences and learning in Drosophila melanogaster: behavioral and genetic components

    Directory of Open Access Journals (Sweden)

    Elisabetta Versace

    2015-09-01

    Full Text Available In the Evolve and Resequence method (E&R, experimental evolution and genomics are combined to investigate evolutionary dynamics and the genotype-phenotype link. As other genomic approaches, this methods requires many replicates with large population sizes, which imposes severe restrictions on the analysis of behavioral phenotypes. Aiming to use E&R for investigating the evolution of behavior in Drosophila, we have developed a simple and effective method to assess spontaneous olfactory preferences and learning in large samples of fruit flies using a T-maze. We tested this procedure on (a a large wild-caught population and (b 11 isofemale lines of Drosophila melanogaster. Compared to previous methods, this procedure reduces the environmental noise and allows for the analysis of large population samples. Consistent with previous results, we show that flies have a preference for orange vs. apple odor. With our procedure wild-derived flies exhibit olfactory learning in the absence of previous laboratory selection. Furthermore, we find genetic differences in the olfactory learning with relatively high heritability. We propose this large-scale method as an effective tool for E&R and genome-wide association studies on olfactory preferences and learning.

  14. Large-scale absence of sharks on reefs in the greater-Caribbean: a footprint of human pressures.

    Directory of Open Access Journals (Sweden)

    Christine A Ward-Paige

    Full Text Available BACKGROUND: In recent decades, large pelagic and coastal shark populations have declined dramatically with increased fishing; however, the status of sharks in other systems such as coral reefs remains largely unassessed despite a long history of exploitation. Here we explore the contemporary distribution and sighting frequency of sharks on reefs in the greater-Caribbean and assess the possible role of human pressures on observed patterns. METHODOLOGY/PRINCIPAL FINDINGS: We analyzed 76,340 underwater surveys carried out by trained volunteer divers between 1993 and 2008. Surveys were grouped within one km2 cells, which allowed us to determine the contemporary geographical distribution and sighting frequency of sharks. Sighting frequency was calculated as the ratio of surveys with sharks to the total number of surveys in each cell. We compared sighting frequency to the number of people in the cell vicinity and used population viability analyses to assess the effects of exploitation on population trends. Sharks, with the exception of nurse sharks occurred mainly in areas with very low human population or strong fishing regulations and marine conservation. Population viability analysis suggests that exploitation alone could explain the large-scale absence; however, this pattern is likely to be exacerbated by additional anthropogenic stressors, such as pollution and habitat degradation, that also correlate with human population. CONCLUSIONS/SIGNIFICANCE: Human pressures in coastal zones have lead to the broad-scale absence of sharks on reefs in the greater-Caribbean. Preventing further loss of sharks requires urgent management measures to curb fishing mortality and to mitigate other anthropogenic stressors to protect sites where sharks still exist. The fact that sharks still occur in some densely populated areas where strong fishing regulations are in place indicates the possibility of success and encourages the implementation of conservation measures.

  15. Large-scale absence of sharks on reefs in the greater-Caribbean: a footprint of human pressures.

    Science.gov (United States)

    Ward-Paige, Christine A; Mora, Camilo; Lotze, Heike K; Pattengill-Semmens, Christy; McClenachan, Loren; Arias-Castro, Ery; Myers, Ransom A

    2010-08-05

    In recent decades, large pelagic and coastal shark populations have declined dramatically with increased fishing; however, the status of sharks in other systems such as coral reefs remains largely unassessed despite a long history of exploitation. Here we explore the contemporary distribution and sighting frequency of sharks on reefs in the greater-Caribbean and assess the possible role of human pressures on observed patterns. We analyzed 76,340 underwater surveys carried out by trained volunteer divers between 1993 and 2008. Surveys were grouped within one km2 cells, which allowed us to determine the contemporary geographical distribution and sighting frequency of sharks. Sighting frequency was calculated as the ratio of surveys with sharks to the total number of surveys in each cell. We compared sighting frequency to the number of people in the cell vicinity and used population viability analyses to assess the effects of exploitation on population trends. Sharks, with the exception of nurse sharks occurred mainly in areas with very low human population or strong fishing regulations and marine conservation. Population viability analysis suggests that exploitation alone could explain the large-scale absence; however, this pattern is likely to be exacerbated by additional anthropogenic stressors, such as pollution and habitat degradation, that also correlate with human population. Human pressures in coastal zones have lead to the broad-scale absence of sharks on reefs in the greater-Caribbean. Preventing further loss of sharks requires urgent management measures to curb fishing mortality and to mitigate other anthropogenic stressors to protect sites where sharks still exist. The fact that sharks still occur in some densely populated areas where strong fishing regulations are in place indicates the possibility of success and encourages the implementation of conservation measures.

  16. Double inflation: A possible resolution of the large-scale structure problem

    International Nuclear Information System (INIS)

    Turner, M.S.; Villumsen, J.V.; Vittorio, N.; Silk, J.; Juszkiewicz, R.

    1986-11-01

    A model is presented for the large-scale structure of the universe in which two successive inflationary phases resulted in large small-scale and small large-scale density fluctuations. This bimodal density fluctuation spectrum in an Ω = 1 universe dominated by hot dark matter leads to large-scale structure of the galaxy distribution that is consistent with recent observational results. In particular, large, nearly empty voids and significant large-scale peculiar velocity fields are produced over scales of ∼100 Mpc, while the small-scale structure over ≤ 10 Mpc resembles that in a low density universe, as observed. Detailed analytical calculations and numerical simulations are given of the spatial and velocity correlations. 38 refs., 6 figs

  17. Large-scale fracture mechancis testing -- requirements and possibilities

    International Nuclear Information System (INIS)

    Brumovsky, M.

    1993-01-01

    Application of fracture mechanics to very important and/or complicated structures, like reactor pressure vessels, brings also some questions about the reliability and precision of such calculations. These problems become more pronounced in cases of elastic-plastic conditions of loading and/or in parts with non-homogeneous materials (base metal and austenitic cladding, property gradient changes through material thickness) or with non-homogeneous stress fields (nozzles, bolt threads, residual stresses etc.). For such special cases some verification by large-scale testing is necessary and valuable. This paper discusses problems connected with planning of such experiments with respect to their limitations, requirements to a good transfer of received results to an actual vessel. At the same time, an analysis of possibilities of small-scale model experiments is also shown, mostly in connection with application of results between standard, small-scale and large-scale experiments. Experience from 30 years of large-scale testing in SKODA is used as an example to support this analysis. 1 fig

  18. Ethics of large-scale change

    DEFF Research Database (Denmark)

    Arler, Finn

    2006-01-01

    , which kind of attitude is appropriate when dealing with large-scale changes like these from an ethical point of view. Three kinds of approaches are discussed: Aldo Leopold's mountain thinking, the neoclassical economists' approach, and finally the so-called Concentric Circle Theories approach...

  19. Behavioral responses of birds of prey to large scale energy development in southcentral Washington

    International Nuclear Information System (INIS)

    Fitzner, R.E.

    1985-02-01

    The types of raptorial and semi-raptorial birds that use the Hanford environs are discussed along with the impacts of past operations and the recent WPPSS project on their populations. These findings add to our understanding of the population dynamics of the birds of prey community at the Hanford Site and the expected impacts of the WPPSS energy facilities. The results may have implications toward other large scale energy facilities, and may aid us in management of bird of prey communities throughout the grasslands of the western United States. 110 refs., 5 figs., 4 tabs

  20. Comparison Between Overtopping Discharge in Small and Large Scale Models

    DEFF Research Database (Denmark)

    Helgason, Einar; Burcharth, Hans F.

    2006-01-01

    The present paper presents overtopping measurements from small scale model test performed at the Haudraulic & Coastal Engineering Laboratory, Aalborg University, Denmark and large scale model tests performed at the Largde Wave Channel,Hannover, Germany. Comparison between results obtained from...... small and large scale model tests show no clear evidence of scale effects for overtopping above a threshold value. In the large scale model no overtopping was measured for waveheights below Hs = 0.5m as the water sunk into the voids between the stones on the crest. For low overtopping scale effects...

  1. Multi-scale temporal and spatial variation in genotypic composition of Cladophora-borne Escherichia coli populations in Lake Michigan.

    Science.gov (United States)

    Badgley, Brian D; Ferguson, John; Vanden Heuvel, Amy; Kleinheinz, Gregory T; McDermott, Colleen M; Sandrin, Todd R; Kinzelman, Julie; Junion, Emily A; Byappanahalli, Muruleedhara N; Whitman, Richard L; Sadowsky, Michael J

    2011-01-01

    High concentrations of Escherichia coli in mats of Cladophora in the Great Lakes have raised concern over the continued use of this bacterium as an indicator of microbial water quality. Determining the impacts of these environmentally abundant E. coli, however, necessitates a better understanding of their ecology. In this study, the population structure of 4285 Cladophora-borne E. coli isolates, obtained over multiple three day periods from Lake Michigan Cladophora mats in 2007-2009, was examined by using DNA fingerprint analyses. In contrast to previous studies that have been done using isolates from attached Cladophora obtained over large time scales and distances, the extensive sampling done here on free-floating mats over successive days at multiple sites provided a large dataset that allowed for a detailed examination of changes in population structure over a wide range of spatial and temporal scales. While Cladophora-borne E. coli populations were highly diverse and consisted of many unique isolates, multiple clonal groups were also present and accounted for approximately 33% of all isolates examined. Patterns in population structure were also evident. At the broadest scales, E. coli populations showed some temporal clustering when examined by year, but did not show good spatial distinction among sites. E. coli population structure also showed significant patterns at much finer temporal scales. Populations were distinct on an individual mat basis at a given site, and on individual days within a single mat. Results of these studies indicate that Cladophora-borne E. coli populations consist of a mixture of stable, and possibly naturalized, strains that persist during the life of the mat, and more unique, transient strains that can change over rapid time scales. It is clear that further study of microbial processes at fine spatial and temporal scales is needed, and that caution must be taken when interpolating short term microbial dynamics from results obtained

  2. Isolating relativistic effects in large-scale structure

    Science.gov (United States)

    Bonvin, Camille

    2014-12-01

    We present a fully relativistic calculation of the observed galaxy number counts in the linear regime. We show that besides the density fluctuations and redshift-space distortions, various relativistic effects contribute to observations at large scales. These effects all have the same physical origin: they result from the fact that our coordinate system, namely the galaxy redshift and the incoming photons’ direction, is distorted by inhomogeneities in our Universe. We then discuss the impact of the relativistic effects on the angular power spectrum and on the two-point correlation function in configuration space. We show that the latter is very well adapted to isolate the relativistic effects since it naturally makes use of the symmetries of the different contributions. In particular, we discuss how the Doppler effect and the gravitational redshift distortions can be isolated by looking for a dipole in the cross-correlation function between a bright and a faint population of galaxies.

  3. Needs, opportunities, and options for large scale systems research

    Energy Technology Data Exchange (ETDEWEB)

    Thompson, G.L.

    1984-10-01

    The Office of Energy Research was recently asked to perform a study of Large Scale Systems in order to facilitate the development of a true large systems theory. It was decided to ask experts in the fields of electrical engineering, chemical engineering and manufacturing/operations research for their ideas concerning large scale systems research. The author was asked to distribute a questionnaire among these experts to find out their opinions concerning recent accomplishments and future research directions in large scale systems research. He was also requested to convene a conference which included three experts in each area as panel members to discuss the general area of large scale systems research. The conference was held on March 26--27, 1984 in Pittsburgh with nine panel members, and 15 other attendees. The present report is a summary of the ideas presented and the recommendations proposed by the attendees.

  4. Large-scale structure of the Universe

    International Nuclear Information System (INIS)

    Doroshkevich, A.G.

    1978-01-01

    The problems, discussed at the ''Large-scale Structure of the Universe'' symposium are considered on a popular level. Described are the cell structure of galaxy distribution in the Universe, principles of mathematical galaxy distribution modelling. The images of cell structures, obtained after reprocessing with the computer are given. Discussed are three hypothesis - vortical, entropic, adiabatic, suggesting various processes of galaxy and galaxy clusters origin. A considerable advantage of the adiabatic hypothesis is recognized. The relict radiation, as a method of direct studying the processes taking place in the Universe is considered. The large-scale peculiarities and small-scale fluctuations of the relict radiation temperature enable one to estimate the turbance properties at the pre-galaxy stage. The discussion of problems, pertaining to studying the hot gas, contained in galaxy clusters, the interactions within galaxy clusters and with the inter-galaxy medium, is recognized to be a notable contribution into the development of theoretical and observational cosmology

  5. Seismic safety in conducting large-scale blasts

    Science.gov (United States)

    Mashukov, I. V.; Chaplygin, V. V.; Domanov, V. P.; Semin, A. A.; Klimkin, M. A.

    2017-09-01

    In mining enterprises to prepare hard rocks for excavation a drilling and blasting method is used. With the approach of mining operations to settlements the negative effect of large-scale blasts increases. To assess the level of seismic impact of large-scale blasts the scientific staff of Siberian State Industrial University carried out expertise for coal mines and iron ore enterprises. Determination of the magnitude of surface seismic vibrations caused by mass explosions was performed using seismic receivers, an analog-digital converter with recording on a laptop. The registration results of surface seismic vibrations during production of more than 280 large-scale blasts at 17 mining enterprises in 22 settlements are presented. The maximum velocity values of the Earth’s surface vibrations are determined. The safety evaluation of seismic effect was carried out according to the permissible value of vibration velocity. For cases with exceedance of permissible values recommendations were developed to reduce the level of seismic impact.

  6. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-05-27

    While real-time applications are nowadays routinely used in visualizing large nu- merical simulations and volumes, handling these large-scale datasets requires high-end graphics clusters or supercomputers to process and visualize them. However, not all users have access to powerful clusters. Therefore, it is challenging to come up with a visualization approach that provides insight to large-scale datasets on a single com- puter. Explorable images (EI) is one of the methods that allows users to handle large data on a single workstation. Although it is a view-dependent method, it combines both exploration and modification of visual aspects without re-accessing the original huge data. In this thesis, we propose a novel image-based method that applies the concept of EI in visualizing large flow-field pathlines data. The goal of our work is to provide an optimized image-based method, which scales well with the dataset size. Our approach is based on constructing a per-pixel linked list data structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination and deferred shading are applied, which further improves the performance and scalability of our approach.

  7. The role of large-scale, extratropical dynamics in climate change

    Energy Technology Data Exchange (ETDEWEB)

    Shepherd, T.G. [ed.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop`s University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database.

  8. The role of large-scale, extratropical dynamics in climate change

    International Nuclear Information System (INIS)

    Shepherd, T.G.

    1994-02-01

    The climate modeling community has focused recently on improving our understanding of certain processes, such as cloud feedbacks and ocean circulation, that are deemed critical to climate-change prediction. Although attention to such processes is warranted, emphasis on these areas has diminished a general appreciation of the role played by the large-scale dynamics of the extratropical atmosphere. Lack of interest in extratropical dynamics may reflect the assumption that these dynamical processes are a non-problem as far as climate modeling is concerned, since general circulation models (GCMs) calculate motions on this scale from first principles. Nevertheless, serious shortcomings in our ability to understand and simulate large-scale dynamics exist. Partly due to a paucity of standard GCM diagnostic calculations of large-scale motions and their transports of heat, momentum, potential vorticity, and moisture, a comprehensive understanding of the role of large-scale dynamics in GCM climate simulations has not been developed. Uncertainties remain in our understanding and simulation of large-scale extratropical dynamics and their interaction with other climatic processes, such as cloud feedbacks, large-scale ocean circulation, moist convection, air-sea interaction and land-surface processes. To address some of these issues, the 17th Stanstead Seminar was convened at Bishop's University in Lennoxville, Quebec. The purpose of the Seminar was to promote discussion of the role of large-scale extratropical dynamics in global climate change. Abstracts of the talks are included in this volume. On the basis of these talks, several key issues emerged concerning large-scale extratropical dynamics and their climatic role. Individual records are indexed separately for the database

  9. Large-scale volcanism associated with coronae on Venus

    Science.gov (United States)

    Roberts, K. Magee; Head, James W.

    1993-01-01

    The formation and evolution of coronae on Venus are thought to be the result of mantle upwellings against the crust and lithosphere and subsequent gravitational relaxation. A variety of other features on Venus have been linked to processes associated with mantle upwelling, including shield volcanoes on large regional rises such as Beta, Atla and Western Eistla Regiones and extensive flow fields such as Mylitta and Kaiwan Fluctus near the Lada Terra/Lavinia Planitia boundary. Of these features, coronae appear to possess the smallest amounts of associated volcanism, although volcanism associated with coronae has only been qualitatively examined. An initial survey of coronae based on recent Magellan data indicated that only 9 percent of all coronae are associated with substantial amounts of volcanism, including interior calderas or edifices greater than 50 km in diameter and extensive, exterior radial flow fields. Sixty-eight percent of all coronae were found to have lesser amounts of volcanism, including interior flooding and associated volcanic domes and small shields; the remaining coronae were considered deficient in associated volcanism. It is possible that coronae are related to mantle plumes or diapirs that are lower in volume or in partial melt than those associated with the large shields or flow fields. Regional tectonics or variations in local crustal and thermal structure may also be significant in determining the amount of volcanism produced from an upwelling. It is also possible that flow fields associated with some coronae are sheet-like in nature and may not be readily identified. If coronae are associated with volcanic flow fields, then they may be a significant contributor to plains formation on Venus, as they number over 300 and are widely distributed across the planet. As a continuation of our analysis of large-scale volcanism on Venus, we have reexamined the known population of coronae and assessed quantitatively the scale of volcanism associated

  10. Status: Large-scale subatmospheric cryogenic systems

    International Nuclear Information System (INIS)

    Peterson, T.

    1989-01-01

    In the late 1960's and early 1970's an interest in testing and operating RF cavities at 1.8K motivated the development and construction of four large (300 Watt) 1.8K refrigeration systems. in the past decade, development of successful superconducting RF cavities and interest in obtaining higher magnetic fields with the improved Niobium-Titanium superconductors has once again created interest in large-scale 1.8K refrigeration systems. The L'Air Liquide plant for Tore Supra is a recently commissioned 300 Watt 1.8K system which incorporates new technology, cold compressors, to obtain the low vapor pressure for low temperature cooling. CEBAF proposes to use cold compressors to obtain 5KW at 2.0K. Magnetic refrigerators of 10 Watt capacity or higher at 1.8K are now being developed. The state of the art of large-scale refrigeration in the range under 4K will be reviewed. 28 refs., 4 figs., 7 tabs

  11. Fine-scale patterns of population stratification confound rare variant association tests.

    Directory of Open Access Journals (Sweden)

    Timothy D O'Connor

    Full Text Available Advances in next-generation sequencing technology have enabled systematic exploration of the contribution of rare variation to Mendelian and complex diseases. Although it is well known that population stratification can generate spurious associations with common alleles, its impact on rare variant association methods remains poorly understood. Here, we performed exhaustive coalescent simulations with demographic parameters calibrated from exome sequence data to evaluate the performance of nine rare variant association methods in the presence of fine-scale population structure. We find that all methods have an inflated spurious association rate for parameter values that are consistent with levels of differentiation typical of European populations. For example, at a nominal significance level of 5%, some test statistics have a spurious association rate as high as 40%. Finally, we empirically assess the impact of population stratification in a large data set of 4,298 European American exomes. Our results have important implications for the design, analysis, and interpretation of rare variant genome-wide association studies.

  12. Large-scale weakly supervised object localization via latent category learning.

    Science.gov (United States)

    Chong Wang; Kaiqi Huang; Weiqiang Ren; Junge Zhang; Maybank, Steve

    2015-04-01

    Localizing objects in cluttered backgrounds is challenging under large-scale weakly supervised conditions. Due to the cluttered image condition, objects usually have large ambiguity with backgrounds. Besides, there is also a lack of effective algorithm for large-scale weakly supervised localization in cluttered backgrounds. However, backgrounds contain useful latent information, e.g., the sky in the aeroplane class. If this latent information can be learned, object-background ambiguity can be largely reduced and background can be suppressed effectively. In this paper, we propose the latent category learning (LCL) in large-scale cluttered conditions. LCL is an unsupervised learning method which requires only image-level class labels. First, we use the latent semantic analysis with semantic object representation to learn the latent categories, which represent objects, object parts or backgrounds. Second, to determine which category contains the target object, we propose a category selection strategy by evaluating each category's discrimination. Finally, we propose the online LCL for use in large-scale conditions. Evaluation on the challenging PASCAL Visual Object Class (VOC) 2007 and the large-scale imagenet large-scale visual recognition challenge 2013 detection data sets shows that the method can improve the annotation precision by 10% over previous methods. More importantly, we achieve the detection precision which outperforms previous results by a large margin and can be competitive to the supervised deformable part model 5.0 baseline on both data sets.

  13. Large-scale networks in engineering and life sciences

    CERN Document Server

    Findeisen, Rolf; Flockerzi, Dietrich; Reichl, Udo; Sundmacher, Kai

    2014-01-01

    This edited volume provides insights into and tools for the modeling, analysis, optimization, and control of large-scale networks in the life sciences and in engineering. Large-scale systems are often the result of networked interactions between a large number of subsystems, and their analysis and control are becoming increasingly important. The chapters of this book present the basic concepts and theoretical foundations of network theory and discuss its applications in different scientific areas such as biochemical reactions, chemical production processes, systems biology, electrical circuits, and mobile agents. The aim is to identify common concepts, to understand the underlying mathematical ideas, and to inspire discussions across the borders of the various disciplines.  The book originates from the interdisciplinary summer school “Large Scale Networks in Engineering and Life Sciences” hosted by the International Max Planck Research School Magdeburg, September 26-30, 2011, and will therefore be of int...

  14. Large-scale recovery of an endangered amphibian despite ongoing exposure to multiple stressors

    Science.gov (United States)

    Knapp, Roland A.; Fellers, Gary M.; Kleeman, Patrick M.; Miller, David A. W.; Vrendenburg, Vance T.; Rosenblum, Erica Bree; Briggs, Cheryl J.

    2016-01-01

    Amphibians are one of the most threatened animal groups, with 32% of species at risk for extinction. Given this imperiled status, is the disappearance of a large fraction of the Earth’s amphibians inevitable, or are some declining species more resilient than is generally assumed? We address this question in a species that is emblematic of many declining amphibians, the endangered Sierra Nevada yellow-legged frog (Rana sierrae). Based on >7,000 frog surveys conducted across Yosemite National Park over a 20-y period, we show that, after decades of decline and despite ongoing exposure to multiple stressors, including introduced fish, the recently emerged disease chytridiomycosis, and pesticides, R. sierrae abundance increased sevenfold during the study and at a rate of 11% per year. These increases occurred in hundreds of populations throughout Yosemite, providing a rare example of amphibian recovery at an ecologically relevant spatial scale. Results from a laboratory experiment indicate that these increases may be in part because of reduced frog susceptibility to chytridiomycosis. The disappearance of nonnative fish from numerous water bodies after cessation of stocking also contributed to the recovery. The large-scale increases in R. sierrae abundance that we document suggest that, when habitats are relatively intact and stressors are reduced in their importance by active management or species’ adaptive responses, declines of some amphibians may be partially reversible, at least at a regional scale. Other studies conducted over similarly large temporal and spatial scales are critically needed to provide insight and generality about the reversibility of amphibian declines at a global scale.

  15. Psychometric properties of the Plutchik´s Violence Risk Scale on adolescent sample of Spanish-speaking population.

    Science.gov (United States)

    Alcázar-Córcoles, Miguel Á; Verdejo-García, Antonio; Bouso-Sáiz, José C

    2016-01-01

    The objective of the present study was the validation and scaling of the Plutchik's Violence Risk Scale (EV) in adolescent Spanish-speaking population. For this purpose, a sample of adolescents from El Salvador, Mexico and Spain was obtained. The sample consisted of 1035 participants with a mean age of 16.2. There were 450 adolescents from forensic population (those who committed crime) and 585 adolescents from normal population (no crime committed). The internal consistency of the EV was estimated by Cronbach's alpha coefficient and with a value of 0.782. As for validity, the factorial structures found explain a large proportion of the variance (53.385%); the convergent validity was estimated by the correlation between the dimensions found, the EV and sociodemographic, criminological and personality variables. The developed scales are presented, for the first time in a cross-cultural sample, differentiating between gender and continent. Consequently, the obtained results suggest that the EV is a valid and reliable instrument within adolescent Spanish-speaking population. Furthermore, it is a quick scale, easy to apply, which is something valuable in forensic assessment.

  16. An Novel Architecture of Large-scale Communication in IOT

    Science.gov (United States)

    Ma, Wubin; Deng, Su; Huang, Hongbin

    2018-03-01

    In recent years, many scholars have done a great deal of research on the development of Internet of Things and networked physical systems. However, few people have made the detailed visualization of the large-scale communications architecture in the IOT. In fact, the non-uniform technology between IPv6 and access points has led to a lack of broad principles of large-scale communications architectures. Therefore, this paper presents the Uni-IPv6 Access and Information Exchange Method (UAIEM), a new architecture and algorithm that addresses large-scale communications in the IOT.

  17. Benefits of transactive memory systems in large-scale development

    OpenAIRE

    Aivars, Sablis

    2016-01-01

    Context. Large-scale software development projects are those consisting of a large number of teams, maybe even spread across multiple locations, and working on large and complex software tasks. That means that neither a team member individually nor an entire team holds all the knowledge about the software being developed and teams have to communicate and coordinate their knowledge. Therefore, teams and team members in large-scale software development projects must acquire and manage expertise...

  18. Study of a large scale neutron measurement channel

    International Nuclear Information System (INIS)

    Amarouayache, Anissa; Ben Hadid, Hayet.

    1982-12-01

    A large scale measurement channel allows the processing of the signal coming from an unique neutronic sensor, during three different running modes: impulses, fluctuations and current. The study described in this note includes three parts: - A theoretical study of the large scale channel and its brief description are given. The results obtained till now in that domain are presented. - The fluctuation mode is thoroughly studied and the improvements to be done are defined. The study of a fluctuation linear channel with an automatic commutation of scales is described and the results of the tests are given. In this large scale channel, the method of data processing is analogical. - To become independent of the problems generated by the use of a an analogical processing of the fluctuation signal, a digital method of data processing is tested. The validity of that method is improved. The results obtained on a test system realized according to this method are given and a preliminary plan for further research is defined [fr

  19. Molecular computational elements encode large populations of small objects

    Science.gov (United States)

    Prasanna de Silva, A.; James, Mark R.; McKinney, Bernadine O. F.; Pears, David A.; Weir, Sheenagh M.

    2006-10-01

    Since the introduction of molecular computation, experimental molecular computational elements have grown to encompass small-scale integration, arithmetic and games, among others. However, the need for a practical application has been pressing. Here we present molecular computational identification (MCID), a demonstration that molecular logic and computation can be applied to a widely relevant issue. Examples of populations that need encoding in the microscopic world are cells in diagnostics or beads in combinatorial chemistry (tags). Taking advantage of the small size (about 1nm) and large `on/off' output ratios of molecular logic gates and using the great variety of logic types, input chemical combinations, switching thresholds and even gate arrays in addition to colours, we produce unique identifiers for members of populations of small polymer beads (about 100μm) used for synthesis of combinatorial libraries. Many millions of distinguishable tags become available. This method should be extensible to far smaller objects, with the only requirement being a `wash and watch' protocol. Our focus on converting molecular science into technology concerning analog sensors, turns to digital logic devices in the present work.

  20. Imprints of the large-scale structure on AGN formation and evolution

    Science.gov (United States)

    Porqueres, Natàlia; Jasche, Jens; Enßlin, Torsten A.; Lavaux, Guilhem

    2018-04-01

    Black hole masses are found to correlate with several global properties of their host galaxies, suggesting that black holes and galaxies have an intertwined evolution and that active galactic nuclei (AGN) have a significant impact on galaxy evolution. Since the large-scale environment can also affect AGN, this work studies how their formation and properties depend on the environment. We have used a reconstructed three-dimensional high-resolution density field obtained from a Bayesian large-scale structure reconstruction method applied to the 2M++ galaxy sample. A web-type classification relying on the shear tensor is used to identify different structures on the cosmic web, defining voids, sheets, filaments, and clusters. We confirm that the environmental density affects the AGN formation and their properties. We found that the AGN abundance is equivalent to the galaxy abundance, indicating that active and inactive galaxies reside in similar dark matter halos. However, occurrence rates are different for each spectral type and accretion rate. These differences are consistent with the AGN evolutionary sequence suggested by previous authors, Seyferts and Transition objects transforming into low-ionization nuclear emission line regions (LINERs), the weaker counterpart of Seyferts. We conclude that AGN properties depend on the environmental density more than on the web-type. More powerful starbursts and younger stellar populations are found in high densities, where interactions and mergers are more likely. AGN hosts show smaller masses in clusters for Seyferts and Transition objects, which might be due to gas stripping. In voids, the AGN population is dominated by the most massive galaxy hosts.

  1. Capabilities of the Large-Scale Sediment Transport Facility

    Science.gov (United States)

    2016-04-01

    pump flow meters, sediment trap weigh tanks , and beach profiling lidar. A detailed discussion of the original LSTF features and capabilities can be...ERDC/CHL CHETN-I-88 April 2016 Approved for public release; distribution is unlimited. Capabilities of the Large-Scale Sediment Transport...describes the Large-Scale Sediment Transport Facility (LSTF) and recent upgrades to the measurement systems. The purpose of these upgrades was to increase

  2. Spatiotemporal property and predictability of large-scale human mobility

    Science.gov (United States)

    Zhang, Hai-Tao; Zhu, Tao; Fu, Dongfei; Xu, Bowen; Han, Xiao-Pu; Chen, Duxin

    2018-04-01

    Spatiotemporal characteristics of human mobility emerging from complexity on individual scale have been extensively studied due to the application potential on human behavior prediction and recommendation, and control of epidemic spreading. We collect and investigate a comprehensive data set of human activities on large geographical scales, including both websites browse and mobile towers visit. Numerical results show that the degree of activity decays as a power law, indicating that human behaviors are reminiscent of scale-free random walks known as Lévy flight. More significantly, this study suggests that human activities on large geographical scales have specific non-Markovian characteristics, such as a two-segment power-law distribution of dwelling time and a high possibility for prediction. Furthermore, a scale-free featured mobility model with two essential ingredients, i.e., preferential return and exploration, and a Gaussian distribution assumption on the exploration tendency parameter is proposed, which outperforms existing human mobility models under scenarios of large geographical scales.

  3. Problems of large-scale vertically-integrated aquaculture

    Energy Technology Data Exchange (ETDEWEB)

    Webber, H H; Riordan, P F

    1976-01-01

    The problems of vertically-integrated aquaculture are outlined; they are concerned with: species limitations (in the market, biological and technological); site selection, feed, manpower needs, and legal, institutional and financial requirements. The gaps in understanding of, and the constraints limiting, large-scale aquaculture are listed. Future action is recommended with respect to: types and diversity of species to be cultivated, marketing, biotechnology (seed supply, disease control, water quality and concerted effort), siting, feed, manpower, legal and institutional aids (granting of water rights, grants, tax breaks, duty-free imports, etc.), and adequate financing. The last of hard data based on experience suggests that large-scale vertically-integrated aquaculture is a high risk enterprise, and with the high capital investment required, banks and funding institutions are wary of supporting it. Investment in pilot projects is suggested to demonstrate that large-scale aquaculture can be a fully functional and successful business. Construction and operation of such pilot farms is judged to be in the interests of both the public and private sector.

  4. Large-scale computing with Quantum Espresso

    International Nuclear Information System (INIS)

    Giannozzi, P.; Cavazzoni, C.

    2009-01-01

    This paper gives a short introduction to Quantum Espresso: a distribution of software for atomistic simulations in condensed-matter physics, chemical physics, materials science, and to its usage in large-scale parallel computing.

  5. Forecasting climate change impacts on plant populations over large spatial extents

    Science.gov (United States)

    Tredennick, Andrew T.; Hooten, Mevin B.; Aldridge, Cameron L.; Homer, Collin G.; Kleinhesselink, Andrew R.; Adler, Peter B.

    2016-01-01

    Plant population models are powerful tools for predicting climate change impacts in one location, but are difficult to apply at landscape scales. We overcome this limitation by taking advantage of two recent advances: remotely sensed, species-specific estimates of plant cover and statistical models developed for spatiotemporal dynamics of animal populations. Using computationally efficient model reparameterizations, we fit a spatiotemporal population model to a 28-year time series of sagebrush (Artemisia spp.) percent cover over a 2.5 × 5 km landscape in southwestern Wyoming while formally accounting for spatial autocorrelation. We include interannual variation in precipitation and temperature as covariates in the model to investigate how climate affects the cover of sagebrush. We then use the model to forecast the future abundance of sagebrush at the landscape scale under projected climate change, generating spatially explicit estimates of sagebrush population trajectories that have, until now, been impossible to produce at this scale. Our broadscale and long-term predictions are rooted in small-scale and short-term population dynamics and provide an alternative to predictions offered by species distribution models that do not include population dynamics. Our approach, which combines several existing techniques in a novel way, demonstrates the use of remote sensing data to model population responses to environmental change that play out at spatial scales far greater than the traditional field study plot.

  6. VESPA: Very large-scale Evolutionary and Selective Pressure Analyses

    Directory of Open Access Journals (Sweden)

    Andrew E. Webb

    2017-06-01

    Full Text Available Background Large-scale molecular evolutionary analyses of protein coding sequences requires a number of preparatory inter-related steps from finding gene families, to generating alignments and phylogenetic trees and assessing selective pressure variation. Each phase of these analyses can represent significant challenges, particularly when working with entire proteomes (all protein coding sequences in a genome from a large number of species. Methods We present VESPA, software capable of automating a selective pressure analysis using codeML in addition to the preparatory analyses and summary statistics. VESPA is written in python and Perl and is designed to run within a UNIX environment. Results We have benchmarked VESPA and our results show that the method is consistent, performs well on both large scale and smaller scale datasets, and produces results in line with previously published datasets. Discussion Large-scale gene family identification, sequence alignment, and phylogeny reconstruction are all important aspects of large-scale molecular evolutionary analyses. VESPA provides flexible software for simplifying these processes along with downstream selective pressure variation analyses. The software automatically interprets results from codeML and produces simplified summary files to assist the user in better understanding the results. VESPA may be found at the following website: http://www.mol-evol.org/VESPA.

  7. Improving predictions of large scale soil carbon dynamics: Integration of fine-scale hydrological and biogeochemical processes, scaling, and benchmarking

    Science.gov (United States)

    Riley, W. J.; Dwivedi, D.; Ghimire, B.; Hoffman, F. M.; Pau, G. S. H.; Randerson, J. T.; Shen, C.; Tang, J.; Zhu, Q.

    2015-12-01

    Numerical model representations of decadal- to centennial-scale soil-carbon dynamics are a dominant cause of uncertainty in climate change predictions. Recent attempts by some Earth System Model (ESM) teams to integrate previously unrepresented soil processes (e.g., explicit microbial processes, abiotic interactions with mineral surfaces, vertical transport), poor performance of many ESM land models against large-scale and experimental manipulation observations, and complexities associated with spatial heterogeneity highlight the nascent nature of our community's ability to accurately predict future soil carbon dynamics. I will present recent work from our group to develop a modeling framework to integrate pore-, column-, watershed-, and global-scale soil process representations into an ESM (ACME), and apply the International Land Model Benchmarking (ILAMB) package for evaluation. At the column scale and across a wide range of sites, observed depth-resolved carbon stocks and their 14C derived turnover times can be explained by a model with explicit representation of two microbial populations, a simple representation of mineralogy, and vertical transport. Integrating soil and plant dynamics requires a 'process-scaling' approach, since all aspects of the multi-nutrient system cannot be explicitly resolved at ESM scales. I will show that one approach, the Equilibrium Chemistry Approximation, improves predictions of forest nitrogen and phosphorus experimental manipulations and leads to very different global soil carbon predictions. Translating model representations from the site- to ESM-scale requires a spatial scaling approach that either explicitly resolves the relevant processes, or more practically, accounts for fine-resolution dynamics at coarser scales. To that end, I will present recent watershed-scale modeling work that applies reduced order model methods to accurately scale fine-resolution soil carbon dynamics to coarse-resolution simulations. Finally, we

  8. Milankovitch-scale correlations between deeply buried microbial populations and biogenic ooze lithology

    Science.gov (United States)

    Aiello, I.W.; Bekins, B.A.

    2010-01-01

    The recent discoveries of large, active populations of microbes in the subseafloor of the world's oceans supports the impact of the deep biosphere biota on global biogeochemical cycles and raises important questions concerning the functioning of these extreme environments for life. These investigations demonstrated that subseafloor microbes are unevenly distributed and that cell abundances and metabolic activities are often independent from sediment depths, with increased prokaryotic activity at geochemical and/or sedimentary interfaces. In this study we demonstrate that microbial populations vary at the scale of individual beds in the biogenic oozes of a drill site in the eastern equatorial Pacific (Ocean Drilling Program Leg 201, Site 1226). We relate bedding-scale changes in biogenic ooze sediment composition to organic carbon (OC) and microbial cell concentrations using high-resolution color reflectance data as proxy for lithology. Our analyses demonstrate that microbial concentrations are an order of magnitude higher in the more organic-rich diatom oozes than in the nannofossil oozes. The variations mimic small-scale variations in diatom abundance and OC, indicating that the modern distribution of microbial biomass is ultimately controlled by Milankovitch-frequency variations in past oceanographic conditions. ?? 2010 Geological Society of America.

  9. RESTRUCTURING OF THE LARGE-SCALE SPRINKLERS

    Directory of Open Access Journals (Sweden)

    Paweł Kozaczyk

    2016-09-01

    Full Text Available One of the best ways for agriculture to become independent from shortages of precipitation is irrigation. In the seventies and eighties of the last century a number of large-scale sprinklers in Wielkopolska was built. At the end of 1970’s in the Poznan province 67 sprinklers with a total area of 6400 ha were installed. The average size of the sprinkler reached 95 ha. In 1989 there were 98 sprinklers, and the area which was armed with them was more than 10 130 ha. The study was conducted on 7 large sprinklers with the area ranging from 230 to 520 hectares in 1986÷1998. After the introduction of the market economy in the early 90’s and ownership changes in agriculture, large-scale sprinklers have gone under a significant or total devastation. Land on the State Farms of the State Agricultural Property Agency has leased or sold and the new owners used the existing sprinklers to a very small extent. This involved a change in crop structure, demand structure and an increase in operating costs. There has also been a threefold increase in electricity prices. Operation of large-scale irrigation encountered all kinds of barriers in practice and limitations of system solutions, supply difficulties, high levels of equipment failure which is not inclined to rational use of available sprinklers. An effect of a vision of the local area was to show the current status of the remaining irrigation infrastructure. The adopted scheme for the restructuring of Polish agriculture was not the best solution, causing massive destruction of assets previously invested in the sprinkler system.

  10. Large-scale synthesis of YSZ nanopowder by Pechini method

    Indian Academy of Sciences (India)

    Administrator

    structure and chemical purity of 99⋅1% by inductively coupled plasma optical emission spectroscopy on a large scale. Keywords. Sol–gel; yttria-stabilized zirconia; large scale; nanopowder; Pechini method. 1. Introduction. Zirconia has attracted the attention of many scientists because of its tremendous thermal, mechanical ...

  11. The Phoenix series large scale LNG pool fire experiments.

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, Richard B.; Jensen, Richard Pearson; Demosthenous, Byron; Luketa, Anay Josephine; Ricks, Allen Joseph; Hightower, Marion Michael; Blanchat, Thomas K.; Helmick, Paul H.; Tieszen, Sheldon Robert; Deola, Regina Anne; Mercier, Jeffrey Alan; Suo-Anttila, Jill Marie; Miller, Timothy J.

    2010-12-01

    The increasing demand for natural gas could increase the number and frequency of Liquefied Natural Gas (LNG) tanker deliveries to ports across the United States. Because of the increasing number of shipments and the number of possible new facilities, concerns about the potential safety of the public and property from an accidental, and even more importantly intentional spills, have increased. While improvements have been made over the past decade in assessing hazards from LNG spills, the existing experimental data is much smaller in size and scale than many postulated large accidental and intentional spills. Since the physics and hazards from a fire change with fire size, there are concerns about the adequacy of current hazard prediction techniques for large LNG spills and fires. To address these concerns, Congress funded the Department of Energy (DOE) in 2008 to conduct a series of laboratory and large-scale LNG pool fire experiments at Sandia National Laboratories (Sandia) in Albuquerque, New Mexico. This report presents the test data and results of both sets of fire experiments. A series of five reduced-scale (gas burner) tests (yielding 27 sets of data) were conducted in 2007 and 2008 at Sandia's Thermal Test Complex (TTC) to assess flame height to fire diameter ratios as a function of nondimensional heat release rates for extrapolation to large-scale LNG fires. The large-scale LNG pool fire experiments were conducted in a 120 m diameter pond specially designed and constructed in Sandia's Area III large-scale test complex. Two fire tests of LNG spills of 21 and 81 m in diameter were conducted in 2009 to improve the understanding of flame height, smoke production, and burn rate and therefore the physics and hazards of large LNG spills and fires.

  12. Geospatial Optimization of Siting Large-Scale Solar Projects

    Energy Technology Data Exchange (ETDEWEB)

    Macknick, Jordan [National Renewable Energy Lab. (NREL), Golden, CO (United States); Quinby, Ted [National Renewable Energy Lab. (NREL), Golden, CO (United States); Caulfield, Emmet [Stanford Univ., CA (United States); Gerritsen, Margot [Stanford Univ., CA (United States); Diffendorfer, Jay [U.S. Geological Survey, Boulder, CO (United States); Haines, Seth [U.S. Geological Survey, Boulder, CO (United States)

    2014-03-01

    Recent policy and economic conditions have encouraged a renewed interest in developing large-scale solar projects in the U.S. Southwest. However, siting large-scale solar projects is complex. In addition to the quality of the solar resource, solar developers must take into consideration many environmental, social, and economic factors when evaluating a potential site. This report describes a proof-of-concept, Web-based Geographical Information Systems (GIS) tool that evaluates multiple user-defined criteria in an optimization algorithm to inform discussions and decisions regarding the locations of utility-scale solar projects. Existing siting recommendations for large-scale solar projects from governmental and non-governmental organizations are not consistent with each other, are often not transparent in methods, and do not take into consideration the differing priorities of stakeholders. The siting assistance GIS tool we have developed improves upon the existing siting guidelines by being user-driven, transparent, interactive, capable of incorporating multiple criteria, and flexible. This work provides the foundation for a dynamic siting assistance tool that can greatly facilitate siting decisions among multiple stakeholders.

  13. Large-scale Agricultural Land Acquisitions in West Africa | IDRC ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    This project will examine large-scale agricultural land acquisitions in nine West African countries -Burkina Faso, Guinea-Bissau, Guinea, Benin, Mali, Togo, Senegal, Niger, and Côte d'Ivoire. ... They will use the results to increase public awareness and knowledge about the consequences of large-scale land acquisitions.

  14. Large-scale motions in the universe: a review

    International Nuclear Information System (INIS)

    Burstein, D.

    1990-01-01

    The expansion of the universe can be retarded in localised regions within the universe both by the presence of gravity and by non-gravitational motions generated in the post-recombination universe. The motions of galaxies thus generated are called 'peculiar motions', and the amplitudes, size scales and coherence of these peculiar motions are among the most direct records of the structure of the universe. As such, measurements of these properties of the present-day universe provide some of the severest tests of cosmological theories. This is a review of the current evidence for large-scale motions of galaxies out to a distance of ∼5000 km s -1 (in an expanding universe, distance is proportional to radial velocity). 'Large-scale' in this context refers to motions that are correlated over size scales larger than the typical sizes of groups of galaxies, up to and including the size of the volume surveyed. To orient the reader into this relatively new field of study, a short modern history is given together with an explanation of the terminology. Careful consideration is given to the data used to measure the distances, and hence the peculiar motions, of galaxies. The evidence for large-scale motions is presented in a graphical fashion, using only the most reliable data for galaxies spanning a wide range in optical properties and over the complete range of galactic environments. The kinds of systematic errors that can affect this analysis are discussed, and the reliability of these motions is assessed. The predictions of two models of large-scale motion are compared to the observations, and special emphasis is placed on those motions in which our own Galaxy directly partakes. (author)

  15. State of the Art in Large-Scale Soil Moisture Monitoring

    Science.gov (United States)

    Ochsner, Tyson E.; Cosh, Michael Harold; Cuenca, Richard H.; Dorigo, Wouter; Draper, Clara S.; Hagimoto, Yutaka; Kerr, Yan H.; Larson, Kristine M.; Njoku, Eni Gerald; Small, Eric E.; hide

    2013-01-01

    Soil moisture is an essential climate variable influencing land atmosphere interactions, an essential hydrologic variable impacting rainfall runoff processes, an essential ecological variable regulating net ecosystem exchange, and an essential agricultural variable constraining food security. Large-scale soil moisture monitoring has advanced in recent years creating opportunities to transform scientific understanding of soil moisture and related processes. These advances are being driven by researchers from a broad range of disciplines, but this complicates collaboration and communication. For some applications, the science required to utilize large-scale soil moisture data is poorly developed. In this review, we describe the state of the art in large-scale soil moisture monitoring and identify some critical needs for research to optimize the use of increasingly available soil moisture data. We review representative examples of 1) emerging in situ and proximal sensing techniques, 2) dedicated soil moisture remote sensing missions, 3) soil moisture monitoring networks, and 4) applications of large-scale soil moisture measurements. Significant near-term progress seems possible in the use of large-scale soil moisture data for drought monitoring. Assimilation of soil moisture data for meteorological or hydrologic forecasting also shows promise, but significant challenges related to model structures and model errors remain. Little progress has been made yet in the use of large-scale soil moisture observations within the context of ecological or agricultural modeling. Opportunities abound to advance the science and practice of large-scale soil moisture monitoring for the sake of improved Earth system monitoring, modeling, and forecasting.

  16. Position Paper on Jatropha curcas. State of the Art Small and Large Scale Project Development

    Energy Technology Data Exchange (ETDEWEB)

    Daey Ouwens, K.; Franken, Y.J.; Rijssenbeek, W. [Fuels from Agriculture in Communal Technology FACT, Eindhoven (Netherlands); Francis, G. [University of Hohenheim, Hohenheim (Germany); Riedacker, A. [French National Institute for Agricultural Research INRA, Paris (France); Foidl, N.; Jongschaap, R.; Bindraban, P. [Plant Research International PRI, Wageningen (Netherlands)

    2007-06-15

    Much information has been collected during the Seminar on Jatropha held in Wageningen, Netherlands, March 2007, summarized in this paper. Much research is still necessary to improve yield, to allow use of biological products such as oil cake as animal fodder, etc. Good documented yield data are still scarce. Cooperation with research institutions is therefore recommended. At this stage it is still particularly important to distinguish between reality, promises and dangerous extrapolations. To avoid, spectacular and regretful failures and waste of money for investors as well as great disappointments of local populations, promoters of large scale plantation are invited to adopt stepwise approaches: large scale plantations should only be considered after some 4 to 5 years obtaining experimental data (annual seed yield and oil yield, economical viability etc.) from a sufficient number of small scale experimental plots (about 1 ha) corresponding to the whole range of soil and climatic conditions of such projects.

  17. A route to explosive large-scale magnetic reconnection in a super-ion-scale current sheet

    Directory of Open Access Journals (Sweden)

    K. G. Tanaka

    2009-01-01

    Full Text Available How to trigger magnetic reconnection is one of the most interesting and important problems in space plasma physics. Recently, electron temperature anisotropy (αeo=Te⊥/Te|| at the center of a current sheet and non-local effect of the lower-hybrid drift instability (LHDI that develops at the current sheet edges have attracted attention in this context. In addition to these effects, here we also study the effects of ion temperature anisotropy (αio=Ti⊥/Ti||. Electron anisotropy effects are known to be helpless in a current sheet whose thickness is of ion-scale. In this range of current sheet thickness, the LHDI effects are shown to weaken substantially with a small increase in thickness and the obtained saturation level is too low for a large-scale reconnection to be achieved. Then we investigate whether introduction of electron and ion temperature anisotropies in the initial stage would couple with the LHDI effects to revive quick triggering of large-scale reconnection in a super-ion-scale current sheet. The results are as follows. (1 The initial electron temperature anisotropy is consumed very quickly when a number of minuscule magnetic islands (each lateral length is 1.5~3 times the ion inertial length form. These minuscule islands do not coalesce into a large-scale island to enable large-scale reconnection. (2 The subsequent LHDI effects disturb the current sheet filled with the small islands. This makes the triggering time scale to be accelerated substantially but does not enhance the saturation level of reconnected flux. (3 When the ion temperature anisotropy is added, it survives through the small island formation stage and makes even quicker triggering to happen when the LHDI effects set-in. Furthermore the saturation level is seen to be elevated by a factor of ~2 and large-scale reconnection is achieved only in this case. Comparison with two-dimensional simulations that exclude the LHDI effects confirms that the saturation level

  18. Large-scale Labeled Datasets to Fuel Earth Science Deep Learning Applications

    Science.gov (United States)

    Maskey, M.; Ramachandran, R.; Miller, J.

    2017-12-01

    Deep learning has revolutionized computer vision and natural language processing with various algorithms scaled using high-performance computing. However, generic large-scale labeled datasets such as the ImageNet are the fuel that drives the impressive accuracy of deep learning results. Large-scale labeled datasets already exist in domains such as medical science, but creating them in the Earth science domain is a challenge. While there are ways to apply deep learning using limited labeled datasets, there is a need in the Earth sciences for creating large-scale labeled datasets for benchmarking and scaling deep learning applications. At the NASA Marshall Space Flight Center, we are using deep learning for a variety of Earth science applications where we have encountered the need for large-scale labeled datasets. We will discuss our approaches for creating such datasets and why these datasets are just as valuable as deep learning algorithms. We will also describe successful usage of these large-scale labeled datasets with our deep learning based applications.

  19. Large-scale structure observables in general relativity

    International Nuclear Information System (INIS)

    Jeong, Donghui; Schmidt, Fabian

    2015-01-01

    We review recent studies that rigorously define several key observables of the large-scale structure of the Universe in a general relativistic context. Specifically, we consider (i) redshift perturbation of cosmic clock events; (ii) distortion of cosmic rulers, including weak lensing shear and magnification; and (iii) observed number density of tracers of the large-scale structure. We provide covariant and gauge-invariant expressions of these observables. Our expressions are given for a linearly perturbed flat Friedmann–Robertson–Walker metric including scalar, vector, and tensor metric perturbations. While we restrict ourselves to linear order in perturbation theory, the approach can be straightforwardly generalized to higher order. (paper)

  20. Fatigue Analysis of Large-scale Wind turbine

    Directory of Open Access Journals (Sweden)

    Zhu Yongli

    2017-01-01

    Full Text Available The paper does research on top flange fatigue damage of large-scale wind turbine generator. It establishes finite element model of top flange connection system with finite element analysis software MSC. Marc/Mentat, analyzes its fatigue strain, implements load simulation of flange fatigue working condition with Bladed software, acquires flange fatigue load spectrum with rain-flow counting method, finally, it realizes fatigue analysis of top flange with fatigue analysis software MSC. Fatigue and Palmgren-Miner linear cumulative damage theory. The analysis result indicates that its result provides new thinking for flange fatigue analysis of large-scale wind turbine generator, and possesses some practical engineering value.

  1. Real-time simulation of large-scale floods

    Science.gov (United States)

    Liu, Q.; Qin, Y.; Li, G. D.; Liu, Z.; Cheng, D. J.; Zhao, Y. H.

    2016-08-01

    According to the complex real-time water situation, the real-time simulation of large-scale floods is very important for flood prevention practice. Model robustness and running efficiency are two critical factors in successful real-time flood simulation. This paper proposed a robust, two-dimensional, shallow water model based on the unstructured Godunov- type finite volume method. A robust wet/dry front method is used to enhance the numerical stability. An adaptive method is proposed to improve the running efficiency. The proposed model is used for large-scale flood simulation on real topography. Results compared to those of MIKE21 show the strong performance of the proposed model.

  2. Large-scale numerical simulations of plasmas

    International Nuclear Information System (INIS)

    Hamaguchi, Satoshi

    2004-01-01

    The recent trend of large scales simulations of fusion plasma and processing plasmas is briefly summarized. Many advanced simulation techniques have been developed for fusion plasmas and some of these techniques are now applied to analyses of processing plasmas. (author)

  3. Nearly incompressible fluids: Hydrodynamics and large scale inhomogeneity

    International Nuclear Information System (INIS)

    Hunana, P.; Zank, G. P.; Shaikh, D.

    2006-01-01

    A system of hydrodynamic equations in the presence of large-scale inhomogeneities for a high plasma beta solar wind is derived. The theory is derived under the assumption of low turbulent Mach number and is developed for the flows where the usual incompressible description is not satisfactory and a full compressible treatment is too complex for any analytical studies. When the effects of compressibility are incorporated only weakly, a new description, referred to as 'nearly incompressible hydrodynamics', is obtained. The nearly incompressible theory, was originally applied to homogeneous flows. However, large-scale gradients in density, pressure, temperature, etc., are typical in the solar wind and it was unclear how inhomogeneities would affect the usual incompressible and nearly incompressible descriptions. In the homogeneous case, the lowest order expansion of the fully compressible equations leads to the usual incompressible equations, followed at higher orders by the nearly incompressible equations, as introduced by Zank and Matthaeus. With this work we show that the inclusion of large-scale inhomogeneities (in this case time-independent and radially symmetric background solar wind) modifies the leading-order incompressible description of solar wind flow. We find, for example, that the divergence of velocity fluctuations is nonsolenoidal and that density fluctuations can be described to leading order as a passive scalar. Locally (for small lengthscales), this system of equations converges to the usual incompressible equations and we therefore use the term 'locally incompressible' to describe the equations. This term should be distinguished from the term 'nearly incompressible', which is reserved for higher-order corrections. Furthermore, we find that density fluctuations scale with Mach number linearly, in contrast to the original homogeneous nearly incompressible theory, in which density fluctuations scale with the square of Mach number. Inhomogeneous nearly

  4. Monitoring carnivore populations at the landscape scale: occupancy modelling of tigers from sign surveys

    Science.gov (United States)

    Karanth, Kota Ullas; Gopalaswamy, Arjun M.; Kumar, Narayanarao Samba; Vaidyanathan, Srinivas; Nichols, James D.; MacKenzie, Darryl I.

    2011-01-01

    1. Assessing spatial distributions of threatened large carnivores at landscape scales poses formidable challenges because of their rarity and elusiveness. As a consequence of logistical constraints, investigators typically rely on sign surveys. Most survey methods, however, do not explicitly address the central problem of imperfect detections of animal signs in the field, leading to underestimates of true habitat occupancy and distribution. 2. We assessed habitat occupancy for a tiger Panthera tigris metapopulation across a c. 38 000-km2 landscape in India, employing a spatially replicated survey to explicitly address imperfect detections. Ecological predictions about tiger presence were confronted with sign detection data generated from occupancy sampling of 205 sites, each of 188 km2. 3. A recent occupancy model that considers Markovian dependency among sign detections on spatial replicates performed better than the standard occupancy model (ΔAIC = 184·9). A formulation of this model that fitted the data best showed that density of ungulate prey and levels of human disturbance were key determinants of local tiger presence. Model averaging resulted in a replicate-level detection probability [inline image] = 0·17 (0·17) for signs and a tiger habitat occupancy estimate of [inline image] = 0·665 (0·0857) or 14 076 (1814) km2 of potential habitat of 21 167 km2. In contrast, a traditional presence-versus-absence approach underestimated occupancy by 47%. Maps of probabilities of local site occupancy clearly identified tiger source populations at higher densities and matched observed tiger density variations, suggesting their potential utility for population assessments at landscape scales. 4. Synthesis and applications. Landscape-scale sign surveys can efficiently assess large carnivore spatial distributions and elucidate the factors governing their local presence, provided ecological and observation processes are both explicitly modelled. Occupancy

  5. Performance Health Monitoring of Large-Scale Systems

    Energy Technology Data Exchange (ETDEWEB)

    Rajamony, Ram [IBM Research, Austin, TX (United States)

    2014-11-20

    This report details the progress made on the ASCR funded project Performance Health Monitoring for Large Scale Systems. A large-­scale application may not achieve its full performance potential due to degraded performance of even a single subsystem. Detecting performance faults, isolating them, and taking remedial action is critical for the scale of systems on the horizon. PHM aims to develop techniques and tools that can be used to identify and mitigate such performance problems. We accomplish this through two main aspects. The PHM framework encompasses diagnostics, system monitoring, fault isolation, and performance evaluation capabilities that indicates when a performance fault has been detected, either due to an anomaly present in the system itself or due to contention for shared resources between concurrently executing jobs. Software components called the PHM Control system then build upon the capabilities provided by the PHM framework to mitigate degradation caused by performance problems.

  6. Measuring large-scale social networks with high resolution.

    Directory of Open Access Journals (Sweden)

    Arkadiusz Stopczynski

    Full Text Available This paper describes the deployment of a large-scale study designed to measure human interactions across a variety of communication channels, with high temporal resolution and spanning multiple years-the Copenhagen Networks Study. Specifically, we collect data on face-to-face interactions, telecommunication, social networks, location, and background information (personality, demographics, health, politics for a densely connected population of 1000 individuals, using state-of-the-art smartphones as social sensors. Here we provide an overview of the related work and describe the motivation and research agenda driving the study. Additionally, the paper details the data-types measured, and the technical infrastructure in terms of both backend and phone software, as well as an outline of the deployment procedures. We document the participant privacy procedures and their underlying principles. The paper is concluded with early results from data analysis, illustrating the importance of multi-channel high-resolution approach to data collection.

  7. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    Large-scale neural simulations have the marks of a distinct methodology which can be fruitfully deployed to advance scientific understanding of the human brain. Computer simulation studies can be used to produce surrogate observational data for better conceptual models and new how...

  8. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek; Verma, Mahendra K.; Sukhatme, Jai

    2017-01-01

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  9. Phenomenology of two-dimensional stably stratified turbulence under large-scale forcing

    KAUST Repository

    Kumar, Abhishek

    2017-01-11

    In this paper, we characterise the scaling of energy spectra, and the interscale transfer of energy and enstrophy, for strongly, moderately and weakly stably stratified two-dimensional (2D) turbulence, restricted in a vertical plane, under large-scale random forcing. In the strongly stratified case, a large-scale vertically sheared horizontal flow (VSHF) coexists with small scale turbulence. The VSHF consists of internal gravity waves and the turbulent flow has a kinetic energy (KE) spectrum that follows an approximate k−3 scaling with zero KE flux and a robust positive enstrophy flux. The spectrum of the turbulent potential energy (PE) also approximately follows a k−3 power-law and its flux is directed to small scales. For moderate stratification, there is no VSHF and the KE of the turbulent flow exhibits Bolgiano–Obukhov scaling that transitions from a shallow k−11/5 form at large scales, to a steeper approximate k−3 scaling at small scales. The entire range of scales shows a strong forward enstrophy flux, and interestingly, large (small) scales show an inverse (forward) KE flux. The PE flux in this regime is directed to small scales, and the PE spectrum is characterised by an approximate k−1.64 scaling. Finally, for weak stratification, KE is transferred upscale and its spectrum closely follows a k−2.5 scaling, while PE exhibits a forward transfer and its spectrum shows an approximate k−1.6 power-law. For all stratification strengths, the total energy always flows from large to small scales and almost all the spectral indicies are well explained by accounting for the scale-dependent nature of the corresponding flux.

  10. Exploring the large-scale structure of Taylor–Couette turbulence through Large-Eddy Simulations

    Science.gov (United States)

    Ostilla-Mónico, Rodolfo; Zhu, Xiaojue; Verzicco, Roberto

    2018-04-01

    Large eddy simulations (LES) of Taylor-Couette (TC) flow, the flow between two co-axial and independently rotating cylinders are performed in an attempt to explore the large-scale axially-pinned structures seen in experiments and simulations. Both static and dynamic LES models are used. The Reynolds number is kept fixed at Re = 3.4 · 104, and the radius ratio η = ri /ro is set to η = 0.909, limiting the effects of curvature and resulting in frictional Reynolds numbers of around Re τ ≈ 500. Four rotation ratios from Rot = ‑0.0909 to Rot = 0.3 are simulated. First, the LES of TC is benchmarked for different rotation ratios. Both the Smagorinsky model with a constant of cs = 0.1 and the dynamic model are found to produce reasonable results for no mean rotation and cyclonic rotation, but deviations increase for increasing rotation. This is attributed to the increasing anisotropic character of the fluctuations. Second, “over-damped” LES, i.e. LES with a large Smagorinsky constant is performed and is shown to reproduce some features of the large-scale structures, even when the near-wall region is not adequately modeled. This shows the potential for using over-damped LES for fast explorations of the parameter space where large-scale structures are found.

  11. Large-scale alcohol use and socioeconomic position of origin: longitudinal study from ages 15 to 19 years

    DEFF Research Database (Denmark)

    Andersen, Anette; Holstein, Bjørn E; Due, Pernille

    2008-01-01

    AIM: To examine socioeconomic position (SEP) of origin as predictor of large-scale alcohol use in adolescence. METHODS: The study population was a random sample of 15-year-olds at baseline (n=843) with a first follow-up 4 years later (n=729). Excess alcohol intake was assessed by consumption last...

  12. Large scale Brownian dynamics of confined suspensions of rigid particles

    Science.gov (United States)

    Sprinkle, Brennan; Balboa Usabiaga, Florencio; Patankar, Neelesh A.; Donev, Aleksandar

    2017-12-01

    We introduce methods for large-scale Brownian Dynamics (BD) simulation of many rigid particles of arbitrary shape suspended in a fluctuating fluid. Our method adds Brownian motion to the rigid multiblob method [F. Balboa Usabiaga et al., Commun. Appl. Math. Comput. Sci. 11(2), 217-296 (2016)] at a cost comparable to the cost of deterministic simulations. We demonstrate that we can efficiently generate deterministic and random displacements for many particles using preconditioned Krylov iterative methods, if kernel methods to efficiently compute the action of the Rotne-Prager-Yamakawa (RPY) mobility matrix and its "square" root are available for the given boundary conditions. These kernel operations can be computed with near linear scaling for periodic domains using the positively split Ewald method. Here we study particles partially confined by gravity above a no-slip bottom wall using a graphical processing unit implementation of the mobility matrix-vector product, combined with a preconditioned Lanczos iteration for generating Brownian displacements. We address a major challenge in large-scale BD simulations, capturing the stochastic drift term that arises because of the configuration-dependent mobility. Unlike the widely used Fixman midpoint scheme, our methods utilize random finite differences and do not require the solution of resistance problems or the computation of the action of the inverse square root of the RPY mobility matrix. We construct two temporal schemes which are viable for large-scale simulations, an Euler-Maruyama traction scheme and a trapezoidal slip scheme, which minimize the number of mobility problems to be solved per time step while capturing the required stochastic drift terms. We validate and compare these schemes numerically by modeling suspensions of boomerang-shaped particles sedimented near a bottom wall. Using the trapezoidal scheme, we investigate the steady-state active motion in dense suspensions of confined microrollers, whose

  13. Large-scale preparation of hollow graphitic carbon nanospheres

    International Nuclear Information System (INIS)

    Feng, Jun; Li, Fu; Bai, Yu-Jun; Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning; Lu, Xi-Feng

    2013-01-01

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 °C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g −1 after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 °C, which exhibit superior electrochemical performance to graphite. Highlights: ► Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 °C ► The preparation is simple, effective and eco-friendly. ► The in situ yielded MgO nanocrystals promote the graphitization. ► The HGCNSs exhibit superior electrochemical performance to graphite.

  14. Accelerating large-scale phase-field simulations with GPU

    Directory of Open Access Journals (Sweden)

    Xiaoming Shi

    2017-10-01

    Full Text Available A new package for accelerating large-scale phase-field simulations was developed by using GPU based on the semi-implicit Fourier method. The package can solve a variety of equilibrium equations with different inhomogeneity including long-range elastic, magnetostatic, and electrostatic interactions. Through using specific algorithm in Compute Unified Device Architecture (CUDA, Fourier spectral iterative perturbation method was integrated in GPU package. The Allen-Cahn equation, Cahn-Hilliard equation, and phase-field model with long-range interaction were solved based on the algorithm running on GPU respectively to test the performance of the package. From the comparison of the calculation results between the solver executed in single CPU and the one on GPU, it was found that the speed on GPU is enormously elevated to 50 times faster. The present study therefore contributes to the acceleration of large-scale phase-field simulations and provides guidance for experiments to design large-scale functional devices.

  15. First Mile Challenges for Large-Scale IoT

    KAUST Repository

    Bader, Ahmed

    2017-03-16

    The Internet of Things is large-scale by nature. This is not only manifested by the large number of connected devices, but also by the sheer scale of spatial traffic intensity that must be accommodated, primarily in the uplink direction. To that end, cellular networks are indeed a strong first mile candidate to accommodate the data tsunami to be generated by the IoT. However, IoT devices are required in the cellular paradigm to undergo random access procedures as a precursor to resource allocation. Such procedures impose a major bottleneck that hinders cellular networks\\' ability to support large-scale IoT. In this article, we shed light on the random access dilemma and present a case study based on experimental data as well as system-level simulations. Accordingly, a case is built for the latent need to revisit random access procedures. A call for action is motivated by listing a few potential remedies and recommendations.

  16. The scale of population structure in Arabidopsis thaliana.

    Directory of Open Access Journals (Sweden)

    Alexander Platt

    2010-02-01

    Full Text Available The population structure of an organism reflects its evolutionary history and influences its evolutionary trajectory. It constrains the combination of genetic diversity and reveals patterns of past gene flow. Understanding it is a prerequisite for detecting genomic regions under selection, predicting the effect of population disturbances, or modeling gene flow. This paper examines the detailed global population structure of Arabidopsis thaliana. Using a set of 5,707 plants collected from around the globe and genotyped at 149 SNPs, we show that while A. thaliana as a species self-fertilizes 97% of the time, there is considerable variation among local groups. This level of outcrossing greatly limits observed heterozygosity but is sufficient to generate considerable local haplotypic diversity. We also find that in its native Eurasian range A. thaliana exhibits continuous isolation by distance at every geographic scale without natural breaks corresponding to classical notions of populations. By contrast, in North America, where it exists as an exotic species, A. thaliana exhibits little or no population structure at a continental scale but local isolation by distance that extends hundreds of km. This suggests a pattern for the development of isolation by distance that can establish itself shortly after an organism fills a new habitat range. It also raises questions about the general applicability of many standard population genetics models. Any model based on discrete clusters of interchangeable individuals will be an uneasy fit to organisms like A. thaliana which exhibit continuous isolation by distance on many scales.

  17. Thermal power generation projects ``Large Scale Solar Heating``; EU-Thermie-Projekte ``Large Scale Solar Heating``

    Energy Technology Data Exchange (ETDEWEB)

    Kuebler, R.; Fisch, M.N. [Steinbeis-Transferzentrum Energie-, Gebaeude- und Solartechnik, Stuttgart (Germany)

    1998-12-31

    The aim of this project is the preparation of the ``Large-Scale Solar Heating`` programme for an Europe-wide development of subject technology. The following demonstration programme was judged well by the experts but was not immediately (1996) accepted for financial subsidies. In November 1997 the EU-commission provided 1,5 million ECU which allowed the realisation of an updated project proposal. By mid 1997 a small project was approved, that had been requested under the lead of Chalmes Industriteteknik (CIT) in Sweden and is mainly carried out for the transfer of technology. (orig.) [Deutsch] Ziel dieses Vorhabens ist die Vorbereitung eines Schwerpunktprogramms `Large Scale Solar Heating`, mit dem die Technologie europaweit weiterentwickelt werden sollte. Das daraus entwickelte Demonstrationsprogramm wurde von den Gutachtern positiv bewertet, konnte jedoch nicht auf Anhieb (1996) in die Foerderung aufgenommen werden. Im November 1997 wurden von der EU-Kommission dann kurzfristig noch 1,5 Mio ECU an Foerderung bewilligt, mit denen ein aktualisierter Projektvorschlag realisiert werden kann. Bereits Mitte 1997 wurde ein kleineres Vorhaben bewilligt, das unter Federfuehrung von Chalmers Industriteknik (CIT) in Schweden beantragt worden war und das vor allem dem Technologietransfer dient. (orig.)

  18. Scaling of the Urban Water Footprint: An Analysis of 65 Mid- to Large-Sized U.S. Metropolitan Areas

    Science.gov (United States)

    Mahjabin, T.; Garcia, S.; Grady, C.; Mejia, A.

    2017-12-01

    Scaling laws have been shown to be relevant to a range of disciplines including biology, ecology, hydrology, and physics, among others. Recently, scaling was shown to be important for understanding and characterizing cities. For instance, it was found that urban infrastructure (water supply pipes and electrical wires) tends to scale sublinearly with city population, implying that large cities are more efficient. In this study, we explore the scaling of the water footprint of cities. The water footprint is a measure of water appropriation that considers both the direct and indirect (virtual) water use of a consumer or producer. Here we compute the water footprint of 65 mid- to large-sized U.S. metropolitan areas, accounting for direct and indirect water uses associated with agricultural and industrial commodities, and residential and commercial water uses. We find that the urban water footprint, computed as the sum of the water footprint of consumption and production, exhibits sublinear scaling with an exponent of 0.89. This suggests the possibility of large cities being more water-efficient than small ones. To further assess this result, we conduct additional analysis by accounting for international flows, and the effects of green water and city boundary definition on the scaling. The analysis confirms the scaling and provides additional insight about its interpretation.

  19. Large-scale retrieval for medical image analytics: A comprehensive review.

    Science.gov (United States)

    Li, Zhongyu; Zhang, Xiaofan; Müller, Henning; Zhang, Shaoting

    2018-01-01

    Over the past decades, medical image analytics was greatly facilitated by the explosion of digital imaging techniques, where huge amounts of medical images were produced with ever-increasing quality and diversity. However, conventional methods for analyzing medical images have achieved limited success, as they are not capable to tackle the huge amount of image data. In this paper, we review state-of-the-art approaches for large-scale medical image analysis, which are mainly based on recent advances in computer vision, machine learning and information retrieval. Specifically, we first present the general pipeline of large-scale retrieval, summarize the challenges/opportunities of medical image analytics on a large-scale. Then, we provide a comprehensive review of algorithms and techniques relevant to major processes in the pipeline, including feature representation, feature indexing, searching, etc. On the basis of existing work, we introduce the evaluation protocols and multiple applications of large-scale medical image retrieval, with a variety of exploratory and diagnostic scenarios. Finally, we discuss future directions of large-scale retrieval, which can further improve the performance of medical image analysis. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Large-Scale Analysis of Auditory Segregation Behavior Crowdsourced via a Smartphone App.

    Science.gov (United States)

    Teki, Sundeep; Kumar, Sukhbinder; Griffiths, Timothy D

    2016-01-01

    The human auditory system is adept at detecting sound sources of interest from a complex mixture of several other simultaneous sounds. The ability to selectively attend to the speech of one speaker whilst ignoring other speakers and background noise is of vital biological significance-the capacity to make sense of complex 'auditory scenes' is significantly impaired in aging populations as well as those with hearing loss. We investigated this problem by designing a synthetic signal, termed the 'stochastic figure-ground' stimulus that captures essential aspects of complex sounds in the natural environment. Previously, we showed that under controlled laboratory conditions, young listeners sampled from the university subject pool (n = 10) performed very well in detecting targets embedded in the stochastic figure-ground signal. Here, we presented a modified version of this cocktail party paradigm as a 'game' featured in a smartphone app (The Great Brain Experiment) and obtained data from a large population with diverse demographical patterns (n = 5148). Despite differences in paradigms and experimental settings, the observed target-detection performance by users of the app was robust and consistent with our previous results from the psychophysical study. Our results highlight the potential use of smartphone apps in capturing robust large-scale auditory behavioral data from normal healthy volunteers, which can also be extended to study auditory deficits in clinical populations with hearing impairments and central auditory disorders.

  1. Large-Scale Analysis of Auditory Segregation Behavior Crowdsourced via a Smartphone App.

    Directory of Open Access Journals (Sweden)

    Sundeep Teki

    Full Text Available The human auditory system is adept at detecting sound sources of interest from a complex mixture of several other simultaneous sounds. The ability to selectively attend to the speech of one speaker whilst ignoring other speakers and background noise is of vital biological significance-the capacity to make sense of complex 'auditory scenes' is significantly impaired in aging populations as well as those with hearing loss. We investigated this problem by designing a synthetic signal, termed the 'stochastic figure-ground' stimulus that captures essential aspects of complex sounds in the natural environment. Previously, we showed that under controlled laboratory conditions, young listeners sampled from the university subject pool (n = 10 performed very well in detecting targets embedded in the stochastic figure-ground signal. Here, we presented a modified version of this cocktail party paradigm as a 'game' featured in a smartphone app (The Great Brain Experiment and obtained data from a large population with diverse demographical patterns (n = 5148. Despite differences in paradigms and experimental settings, the observed target-detection performance by users of the app was robust and consistent with our previous results from the psychophysical study. Our results highlight the potential use of smartphone apps in capturing robust large-scale auditory behavioral data from normal healthy volunteers, which can also be extended to study auditory deficits in clinical populations with hearing impairments and central auditory disorders.

  2. Photorealistic large-scale urban city model reconstruction.

    Science.gov (United States)

    Poullis, Charalambos; You, Suya

    2009-01-01

    The rapid and efficient creation of virtual environments has become a crucial part of virtual reality applications. In particular, civil and defense applications often require and employ detailed models of operations areas for training, simulations of different scenarios, planning for natural or man-made events, monitoring, surveillance, games, and films. A realistic representation of the large-scale environments is therefore imperative for the success of such applications since it increases the immersive experience of its users and helps reduce the difference between physical and virtual reality. However, the task of creating such large-scale virtual environments still remains a time-consuming and manual work. In this work, we propose a novel method for the rapid reconstruction of photorealistic large-scale virtual environments. First, a novel, extendible, parameterized geometric primitive is presented for the automatic building identification and reconstruction of building structures. In addition, buildings with complex roofs containing complex linear and nonlinear surfaces are reconstructed interactively using a linear polygonal and a nonlinear primitive, respectively. Second, we present a rendering pipeline for the composition of photorealistic textures, which unlike existing techniques, can recover missing or occluded texture information by integrating multiple information captured from different optical sensors (ground, aerial, and satellite).

  3. Automatic Generation of Connectivity for Large-Scale Neuronal Network Models through Structural Plasticity.

    Science.gov (United States)

    Diaz-Pier, Sandra; Naveau, Mikaël; Butz-Ostendorf, Markus; Morrison, Abigail

    2016-01-01

    With the emergence of new high performance computation technology in the last decade, the simulation of large scale neural networks which are able to reproduce the behavior and structure of the brain has finally become an achievable target of neuroscience. Due to the number of synaptic connections between neurons and the complexity of biological networks, most contemporary models have manually defined or static connectivity. However, it is expected that modeling the dynamic generation and deletion of the links among neurons, locally and between different regions of the brain, is crucial to unravel important mechanisms associated with learning, memory and healing. Moreover, for many neural circuits that could potentially be modeled, activity data is more readily and reliably available than connectivity data. Thus, a framework that enables networks to wire themselves on the basis of specified activity targets can be of great value in specifying network models where connectivity data is incomplete or has large error margins. To address these issues, in the present work we present an implementation of a model of structural plasticity in the neural network simulator NEST. In this model, synapses consist of two parts, a pre- and a post-synaptic element. Synapses are created and deleted during the execution of the simulation following local homeostatic rules until a mean level of electrical activity is reached in the network. We assess the scalability of the implementation in order to evaluate its potential usage in the self generation of connectivity of large scale networks. We show and discuss the results of simulations on simple two population networks and more complex models of the cortical microcircuit involving 8 populations and 4 layers using the new framework.

  4. Prototype Vector Machine for Large Scale Semi-Supervised Learning

    Energy Technology Data Exchange (ETDEWEB)

    Zhang, Kai; Kwok, James T.; Parvin, Bahram

    2009-04-29

    Practicaldataminingrarelyfalls exactlyinto the supervisedlearning scenario. Rather, the growing amount of unlabeled data poses a big challenge to large-scale semi-supervised learning (SSL). We note that the computationalintensivenessofgraph-based SSLarises largely from the manifold or graph regularization, which in turn lead to large models that are dificult to handle. To alleviate this, we proposed the prototype vector machine (PVM), a highlyscalable,graph-based algorithm for large-scale SSL. Our key innovation is the use of"prototypes vectors" for effcient approximation on both the graph-based regularizer and model representation. The choice of prototypes are grounded upon two important criteria: they not only perform effective low-rank approximation of the kernel matrix, but also span a model suffering the minimum information loss compared with the complete model. We demonstrate encouraging performance and appealing scaling properties of the PVM on a number of machine learning benchmark data sets.

  5. Environmental degradation, global food production, and risk for large-scale migrations

    International Nuclear Information System (INIS)

    Doeoes, B.R.

    1994-01-01

    This paper attempts to estimate to what extent global food production is affected by the ongoing environmental degradation through processes, such as soil erosion, salinization, chemical contamination, ultraviolet radiation, and biotic stress. Estimates have also been made of available opportunities to improve food production efficiency by, e.g., increased use of fertilizers, irrigation, and biotechnology, as well as improved management. Expected losses and gains of agricultural land in competition with urbanization, industrial development, and forests have been taken into account. Although estimated gains in food production deliberately have been overestimated and losses underestimated, calculations indicate that during the next 30-35 years the annual net gain in food production will be significantly lower than the rate of world population growth. An attempt has also been made to identify possible scenarios for large-scale migrations, caused mainly by rapid population growth in combination with insufficient local food production and poverty. 18 refs, 7 figs, 6 tabs

  6. Large-scale network dynamics of beta-band oscillations underlie auditory perceptual decision-making

    Directory of Open Access Journals (Sweden)

    Mohsen Alavash

    2017-06-01

    Full Text Available Perceptual decisions vary in the speed at which we make them. Evidence suggests that translating sensory information into perceptual decisions relies on distributed interacting neural populations, with decision speed hinging on power modulations of the neural oscillations. Yet the dependence of perceptual decisions on the large-scale network organization of coupled neural oscillations has remained elusive. We measured magnetoencephalographic signals in human listeners who judged acoustic stimuli composed of carefully titrated clouds of tone sweeps. These stimuli were used in two task contexts, in which the participants judged the overall pitch or direction of the tone sweeps. We traced the large-scale network dynamics of the source-projected neural oscillations on a trial-by-trial basis using power-envelope correlations and graph-theoretical network discovery. In both tasks, faster decisions were predicted by higher segregation and lower integration of coupled beta-band (∼16–28 Hz oscillations. We also uncovered the brain network states that promoted faster decisions in either lower-order auditory or higher-order control brain areas. Specifically, decision speed in judging the tone sweep direction critically relied on the nodal network configurations of anterior temporal, cingulate, and middle frontal cortices. Our findings suggest that global network communication during perceptual decision-making is implemented in the human brain by large-scale couplings between beta-band neural oscillations. The speed at which we make perceptual decisions varies. This translation of sensory information into perceptual decisions hinges on dynamic changes in neural oscillatory activity. However, the large-scale neural-network embodiment supporting perceptual decision-making is unclear. We addressed this question by experimenting two auditory perceptual decision-making situations. Using graph-theoretical network discovery, we traced the large-scale network

  7. Population genetics of the Eastern Hellbender (Cryptobranchus alleganiensis alleganiensis across multiple spatial scales.

    Directory of Open Access Journals (Sweden)

    Shem D Unger

    Full Text Available Conservation genetics is a powerful tool to assess the population structure of species and provides a framework for informing management of freshwater ecosystems. As lotic habitats become fragmented, the need to assess gene flow for species of conservation management becomes a priority. The eastern hellbender (Cryptobranchus alleganiensis alleganiensis is a large, fully aquatic paedamorphic salamander. Many populations are experiencing declines throughout their geographic range, yet the genetic ramifications of these declines are currently unknown. To this end, we examined levels of genetic variation and genetic structure at both range-wide and drainage (hierarchical scales. We collected 1,203 individuals from 77 rivers throughout nine states from June 2007 to August 2011. Levels of genetic diversity were relatively high among all sampling locations. We detected significant genetic structure across populations (Fst values ranged from 0.001 between rivers within a single watershed to 0.218 between states. We identified two genetically differentiated groups at the range-wide scale: 1 the Ohio River drainage and 2 the Tennessee River drainage. An analysis of molecular variance (AMOVA based on landscape-scale sampling of basins within the Tennessee River drainage revealed the majority of genetic variation (∼94-98% occurs within rivers. Eastern hellbenders show a strong pattern of isolation by stream distance (IBSD at the drainage level. Understanding levels of genetic variation and differentiation at multiple spatial and biological scales will enable natural resource managers to make more informed decisions and plan effective conservation strategies for cryptic, lotic species.

  8. Accelerating Relevance Vector Machine for Large-Scale Data on Spark

    Directory of Open Access Journals (Sweden)

    Liu Fang

    2017-01-01

    Full Text Available Relevance vector machine (RVM is a machine learning algorithm based on a sparse Bayesian framework, which performs well when running classification and regression tasks on small-scale datasets. However, RVM also has certain drawbacks which restricts its practical applications such as (1 slow training process, (2 poor performance on training large-scale datasets. In order to solve these problem, we propose Discrete AdaBoost RVM (DAB-RVM which incorporate ensemble learning in RVM at first. This method performs well with large-scale low-dimensional datasets. However, as the number of features increases, the training time of DAB-RVM increases as well. To avoid this phenomenon, we utilize the sufficient training samples of large-scale datasets and propose all features boosting RVM (AFB-RVM, which modifies the way of obtaining weak classifiers. In our experiments we study the differences between various boosting techniques with RVM, demonstrating the performance of the proposed approaches on Spark. As a result of this paper, two proposed approaches on Spark for different types of large-scale datasets are available.

  9. Bayesian hierarchical model for large-scale covariance matrix estimation.

    Science.gov (United States)

    Zhu, Dongxiao; Hero, Alfred O

    2007-12-01

    Many bioinformatics problems implicitly depend on estimating large-scale covariance matrix. The traditional approaches tend to give rise to high variance and low accuracy due to "overfitting." We cast the large-scale covariance matrix estimation problem into the Bayesian hierarchical model framework, and introduce dependency between covariance parameters. We demonstrate the advantages of our approaches over the traditional approaches using simulations and OMICS data analysis.

  10. Creating Large Scale Database Servers

    International Nuclear Information System (INIS)

    Becla, Jacek

    2001-01-01

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region

  11. Creating Large Scale Database Servers

    Energy Technology Data Exchange (ETDEWEB)

    Becla, Jacek

    2001-12-14

    The BaBar experiment at the Stanford Linear Accelerator Center (SLAC) is designed to perform a high precision investigation of the decays of the B-meson produced from electron-positron interactions. The experiment, started in May 1999, will generate approximately 300TB/year of data for 10 years. All of the data will reside in Objectivity databases accessible via the Advanced Multi-threaded Server (AMS). To date, over 70TB of data have been placed in Objectivity/DB, making it one of the largest databases in the world. Providing access to such a large quantity of data through a database server is a daunting task. A full-scale testbed environment had to be developed to tune various software parameters and a fundamental change had to occur in the AMS architecture to allow it to scale past several hundred terabytes of data. Additionally, several protocol extensions had to be implemented to provide practical access to large quantities of data. This paper will describe the design of the database and the changes that we needed to make in the AMS for scalability reasons and how the lessons we learned would be applicable to virtually any kind of database server seeking to operate in the Petabyte region.

  12. Large-scale pool fires

    Directory of Open Access Journals (Sweden)

    Steinhaus Thomas

    2007-01-01

    Full Text Available A review of research into the burning behavior of large pool fires and fuel spill fires is presented. The features which distinguish such fires from smaller pool fires are mainly associated with the fire dynamics at low source Froude numbers and the radiative interaction with the fire source. In hydrocarbon fires, higher soot levels at increased diameters result in radiation blockage effects around the perimeter of large fire plumes; this yields lower emissive powers and a drastic reduction in the radiative loss fraction; whilst there are simplifying factors with these phenomena, arising from the fact that soot yield can saturate, there are other complications deriving from the intermittency of the behavior, with luminous regions of efficient combustion appearing randomly in the outer surface of the fire according the turbulent fluctuations in the fire plume. Knowledge of the fluid flow instabilities, which lead to the formation of large eddies, is also key to understanding the behavior of large-scale fires. Here modeling tools can be effectively exploited in order to investigate the fluid flow phenomena, including RANS- and LES-based computational fluid dynamics codes. The latter are well-suited to representation of the turbulent motions, but a number of challenges remain with their practical application. Massively-parallel computational resources are likely to be necessary in order to be able to adequately address the complex coupled phenomena to the level of detail that is necessary.

  13. Decentralised stabilising controllers for a class of large-scale linear ...

    Indian Academy of Sciences (India)

    subsystems resulting from a new aggregation-decomposition technique. The method has been illustrated through a numerical example of a large-scale linear system consisting of three subsystems each of the fourth order. Keywords. Decentralised stabilisation; large-scale linear systems; optimal feedback control; algebraic ...

  14. Large Scale Survey Data in Career Development Research

    Science.gov (United States)

    Diemer, Matthew A.

    2008-01-01

    Large scale survey datasets have been underutilized but offer numerous advantages for career development scholars, as they contain numerous career development constructs with large and diverse samples that are followed longitudinally. Constructs such as work salience, vocational expectations, educational expectations, work satisfaction, and…

  15. Similitude and scaling of large structural elements: Case study

    Directory of Open Access Journals (Sweden)

    M. Shehadeh

    2015-06-01

    Full Text Available Scaled down models are widely used for experimental investigations of large structures due to the limitation in the capacities of testing facilities along with the expenses of the experimentation. The modeling accuracy depends upon the model material properties, fabrication accuracy and loading techniques. In the present work the Buckingham π theorem is used to develop the relations (i.e. geometry, loading and properties between the model and a large structural element as that is present in the huge existing petroleum oil drilling rigs. The model is to be designed, loaded and treated according to a set of similitude requirements that relate the model to the large structural element. Three independent scale factors which represent three fundamental dimensions, namely mass, length and time need to be selected for designing the scaled down model. Numerical prediction of the stress distribution within the model and its elastic deformation under steady loading is to be made. The results are compared with those obtained from the full scale structure numerical computations. The effect of scaled down model size and material on the accuracy of the modeling technique is thoroughly examined.

  16. Large-scale preparation of hollow graphitic carbon nanospheres

    Energy Technology Data Exchange (ETDEWEB)

    Feng, Jun; Li, Fu [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Bai, Yu-Jun, E-mail: byj97@126.com [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); State Key laboratory of Crystal Materials, Shandong University, Jinan 250100 (China); Han, Fu-Dong; Qi, Yong-Xin; Lun, Ning [Key Laboratory for Liquid-Solid Structural Evolution and Processing of Materials, Ministry of Education, Shandong University, Jinan 250061 (China); Lu, Xi-Feng [Lunan Institute of Coal Chemical Engineering, Jining 272000 (China)

    2013-01-15

    Hollow graphitic carbon nanospheres (HGCNSs) were synthesized on large scale by a simple reaction between glucose and Mg at 550 Degree-Sign C in an autoclave. Characterization by X-ray diffraction, Raman spectroscopy and transmission electron microscopy demonstrates the formation of HGCNSs with an average diameter of 10 nm or so and a wall thickness of a few graphenes. The HGCNSs exhibit a reversible capacity of 391 mAh g{sup -1} after 60 cycles when used as anode materials for Li-ion batteries. -- Graphical abstract: Hollow graphitic carbon nanospheres could be prepared on large scale by the simple reaction between glucose and Mg at 550 Degree-Sign C, which exhibit superior electrochemical performance to graphite. Highlights: Black-Right-Pointing-Pointer Hollow graphitic carbon nanospheres (HGCNSs) were prepared on large scale at 550 Degree-Sign C Black-Right-Pointing-Pointer The preparation is simple, effective and eco-friendly. Black-Right-Pointing-Pointer The in situ yielded MgO nanocrystals promote the graphitization. Black-Right-Pointing-Pointer The HGCNSs exhibit superior electrochemical performance to graphite.

  17. Large-scale impact cratering on the terrestrial planets

    International Nuclear Information System (INIS)

    Grieve, R.A.F.

    1982-01-01

    The crater densities on the earth and moon form the basis for a standard flux-time curve that can be used in dating unsampled planetary surfaces and constraining the temporal history of endogenic geologic processes. Abundant evidence is seen not only that impact cratering was an important surface process in planetary history but also that large imapact events produced effects that were crucial in scale. By way of example, it is noted that the formation of multiring basins on the early moon was as important in defining the planetary tectonic framework as plate tectonics is on the earth. Evidence from several planets suggests that the effects of very-large-scale impacts go beyond the simple formation of an impact structure and serve to localize increased endogenic activity over an extended period of geologic time. Even though no longer occurring with the frequency and magnitude of early solar system history, it is noted that large scale impact events continue to affect the local geology of the planets. 92 references

  18. Optical interconnect for large-scale systems

    Science.gov (United States)

    Dress, William

    2013-02-01

    This paper presents a switchless, optical interconnect module that serves as a node in a network of identical distribution modules for large-scale systems. Thousands to millions of hosts or endpoints may be interconnected by a network of such modules, avoiding the need for multi-level switches. Several common network topologies are reviewed and their scaling properties assessed. The concept of message-flow routing is discussed in conjunction with the unique properties enabled by the optical distribution module where it is shown how top-down software control (global routing tables, spanning-tree algorithms) may be avoided.

  19. A multi-scale assessment of population connectivity in African lions (Panthera leo) in response to landscape change

    Science.gov (United States)

    Samuel A. Cushman; Nicholas B. Elliot; David W. Macdonald; Andrew J. Loveridge

    2015-01-01

    Habitat loss and fragmentation are among the major drivers of population declines and extinction, particularly in large carnivores. Connectivity models provide practical tools for assessing fragmentation effects and developing mitigation or conservation responses. To be useful to conservation practitioners, connectivity models need to incorporate multiple scales and...

  20. [A large-scale accident in Alpine terrain].

    Science.gov (United States)

    Wildner, M; Paal, P

    2015-02-01

    Due to the geographical conditions, large-scale accidents amounting to mass casualty incidents (MCI) in Alpine terrain regularly present rescue teams with huge challenges. Using an example incident, specific conditions and typical problems associated with such a situation are presented. The first rescue team members to arrive have the elementary tasks of qualified triage and communication to the control room, which is required to dispatch the necessary additional support. Only with a clear "concept", to which all have to adhere, can the subsequent chaos phase be limited. In this respect, a time factor confounded by adverse weather conditions or darkness represents enormous pressure. Additional hazards are frostbite and hypothermia. If priorities can be established in terms of urgency, then treatment and procedure algorithms have proven successful. For evacuation of causalities, a helicopter should be strived for. Due to the low density of hospitals in Alpine regions, it is often necessary to distribute the patients over a wide area. Rescue operations in Alpine terrain have to be performed according to the particular conditions and require rescue teams to have specific knowledge and expertise. The possibility of a large-scale accident should be considered when planning events. With respect to optimization of rescue measures, regular training and exercises are rational, as is the analysis of previous large-scale Alpine accidents.

  1. Hierarchical Cantor set in the large scale structure with torus geometry

    Energy Technology Data Exchange (ETDEWEB)

    Murdzek, R. [Physics Department, ' Al. I. Cuza' University, Blvd. Carol I, Nr. 11, Iassy 700506 (Romania)], E-mail: rmurdzek@yahoo.com

    2008-12-15

    The formation of large scale structures is considered within a model with string on toroidal space-time. Firstly, the space-time geometry is presented. In this geometry, the Universe is represented by a string describing a torus surface. Thereafter, the large scale structure of the Universe is derived from the string oscillations. The results are in agreement with the cellular structure of the large scale distribution and with the theory of a Cantorian space-time.

  2. Fine-Scale Human Population Structure in Southern Africa Reflects Ecogeographic Boundaries.

    Science.gov (United States)

    Uren, Caitlin; Kim, Minju; Martin, Alicia R; Bobo, Dean; Gignoux, Christopher R; van Helden, Paul D; Möller, Marlo; Hoal, Eileen G; Henn, Brenna M

    2016-09-01

    Recent genetic studies have established that the KhoeSan populations of southern Africa are distinct from all other African populations and have remained largely isolated during human prehistory until ∼2000 years ago. Dozens of different KhoeSan groups exist, belonging to three different language families, but very little is known about their population history. We examine new genome-wide polymorphism data and whole mitochondrial genomes for >100 South Africans from the ≠Khomani San and Nama populations of the Northern Cape, analyzed in conjunction with 19 additional southern African populations. Our analyses reveal fine-scale population structure in and around the Kalahari Desert. Surprisingly, this structure does not always correspond to linguistic or subsistence categories as previously suggested, but rather reflects the role of geographic barriers and the ecology of the greater Kalahari Basin. Regardless of subsistence strategy, the indigenous Khoe-speaking Nama pastoralists and the N|u-speaking ≠Khomani (formerly hunter-gatherers) share ancestry with other Khoe-speaking forager populations that form a rim around the Kalahari Desert. We reconstruct earlier migration patterns and estimate that the southern Kalahari populations were among the last to experience gene flow from Bantu speakers, ∼14 generations ago. We conclude that local adoption of pastoralism, at least by the Nama, appears to have been primarily a cultural process with limited genetic impact from eastern Africa. Copyright © 2016 by the Genetics Society of America.

  3. A large-scale population-based study of the association of vitamin D receptor gene polymorphisms with bone mineral density.

    NARCIS (Netherlands)

    P.L.A. van Dalen; C.M. van Duijn (Cornelia); J.C. Birkenhäger (Jan); J.P.T.M. van Leeuwen (Hans); H.A.P. Pols (Huib); A.G. Uitterlinden (André); A. Hofman (Albert)

    1996-01-01

    textabstractConflicting results have been reported on the association between restriction fragment length polymorphisms (RFLPs) at the vitamin D receptor (VDR) gene locus (i.e., for BsmI, ApaI, and TaqI) and bone mineral density (BMD). We analyzed this association in a large population-based sample

  4. Large-scale Motion of Solar Filaments

    Indian Academy of Sciences (India)

    tribpo

    Large-scale Motion of Solar Filaments. Pavel Ambrož, Astronomical Institute of the Acad. Sci. of the Czech Republic, CZ-25165. Ondrejov, The Czech Republic. e-mail: pambroz@asu.cas.cz. Alfred Schroll, Kanzelhöehe Solar Observatory of the University of Graz, A-9521 Treffen,. Austria. e-mail: schroll@solobskh.ac.at.

  5. Sensitivity analysis for large-scale problems

    Science.gov (United States)

    Noor, Ahmed K.; Whitworth, Sandra L.

    1987-01-01

    The development of efficient techniques for calculating sensitivity derivatives is studied. The objective is to present a computational procedure for calculating sensitivity derivatives as part of performing structural reanalysis for large-scale problems. The scope is limited to framed type structures. Both linear static analysis and free-vibration eigenvalue problems are considered.

  6. Topology Optimization of Large Scale Stokes Flow Problems

    DEFF Research Database (Denmark)

    Aage, Niels; Poulsen, Thomas Harpsøe; Gersborg-Hansen, Allan

    2008-01-01

    This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs.......This note considers topology optimization of large scale 2D and 3D Stokes flow problems using parallel computations. We solve problems with up to 1.125.000 elements in 2D and 128.000 elements in 3D on a shared memory computer consisting of Sun UltraSparc IV CPUs....

  7. The Cosmology Large Angular Scale Surveyor

    Science.gov (United States)

    Harrington, Kathleen; Marriage, Tobias; Ali, Aamir; Appel, John; Bennett, Charles; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from inflation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  8. Depression among family caregivers of community-dwelling older people who used services under the Long Term Care Insurance program: a large-scale population-based study in Japan.

    Science.gov (United States)

    Arai, Yumiko; Kumamoto, Keigo; Mizuno, Yoko; Washio, Masakazu

    2014-01-01

    To identify predictors for depression among family caregivers of community-dwelling older people under the Long Term Care Insurance (LTCI) program in Japan through a large-scale population-based survey. All 5938 older people with disabilities, using domiciliary services under the LTCI in the city of Toyama, and their family caregivers participated in this study. Caregiver depression was defined as scores of ≥16 on the Center for Epidemiological Studies Depression Scale (CES-D). Other caregiver measures included age, sex, hours spent caregiving, relationship to the care recipient, income adequacy, living arrangement, self-rated health, and work status. Care recipient measures included age, sex, level of functional disability, and severity of dementia. The data from 4128 pairs of the care recipients and their family caregivers were eligible for further analyses. A multiple logistic regression analysis was used to examine the predictors associated with being at risk of clinical depression (CES-D of ≥16). Overall, 34.2% of caregivers scored ≥16 on the CES-D. The independent predictors for depression by logistic regression analysis were six caregiver characteristics (female, income inadequacy, longer hours spent caregiving, worse subjective health, and co-residence with the care recipient) and one care-recipient characteristic (moderate dementia). This is one of the first population-based examinations of caregivers of older people who are enrolled in a national service system that provides affordable access to services. The results highlighted the importance of monitoring caregivers who manifest the identified predictors to attenuate caregiver depression at the population level under the LTCI.

  9. Prehospital Acute Stroke Severity Scale to Predict Large Artery Occlusion: Design and Comparison With Other Scales.

    Science.gov (United States)

    Hastrup, Sidsel; Damgaard, Dorte; Johnsen, Søren Paaske; Andersen, Grethe

    2016-07-01

    We designed and validated a simple prehospital stroke scale to identify emergent large vessel occlusion (ELVO) in patients with acute ischemic stroke and compared the scale to other published scales for prediction of ELVO. A national historical test cohort of 3127 patients with information on intracranial vessel status (angiography) before reperfusion therapy was identified. National Institutes of Health Stroke Scale (NIHSS) items with the highest predictive value of occlusion of a large intracranial artery were identified, and the most optimal combination meeting predefined criteria to ensure usefulness in the prehospital phase was determined. The predictive performance of Prehospital Acute Stroke Severity (PASS) scale was compared with other published scales for ELVO. The PASS scale was composed of 3 NIHSS scores: level of consciousness (month/age), gaze palsy/deviation, and arm weakness. In derivation of PASS 2/3 of the test cohort was used and showed accuracy (area under the curve) of 0.76 for detecting large arterial occlusion. Optimal cut point ≥2 abnormal scores showed: sensitivity=0.66 (95% CI, 0.62-0.69), specificity=0.83 (0.81-0.85), and area under the curve=0.74 (0.72-0.76). Validation on 1/3 of the test cohort showed similar performance. Patients with a large artery occlusion on angiography with PASS ≥2 had a median NIHSS score of 17 (interquartile range=6) as opposed to PASS <2 with a median NIHSS score of 6 (interquartile range=5). The PASS scale showed equal performance although more simple when compared with other scales predicting ELVO. The PASS scale is simple and has promising accuracy for prediction of ELVO in the field. © 2016 American Heart Association, Inc.

  10. Large Scale Demand Response of Thermostatic Loads

    DEFF Research Database (Denmark)

    Totu, Luminita Cristiana

    This study is concerned with large populations of residential thermostatic loads (e.g. refrigerators, air conditioning or heat pumps). The purpose is to gain control over the aggregate power consumption in order to provide balancing services for the electrical grid. Without affecting the temperat......This study is concerned with large populations of residential thermostatic loads (e.g. refrigerators, air conditioning or heat pumps). The purpose is to gain control over the aggregate power consumption in order to provide balancing services for the electrical grid. Without affecting....... The control architecture is defined by parsimonious communication requirements that also have a high level data privacy, and it furthermore guarantees a robust and secure local operation. Mathematical models are put forward, and the effectiveness is shown by numerical simulations. A case study of 10000...

  11. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    OpenAIRE

    Qiang Liu; Yi Qin; Guodong Li

    2018-01-01

    Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal...

  12. Managing Risk and Uncertainty in Large-Scale University Research Projects

    Science.gov (United States)

    Moore, Sharlissa; Shangraw, R. F., Jr.

    2011-01-01

    Both publicly and privately funded research projects managed by universities are growing in size and scope. Complex, large-scale projects (over $50 million) pose new management challenges and risks for universities. This paper explores the relationship between project success and a variety of factors in large-scale university projects. First, we…

  13. Parallel clustering algorithm for large-scale biological data sets.

    Science.gov (United States)

    Wang, Minchao; Zhang, Wu; Ding, Wang; Dai, Dongbo; Zhang, Huiran; Xie, Hao; Chen, Luonan; Guo, Yike; Xie, Jiang

    2014-01-01

    Recent explosion of biological data brings a great challenge for the traditional clustering algorithms. With increasing scale of data sets, much larger memory and longer runtime are required for the cluster identification problems. The affinity propagation algorithm outperforms many other classical clustering algorithms and is widely applied into the biological researches. However, the time and space complexity become a great bottleneck when handling the large-scale data sets. Moreover, the similarity matrix, whose constructing procedure takes long runtime, is required before running the affinity propagation algorithm, since the algorithm clusters data sets based on the similarities between data pairs. Two types of parallel architectures are proposed in this paper to accelerate the similarity matrix constructing procedure and the affinity propagation algorithm. The memory-shared architecture is used to construct the similarity matrix, and the distributed system is taken for the affinity propagation algorithm, because of its large memory size and great computing capacity. An appropriate way of data partition and reduction is designed in our method, in order to minimize the global communication cost among processes. A speedup of 100 is gained with 128 cores. The runtime is reduced from serval hours to a few seconds, which indicates that parallel algorithm is capable of handling large-scale data sets effectively. The parallel affinity propagation also achieves a good performance when clustering large-scale gene data (microarray) and detecting families in large protein superfamilies.

  14. Efficient population-scale variant analysis and prioritization with VAPr.

    Science.gov (United States)

    Birmingham, Amanda; Mark, Adam M; Mazzaferro, Carlo; Xu, Guorong; Fisch, Kathleen M

    2018-04-06

    With the growing availability of population-scale whole-exome and whole-genome sequencing, demand for reproducible, scalable variant analysis has spread within genomic research communities. To address this need, we introduce the Python package VAPr (Variant Analysis and Prioritization). VAPr leverages existing annotation tools ANNOVAR and MyVariant.info with MongoDB-based flexible storage and filtering functionality. It offers biologists and bioinformatics generalists easy-to-use and scalable analysis and prioritization of genomic variants from large cohort studies. VAPr is developed in Python and is available for free use and extension under the MIT License. An install package is available on PyPi at https://pypi.python.org/pypi/VAPr, while source code and extensive documentation are on GitHub at https://github.com/ucsd-ccbb/VAPr. kfisch@ucsd.edu.

  15. Large-scale effects of migration and conflict in pre-agricultural groups: Insights from a dynamic model.

    Directory of Open Access Journals (Sweden)

    Francesco Gargano

    Full Text Available The debate on the causes of conflict in human societies has deep roots. In particular, the extent of conflict in hunter-gatherer groups remains unclear. Some authors suggest that large-scale violence only arose with the spreading of agriculture and the building of complex societies. To shed light on this issue, we developed a model based on operatorial techniques simulating population-resource dynamics within a two-dimensional lattice, with humans and natural resources interacting in each cell of the lattice. The model outcomes under different conditions were compared with recently available demographic data for prehistoric South America. Only under conditions that include migration among cells and conflict was the model able to consistently reproduce the empirical data at a continental scale. We argue that the interplay between resource competition, migration, and conflict drove the population dynamics of South America after the colonization phase and before the introduction of agriculture. The relation between population and resources indeed emerged as a key factor leading to migration and conflict once the carrying capacity of the environment has been reached.

  16. Adaptive visualization for large-scale graph

    International Nuclear Information System (INIS)

    Nakamura, Hiroko; Shinano, Yuji; Ohzahata, Satoshi

    2010-01-01

    We propose an adoptive visualization technique for representing a large-scale hierarchical dataset within limited display space. A hierarchical dataset has nodes and links showing the parent-child relationship between the nodes. These nodes and links are described using graphics primitives. When the number of these primitives is large, it is difficult to recognize the structure of the hierarchical data because many primitives are overlapped within a limited region. To overcome this difficulty, we propose an adaptive visualization technique for hierarchical datasets. The proposed technique selects an appropriate graph style according to the nodal density in each area. (author)

  17. Stabilization Algorithms for Large-Scale Problems

    DEFF Research Database (Denmark)

    Jensen, Toke Koldborg

    2006-01-01

    The focus of the project is on stabilization of large-scale inverse problems where structured models and iterative algorithms are necessary for computing approximate solutions. For this purpose, we study various iterative Krylov methods and their abilities to produce regularized solutions. Some......-curve. This heuristic is implemented as a part of a larger algorithm which is developed in collaboration with G. Rodriguez and P. C. Hansen. Last, but not least, a large part of the project has, in different ways, revolved around the object-oriented Matlab toolbox MOORe Tools developed by PhD Michael Jacobsen. New...

  18. Linear time algorithms to construct populations fitting multiple constraint distributions at genomic scales.

    Science.gov (United States)

    Siragusa, Enrico; Haiminen, Niina; Utro, Filippo; Parida, Laxmi

    2017-10-09

    Computer simulations can be used to study population genetic methods, models and parameters, as well as to predict potential outcomes. For example, in plant populations, predicting the outcome of breeding operations can be studied using simulations. In-silico construction of populations with pre-specified characteristics is an important task in breeding optimization and other population genetic studies. We present two linear time Simulation using Best-fit Algorithms (SimBA) for two classes of problems where each co-fits two distributions: SimBA-LD fits linkage disequilibrium and minimum allele frequency distributions, while SimBA-hap fits founder-haplotype and polyploid allele dosage distributions. An incremental gap-filling version of previously introduced SimBA-LD is here demonstrated to accurately fit the target distributions, allowing efficient large scale simulations. SimBA-hap accuracy and efficiency is demonstrated by simulating tetraploid populations with varying numbers of founder haplotypes, we evaluate both a linear time greedy algoritm and an optimal solution based on mixed-integer programming. SimBA is available on http://researcher.watson.ibm.com/project/5669.

  19. Design study on sodium cooled large-scale reactor

    International Nuclear Information System (INIS)

    Murakami, Tsutomu; Hishida, Masahiko; Kisohara, Naoyuki

    2004-07-01

    In Phase 1 of the 'Feasibility Studies on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2, design improvement for further cost reduction of establishment of the plant concept has been performed. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2003, which is the third year of Phase 2. In the JFY2003 design study, critical subjects related to safety, structural integrity and thermal hydraulics which found in the last fiscal year has been examined and the plant concept has been modified. Furthermore, fundamental specifications of main systems and components have been set and economy has been evaluated. In addition, as the interim evaluation of the candidate concept of the FBR fuel cycle is to be conducted, cost effectiveness and achievability for the development goal were evaluated and the data of the three large-scale reactor candidate concepts were prepared. As a results of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000 yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  20. Design study on sodium-cooled large-scale reactor

    International Nuclear Information System (INIS)

    Shimakawa, Yoshio; Nibe, Nobuaki; Hori, Toru

    2002-05-01

    In Phase 1 of the 'Feasibility Study on Commercialized Fast Reactor Cycle Systems (F/S)', an advanced loop type reactor has been selected as a promising concept of sodium-cooled large-scale reactor, which has a possibility to fulfill the design requirements of the F/S. In Phase 2 of the F/S, it is planed to precede a preliminary conceptual design of a sodium-cooled large-scale reactor based on the design of the advanced loop type reactor. Through the design study, it is intended to construct such a plant concept that can show its attraction and competitiveness as a commercialized reactor. This report summarizes the results of the design study on the sodium-cooled large-scale reactor performed in JFY2001, which is the first year of Phase 2. In the JFY2001 design study, a plant concept has been constructed based on the design of the advanced loop type reactor, and fundamental specifications of main systems and components have been set. Furthermore, critical subjects related to safety, structural integrity, thermal hydraulics, operability, maintainability and economy have been examined and evaluated. As a result of this study, the plant concept of the sodium-cooled large-scale reactor has been constructed, which has a prospect to satisfy the economic goal (construction cost: less than 200,000yens/kWe, etc.) and has a prospect to solve the critical subjects. From now on, reflecting the results of elemental experiments, the preliminary conceptual design of this plant will be preceded toward the selection for narrowing down candidate concepts at the end of Phase 2. (author)

  1. Large scale CMB anomalies from thawing cosmic strings

    Energy Technology Data Exchange (ETDEWEB)

    Ringeval, Christophe [Centre for Cosmology, Particle Physics and Phenomenology, Institute of Mathematics and Physics, Louvain University, 2 Chemin du Cyclotron, 1348 Louvain-la-Neuve (Belgium); Yamauchi, Daisuke; Yokoyama, Jun' ichi [Research Center for the Early Universe (RESCEU), Graduate School of Science, The University of Tokyo, Tokyo 113-0033 (Japan); Bouchet, François R., E-mail: christophe.ringeval@uclouvain.be, E-mail: yamauchi@resceu.s.u-tokyo.ac.jp, E-mail: yokoyama@resceu.s.u-tokyo.ac.jp, E-mail: bouchet@iap.fr [Institut d' Astrophysique de Paris, UMR 7095-CNRS, Université Pierre et Marie Curie, 98bis boulevard Arago, 75014 Paris (France)

    2016-02-01

    Cosmic strings formed during inflation are expected to be either diluted over super-Hubble distances, i.e., invisible today, or to have crossed our past light cone very recently. We discuss the latter situation in which a few strings imprint their signature in the Cosmic Microwave Background (CMB) Anisotropies after recombination. Being almost frozen in the Hubble flow, these strings are quasi static and evade almost all of the previously derived constraints on their tension while being able to source large scale anisotropies in the CMB sky. Using a local variance estimator on thousand of numerically simulated Nambu-Goto all sky maps, we compute the expected signal and show that it can mimic a dipole modulation at large angular scales while being negligible at small angles. Interestingly, such a scenario generically produces one cold spot from the thawing of a cosmic string loop. Mixed with anisotropies of inflationary origin, we find that a few strings of tension GU = O(1) × 10{sup −6} match the amplitude of the dipole modulation reported in the Planck satellite measurements and could be at the origin of other large scale anomalies.

  2. Large population center and core melt accident considerations in siting

    International Nuclear Information System (INIS)

    Camarinopoulos, L.; Yadigaroglu, G.

    1983-01-01

    The problem of providing suitable demographic siting criteria in the presence of a very large population center in an otherwise sparsely populated region is addressed. Simple calculations were performed making maximum use of pretabulated results of studies where core melt accidents are considered. These show that taking into consideration the air flow patterns in the region can lower the expected population doses from core melt accidents more effectively than distance alone. Expected doses are compared to the annual background radiation dose. A simple siting criterion combining geographical considerations with the probability of a release reaching the large population center is proposed

  3. Large-scale movements in European badgers: has the tail of the movement kernel been underestimated?

    Science.gov (United States)

    Byrne, Andrew W; Quinn, John L; O'Keeffe, James J; Green, Stuart; Sleeman, D Paddy; Martin, S Wayne; Davenport, John

    2014-07-01

    Characterizing patterns of animal movement is a major aim in population ecology, and yet doing so at an appropriate spatial scale remains a major challenge. Estimating the frequency and distances of movements is of particular importance when species are implicated in the transmission of zoonotic diseases. European badgers (Meles meles) are classically viewed as exhibiting limited dispersal, and yet their movements bring them into conflict with farmers due to their potential to spread bovine tuberculosis in parts of their range. Considerable uncertainty surrounds the movement potential of badgers, and this may be related to the spatial scale of previous empirical studies. We conducted a large-scale mark-recapture study (755 km(2); 2008-2012; 1935 capture events; 963 badgers) to investigate movement patterns in badgers, and undertook a comparative meta-analysis using published data from 15 European populations. The dispersal movement (>1 km) kernel followed an inverse power-law function, with a substantial 'tail' indicating the occurrence of rare long-distance dispersal attempts during the study period. The mean recorded distance from this distribution was 2.6 km, the 95 percentile was 7.3 km and the longest recorded was 22.1 km. Dispersal frequency distributions were significantly different between genders; males dispersed more frequently than females, but females made proportionally more long-distance dispersal attempts than males. We used a subsampling approach to demonstrate that the appropriate minimum spatial scale to characterize badger movements in our study population was 80 km(2), substantially larger than many previous badger studies. Furthermore, the meta-analysis indicated a significant association between maximum movement distance and study area size, while controlling for population density. Maximum long-distance movements were often only recorded by chance beyond the boundaries of study areas. These findings suggest that the tail of the badger

  4. Exploiting multi-scale parallelism for large scale numerical modelling of laser wakefield accelerators

    International Nuclear Information System (INIS)

    Fonseca, R A; Vieira, J; Silva, L O; Fiuza, F; Davidson, A; Tsung, F S; Mori, W B

    2013-01-01

    A new generation of laser wakefield accelerators (LWFA), supported by the extreme accelerating fields generated in the interaction of PW-Class lasers and underdense targets, promises the production of high quality electron beams in short distances for multiple applications. Achieving this goal will rely heavily on numerical modelling to further understand the underlying physics and identify optimal regimes, but large scale modelling of these scenarios is computationally heavy and requires the efficient use of state-of-the-art petascale supercomputing systems. We discuss the main difficulties involved in running these simulations and the new developments implemented in the OSIRIS framework to address these issues, ranging from multi-dimensional dynamic load balancing and hybrid distributed/shared memory parallelism to the vectorization of the PIC algorithm. We present the results of the OASCR Joule Metric program on the issue of large scale modelling of LWFA, demonstrating speedups of over 1 order of magnitude on the same hardware. Finally, scalability to over ∼10 6 cores and sustained performance over ∼2 P Flops is demonstrated, opening the way for large scale modelling of LWFA scenarios. (paper)

  5. Thermal activation of dislocations in large scale obstacle bypass

    Science.gov (United States)

    Sobie, Cameron; Capolungo, Laurent; McDowell, David L.; Martinez, Enrique

    2017-08-01

    Dislocation dynamics simulations have been used extensively to predict hardening caused by dislocation-obstacle interactions, including irradiation defect hardening in the athermal case. Incorporating the role of thermal energy on these interactions is possible with a framework provided by harmonic transition state theory (HTST) enabling direct access to thermally activated reaction rates using the Arrhenius equation, including rates of dislocation-obstacle bypass processes. Moving beyond unit dislocation-defect reactions to a representative environment containing a large number of defects requires coarse-graining the activation energy barriers of a population of obstacles into an effective energy barrier that accurately represents the large scale collective process. The work presented here investigates the relationship between unit dislocation-defect bypass processes and the distribution of activation energy barriers calculated for ensemble bypass processes. A significant difference between these cases is observed, which is attributed to the inherent cooperative nature of dislocation bypass processes. In addition to the dislocation-defect interaction, the morphology of the dislocation segments pinned to the defects play an important role on the activation energies for bypass. A phenomenological model for activation energy stress dependence is shown to describe well the effect of a distribution of activation energies, and a probabilistic activation energy model incorporating the stress distribution in a material is presented.

  6. Balancing modern Power System with large scale of wind power

    DEFF Research Database (Denmark)

    Basit, Abdul; Altin, Müfit; Hansen, Anca Daniela

    2014-01-01

    Power system operators must ensure robust, secure and reliable power system operation even with a large scale integration of wind power. Electricity generated from the intermittent wind in large propor-tion may impact on the control of power system balance and thus deviations in the power system...... frequency in small or islanded power systems or tie line power flows in interconnected power systems. Therefore, the large scale integration of wind power into the power system strongly concerns the secure and stable grid operation. To ensure the stable power system operation, the evolving power system has...... to be analysed with improved analytical tools and techniques. This paper proposes techniques for the active power balance control in future power systems with the large scale wind power integration, where power balancing model provides the hour-ahead dispatch plan with reduced planning horizon and the real time...

  7. The Role of Remote Sensing for Understanding Large-Scale Rubber Concession Expansion in Southern Laos

    Directory of Open Access Journals (Sweden)

    Mutlu Özdoğan

    2018-04-01

    Full Text Available Increasing global demand for natural rubber began in the mid-2000s and led to large-scale expansion of plantations in Laos until rubber latex prices declined greatly beginning in 2011. The expansion of rubber did not, however, occur uniformly across the country. While the north and central Laos experienced mostly local and smallholder plantations, rubber expansion in the south was dominated by transnational companies from Vietnam, China and Thailand through large-scale land concessions, often causing conflicts with local communities. In this study we use satellite remote sensing to identify and map the expansion of large-scale rubber plantations in Champasak Province—the first area in southern Laos to host large-scale rubber development—and document the biophysical impacts on the local landscape, which of course is linked to social impacts on local people. Our study demonstrates that the expansion of rubber in the province was rapid and did not always conform to approved concession area locations. The mono-culture nature of rubber plantations also had the effect of homogenizing the landscape, eclipsing the changes caused by local populations. We argue that by providing a relatively inexpensive way to track the expansion of rubber plantations over space and time, remote sensing has the potential to provide advocates and other civil society groups with data that might otherwise remain limited to the restricted domains of state regulation and private sector reporting. However, we also caution that while remote sensing has the potential to provide strong public evidence about plantation expansion, access to and control of this information ultimately determines its value.

  8. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif

    2017-01-07

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  9. Large-Scale Graph Processing Using Apache Giraph

    KAUST Repository

    Sakr, Sherif; Orakzai, Faisal Moeen; Abdelaziz, Ibrahim; Khayyat, Zuhair

    2017-01-01

    This book takes its reader on a journey through Apache Giraph, a popular distributed graph processing platform designed to bring the power of big data processing to graph data. Designed as a step-by-step self-study guide for everyone interested in large-scale graph processing, it describes the fundamental abstractions of the system, its programming models and various techniques for using the system to process graph data at scale, including the implementation of several popular and advanced graph analytics algorithms.

  10. An interactive display system for large-scale 3D models

    Science.gov (United States)

    Liu, Zijian; Sun, Kun; Tao, Wenbing; Liu, Liman

    2018-04-01

    With the improvement of 3D reconstruction theory and the rapid development of computer hardware technology, the reconstructed 3D models are enlarging in scale and increasing in complexity. Models with tens of thousands of 3D points or triangular meshes are common in practical applications. Due to storage and computing power limitation, it is difficult to achieve real-time display and interaction with large scale 3D models for some common 3D display software, such as MeshLab. In this paper, we propose a display system for large-scale 3D scene models. We construct the LOD (Levels of Detail) model of the reconstructed 3D scene in advance, and then use an out-of-core view-dependent multi-resolution rendering scheme to realize the real-time display of the large-scale 3D model. With the proposed method, our display system is able to render in real time while roaming in the reconstructed scene and 3D camera poses can also be displayed. Furthermore, the memory consumption can be significantly decreased via internal and external memory exchange mechanism, so that it is possible to display a large scale reconstructed scene with over millions of 3D points or triangular meshes in a regular PC with only 4GB RAM.

  11. Large-scale hydrology in Europe : observed patterns and model performance

    Energy Technology Data Exchange (ETDEWEB)

    Gudmundsson, Lukas

    2011-06-15

    In a changing climate, terrestrial water storages are of great interest as water availability impacts key aspects of ecosystem functioning. Thus, a better understanding of the variations of wet and dry periods will contribute to fully grasp processes of the earth system such as nutrient cycling and vegetation dynamics. Currently, river runoff from small, nearly natural, catchments is one of the few variables of the terrestrial water balance that is regularly monitored with detailed spatial and temporal coverage on large scales. River runoff, therefore, provides a foundation to approach European hydrology with respect to observed patterns on large scales, with regard to the ability of models to capture these.The analysis of observed river flow from small catchments, focused on the identification and description of spatial patterns of simultaneous temporal variations of runoff. These are dominated by large-scale variations of climatic variables but also altered by catchment processes. It was shown that time series of annual low, mean and high flows follow the same atmospheric drivers. The observation that high flows are more closely coupled to large scale atmospheric drivers than low flows, indicates the increasing influence of catchment properties on runoff under dry conditions. Further, it was shown that the low-frequency variability of European runoff is dominated by two opposing centres of simultaneous variations, such that dry years in the north are accompanied by wet years in the south.Large-scale hydrological models are simplified representations of our current perception of the terrestrial water balance on large scales. Quantification of the models strengths and weaknesses is the prerequisite for a reliable interpretation of simulation results. Model evaluations may also enable to detect shortcomings with model assumptions and thus enable a refinement of the current perception of hydrological systems. The ability of a multi model ensemble of nine large-scale

  12. Large-scale perturbations from the waterfall field in hybrid inflation

    International Nuclear Information System (INIS)

    Fonseca, José; Wands, David; Sasaki, Misao

    2010-01-01

    We estimate large-scale curvature perturbations from isocurvature fluctuations in the waterfall field during hybrid inflation, in addition to the usual inflaton field perturbations. The tachyonic instability at the end of inflation leads to an explosive growth of super-Hubble scale perturbations, but they retain the steep blue spectrum characteristic of vacuum fluctuations in a massive field during inflation. The power spectrum thus peaks around the Hubble-horizon scale at the end of inflation. We extend the usual δN formalism to include the essential role of these small fluctuations when estimating the large-scale curvature perturbation. The resulting curvature perturbation due to fluctuations in the waterfall field is second-order and the spectrum is expected to be of order 10 −54 on cosmological scales

  13. A cloud-based framework for large-scale traditional Chinese medical record retrieval.

    Science.gov (United States)

    Liu, Lijun; Liu, Li; Fu, Xiaodong; Huang, Qingsong; Zhang, Xianwen; Zhang, Yin

    2018-01-01

    Electronic medical records are increasingly common in medical practice. The secondary use of medical records has become increasingly important. It relies on the ability to retrieve the complete information about desired patient populations. How to effectively and accurately retrieve relevant medical records from large- scale medical big data is becoming a big challenge. Therefore, we propose an efficient and robust framework based on cloud for large-scale Traditional Chinese Medical Records (TCMRs) retrieval. We propose a parallel index building method and build a distributed search cluster, the former is used to improve the performance of index building, and the latter is used to provide high concurrent online TCMRs retrieval. Then, a real-time multi-indexing model is proposed to ensure the latest relevant TCMRs are indexed and retrieved in real-time, and a semantics-based query expansion method and a multi- factor ranking model are proposed to improve retrieval quality. Third, we implement a template-based visualization method for displaying medical reports. The proposed parallel indexing method and distributed search cluster can improve the performance of index building and provide high concurrent online TCMRs retrieval. The multi-indexing model can ensure the latest relevant TCMRs are indexed and retrieved in real-time. The semantics expansion method and the multi-factor ranking model can enhance retrieval quality. The template-based visualization method can enhance the availability and universality, where the medical reports are displayed via friendly web interface. In conclusion, compared with the current medical record retrieval systems, our system provides some advantages that are useful in improving the secondary use of large-scale traditional Chinese medical records in cloud environment. The proposed system is more easily integrated with existing clinical systems and be used in various scenarios. Copyright © 2017. Published by Elsevier Inc.

  14. Decoupling local mechanics from large-scale structure in modular metamaterials

    Science.gov (United States)

    Yang, Nan; Silverberg, Jesse L.

    2017-04-01

    A defining feature of mechanical metamaterials is that their properties are determined by the organization of internal structure instead of the raw fabrication materials. This shift of attention to engineering internal degrees of freedom has coaxed relatively simple materials into exhibiting a wide range of remarkable mechanical properties. For practical applications to be realized, however, this nascent understanding of metamaterial design must be translated into a capacity for engineering large-scale structures with prescribed mechanical functionality. Thus, the challenge is to systematically map desired functionality of large-scale structures backward into a design scheme while using finite parameter domains. Such “inverse design” is often complicated by the deep coupling between large-scale structure and local mechanical function, which limits the available design space. Here, we introduce a design strategy for constructing 1D, 2D, and 3D mechanical metamaterials inspired by modular origami and kirigami. Our approach is to assemble a number of modules into a voxelized large-scale structure, where the module’s design has a greater number of mechanical design parameters than the number of constraints imposed by bulk assembly. This inequality allows each voxel in the bulk structure to be uniquely assigned mechanical properties independent from its ability to connect and deform with its neighbors. In studying specific examples of large-scale metamaterial structures we show that a decoupling of global structure from local mechanical function allows for a variety of mechanically and topologically complex designs.

  15. Large-Scale Habitat Corridors for Biodiversity Conservation: A Forest Corridor in Madagascar.

    Directory of Open Access Journals (Sweden)

    Tanjona Ramiadantsoa

    Full Text Available In biodiversity conservation, habitat corridors are assumed to increase landscape-level connectivity and to enhance the viability of otherwise isolated populations. While the role of corridors is supported by empirical evidence, studies have typically been conducted at small spatial scales. Here, we assess the quality and the functionality of a large 95-km long forest corridor connecting two large national parks (416 and 311 km2 in the southeastern escarpment of Madagascar. We analyze the occurrence of 300 species in 5 taxonomic groups in the parks and in the corridor, and combine high-resolution forest cover data with a simulation model to examine various scenarios of corridor destruction. At present, the corridor contains essentially the same communities as the national parks, reflecting its breadth which on average matches that of the parks. In the simulation model, we consider three types of dispersers: passive dispersers, which settle randomly around the source population; active dispersers, which settle only in favorable habitat; and gap-avoiding active dispersers, which avoid dispersing across non-habitat. Our results suggest that long-distance passive dispersers are most sensitive to ongoing degradation of the corridor, because increasing numbers of propagules are lost outside the forest habitat. For a wide range of dispersal parameters, the national parks are large enough to sustain stable populations until the corridor becomes severely broken, which will happen around 2065 if the current rate of forest loss continues. A significant decrease in gene flow along the corridor is expected after 2040, and this will exacerbate the adverse consequences of isolation. Our results demonstrate that simulation studies assessing the role of habitat corridors should pay close attention to the mode of dispersal and the effects of regional stochasticity.

  16. Large scale air pollution estimation method combining land use regression and chemical transport modeling in a geostatistical framework.

    Science.gov (United States)

    Akita, Yasuyuki; Baldasano, Jose M; Beelen, Rob; Cirach, Marta; de Hoogh, Kees; Hoek, Gerard; Nieuwenhuijsen, Mark; Serre, Marc L; de Nazelle, Audrey

    2014-04-15

    In recognition that intraurban exposure gradients may be as large as between-city variations, recent air pollution epidemiologic studies have become increasingly interested in capturing within-city exposure gradients. In addition, because of the rapidly accumulating health data, recent studies also need to handle large study populations distributed over large geographic domains. Even though several modeling approaches have been introduced, a consistent modeling framework capturing within-city exposure variability and applicable to large geographic domains is still missing. To address these needs, we proposed a modeling framework based on the Bayesian Maximum Entropy method that integrates monitoring data and outputs from existing air quality models based on Land Use Regression (LUR) and Chemical Transport Models (CTM). The framework was applied to estimate the yearly average NO2 concentrations over the region of Catalunya in Spain. By jointly accounting for the global scale variability in the concentration from the output of CTM and the intraurban scale variability through LUR model output, the proposed framework outperformed more conventional approaches.

  17. The origin of large scale cosmic structure

    International Nuclear Information System (INIS)

    Jones, B.J.T.; Palmer, P.L.

    1985-01-01

    The paper concerns the origin of large scale cosmic structure. The evolution of density perturbations, the nonlinear regime (Zel'dovich's solution and others), the Gott and Rees clustering hierarchy, the spectrum of condensations, and biassed galaxy formation, are all discussed. (UK)

  18. A practical process for light-water detritiation at large scales

    Energy Technology Data Exchange (ETDEWEB)

    Boniface, H.A. [Atomic Energy of Canada Limited, Chalk River, ON (Canada); Robinson, J., E-mail: jr@tyne-engineering.com [Tyne Engineering, Burlington, ON (Canada); Gnanapragasam, N.V.; Castillo, I.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    AECL and Tyne Engineering have recently completed a preliminary engineering design for a modest-scale tritium removal plant for light water, intended for installation at AECL's Chalk River Laboratories (CRL). This plant design was based on the Combined Electrolysis and Catalytic Exchange (CECE) technology developed at CRL over many years and demonstrated there and elsewhere. The general features and capabilities of this design have been reported as well as the versatility of the design for separating any pair of the three hydrogen isotopes. The same CECE technology could be applied directly to very large-scale wastewater detritiation, such as the case at Fukushima Daiichi Nuclear Power Station. However, since the CECE process scales linearly with throughput, the required capital and operating costs are substantial for such large-scale applications. This paper discusses some options for reducing the costs of very large-scale detritiation. Options include: Reducing tritium removal effectiveness; Energy recovery; Improving the tolerance of impurities; Use of less expensive or more efficient equipment. A brief comparison with alternative processes is also presented. (author)

  19. Network Partitioning Domain Knowledge Multiobjective Application Mapping for Large-Scale Network-on-Chip

    Directory of Open Access Journals (Sweden)

    Yin Zhen Tei

    2014-01-01

    Full Text Available This paper proposes a multiobjective application mapping technique targeted for large-scale network-on-chip (NoC. As the number of intellectual property (IP cores in multiprocessor system-on-chip (MPSoC increases, NoC application mapping to find optimum core-to-topology mapping becomes more challenging. Besides, the conflicting cost and performance trade-off makes multiobjective application mapping techniques even more complex. This paper proposes an application mapping technique that incorporates domain knowledge into genetic algorithm (GA. The initial population of GA is initialized with network partitioning (NP while the crossover operator is guided with knowledge on communication demands. NP reduces the large-scale application mapping complexity and provides GA with a potential mapping search space. The proposed genetic operator is compared with state-of-the-art genetic operators in terms of solution quality. In this work, multiobjective optimization of energy and thermal-balance is considered. Through simulation, knowledge-based initial mapping shows significant improvement in Pareto front compared to random initial mapping that is widely used. The proposed knowledge-based crossover also shows better Pareto front compared to state-of-the-art knowledge-based crossover.

  20. OffshoreDC DC grids for integration of large scale wind power

    DEFF Research Database (Denmark)

    Zeni, Lorenzo; Endegnanew, Atsede Gualu; Stamatiou, Georgios

    The present report summarizes the main findings of the Nordic Energy Research project “DC grids for large scale integration of offshore wind power – OffshoreDC”. The project is been funded by Nordic Energy Research through the TFI programme and was active between 2011 and 2016. The overall...... objective of the project was to drive the development of the VSC based HVDC technology for future large scale offshore grids, supporting a standardised and commercial development of the technology, and improving the opportunities for the technology to support power system integration of large scale offshore...

  1. Low-Complexity Transmit Antenna Selection and Beamforming for Large-Scale MIMO Communications

    Directory of Open Access Journals (Sweden)

    Kun Qian

    2014-01-01

    Full Text Available Transmit antenna selection plays an important role in large-scale multiple-input multiple-output (MIMO communications, but optimal large-scale MIMO antenna selection is a technical challenge. Exhaustive search is often employed in antenna selection, but it cannot be efficiently implemented in large-scale MIMO communication systems due to its prohibitive high computation complexity. This paper proposes a low-complexity interactive multiple-parameter optimization method for joint transmit antenna selection and beamforming in large-scale MIMO communication systems. The objective is to jointly maximize the channel outrage capacity and signal-to-noise (SNR performance and minimize the mean square error in transmit antenna selection and minimum variance distortionless response (MVDR beamforming without exhaustive search. The effectiveness of all the proposed methods is verified by extensive simulation results. It is shown that the required antenna selection processing time of the proposed method does not increase along with the increase of selected antennas, but the computation complexity of conventional exhaustive search method will significantly increase when large-scale antennas are employed in the system. This is particularly useful in antenna selection for large-scale MIMO communication systems.

  2. Automatic initialization and quality control of large-scale cardiac MRI segmentations.

    Science.gov (United States)

    Albà, Xènia; Lekadir, Karim; Pereañez, Marco; Medrano-Gracia, Pau; Young, Alistair A; Frangi, Alejandro F

    2018-01-01

    Continuous advances in imaging technologies enable ever more comprehensive phenotyping of human anatomy and physiology. Concomitant reduction of imaging costs has resulted in widespread use of imaging in large clinical trials and population imaging studies. Magnetic Resonance Imaging (MRI), in particular, offers one-stop-shop multidimensional biomarkers of cardiovascular physiology and pathology. A wide range of analysis methods offer sophisticated cardiac image assessment and quantification for clinical and research studies. However, most methods have only been evaluated on relatively small databases often not accessible for open and fair benchmarking. Consequently, published performance indices are not directly comparable across studies and their translation and scalability to large clinical trials or population imaging cohorts is uncertain. Most existing techniques still rely on considerable manual intervention for the initialization and quality control of the segmentation process, becoming prohibitive when dealing with thousands of images. The contributions of this paper are three-fold. First, we propose a fully automatic method for initializing cardiac MRI segmentation, by using image features and random forests regression to predict an initial position of the heart and key anatomical landmarks in an MRI volume. In processing a full imaging database, the technique predicts the optimal corrective displacements and positions in relation to the initial rough intersections of the long and short axis images. Second, we introduce for the first time a quality control measure capable of identifying incorrect cardiac segmentations with no visual assessment. The method uses statistical, pattern and fractal descriptors in a random forest classifier to detect failures to be corrected or removed from subsequent statistical analysis. Finally, we validate these new techniques within a full pipeline for cardiac segmentation applicable to large-scale cardiac MRI databases. The

  3. The effective field theory of cosmological large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Carrasco, John Joseph M. [Stanford Univ., Stanford, CA (United States); Hertzberg, Mark P. [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States); Senatore, Leonardo [Stanford Univ., Stanford, CA (United States); SLAC National Accelerator Lab., Menlo Park, CA (United States)

    2012-09-20

    Large scale structure surveys will likely become the next leading cosmological probe. In our universe, matter perturbations are large on short distances and small at long scales, i.e. strongly coupled in the UV and weakly coupled in the IR. To make precise analytical predictions on large scales, we develop an effective field theory formulated in terms of an IR effective fluid characterized by several parameters, such as speed of sound and viscosity. These parameters, determined by the UV physics described by the Boltzmann equation, are measured from N-body simulations. We find that the speed of sound of the effective fluid is c2s ≈ 10–6c2 and that the viscosity contributions are of the same order. The fluid describes all the relevant physics at long scales k and permits a manifestly convergent perturbative expansion in the size of the matter perturbations δ(k) for all the observables. As an example, we calculate the correction to the power spectrum at order δ(k)4. As a result, the predictions of the effective field theory are found to be in much better agreement with observation than standard cosmological perturbation theory, already reaching percent precision at this order up to a relatively short scale k ≃ 0.24h Mpc–1.

  4. Temporal flexibility and careers: The role of large-scale organizations for physicians

    OpenAIRE

    Forrest Briscoe

    2006-01-01

    Temporal flexibility and careers: The role of large-scale organizations for physicians. Forrest Briscoe Briscoe This study investigates how employment in large-scale organizations affects the work lives of practicing physicians. Well-established theory associates larger organizations with bureaucratic constraint, loss of workplace control, and dissatisfaction, but this author finds that large scale is also associated with greater schedule and career flexibility. Ironically, the bureaucratic p...

  5. Factor solutions of the Social Phobia Scale (SPS) and the Social Interaction Anxiety Scale (SIAS) in a Swedish population.

    Science.gov (United States)

    Mörtberg, Ewa; Reuterskiöld, Lena; Tillfors, Maria; Furmark, Tomas; Öst, Lars-Göran

    2017-06-01

    Culturally validated rating scales for social anxiety disorder (SAD) are of significant importance when screening for the disorder, as well as for evaluating treatment efficacy. This study examined construct validity and additional psychometric properties of two commonly used scales, the Social Phobia Scale and the Social Interaction Anxiety Scale, in a clinical SAD population (n = 180) and in a normal population (n = 614) in Sweden. Confirmatory factor analyses of previously reported factor solutions were tested but did not reveal acceptable fit. Exploratory factor analyses (EFA) of the joint structure of the scales in the total population yielded a two-factor model (performance anxiety and social interaction anxiety), whereas EFA in the clinical sample revealed a three-factor solution, a social interaction anxiety factor and two performance anxiety factors. The SPS and SIAS showed good to excellent internal consistency, and discriminated well between patients with SAD and a normal population sample. Both scales showed good convergent validity with an established measure of SAD, whereas the discriminant validity of symptoms of social anxiety and depression could not be confirmed. The optimal cut-off score for SPS and SIAS were 18 and 22 points, respectively. It is concluded that the factor structure and the additional psychometric properties of SPS and SIAS support the use of the scales for assessment in a Swedish population.

  6. The role of large scale motions on passive scalar transport

    Science.gov (United States)

    Dharmarathne, Suranga; Araya, Guillermo; Tutkun, Murat; Leonardi, Stefano; Castillo, Luciano

    2014-11-01

    We study direct numerical simulation (DNS) of turbulent channel flow at Reτ = 394 to investigate effect of large scale motions on fluctuating temperature field which forms a passive scalar field. Statistical description of the large scale features of the turbulent channel flow is obtained using two-point correlations of velocity components. Two-point correlations of fluctuating temperature field is also examined in order to identify possible similarities between velocity and temperature fields. The two-point cross-correlations betwen the velocity and temperature fluctuations are further analyzed to establish connections between these two fields. In addition, we use proper orhtogonal decompotion (POD) to extract most dominant modes of the fields and discuss the coupling of large scale features of turbulence and the temperature field.

  7. Signatures of non-universal large scales in conditional structure functions from various turbulent flows

    International Nuclear Information System (INIS)

    Blum, Daniel B; Voth, Greg A; Bewley, Gregory P; Bodenschatz, Eberhard; Gibert, Mathieu; Xu Haitao; Gylfason, Ármann; Mydlarski, Laurent; Yeung, P K

    2011-01-01

    We present a systematic comparison of conditional structure functions in nine turbulent flows. The flows studied include forced isotropic turbulence simulated on a periodic domain, passive grid wind tunnel turbulence in air and in pressurized SF 6 , active grid wind tunnel turbulence (in both synchronous and random driving modes), the flow between counter-rotating discs, oscillating grid turbulence and the flow in the Lagrangian exploration module (in both constant and random driving modes). We compare longitudinal Eulerian second-order structure functions conditioned on the instantaneous large-scale velocity in each flow to assess the ways in which the large scales affect the small scales in a variety of turbulent flows. Structure functions are shown to have larger values when the large-scale velocity significantly deviates from the mean in most flows, suggesting that dependence on the large scales is typical in many turbulent flows. The effects of the large-scale velocity on the structure functions can be quite strong, with the structure function varying by up to a factor of 2 when the large-scale velocity deviates from the mean by ±2 standard deviations. In several flows, the effects of the large-scale velocity are similar at all the length scales we measured, indicating that the large-scale effects are scale independent. In a few flows, the effects of the large-scale velocity are larger on the smallest length scales. (paper)

  8. Cytology of DNA Replication Reveals Dynamic Plasticity of Large-Scale Chromatin Fibers.

    Science.gov (United States)

    Deng, Xiang; Zhironkina, Oxana A; Cherepanynets, Varvara D; Strelkova, Olga S; Kireev, Igor I; Belmont, Andrew S

    2016-09-26

    In higher eukaryotic interphase nuclei, the 100- to >1,000-fold linear compaction of chromatin is difficult to reconcile with its function as a template for transcription, replication, and repair. It is challenging to imagine how DNA and RNA polymerases with their associated molecular machinery would move along the DNA template without transient decondensation of observed large-scale chromatin "chromonema" fibers [1]. Transcription or "replication factory" models [2], in which polymerases remain fixed while DNA is reeled through, are similarly difficult to conceptualize without transient decondensation of these chromonema fibers. Here, we show how a dynamic plasticity of chromatin folding within large-scale chromatin fibers allows DNA replication to take place without significant changes in the global large-scale chromatin compaction or shape of these large-scale chromatin fibers. Time-lapse imaging of lac-operator-tagged chromosome regions shows no major change in the overall compaction of these chromosome regions during their DNA replication. Improved pulse-chase labeling of endogenous interphase chromosomes yields a model in which the global compaction and shape of large-Mbp chromatin domains remains largely invariant during DNA replication, with DNA within these domains undergoing significant movements and redistribution as they move into and then out of adjacent replication foci. In contrast to hierarchical folding models, this dynamic plasticity of large-scale chromatin organization explains how localized changes in DNA topology allow DNA replication to take place without an accompanying global unfolding of large-scale chromatin fibers while suggesting a possible mechanism for maintaining epigenetic programming of large-scale chromatin domains throughout DNA replication. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    NARCIS (Netherlands)

    Loon, van A.F.; Huijgevoort, van M.H.J.; Lanen, van H.A.J.

    2012-01-01

    Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological

  10. Configuration management in large scale infrastructure development

    NARCIS (Netherlands)

    Rijn, T.P.J. van; Belt, H. van de; Los, R.H.

    2000-01-01

    Large Scale Infrastructure (LSI) development projects such as the construction of roads, rail-ways and other civil engineering (water)works is tendered differently today than a decade ago. Traditional workflow requested quotes from construction companies for construction works where the works to be

  11. Dual Decomposition for Large-Scale Power Balancing

    DEFF Research Database (Denmark)

    Halvgaard, Rasmus; Jørgensen, John Bagterp; Vandenberghe, Lieven

    2013-01-01

    Dual decomposition is applied to power balancing of exible thermal storage units. The centralized large-scale problem is decomposed into smaller subproblems and solved locallyby each unit in the Smart Grid. Convergence is achieved by coordinating the units consumption through a negotiation...

  12. Generation of large-scale vortives in compressible helical turbulence

    International Nuclear Information System (INIS)

    Chkhetiani, O.G.; Gvaramadze, V.V.

    1989-01-01

    We consider generation of large-scale vortices in compressible self-gravitating turbulent medium. The closed equation describing evolution of the large-scale vortices in helical turbulence with finite correlation time is obtained. This equation has the form similar to the hydromagnetic dynamo equation, which allows us to call the vortx genertation effect the vortex dynamo. It is possible that principally the same mechanism is responsible both for amplification and maintenance of density waves and magnetic fields in gaseous disks of spiral galaxies. (author). 29 refs

  13. Dipolar modulation of Large-Scale Structure

    Science.gov (United States)

    Yoon, Mijin

    For the last two decades, we have seen a drastic development of modern cosmology based on various observations such as the cosmic microwave background (CMB), type Ia supernovae, and baryonic acoustic oscillations (BAO). These observational evidences have led us to a great deal of consensus on the cosmological model so-called LambdaCDM and tight constraints on cosmological parameters consisting the model. On the other hand, the advancement in cosmology relies on the cosmological principle: the universe is isotropic and homogeneous on large scales. Testing these fundamental assumptions is crucial and will soon become possible given the planned observations ahead. Dipolar modulation is the largest angular anisotropy of the sky, which is quantified by its direction and amplitude. We measured a huge dipolar modulation in CMB, which mainly originated from our solar system's motion relative to CMB rest frame. However, we have not yet acquired consistent measurements of dipolar modulations in large-scale structure (LSS), as they require large sky coverage and a number of well-identified objects. In this thesis, we explore measurement of dipolar modulation in number counts of LSS objects as a test of statistical isotropy. This thesis is based on two papers that were published in peer-reviewed journals. In Chapter 2 [Yoon et al., 2014], we measured a dipolar modulation in number counts of WISE matched with 2MASS sources. In Chapter 3 [Yoon & Huterer, 2015], we investigated requirements for detection of kinematic dipole in future surveys.

  14. Large-scale geographical variation in eggshell metal and calcium content in a passerine bird (Ficedula hypoleuca).

    Science.gov (United States)

    Ruuskanen, Suvi; Laaksonen, Toni; Morales, Judith; Moreno, Juan; Mateo, Rafael; Belskii, Eugen; Bushuev, Andrey; Järvinen, Antero; Kerimov, Anvar; Krams, Indrikis; Morosinotto, Chiara; Mänd, Raivo; Orell, Markku; Qvarnström, Anna; Slate, Fred; Tilgar, Vallo; Visser, Marcel E; Winkel, Wolfgang; Zang, Herwig; Eeva, Tapio

    2014-03-01

    Birds have been used as bioindicators of pollution, such as toxic metals. Levels of pollutants in eggs are especially interesting, as developing birds are more sensitive to detrimental effects of pollutants than adults. Only very few studies have monitored intraspecific, large-scale variation in metal pollution across a species' breeding range. We studied large-scale geographic variation in metal levels in the eggs of a small passerine, the pied flycatcher (Ficedula hypoleuca), sampled from 15 populations across Europe. We measured 10 eggshell elements (As, Cd, Cr, Cu, Ni, Pb, Zn, Se, Sr, and Ca) and several shell characteristics (mass, thickness, porosity, and color). We found significant variation among populations in eggshell metal levels for all metals except copper. Eggshell lead, zinc, and chromium levels decreased from central Europe to the north, in line with the gradient in pollution levels over Europe, thus suggesting that eggshell can be used as an indicator of pollution levels. Eggshell lead levels were also correlated with soil lead levels and pH. Most of the metals were not correlated with eggshell characteristics, with the exception of shell mass, or with breeding success, which may suggest that birds can cope well with the current background exposure levels across Europe.

  15. Large-scale academic achievement testing of deaf and hard-of-hearing students: past, present, and future.

    Science.gov (United States)

    Qi, Sen; Mitchell, Ross E

    2012-01-01

    The first large-scale, nationwide academic achievement testing program using Stanford Achievement Test (Stanford) for deaf and hard-of-hearing children in the United States started in 1969. Over the past three decades, the Stanford has served as a benchmark in the field of deaf education for assessing student academic achievement. However, the validity and reliability of using the Stanford for this special student population still require extensive scrutiny. Recent shifts in educational policy environment, which require that schools enable all children to achieve proficiency through accountability testing, warrants a close examination of the adequacy and relevance of the current large-scale testing of deaf and hard-of-hearing students. This study has three objectives: (a) it will summarize the historical data over the last three decades to indicate trends in academic achievement for this special population, (b) it will analyze the current federal laws and regulations related to educational testing and special education, thereby identifying gaps between policy and practice in the field, especially identifying the limitations of current testing programs in assessing what deaf and hard-of-hearing students know, and (c) it will offer some insights and suggestions for future testing programs for deaf and hard-of-hearing students.

  16. Impact of large-scale tides on cosmological distortions via redshift-space power spectrum

    Science.gov (United States)

    Akitsu, Kazuyuki; Takada, Masahiro

    2018-03-01

    Although large-scale perturbations beyond a finite-volume survey region are not direct observables, these affect measurements of clustering statistics of small-scale (subsurvey) perturbations in large-scale structure, compared with the ensemble average, via the mode-coupling effect. In this paper we show that a large-scale tide induced by scalar perturbations causes apparent anisotropic distortions in the redshift-space power spectrum of galaxies in a way depending on an alignment between the tide, wave vector of small-scale modes and line-of-sight direction. Using the perturbation theory of structure formation, we derive a response function of the redshift-space power spectrum to large-scale tide. We then investigate the impact of large-scale tide on estimation of cosmological distances and the redshift-space distortion parameter via the measured redshift-space power spectrum for a hypothetical large-volume survey, based on the Fisher matrix formalism. To do this, we treat the large-scale tide as a signal, rather than an additional source of the statistical errors, and show that a degradation in the parameter is restored if we can employ the prior on the rms amplitude expected for the standard cold dark matter (CDM) model. We also discuss whether the large-scale tide can be constrained at an accuracy better than the CDM prediction, if the effects up to a larger wave number in the nonlinear regime can be included.

  17. Large-scale Intelligent Transporation Systems simulation

    Energy Technology Data Exchange (ETDEWEB)

    Ewing, T.; Canfield, T.; Hannebutte, U.; Levine, D.; Tentner, A.

    1995-06-01

    A prototype computer system has been developed which defines a high-level architecture for a large-scale, comprehensive, scalable simulation of an Intelligent Transportation System (ITS) capable of running on massively parallel computers and distributed (networked) computer systems. The prototype includes the modelling of instrumented ``smart`` vehicles with in-vehicle navigation units capable of optimal route planning and Traffic Management Centers (TMC). The TMC has probe vehicle tracking capabilities (display position and attributes of instrumented vehicles), and can provide 2-way interaction with traffic to provide advisories and link times. Both the in-vehicle navigation module and the TMC feature detailed graphical user interfaces to support human-factors studies. The prototype has been developed on a distributed system of networked UNIX computers but is designed to run on ANL`s IBM SP-X parallel computer system for large scale problems. A novel feature of our design is that vehicles will be represented by autonomus computer processes, each with a behavior model which performs independent route selection and reacts to external traffic events much like real vehicles. With this approach, one will be able to take advantage of emerging massively parallel processor (MPP) systems.

  18. The Hamburg large scale geostrophic ocean general circulation model. Cycle 1

    International Nuclear Information System (INIS)

    Maier-Reimer, E.; Mikolajewicz, U.

    1992-02-01

    The rationale for the Large Scale Geostrophic ocean circulation model (LSG-OGCM) is based on the observations that for a large scale ocean circulation model designed for climate studies, the relevant characteristic spatial scales are large compared with the internal Rossby radius throughout most of the ocean, while the characteristic time scales are large compared with the periods of gravity modes and barotropic Rossby wave modes. In the present version of the model, the fast modes have been filtered out by a conventional technique of integrating the full primitive equations, including all terms except the nonlinear advection of momentum, by an implicit time integration method. The free surface is also treated prognostically, without invoking a rigid lid approximation. The numerical scheme is unconditionally stable and has the additional advantage that it can be applied uniformly to the entire globe, including the equatorial and coastal current regions. (orig.)

  19. Soft X-ray Emission from Large-Scale Galactic Outflows in Seyfert Galaxies

    Science.gov (United States)

    Colbert, E. J. M.; Baum, S.; O'Dea, C.; Veilleux, S.

    1998-01-01

    Kiloparsec-scale soft X-ray nebulae extend along the galaxy minor axes in several Seyfert galaxies, including NGC 2992, NGC 4388 and NGC 5506. In these three galaxies, the extended X-ray emission observed in ROSAT HRI images has 0.2-2.4 keV X-ray luminosities of 0.4-3.5 x 10(40) erg s(-1) . The X-ray nebulae are roughly co-spatial with the large-scale radio emission, suggesting that both are produced by large-scale galactic outflows. Assuming pressure balance between the radio and X-ray plasmas, the X-ray filling factor is >~ 10(4) times as large as the radio plasma filling factor, suggesting that large-scale outflows in Seyfert galaxies are predominantly winds of thermal X-ray emitting gas. We favor an interpretation in which large-scale outflows originate as AGN-driven jets that entrain and heat gas on kpc scales as they make their way out of the galaxy. AGN- and starburst-driven winds are also possible explanations if the winds are oriented along the rotation axis of the galaxy disk. Since large-scale outflows are present in at least 50 percent of Seyfert galaxies, the soft X-ray emission from the outflowing gas may, in many cases, explain the ``soft excess" X-ray feature observed below 2 keV in X-ray spectra of many Seyfert 2 galaxies.

  20. Pro website development and operations streamlining DevOps for large-scale websites

    CERN Document Server

    Sacks, Matthew

    2012-01-01

    Pro Website Development and Operations gives you the experience you need to create and operate a large-scale production website. Large-scale websites have their own unique set of problems regarding their design-problems that can get worse when agile methodologies are adopted for rapid results. Managing large-scale websites, deploying applications, and ensuring they are performing well often requires a full scale team involving the development and operations sides of the company-two departments that don't always see eye to eye. When departments struggle with each other, it adds unnecessary comp

  1. Hierarchical population monitoring of greater sage-grouse (Centrocercus urophasianus) in Nevada and California—Identifying populations for management at the appropriate spatial scale

    Science.gov (United States)

    Coates, Peter S.; Prochazka, Brian G.; Ricca, Mark A.; Wann, Gregory T.; Aldridge, Cameron L.; Hanser, Steven E.; Doherty, Kevin E.; O'Donnell, Michael S.; Edmunds, David R.; Espinosa, Shawn P.

    2017-08-10

    Population ecologists have long recognized the importance of ecological scale in understanding processes that guide observed demographic patterns for wildlife species. However, directly incorporating spatial and temporal scale into monitoring strategies that detect whether trajectories are driven by local or regional factors is challenging and rarely implemented. Identifying the appropriate scale is critical to the development of management actions that can attenuate or reverse population declines. We describe a novel example of a monitoring framework for estimating annual rates of population change for greater sage-grouse (Centrocercus urophasianus) within a hierarchical and spatially nested structure. Specifically, we conducted Bayesian analyses on a 17-year dataset (2000–2016) of lek counts in Nevada and northeastern California to estimate annual rates of population change, and compared trends across nested spatial scales. We identified leks and larger scale populations in immediate need of management, based on the occurrence of two criteria: (1) crossing of a destabilizing threshold designed to identify significant rates of population decline at a particular nested scale; and (2) crossing of decoupling thresholds designed to identify rates of population decline at smaller scales that decouple from rates of population change at a larger spatial scale. This approach establishes how declines affected by local disturbances can be separated from those operating at larger scales (for example, broad-scale wildfire and region-wide drought). Given the threshold output from our analysis, this adaptive management framework can be implemented readily and annually to facilitate responsive and effective actions for sage-grouse populations in the Great Basin. The rules of the framework can also be modified to identify populations responding positively to management action or demonstrating strong resilience to disturbance. Similar hierarchical approaches might be beneficial

  2. Evaluation of Large-scale Public Sector Reforms

    DEFF Research Database (Denmark)

    Breidahl, Karen Nielsen; Gjelstrup, Gunnar; Hansen, Hanne Foss

    2017-01-01

    and more delimited policy areas take place. In our analysis we apply four governance perspectives (rational-instrumental, rational-interest based, institutional-cultural and a chaos perspective) in a comparative analysis of the evaluations of two large-scale public sector reforms in Denmark and Norway. We...

  3. Highly Scalable Trip Grouping for Large Scale Collective Transportation Systems

    DEFF Research Database (Denmark)

    Gidofalvi, Gyozo; Pedersen, Torben Bach; Risch, Tore

    2008-01-01

    Transportation-related problems, like road congestion, parking, and pollution, are increasing in most cities. In order to reduce traffic, recent work has proposed methods for vehicle sharing, for example for sharing cabs by grouping "closeby" cab requests and thus minimizing transportation cost...... and utilizing cab space. However, the methods published so far do not scale to large data volumes, which is necessary to facilitate large-scale collective transportation systems, e.g., ride-sharing systems for large cities. This paper presents highly scalable trip grouping algorithms, which generalize previous...

  4. Penalized Estimation in Large-Scale Generalized Linear Array Models

    DEFF Research Database (Denmark)

    Lund, Adam; Vincent, Martin; Hansen, Niels Richard

    2017-01-01

    Large-scale generalized linear array models (GLAMs) can be challenging to fit. Computation and storage of its tensor product design matrix can be impossible due to time and memory constraints, and previously considered design matrix free algorithms do not scale well with the dimension...

  5. Large-scale coastal impact induced by a catastrophic storm

    DEFF Research Database (Denmark)

    Fruergaard, Mikkel; Andersen, Thorbjørn Joest; Johannessen, Peter N

    breaching. Our results demonstrate that violent, millennial-scale storms can trigger significant large-scale and long-term changes on barrier coasts, and that coastal changes assumed to take place over centuries or even millennia may occur in association with a single extreme storm event....

  6. Large-eddy simulation with accurate implicit subgrid-scale diffusion

    NARCIS (Netherlands)

    B. Koren (Barry); C. Beets

    1996-01-01

    textabstractA method for large-eddy simulation is presented that does not use an explicit subgrid-scale diffusion term. Subgrid-scale effects are modelled implicitly through an appropriate monotone (in the sense of Spekreijse 1987) discretization method for the advective terms. Special attention is

  7. Challenges for Large Scale Structure Theory

    CERN Multimedia

    CERN. Geneva

    2018-01-01

    I will describe some of the outstanding questions in Cosmology where answers could be provided by observations of the Large Scale Structure of the Universe at late times.I will discuss some of the theoretical challenges which will have to be overcome to extract this information from the observations. I will describe some of the theoretical tools that might be useful to achieve this goal. 

  8. Broad-scale Population Genetics of the Host Sea Anemone, Heteractis magnifica

    KAUST Repository

    Emms, Madeleine

    2015-12-01

    Broad-scale population genetics can reveal population structure across an organism’s entire range, which can enable us to determine the most efficient population-wide management strategy depending on levels of connectivity. Genetic variation and differences in genetic diversity on small-scales have been reported in anemones, but nothing is known about their broad-scale population structure, including that of “host” anemone species, which are increasingly being targeted in the aquarium trade. In this study, microsatellite markers were used as a tool to determine the population structure of a sessile, host anemone species, Heteractis magnifica, across the Indo-Pacific region. In addition, two rDNA markers were used to identify Symbiodinium from the samples, and phylogenetic analyses were used to measure diversity and geographic distribution of Symbiodinium across the region. Significant population structure was identified in H. magnifica across the Indo-Pacific, with at least three genetic breaks, possibly the result of factors such as geographic distance, geographic isolation and environmental variation. Symbiodinium associations were also affected by environmental variation and supported the geographic isolation of some regions. These results suggests that management of H. magnifica must be implemented on a local scale, due to the lack of connectivity between clusters. This study also provides further evidence for the combined effects of geographic distance and environmental distance in explaining genetic variance.

  9. Large Scale Investments in Infrastructure : Competing Policy regimes to Control Connections

    NARCIS (Netherlands)

    Otsuki, K.; Read, M.L.; Zoomers, E.B.

    2016-01-01

    This paper proposes to analyse implications of large-scale investments in physical infrastructure for social and environmental justice. While case studies on the global land rush and climate change have advanced our understanding of how large-scale investments in land, forests and water affect

  10. Rotation invariant fast features for large-scale recognition

    Science.gov (United States)

    Takacs, Gabriel; Chandrasekhar, Vijay; Tsai, Sam; Chen, David; Grzeszczuk, Radek; Girod, Bernd

    2012-10-01

    We present an end-to-end feature description pipeline which uses a novel interest point detector and Rotation- Invariant Fast Feature (RIFF) descriptors. The proposed RIFF algorithm is 15× faster than SURF1 while producing large-scale retrieval results that are comparable to SIFT.2 Such high-speed features benefit a range of applications from Mobile Augmented Reality (MAR) to web-scale image retrieval and analysis.

  11. Large-scale bioenergy production: how to resolve sustainability trade-offs?

    Science.gov (United States)

    Humpenöder, Florian; Popp, Alexander; Bodirsky, Benjamin Leon; Weindl, Isabelle; Biewald, Anne; Lotze-Campen, Hermann; Dietrich, Jan Philipp; Klein, David; Kreidenweis, Ulrich; Müller, Christoph; Rolinski, Susanne; Stevanovic, Miodrag

    2018-02-01

    Large-scale 2nd generation bioenergy deployment is a key element of 1.5 °C and 2 °C transformation pathways. However, large-scale bioenergy production might have negative sustainability implications and thus may conflict with the Sustainable Development Goal (SDG) agenda. Here, we carry out a multi-criteria sustainability assessment of large-scale bioenergy crop production throughout the 21st century (300 EJ in 2100) using a global land-use model. Our analysis indicates that large-scale bioenergy production without complementary measures results in negative effects on the following sustainability indicators: deforestation, CO2 emissions from land-use change, nitrogen losses, unsustainable water withdrawals and food prices. One of our main findings is that single-sector environmental protection measures next to large-scale bioenergy production are prone to involve trade-offs among these sustainability indicators—at least in the absence of more efficient land or water resource use. For instance, if bioenergy production is accompanied by forest protection, deforestation and associated emissions (SDGs 13 and 15) decline substantially whereas food prices (SDG 2) increase. However, our study also shows that this trade-off strongly depends on the development of future food demand. In contrast to environmental protection measures, we find that agricultural intensification lowers some side-effects of bioenergy production substantially (SDGs 13 and 15) without generating new trade-offs—at least among the sustainability indicators considered here. Moreover, our results indicate that a combination of forest and water protection schemes, improved fertilization efficiency, and agricultural intensification would reduce the side-effects of bioenergy production most comprehensively. However, although our study includes more sustainability indicators than previous studies on bioenergy side-effects, our study represents only a small subset of all indicators relevant for the

  12. Large-scale structure in the universe: Theory vs observations

    International Nuclear Information System (INIS)

    Kashlinsky, A.; Jones, B.J.T.

    1990-01-01

    A variety of observations constrain models of the origin of large scale cosmic structures. We review here the elements of current theories and comment in detail on which of the current observational data provide the principal constraints. We point out that enough observational data have accumulated to constrain (and perhaps determine) the power spectrum of primordial density fluctuations over a very large range of scales. We discuss the theories in the light of observational data and focus on the potential of future observations in providing even (and ever) tighter constraints. (orig.)

  13. Evaluation of drought propagation in an ensemble mean of large-scale hydrological models

    Directory of Open Access Journals (Sweden)

    A. F. Van Loon

    2012-11-01

    Full Text Available Hydrological drought is increasingly studied using large-scale models. It is, however, not sure whether large-scale models reproduce the development of hydrological drought correctly. The pressing question is how well do large-scale models simulate the propagation from meteorological to hydrological drought? To answer this question, we evaluated the simulation of drought propagation in an ensemble mean of ten large-scale models, both land-surface models and global hydrological models, that participated in the model intercomparison project of WATCH (WaterMIP. For a selection of case study areas, we studied drought characteristics (number of droughts, duration, severity, drought propagation features (pooling, attenuation, lag, lengthening, and hydrological drought typology (classical rainfall deficit drought, rain-to-snow-season drought, wet-to-dry-season drought, cold snow season drought, warm snow season drought, composite drought.

    Drought characteristics simulated by large-scale models clearly reflected drought propagation; i.e. drought events became fewer and longer when moving through the hydrological cycle. However, more differentiation was expected between fast and slowly responding systems, with slowly responding systems having fewer and longer droughts in runoff than fast responding systems. This was not found using large-scale models. Drought propagation features were poorly reproduced by the large-scale models, because runoff reacted immediately to precipitation, in all case study areas. This fast reaction to precipitation, even in cold climates in winter and in semi-arid climates in summer, also greatly influenced the hydrological drought typology as identified by the large-scale models. In general, the large-scale models had the correct representation of drought types, but the percentages of occurrence had some important mismatches, e.g. an overestimation of classical rainfall deficit droughts, and an

  14. Finite-time and finite-size scalings in the evaluation of large-deviation functions: Numerical approach in continuous time.

    Science.gov (United States)

    Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien

    2017-06-01

    Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provides a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to selection rules that favor the rare trajectories of interest. Such algorithms are plagued by finite simulation time and finite population size, effects that can render their use delicate. In this paper, we present a numerical approach which uses the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of rare trajectories. The method we propose allows one to extract the infinite-time and infinite-size limit of these estimators, which-as shown on the contact process-provides a significant improvement of the large deviation function estimators compared to the standard one.

  15. Finite-time and finite-size scalings in the evaluation of large-deviation functions: Numerical approach in continuous time

    Science.gov (United States)

    Guevara Hidalgo, Esteban; Nemoto, Takahiro; Lecomte, Vivien

    2017-06-01

    Rare trajectories of stochastic systems are important to understand because of their potential impact. However, their properties are by definition difficult to sample directly. Population dynamics provides a numerical tool allowing their study, by means of simulating a large number of copies of the system, which are subjected to selection rules that favor the rare trajectories of interest. Such algorithms are plagued by finite simulation time and finite population size, effects that can render their use delicate. In this paper, we present a numerical approach which uses the finite-time and finite-size scalings of estimators of the large deviation functions associated to the distribution of rare trajectories. The method we propose allows one to extract the infinite-time and infinite-size limit of these estimators, which—as shown on the contact process—provides a significant improvement of the large deviation function estimators compared to the standard one.

  16. Large scale treatment of total petroleum-hydrocarbon contaminated groundwater using bioaugmentation.

    Science.gov (United States)

    Poi, Gregory; Shahsavari, Esmaeil; Aburto-Medina, Arturo; Mok, Puah Chum; Ball, Andrew S

    2018-05-15

    Bioaugmentation or the addition of microbes to contaminated sites has been widely used to treat contaminated soil or water; however this approach is often limited to laboratory based studies. In the present study, large scale bioaugmentation has been applied to total petroleum hydrocarbons (TPH)-contaminated groundwater at a petroleum facility. Initial TPH concentrations of 1564 mg L -1 in the field were reduced to 89 mg L -1 over 32 days. This reduction was accompanied by improved ecotoxicity, as shown by Brassica rapa germination numbers that increased from 52 at day 0 to 82% by the end of the treatment. Metagenomic analysis indicated that there was a shift in the microbial community when compared to the beginning of the treatment. The microbial community was dominated by Proteobacteria and Bacteroidetes from day 0 to day 32, although differences at the genus level were observed. The predominant genera at the beginning of the treatment (day 0 just after inoculation) were Cloacibacterium, Sediminibacterium and Brevundimonas while at the end of the treatment members of Flavobacterium dominated, reaching almost half the population (41%), followed by Pseudomonas (6%) and Limnobacter (5.8%). To the author's knowledge, this is among the first studies to report the successful large scale biodegradation of TPH-contaminated groundwater (18,000 L per treatment session) at an offshore petrochemical facility. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Development of electric road vehicles in France. Political measures, large-scale tests, and strategy of PSA Peugeot Citroen

    International Nuclear Information System (INIS)

    Beau, J.C.

    1993-01-01

    France offers particularly favourable conditions for the further development and the market introduction of electric vehicles: On account of the electricity production with almost no exhaust emission and due to the concentrated population structure stemming from the historical background in densely populated historical towns up to the innovational, electrochemical and electrotechnical industries and last but not least the automotive industry itself. The article is structured as follows: A) Political measures, large scale experiments in France; B) Strategy of PSA Peugeot Citroen; C) Activities by Peugeot in Germany. (orig.) [de

  18. Multiresolution comparison of precipitation datasets for large-scale models

    Science.gov (United States)

    Chun, K. P.; Sapriza Azuri, G.; Davison, B.; DeBeer, C. M.; Wheater, H. S.

    2014-12-01

    Gridded precipitation datasets are crucial for driving large-scale models which are related to weather forecast and climate research. However, the quality of precipitation products is usually validated individually. Comparisons between gridded precipitation products along with ground observations provide another avenue for investigating how the precipitation uncertainty would affect the performance of large-scale models. In this study, using data from a set of precipitation gauges over British Columbia and Alberta, we evaluate several widely used North America gridded products including the Canadian Gridded Precipitation Anomalies (CANGRD), the National Center for Environmental Prediction (NCEP) reanalysis, the Water and Global Change (WATCH) project, the thin plate spline smoothing algorithms (ANUSPLIN) and Canadian Precipitation Analysis (CaPA). Based on verification criteria for various temporal and spatial scales, results provide an assessment of possible applications for various precipitation datasets. For long-term climate variation studies (~100 years), CANGRD, NCEP, WATCH and ANUSPLIN have different comparative advantages in terms of their resolution and accuracy. For synoptic and mesoscale precipitation patterns, CaPA provides appealing performance of spatial coherence. In addition to the products comparison, various downscaling methods are also surveyed to explore new verification and bias-reduction methods for improving gridded precipitation outputs for large-scale models.

  19. Toward Instructional Leadership: Principals' Perceptions of Large-Scale Assessment in Schools

    Science.gov (United States)

    Prytula, Michelle; Noonan, Brian; Hellsten, Laurie

    2013-01-01

    This paper describes a study of the perceptions that Saskatchewan school principals have regarding large-scale assessment reform and their perceptions of how assessment reform has affected their roles as principals. The findings revealed that large-scale assessments, especially provincial assessments, have affected the principal in Saskatchewan…

  20. A large scale field experiment in the Amazon basin (LAMBADA/BATERISTA)

    NARCIS (Netherlands)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C.

    1995-01-01

    A description is given of a large-scale field experiment planned in the Amazon basin, aimed at assessing the large-scale balances of energy, water and carbon dioxide. The embedding of this experiment in global change programmes is described, viz. the Biospheric Aspects of the Hydrological Cycle

  1. Large-scale derived flood frequency analysis based on continuous simulation

    Science.gov (United States)

    Dung Nguyen, Viet; Hundecha, Yeshewatesfa; Guse, Björn; Vorogushyn, Sergiy; Merz, Bruno

    2016-04-01

    There is an increasing need for spatially consistent flood risk assessments at the regional scale (several 100.000 km2), in particular in the insurance industry and for national risk reduction strategies. However, most large-scale flood risk assessments are composed of smaller-scale assessments and show spatial inconsistencies. To overcome this deficit, a large-scale flood model composed of a weather generator and catchments models was developed reflecting the spatially inherent heterogeneity. The weather generator is a multisite and multivariate stochastic model capable of generating synthetic meteorological fields (precipitation, temperature, etc.) at daily resolution for the regional scale. These fields respect the observed autocorrelation, spatial correlation and co-variance between the variables. They are used as input into catchment models. A long-term simulation of this combined system enables to derive very long discharge series at many catchment locations serving as a basic for spatially consistent flood risk estimates at the regional scale. This combined model was set up and validated for major river catchments in Germany. The weather generator was trained by 53-year observation data at 528 stations covering not only the complete Germany but also parts of France, Switzerland, Czech Republic and Australia with the aggregated spatial scale of 443,931 km2. 10.000 years of daily meteorological fields for the study area were generated. Likewise, rainfall-runoff simulations with SWIM were performed for the entire Elbe, Rhine, Weser, Donau and Ems catchments. The validation results illustrate a good performance of the combined system, as the simulated flood magnitudes and frequencies agree well with the observed flood data. Based on continuous simulation this model chain is then used to estimate flood quantiles for the whole Germany including upstream headwater catchments in neighbouring countries. This continuous large scale approach overcomes the several

  2. Distributed parallel cooperative coevolutionary multi-objective large-scale immune algorithm for deployment of wireless sensor networks

    DEFF Research Database (Denmark)

    Cao, Bin; Zhao, Jianwei; Yang, Po

    2018-01-01

    -objective evolutionary algorithms the Cooperative Coevolutionary Generalized Differential Evolution 3, the Cooperative Multi-objective Differential Evolution and the Nondominated Sorting Genetic Algorithm III, the proposed algorithm addresses the deployment optimization problem efficiently and effectively.......Using immune algorithms is generally a time-intensive process especially for problems with a large number of variables. In this paper, we propose a distributed parallel cooperative coevolutionary multi-objective large-scale immune algorithm that is implemented using the message passing interface...... (MPI). The proposed algorithm is composed of three layers: objective, group and individual layers. First, for each objective in the multi-objective problem to be addressed, a subpopulation is used for optimization, and an archive population is used to optimize all the objectives. Second, the large...

  3. GAIA: A WINDOW TO LARGE-SCALE MOTIONS

    Energy Technology Data Exchange (ETDEWEB)

    Nusser, Adi [Physics Department and the Asher Space Science Institute-Technion, Haifa 32000 (Israel); Branchini, Enzo [Department of Physics, Universita Roma Tre, Via della Vasca Navale 84, 00146 Rome (Italy); Davis, Marc, E-mail: adi@physics.technion.ac.il, E-mail: branchin@fis.uniroma3.it, E-mail: mdavis@berkeley.edu [Departments of Astronomy and Physics, University of California, Berkeley, CA 94720 (United States)

    2012-08-10

    Using redshifts as a proxy for galaxy distances, estimates of the two-dimensional (2D) transverse peculiar velocities of distant galaxies could be obtained from future measurements of proper motions. We provide the mathematical framework for analyzing 2D transverse motions and show that they offer several advantages over traditional probes of large-scale motions. They are completely independent of any intrinsic relations between galaxy properties; hence, they are essentially free of selection biases. They are free from homogeneous and inhomogeneous Malmquist biases that typically plague distance indicator catalogs. They provide additional information to traditional probes that yield line-of-sight peculiar velocities only. Further, because of their 2D nature, fundamental questions regarding vorticity of large-scale flows can be addressed. Gaia, for example, is expected to provide proper motions of at least bright galaxies with high central surface brightness, making proper motions a likely contender for traditional probes based on current and future distance indicator measurements.

  4. Large-scale hydrogen production using nuclear reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ryland, D.; Stolberg, L.; Kettner, A.; Gnanapragasam, N.; Suppiah, S. [Atomic Energy of Canada Limited, Chalk River, ON (Canada)

    2014-07-01

    For many years, Atomic Energy of Canada Limited (AECL) has been studying the feasibility of using nuclear reactors, such as the Supercritical Water-cooled Reactor, as an energy source for large scale hydrogen production processes such as High Temperature Steam Electrolysis and the Copper-Chlorine thermochemical cycle. Recent progress includes the augmentation of AECL's experimental capabilities by the construction of experimental systems to test high temperature steam electrolysis button cells at ambient pressure and temperatures up to 850{sup o}C and CuCl/HCl electrolysis cells at pressures up to 7 bar and temperatures up to 100{sup o}C. In parallel, detailed models of solid oxide electrolysis cells and the CuCl/HCl electrolysis cell are being refined and validated using experimental data. Process models are also under development to assess options for economic integration of these hydrogen production processes with nuclear reactors. Options for large-scale energy storage, including hydrogen storage, are also under study. (author)

  5. Planck intermediate results XLII. Large-scale Galactic magnetic fields

    DEFF Research Database (Denmark)

    Adam, R.; Ade, P. A. R.; Alves, M. I. R.

    2016-01-01

    Recent models for the large-scale Galactic magnetic fields in the literature have been largely constrained by synchrotron emission and Faraday rotation measures. We use three different but representative models to compare their predicted polarized synchrotron and dust emission with that measured ...

  6. A Topology Visualization Early Warning Distribution Algorithm for Large-Scale Network Security Incidents

    Directory of Open Access Journals (Sweden)

    Hui He

    2013-01-01

    Full Text Available It is of great significance to research the early warning system for large-scale network security incidents. It can improve the network system’s emergency response capabilities, alleviate the cyber attacks’ damage, and strengthen the system’s counterattack ability. A comprehensive early warning system is presented in this paper, which combines active measurement and anomaly detection. The key visualization algorithm and technology of the system are mainly discussed. The large-scale network system’s plane visualization is realized based on the divide and conquer thought. First, the topology of the large-scale network is divided into some small-scale networks by the MLkP/CR algorithm. Second, the sub graph plane visualization algorithm is applied to each small-scale network. Finally, the small-scale networks’ topologies are combined into a topology based on the automatic distribution algorithm of force analysis. As the algorithm transforms the large-scale network topology plane visualization problem into a series of small-scale network topology plane visualization and distribution problems, it has higher parallelism and is able to handle the display of ultra-large-scale network topology.

  7. No Large Scale Curvature Perturbations during Waterfall of Hybrid Inflation

    OpenAIRE

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2010-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depend crucially on the competition between the classical and the quantum mechanical back-reactions to terminate inflation. If one considers only the clas...

  8. Large Scale Emerging Properties from Non Hamiltonian Complex Systems

    Directory of Open Access Journals (Sweden)

    Marco Bianucci

    2017-06-01

    Full Text Available The concept of “large scale” depends obviously on the phenomenon we are interested in. For example, in the field of foundation of Thermodynamics from microscopic dynamics, the spatial and time large scales are order of fraction of millimetres and microseconds, respectively, or lesser, and are defined in relation to the spatial and time scales of the microscopic systems. In large scale oceanography or global climate dynamics problems the time scales of interest are order of thousands of kilometres, for space, and many years for time, and are compared to the local and daily/monthly times scales of atmosphere and ocean dynamics. In all the cases a Zwanzig projection approach is, at least in principle, an effective tool to obtain class of universal smooth “large scale” dynamics for few degrees of freedom of interest, starting from the complex dynamics of the whole (usually many degrees of freedom system. The projection approach leads to a very complex calculus with differential operators, that is drastically simplified when the basic dynamics of the system of interest is Hamiltonian, as it happens in Foundation of Thermodynamics problems. However, in geophysical Fluid Dynamics, Biology, and in most of the physical problems the building block fundamental equations of motions have a non Hamiltonian structure. Thus, to continue to apply the useful projection approach also in these cases, we exploit the generalization of the Hamiltonian formalism given by the Lie algebra of dissipative differential operators. In this way, we are able to analytically deal with the series of the differential operators stemming from the projection approach applied to these general cases. Then we shall apply this formalism to obtain some relevant results concerning the statistical properties of the El Niño Southern Oscillation (ENSO.

  9. Fine-Scale Population Estimation by 3D Reconstruction of Urban Residential Buildings

    Science.gov (United States)

    Wang, Shixin; Tian, Ye; Zhou, Yi; Liu, Wenliang; Lin, Chenxi

    2016-01-01

    Fine-scale population estimation is essential in emergency response and epidemiological applications as well as urban planning and management. However, representing populations in heterogeneous urban regions with a finer resolution is a challenge. This study aims to obtain fine-scale population distribution based on 3D reconstruction of urban residential buildings with morphological operations using optical high-resolution (HR) images from the Chinese No. 3 Resources Satellite (ZY-3). Specifically, the research area was first divided into three categories when dasymetric mapping was taken into consideration. The results demonstrate that the morphological building index (MBI) yielded better results than built-up presence index (PanTex) in building detection, and the morphological shadow index (MSI) outperformed color invariant indices (CIIT) in shadow extraction and height retrieval. Building extraction and height retrieval were then combined to reconstruct 3D models and to estimate population. Final results show that this approach is effective in fine-scale population estimation, with a mean relative error of 16.46% and an overall Relative Total Absolute Error (RATE) of 0.158. This study gives significant insights into fine-scale population estimation in complicated urban landscapes, when detailed 3D information of buildings is unavailable. PMID:27775670

  10. A study of rotor and platform design trade-offs for large-scale floating vertical axis wind turbines

    Science.gov (United States)

    Griffith, D. Todd; Paquette, Joshua; Barone, Matthew; Goupee, Andrew J.; Fowler, Matthew J.; Bull, Diana; Owens, Brian

    2016-09-01

    Vertical axis wind turbines are receiving significant attention for offshore siting. In general, offshore wind offers proximity to large populations centers, a vast & more consistent wind resource, and a scale-up opportunity, to name a few beneficial characteristics. On the other hand, offshore wind suffers from high levelized cost of energy (LCOE) and in particular high balance of system (BoS) costs owing to accessibility challenges and limited project experience. To address these challenges associated with offshore wind, Sandia National Laboratories is researching large-scale (MW class) offshore floating vertical axis wind turbines (VAWTs). The motivation for this work is that floating VAWTs are a potential transformative technology solution to reduce offshore wind LCOE in deep-water locations. This paper explores performance and cost trade-offs within the design space for floating VAWTs between the configurations for the rotor and platform.

  11. A new system of labour management in African large-scale agriculture?

    DEFF Research Database (Denmark)

    Gibbon, Peter; Riisgaard, Lone

    2014-01-01

    This paper applies a convention theory (CT) approach to the analysis of labour management systems in African large-scale farming. The reconstruction of previous analyses of high-value crop production on large-scale farms in Africa in terms of CT suggests that, since 1980–95, labour management has...

  12. Pseudoscalar-photon mixing and the large scale alignment of QsO ...

    Indian Academy of Sciences (India)

    physics pp. 679-682. Pseudoscalar-photon mixing and the large scale alignment of QsO optical polarizations. PANKAJ JAIN, sUKANTA PANDA and s sARALA. Physics Department, Indian Institute of Technology, Kanpur 208 016, India. Abstract. We review the observation of large scale alignment of QSO optical polariza-.

  13. On the universal character of the large scale structure of the universe

    International Nuclear Information System (INIS)

    Demianski, M.; International Center for Relativistic Astrophysics; Rome Univ.; Doroshkevich, A.G.

    1991-01-01

    We review different theories of formation of the large scale structure of the Universe. Special emphasis is put on the theory of inertial instability. We show that for a large class of initial spectra the resulting two point correlation functions are similar. We discuss also the adhesion theory which uses the Burgers equation, Navier-Stokes equation or coagulation process. We review the Zeldovich theory of gravitational instability and discuss the internal structure of pancakes. Finally we discuss the role of the velocity potential in determining the global characteristics of large scale structures (distribution of caustics, scale of voids, etc.). In the last chapter we list the main unsolved problems and main successes of the theory of formation of large scale structure. (orig.)

  14. The spatial scale of genetic subdivision in populations of Ifremeria nautilei, a hydrothermal-vent gastropod from the southwest Pacific

    Directory of Open Access Journals (Sweden)

    Thaler Andrew D

    2011-12-01

    Full Text Available Abstract Background Deep-sea hydrothermal vents provide patchy, ephemeral habitats for specialized communities of animals that depend on chemoautotrophic primary production. Unlike eastern Pacific hydrothermal vents, where population structure has been studied at large (thousands of kilometres and small (hundreds of meters spatial scales, population structure of western Pacific vents has received limited attention. This study addresses the scale at which genetic differentiation occurs among populations of a western Pacific vent-restricted gastropod, Ifremeria nautilei. Results We used mitochondrial and DNA microsatellite markers to infer patterns of gene flow and population subdivision. A nested sampling strategy was employed to compare genetic diversity in discrete patches of Ifremeria nautilei separated by a few meters within a single vent field to distances as great as several thousand kilometres between back-arc basins that encompass the known range of the species. No genetic subdivisions were detected among patches, mounds, or sites within Manus Basin. Although I. nautilei from Lau and North Fiji Basins (~1000 km apart also exhibited no evidence for genetic subdivision, these populations were genetically distinct from the Manus Basin population. Conclusions An unknown process that restricts contemporary gene flow isolates the Manus Basin population of Ifremeria nautilei from widespread populations that occupy the North Fiji and Lau Basins. A robust understanding of the genetic structure of hydrothermal vent populations at multiple spatial scales defines natural conservation units and can help minimize loss of genetic diversity in situations where human activities are proposed and managed.

  15. LAVA: Large scale Automated Vulnerability Addition

    Science.gov (United States)

    2016-05-23

    LAVA: Large-scale Automated Vulnerability Addition Brendan Dolan -Gavitt∗, Patrick Hulin†, Tim Leek†, Fredrich Ulrich†, Ryan Whelan† (Authors listed...released, and thus rapidly become stale. We can expect tools to have been trained to detect bugs that have been released. Given the commercial price tag...low TCN) and dead (low liveness) program data is a powerful one for vulnera- bility injection. The DUAs it identifies are internal program quantities

  16. Large-Scale Transit Signal Priority Implementation

    OpenAIRE

    Lee, Kevin S.; Lozner, Bailey

    2018-01-01

    In 2016, the District Department of Transportation (DDOT) deployed Transit Signal Priority (TSP) at 195 intersections in highly urbanized areas of Washington, DC. In collaboration with a broader regional implementation, and in partnership with the Washington Metropolitan Area Transit Authority (WMATA), DDOT set out to apply a systems engineering–driven process to identify, design, test, and accept a large-scale TSP system. This presentation will highlight project successes and lessons learned.

  17. Probing cosmology with the homogeneity scale of the Universe through large scale structure surveys

    International Nuclear Information System (INIS)

    Ntelis, Pierros

    2017-01-01

    This thesis exposes my contribution to the measurement of homogeneity scale using galaxies, with the cosmological interpretation of results. In physics, any model is characterized by a set of principles. Most models in cosmology are based on the Cosmological Principle, which states that the universe is statistically homogeneous and isotropic on a large scales. Today, this principle is considered to be true since it is respected by those cosmological models that accurately describe the observations. However, while the isotropy of the universe is now confirmed by many experiments, it is not the case for the homogeneity. To study cosmic homogeneity, we propose to not only test a model but to test directly one of the postulates of modern cosmology. Since 1998 the measurements of cosmic distances using type Ia supernovae, we know that the universe is now in a phase of accelerated expansion. This phenomenon can be explained by the addition of an unknown energy component, which is called dark energy. Since dark energy is responsible for the expansion of the universe, we can study this mysterious fluid by measuring the rate of expansion of the universe. The universe has imprinted in its matter distribution a standard ruler, the Baryon Acoustic Oscillation (BAO) scale. By measuring this scale at different times during the evolution of our universe, it is then possible to measure the rate of expansion of the universe and thus characterize this dark energy. Alternatively, we can use the homogeneity scale to study this dark energy. Studying the homogeneity and the BAO scale requires the statistical study of the matter distribution of the universe at large scales, superior to tens of Mega-parsecs. Galaxies and quasars are formed in the vast over densities of matter and they are very luminous: these sources trace the distribution of matter. By measuring the emission spectra of these sources using large spectroscopic surveys, such as BOSS and eBOSS, we can measure their positions

  18. Large-Scale Optimization for Bayesian Inference in Complex Systems

    Energy Technology Data Exchange (ETDEWEB)

    Willcox, Karen [MIT; Marzouk, Youssef [MIT

    2013-11-12

    The SAGUARO (Scalable Algorithms for Groundwater Uncertainty Analysis and Robust Optimization) Project focused on the development of scalable numerical algorithms for large-scale Bayesian inversion in complex systems that capitalize on advances in large-scale simulation-based optimization and inversion methods. The project was a collaborative effort among MIT, the University of Texas at Austin, Georgia Institute of Technology, and Sandia National Laboratories. The research was directed in three complementary areas: efficient approximations of the Hessian operator, reductions in complexity of forward simulations via stochastic spectral approximations and model reduction, and employing large-scale optimization concepts to accelerate sampling. The MIT--Sandia component of the SAGUARO Project addressed the intractability of conventional sampling methods for large-scale statistical inverse problems by devising reduced-order models that are faithful to the full-order model over a wide range of parameter values; sampling then employs the reduced model rather than the full model, resulting in very large computational savings. Results indicate little effect on the computed posterior distribution. On the other hand, in the Texas--Georgia Tech component of the project, we retain the full-order model, but exploit inverse problem structure (adjoint-based gradients and partial Hessian information of the parameter-to-observation map) to implicitly extract lower dimensional information on the posterior distribution; this greatly speeds up sampling methods, so that fewer sampling points are needed. We can think of these two approaches as ``reduce then sample'' and ``sample then reduce.'' In fact, these two approaches are complementary, and can be used in conjunction with each other. Moreover, they both exploit deterministic inverse problem structure, in the form of adjoint-based gradient and Hessian information of the underlying parameter-to-observation map, to

  19. Response of deep and shallow tropical maritime cumuli to large-scale processes

    Science.gov (United States)

    Yanai, M.; Chu, J.-H.; Stark, T. E.; Nitta, T.

    1976-01-01

    The bulk diagnostic method of Yanai et al. (1973) and a simplified version of the spectral diagnostic method of Nitta (1975) are used for a more quantitative evaluation of the response of various types of cumuliform clouds to large-scale processes, using the same data set in the Marshall Islands area for a 100-day period in 1956. The dependence of the cloud mass flux distribution on radiative cooling, large-scale vertical motion, and evaporation from the sea is examined. It is shown that typical radiative cooling rates in the tropics tend to produce a bimodal distribution of mass spectrum exhibiting deep and shallow clouds. The bimodal distribution is further enhanced when the large-scale vertical motion is upward, and a nearly unimodal distribution of shallow clouds prevails when the relative cooling is compensated by the heating due to the large-scale subsidence. Both deep and shallow clouds are modulated by large-scale disturbances. The primary role of surface evaporation is to maintain the moisture flux at the cloud base.

  20. Accuracy assessment of planimetric large-scale map data for decision-making

    Directory of Open Access Journals (Sweden)

    Doskocz Adam

    2016-06-01

    Full Text Available This paper presents decision-making risk estimation based on planimetric large-scale map data, which are data sets or databases which are useful for creating planimetric maps on scales of 1:5,000 or larger. The studies were conducted on four data sets of large-scale map data. Errors of map data were used for a risk assessment of decision-making about the localization of objects, e.g. for land-use planning in realization of investments. An analysis was performed for a large statistical sample set of shift vectors of control points, which were identified with the position errors of these points (errors of map data.

  1. Reviving large-scale projects

    International Nuclear Information System (INIS)

    Desiront, A.

    2003-01-01

    For the past decade, most large-scale hydro development projects in northern Quebec have been put on hold due to land disputes with First Nations. Hydroelectric projects have recently been revived following an agreement signed with Aboriginal communities in the province who recognized the need to find new sources of revenue for future generations. Many Cree are working on the project to harness the waters of the Eastmain River located in the middle of their territory. The work involves building an 890 foot long dam, 30 dikes enclosing a 603 square-km reservoir, a spillway, and a power house with 3 generating units with a total capacity of 480 MW of power for start-up in 2007. The project will require the use of 2,400 workers in total. The Cree Construction and Development Company is working on relations between Quebec's 14,000 Crees and the James Bay Energy Corporation, the subsidiary of Hydro-Quebec which is developing the project. Approximately 10 per cent of the $735-million project has been designated for the environmental component. Inspectors ensure that the project complies fully with environmental protection guidelines. Total development costs for Eastmain-1 are in the order of $2 billion of which $735 million will cover work on site and the remainder will cover generating units, transportation and financial charges. Under the treaty known as the Peace of the Braves, signed in February 2002, the Quebec government and Hydro-Quebec will pay the Cree $70 million annually for 50 years for the right to exploit hydro, mining and forest resources within their territory. The project comes at a time when electricity export volumes to the New England states are down due to growth in Quebec's domestic demand. Hydropower is a renewable and non-polluting source of energy that is one of the most acceptable forms of energy where the Kyoto Protocol is concerned. It was emphasized that large-scale hydro-electric projects are needed to provide sufficient energy to meet both

  2. Variability in population abundance is associated with thresholds between scaling regimes

    Science.gov (United States)

    Wardwell, D.; Allen, Craig R.

    2009-01-01

    Discontinuous structure in landscapes may result in discontinuous, aggregated species body-mass patterns, reflecting the scales of structure available to animal communities within a landscape. The edges of these body-mass aggregations reflect transitions between available scales of landscape structure. Such transitions, or scale breaks, are theoretically associated with increased biological variability. We hypothesized that variability in population abundance is greater in animal species near the edge of body-mass aggregations than it is in species that are situated in the interior of body-mass aggregations. We tested this hypothesis by examining both temporal and spatial variability in the abundance of species in the bird community of the Florida Everglades sub-ecoregion, USA. Analyses of both temporal and spatial variability in population abundance supported our hypothesis. Our results indicate that variability within complex systems may be non-random, and is heightened where transitions in scales of process and structure occur. This is the first explicit test of the hypothetical relationship between increased population variability and scale breaks. ?? 2009 by the author(s).

  3. Large-scale Flow and Transport of Magnetic Flux in the Solar ...

    Indian Academy of Sciences (India)

    tribpo

    Abstract. Horizontal large-scale velocity field describes horizontal displacement of the photospheric magnetic flux in zonal and meridian directions. The flow systems of solar plasma, constructed according to the velocity field, create the large-scale cellular-like patterns with up-flow in the center and the down-flow on the ...

  4. A Large-scale Plume in an X-class Solar Flare

    Energy Technology Data Exchange (ETDEWEB)

    Fleishman, Gregory D.; Nita, Gelu M.; Gary, Dale E. [Physics Department, Center for Solar-Terrestrial Research, New Jersey Institute of Technology Newark, NJ, 07102-1982 (United States)

    2017-08-20

    Ever-increasing multi-frequency imaging of solar observations suggests that solar flares often involve more than one magnetic fluxtube. Some of the fluxtubes are closed, while others can contain open fields. The relative proportion of nonthermal electrons among those distinct loops is highly important for understanding energy release, particle acceleration, and transport. The access of nonthermal electrons to the open field is also important because the open field facilitates the solar energetic particle (SEP) escape from the flaring site, and thus controls the SEP fluxes in the solar system, both directly and as seed particles for further acceleration. The large-scale fluxtubes are often filled with a tenuous plasma, which is difficult to detect in either EUV or X-ray wavelengths; however, they can dominate at low radio frequencies, where a modest component of nonthermal electrons can render the source optically thick and, thus, bright enough to be observed. Here we report the detection of a large-scale “plume” at the impulsive phase of an X-class solar flare, SOL2001-08-25T16:23, using multi-frequency radio data from Owens Valley Solar Array. To quantify the flare’s spatial structure, we employ 3D modeling utilizing force-free-field extrapolations from the line of sight SOHO /MDI magnetograms with our modeling tool GX-Simulator. We found that a significant fraction of the nonthermal electrons that accelerated at the flare site low in the corona escapes to the plume, which contains both closed and open fields. We propose that the proportion between the closed and open fields at the plume is what determines the SEP population escaping into interplanetary space.

  5. Utilization of Large Scale Surface Models for Detailed Visibility Analyses

    Science.gov (United States)

    Caha, J.; Kačmařík, M.

    2017-11-01

    This article demonstrates utilization of large scale surface models with small spatial resolution and high accuracy, acquired from Unmanned Aerial Vehicle scanning, for visibility analyses. The importance of large scale data for visibility analyses on the local scale, where the detail of the surface model is the most defining factor, is described. The focus is not only the classic Boolean visibility, that is usually determined within GIS, but also on so called extended viewsheds that aims to provide more information about visibility. The case study with examples of visibility analyses was performed on river Opava, near the Ostrava city (Czech Republic). The multiple Boolean viewshed analysis and global horizon viewshed were calculated to determine most prominent features and visibility barriers of the surface. Besides that, the extended viewshed showing angle difference above the local horizon, which describes angular height of the target area above the barrier, is shown. The case study proved that large scale models are appropriate data source for visibility analyses on local level. The discussion summarizes possible future applications and further development directions of visibility analyses.

  6. Sleep patterns and predictors of disturbed sleep in a large population of college students.

    Science.gov (United States)

    Lund, Hannah G; Reider, Brian D; Whiting, Annie B; Prichard, J Roxanne

    2010-02-01

    To characterize sleep patterns and predictors of poor sleep quality in a large population of college students. This study extends the 2006 National Sleep Foundation examination of sleep in early adolescence by examining sleep in older adolescents. One thousand one hundred twenty-five students aged 17 to 24 years from an urban Midwestern university completed a cross-sectional online survey about sleep habits that included the Pittsburgh Sleep Quality Index (PSQI), the Epworth Sleepiness Scale, the Horne-Ostberg Morningness-Eveningness Scale, the Profile of Mood States, the Subjective Units of Distress Scale, and questions about academic performance, physical health, and psychoactive drug use. Students reported disturbed sleep; over 60% were categorized as poor-quality sleepers by the PSQI, bedtimes and risetimes were delayed during weekends, and students reported frequently taking prescription, over the counter, and recreational psychoactive drugs to alter sleep/wakefulness. Students classified as poor-quality sleepers reported significantly more problems with physical and psychological health than did good-quality sleepers. Students overwhelmingly stated that emotional and academic stress negatively impacted sleep. Multiple regression analyses revealed that tension and stress accounted for 24% of the variance in the PSQI score, whereas exercise, alcohol and caffeine consumption, and consistency of sleep schedule were not significant predictors of sleep quality. These results demonstrate that insufficient sleep and irregular sleep-wake patterns, which have been extensively documented in younger adolescents, are also present at alarming levels in the college student population. Given the close relationships between sleep quality and physical and mental health, intervention programs for sleep disturbance in this population should be considered. Copyright 2010 Society for Adolescent Medicine. Published by Elsevier Inc. All rights reserved.

  7. Large-scale modeling of rain fields from a rain cell deterministic model

    Science.gov (United States)

    FéRal, Laurent; Sauvageot, Henri; Castanet, Laurent; Lemorton, JoëL.; Cornet, FréDéRic; Leconte, Katia

    2006-04-01

    A methodology to simulate two-dimensional rain rate fields at large scale (1000 × 1000 km2, the scale of a satellite telecommunication beam or a terrestrial fixed broadband wireless access network) is proposed. It relies on a rain rate field cellular decomposition. At small scale (˜20 × 20 km2), the rain field is split up into its macroscopic components, the rain cells, described by the Hybrid Cell (HYCELL) cellular model. At midscale (˜150 × 150 km2), the rain field results from the conglomeration of rain cells modeled by HYCELL. To account for the rain cell spatial distribution at midscale, the latter is modeled by a doubly aggregative isotropic random walk, the optimal parameterization of which is derived from radar observations at midscale. The extension of the simulation area from the midscale to the large scale (1000 × 1000 km2) requires the modeling of the weather frontal area. The latter is first modeled by a Gaussian field with anisotropic covariance function. The Gaussian field is then turned into a binary field, giving the large-scale locations over which it is raining. This transformation requires the definition of the rain occupation rate over large-scale areas. Its probability distribution is determined from observations by the French operational radar network ARAMIS. The coupling with the rain field modeling at midscale is immediate whenever the large-scale field is split up into midscale subareas. The rain field thus generated accounts for the local CDF at each point, defining a structure spatially correlated at small scale, midscale, and large scale. It is then suggested that this approach be used by system designers to evaluate diversity gain, terrestrial path attenuation, or slant path attenuation for different azimuth and elevation angle directions.

  8. Facile Large-scale synthesis of stable CuO nanoparticles

    Science.gov (United States)

    Nazari, P.; Abdollahi-Nejand, B.; Eskandari, M.; Kohnehpoushi, S.

    2018-04-01

    In this work, a novel approach in synthesizing the CuO nanoparticles was introduced. A sequential corrosion and detaching was proposed in the growth and dispersion of CuO nanoparticles in the optimum pH value of eight. The produced CuO nanoparticles showed six nm (±2 nm) in diameter and spherical feather with a high crystallinity and uniformity in size. In this method, a large-scale production of CuO nanoparticles (120 grams in an experimental batch) from Cu micro-particles was achieved which may met the market criteria for large-scale production of CuO nanoparticles.

  9. Large-Scale Cooperative Task Distribution on Peer-to-Peer Networks

    Science.gov (United States)

    2012-01-01

    SUBTITLE Large-scale cooperative task distribution on peer-to-peer networks 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6...disadvantages of ML- Chord are its fixed size (two layers), and limited scala - bility for large-scale systems. RC-Chord extends ML- D. Karrels et al...configurable before runtime. This can be improved by incorporating a distributed learning algorithm to tune the number and range of the DLoE tracking

  10. Comparative Analysis of Different Protocols to Manage Large Scale Networks

    OpenAIRE

    Anil Rao Pimplapure; Dr Jayant Dubey; Prashant Sen

    2013-01-01

    In recent year the numbers, complexity and size is increased in Large Scale Network. The best example of Large Scale Network is Internet, and recently once are Data-centers in Cloud Environment. In this process, involvement of several management tasks such as traffic monitoring, security and performance optimization is big task for Network Administrator. This research reports study the different protocols i.e. conventional protocols like Simple Network Management Protocol and newly Gossip bas...

  11. Puzzles of large scale structure and gravitation

    International Nuclear Information System (INIS)

    Sidharth, B.G.

    2006-01-01

    We consider the puzzle of cosmic voids bounded by two-dimensional structures of galactic clusters as also a puzzle pointed out by Weinberg: How can the mass of a typical elementary particle depend on a cosmic parameter like the Hubble constant? An answer to the first puzzle is proposed in terms of 'Scaled' Quantum Mechanical like behaviour which appears at large scales. The second puzzle can be answered by showing that the gravitational mass of an elementary particle has a Machian character (see Ahmed N. Cantorian small worked, Mach's principle and the universal mass network. Chaos, Solitons and Fractals 2004;21(4))

  12. Personalized Opportunistic Computing for CMS at Large Scale

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    **Douglas Thain** is an Associate Professor of Computer Science and Engineering at the University of Notre Dame, where he designs large scale distributed computing systems to power the needs of advanced science and...

  13. Stability of large scale interconnected dynamical systems

    International Nuclear Information System (INIS)

    Akpan, E.P.

    1993-07-01

    Large scale systems modelled by a system of ordinary differential equations are considered and necessary and sufficient conditions are obtained for the uniform asymptotic connective stability of the systems using the method of cone-valued Lyapunov functions. It is shown that this model significantly improves the existing models. (author). 9 refs

  14. Large scale cross hole testing

    International Nuclear Information System (INIS)

    Ball, J.K.; Black, J.H.; Doe, T.

    1991-05-01

    As part of the Site Characterisation and Validation programme the results of the large scale cross hole testing have been used to document hydraulic connections across the SCV block, to test conceptual models of fracture zones and obtain hydrogeological properties of the major hydrogeological features. The SCV block is highly heterogeneous. This heterogeneity is not smoothed out even over scales of hundreds of meters. Results of the interpretation validate the hypothesis of the major fracture zones, A, B and H; not much evidence of minor fracture zones is found. The uncertainty in the flow path, through the fractured rock, causes sever problems in interpretation. Derived values of hydraulic conductivity were found to be in a narrow range of two to three orders of magnitude. Test design did not allow fracture zones to be tested individually. This could be improved by testing the high hydraulic conductivity regions specifically. The Piezomac and single hole equipment worked well. Few, if any, of the tests ran long enough to approach equilibrium. Many observation boreholes showed no response. This could either be because there is no hydraulic connection, or there is a connection but a response is not seen within the time scale of the pumping test. The fractional dimension analysis yielded credible results, and the sinusoidal testing procedure provided an effective means of identifying the dominant hydraulic connections. (10 refs.) (au)

  15. Large transverse momentum processes in a non-scaling parton model

    International Nuclear Information System (INIS)

    Stirling, W.J.

    1977-01-01

    The production of large transverse momentum mesons in hadronic collisions by the quark fusion mechanism is discussed in a parton model which gives logarithmic corrections to Bjorken scaling. It is found that the moments of the large transverse momentum structure function exhibit a simple scale breaking behaviour similar to the behaviour of the Drell-Yan and deep inelastic structure functions of the model. An estimate of corresponding experimental consequences is made and the extent to which analogous results can be expected in an asymptotically free gauge theory is discussed. A simple set of rules is presented for incorporating the logarithmic corrections to scaling into all covariant parton model calculations. (Auth.)

  16. On the Renormalization of the Effective Field Theory of Large Scale Structures

    OpenAIRE

    Pajer, Enrico; Zaldarriaga, Matias

    2013-01-01

    Standard perturbation theory (SPT) for large-scale matter inhomogeneities is unsatisfactory for at least three reasons: there is no clear expansion parameter since the density contrast is not small on all scales; it does not fully account for deviations at large scales from a perfect pressureless fluid induced by short-scale non-linearities; for generic initial conditions, loop corrections are UV-divergent, making predictions cutoff dependent and hence unphysical. The Effective Field Theory o...

  17. Quantitative Missense Variant Effect Prediction Using Large-Scale Mutagenesis Data.

    Science.gov (United States)

    Gray, Vanessa E; Hause, Ronald J; Luebeck, Jens; Shendure, Jay; Fowler, Douglas M

    2018-01-24

    Large datasets describing the quantitative effects of mutations on protein function are becoming increasingly available. Here, we leverage these datasets to develop Envision, which predicts the magnitude of a missense variant's molecular effect. Envision combines 21,026 variant effect measurements from nine large-scale experimental mutagenesis datasets, a hitherto untapped training resource, with a supervised, stochastic gradient boosting learning algorithm. Envision outperforms other missense variant effect predictors both on large-scale mutagenesis data and on an independent test dataset comprising 2,312 TP53 variants whose effects were measured using a low-throughput approach. This dataset was never used for hyperparameter tuning or model training and thus serves as an independent validation set. Envision prediction accuracy is also more consistent across amino acids than other predictors. Finally, we demonstrate that Envision's performance improves as more large-scale mutagenesis data are incorporated. We precompute Envision predictions for every possible single amino acid variant in human, mouse, frog, zebrafish, fruit fly, worm, and yeast proteomes (https://envision.gs.washington.edu/). Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Methods for Large-Scale Nonlinear Optimization.

    Science.gov (United States)

    1980-05-01

    STANFORD, CALIFORNIA 94305 METHODS FOR LARGE-SCALE NONLINEAR OPTIMIZATION by Philip E. Gill, Waiter Murray, I Michael A. Saunden, and Masgaret H. Wright...typical iteration can be partitioned so that where B is an m X m basise matrix. This partition effectively divides the vari- ables into three classes... attention is given to the standard of the coding or the documentation. A much better way of obtaining mathematical software is from a software library

  19. Generation of large-scale vorticity in rotating stratified turbulence with inhomogeneous helicity: mean-field theory

    Science.gov (United States)

    Kleeorin, N.

    2018-06-01

    We discuss a mean-field theory of the generation of large-scale vorticity in a rotating density stratified developed turbulence with inhomogeneous kinetic helicity. We show that the large-scale non-uniform flow is produced due to either a combined action of a density stratified rotating turbulence and uniform kinetic helicity or a combined effect of a rotating incompressible turbulence and inhomogeneous kinetic helicity. These effects result in the formation of a large-scale shear, and in turn its interaction with the small-scale turbulence causes an excitation of the large-scale instability (known as a vorticity dynamo) due to a combined effect of the large-scale shear and Reynolds stress-induced generation of the mean vorticity. The latter is due to the effect of large-scale shear on the Reynolds stress. A fast rotation suppresses this large-scale instability.

  20. Recent Advances in Understanding Large Scale Vapour Explosions

    International Nuclear Information System (INIS)

    Board, S.J.; Hall, R.W.

    1976-01-01

    In foundries, violent explosions occur occasionally when molten metal comes into contact with water. If similar explosions can occur with other materials, hazardous situations may arise for example in LNG marine transportation accidents, or in liquid cooled reactor incidents when molten UO 2 contacts water or sodium coolant. Over the last 10 years a large body of experimental data has been obtained on the behaviour of small quantities of hot material in contact with a vaporisable coolant. Such experiments generally give low energy yields, despite producing fine fragmentation of the molten material. These events have been interpreted in terms of a wide range of phenomena such as violent boiling, liquid entrainment, bubble collapse, superheat, surface cracking and many others. Many of these studies have been aimed at understanding the small scale behaviour of the particular materials of interest. However, understanding the nature of the energetic events which were the original cause for concern may also be necessary to give confidence that violent events cannot occur for these materials in large scale situations. More recently, there has been a trend towards larger experiments and some of these have produced explosions of moderately high efficiency. Although occurrence of such large scale explosions can depend rather critically on initial conditions in a way which is not fully understood, there are signs that the interpretation of these events may be more straightforward than that of the single drop experiments. In the last two years several theoretical models for large scale explosions have appeared which attempt a self contained explanation of at least some stages of such high yield events: these have as their common feature a description of how a propagating breakdown of an initially quasi-stable distribution of materials is induced by the pressure and flow field caused by the energy release in adjacent regions. These models have led to the idea that for a full

  1. Robust large-scale parallel nonlinear solvers for simulations.

    Energy Technology Data Exchange (ETDEWEB)

    Bader, Brett William; Pawlowski, Roger Patrick; Kolda, Tamara Gibson (Sandia National Laboratories, Livermore, CA)

    2005-11-01

    This report documents research to develop robust and efficient solution techniques for solving large-scale systems of nonlinear equations. The most widely used method for solving systems of nonlinear equations is Newton's method. While much research has been devoted to augmenting Newton-based solvers (usually with globalization techniques), little has been devoted to exploring the application of different models. Our research has been directed at evaluating techniques using different models than Newton's method: a lower order model, Broyden's method, and a higher order model, the tensor method. We have developed large-scale versions of each of these models and have demonstrated their use in important applications at Sandia. Broyden's method replaces the Jacobian with an approximation, allowing codes that cannot evaluate a Jacobian or have an inaccurate Jacobian to converge to a solution. Limited-memory methods, which have been successful in optimization, allow us to extend this approach to large-scale problems. We compare the robustness and efficiency of Newton's method, modified Newton's method, Jacobian-free Newton-Krylov method, and our limited-memory Broyden method. Comparisons are carried out for large-scale applications of fluid flow simulations and electronic circuit simulations. Results show that, in cases where the Jacobian was inaccurate or could not be computed, Broyden's method converged in some cases where Newton's method failed to converge. We identify conditions where Broyden's method can be more efficient than Newton's method. We also present modifications to a large-scale tensor method, originally proposed by Bouaricha, for greater efficiency, better robustness, and wider applicability. Tensor methods are an alternative to Newton-based methods and are based on computing a step based on a local quadratic model rather than a linear model. The advantage of Bouaricha's method is that it can use any

  2. Large Scale GW Calculations on the Cori System

    Science.gov (United States)

    Deslippe, Jack; Del Ben, Mauro; da Jornada, Felipe; Canning, Andrew; Louie, Steven

    The NERSC Cori system, powered by 9000+ Intel Xeon-Phi processors, represents one of the largest HPC systems for open-science in the United States and the world. We discuss the optimization of the GW methodology for this system, including both node level and system-scale optimizations. We highlight multiple large scale (thousands of atoms) case studies and discuss both absolute application performance and comparison to calculations on more traditional HPC architectures. We find that the GW method is particularly well suited for many-core architectures due to the ability to exploit a large amount of parallelism across many layers of the system. This work was supported by the U.S. Department of Energy, Office of Science, Basic Energy Sciences, Materials Sciences and Engineering Division, as part of the Computational Materials Sciences Program.

  3. Cosmic ray acceleration by large scale galactic shocks

    International Nuclear Information System (INIS)

    Cesarsky, C.J.; Lagage, P.O.

    1987-01-01

    The mechanism of diffusive shock acceleration may account for the existence of galactic cosmic rays detailed application to stellar wind shocks and especially to supernova shocks have been developed. Existing models can usually deal with the energetics or the spectral slope, but the observed energy range of cosmic rays is not explained. Therefore it seems worthwhile to examine the effect that large scale, long-lived galactic shocks may have on galactic cosmic rays, in the frame of the diffusive shock acceleration mechanism. Large scale fast shocks can only be expected to exist in the galactic halo. We consider three situations where they may arise: expansion of a supernova shock in the halo, galactic wind, galactic infall; and discuss the possible existence of these shocks and their role in accelerating cosmic rays

  4. Lagrangian space consistency relation for large scale structure

    International Nuclear Information System (INIS)

    Horn, Bart; Hui, Lam; Xiao, Xiao

    2015-01-01

    Consistency relations, which relate the squeezed limit of an (N+1)-point correlation function to an N-point function, are non-perturbative symmetry statements that hold even if the associated high momentum modes are deep in the nonlinear regime and astrophysically complex. Recently, Kehagias and Riotto and Peloso and Pietroni discovered a consistency relation applicable to large scale structure. We show that this can be recast into a simple physical statement in Lagrangian space: that the squeezed correlation function (suitably normalized) vanishes. This holds regardless of whether the correlation observables are at the same time or not, and regardless of whether multiple-streaming is present. The simplicity of this statement suggests that an analytic understanding of large scale structure in the nonlinear regime may be particularly promising in Lagrangian space

  5. Electron drift in a large scale solid xenon

    International Nuclear Information System (INIS)

    Yoo, J.; Jaskierny, W.F.

    2015-01-01

    A study of charge drift in a large scale optically transparent solid xenon is reported. A pulsed high power xenon light source is used to liberate electrons from a photocathode. The drift speeds of the electrons are measured using a 8.7 cm long electrode in both the liquid and solid phase of xenon. In the liquid phase (163 K), the drift speed is 0.193 ± 0.003 cm/μs while the drift speed in the solid phase (157 K) is 0.397 ± 0.006 cm/μs at 900 V/cm over 8.0 cm of uniform electric fields. Therefore, it is demonstrated that a factor two faster electron drift speed in solid phase xenon compared to that in liquid in a large scale solid xenon

  6. Wind and Photovoltaic Large-Scale Regional Models for hourly production evaluation

    DEFF Research Database (Denmark)

    Marinelli, Mattia; Maule, Petr; Hahmann, Andrea N.

    2015-01-01

    This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesosca...... of the transmission system, especially regarding the cross-border power flows. The tuning of these regional models is done using historical meteorological data acquired on a per-country basis and using publicly available data of installed capacity.......This work presents two large-scale regional models used for the evaluation of normalized power output from wind turbines and photovoltaic power plants on a European regional scale. The models give an estimate of renewable production on a regional scale with 1 h resolution, starting from a mesoscale...

  7. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    International Nuclear Information System (INIS)

    Jin Zhenxing; Wu Yong; Li Baizhan; Gao Yafeng

    2009-01-01

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China.

  8. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    Energy Technology Data Exchange (ETDEWEB)

    Jin, Zhenxing; Li, Baizhan; Gao, Yafeng [The Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China); Wu, Yong [The Department of Science and Technology, Ministry of Construction, Beijing 100835 (China)

    2009-06-15

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China. (author)

  9. Energy efficiency supervision strategy selection of Chinese large-scale public buildings

    Energy Technology Data Exchange (ETDEWEB)

    Jin Zhenxing [Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China)], E-mail: jinzhenxing33@sina.com; Wu Yong [Department of Science and Technology, Ministry of Construction, Beijing 100835 (China); Li Baizhan; Gao Yafeng [Faculty of Urban Construction and Environmental Engineering, Chongqing University, Chongqing (China); Key Laboratory of the Three Gorges Reservoir Region' s Eco-Environment, Ministry of Education, Chongqing 400045 (China)

    2009-06-15

    This paper discusses energy consumption, building development and building energy consumption in China, and points that energy efficiency management and maintenance of large-scale public buildings is the breakthrough point of building energy saving in China. Three obstacles are lack of basic statistics data, lack of service market for building energy saving, and lack of effective management measures account for the necessity of energy efficiency supervision for large-scale public buildings. And then the paper introduces the supervision aims, the supervision system and the five basic systems' role in the supervision system, and analyzes the working mechanism of the five basic systems. The energy efficiency supervision system of large-scale public buildings takes energy consumption statistics as a data basis, Energy auditing as a technical support, energy consumption ration as a benchmark of energy saving and price increase beyond ration as a price lever, and energy efficiency public-noticing as an amplifier. The supervision system promotes energy efficiency operation and maintenance of large-scale public building, and drives a comprehensive building energy saving in China.

  10. Mirror dark matter and large scale structure

    International Nuclear Information System (INIS)

    Ignatiev, A.Yu.; Volkas, R.R.

    2003-01-01

    Mirror matter is a dark matter candidate. In this paper, we reexamine the linear regime of density perturbation growth in a universe containing mirror dark matter. Taking adiabatic scale-invariant perturbations as the input, we confirm that the resulting processed power spectrum is richer than for the more familiar cases of cold, warm and hot dark matter. The new features include a maximum at a certain scale λ max , collisional damping below a smaller characteristic scale λ S ' , with oscillatory perturbations between the two. These scales are functions of the fundamental parameters of the theory. In particular, they decrease for decreasing x, the ratio of the mirror plasma temperature to that of the ordinary. For x∼0.2, the scale λ max becomes galactic. Mirror dark matter therefore leads to bottom-up large scale structure formation, similar to conventional cold dark matter, for x(less-or-similar sign)0.2. Indeed, the smaller the value of x, the closer mirror dark matter resembles standard cold dark matter during the linear regime. The differences pertain to scales smaller than λ S ' in the linear regime, and generally in the nonlinear regime because mirror dark matter is chemically complex and to some extent dissipative. Lyman-α forest data and the early reionization epoch established by WMAP may hold the key to distinguishing mirror dark matter from WIMP-style cold dark matter

  11. The Large-Scale Structure of Scientific Method

    Science.gov (United States)

    Kosso, Peter

    2009-01-01

    The standard textbook description of the nature of science describes the proposal, testing, and acceptance of a theoretical idea almost entirely in isolation from other theories. The resulting model of science is a kind of piecemeal empiricism that misses the important network structure of scientific knowledge. Only the large-scale description of…

  12. Detection of large-scale concentric gravity waves from a Chinese airglow imager network

    Science.gov (United States)

    Lai, Chang; Yue, Jia; Xu, Jiyao; Yuan, Wei; Li, Qinzeng; Liu, Xiao

    2018-06-01

    Concentric gravity waves (CGWs) contain a broad spectrum of horizontal wavelengths and periods due to their instantaneous localized sources (e.g., deep convection, volcanic eruptions, or earthquake, etc.). However, it is difficult to observe large-scale gravity waves of >100 km wavelength from the ground for the limited field of view of a single camera and local bad weather. Previously, complete large-scale CGW imagery could only be captured by satellite observations. In the present study, we developed a novel method that uses assembling separate images and applying low-pass filtering to obtain temporal and spatial information about complete large-scale CGWs from a network of all-sky airglow imagers. Coordinated observations from five all-sky airglow imagers in Northern China were assembled and processed to study large-scale CGWs over a wide area (1800 km × 1 400 km), focusing on the same two CGW events as Xu et al. (2015). Our algorithms yielded images of large-scale CGWs by filtering out the small-scale CGWs. The wavelengths, wave speeds, and periods of CGWs were measured from a sequence of consecutive assembled images. Overall, the assembling and low-pass filtering algorithms can expand the airglow imager network to its full capacity regarding the detection of large-scale gravity waves.

  13. Fine-Scale Population Estimation by 3D Reconstruction of Urban Residential Buildings

    Directory of Open Access Journals (Sweden)

    Shixin Wang

    2016-10-01

    Full Text Available Fine-scale population estimation is essential in emergency response and epidemiological applications as well as urban planning and management. However, representing populations in heterogeneous urban regions with a finer resolution is a challenge. This study aims to obtain fine-scale population distribution based on 3D reconstruction of urban residential buildings with morphological operations using optical high-resolution (HR images from the Chinese No. 3 Resources Satellite (ZY-3. Specifically, the research area was first divided into three categories when dasymetric mapping was taken into consideration. The results demonstrate that the morphological building index (MBI yielded better results than built-up presence index (PanTex in building detection, and the morphological shadow index (MSI outperformed color invariant indices (CIIT in shadow extraction and height retrieval. Building extraction and height retrieval were then combined to reconstruct 3D models and to estimate population. Final results show that this approach is effective in fine-scale population estimation, with a mean relative error of 16.46% and an overall Relative Total Absolute Error (RATE of 0.158. This study gives significant insights into fine-scale population estimation in complicated urban landscapes, when detailed 3D information of buildings is unavailable.

  14. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... Security can help increase accountability for large-scale land acquisitions in ... to build decent economic livelihoods and participate meaningfully in decisions ... its 2017 call for proposals to establish Cyber Policy Centres in the Global South.

  15. Quantifying the Impacts of Large Scale Integration of Renewables in Indian Power Sector

    Science.gov (United States)

    Kumar, P.; Mishra, T.; Banerjee, R.

    2017-12-01

    India's power sector is responsible for nearly 37 percent of India's greenhouse gas emissions. For a fast emerging economy like India whose population and energy consumption are poised to rise rapidly in the coming decades, renewable energy can play a vital role in decarbonizing power sector. In this context, India has targeted 33-35 percent emission intensity reduction (with respect to 2005 levels) along with large scale renewable energy targets (100GW solar, 60GW wind, and 10GW biomass energy by 2022) in INDCs submitted at Paris agreement. But large scale integration of renewable energy is a complex process which faces a number of problems like capital intensiveness, matching intermittent loads with least storage capacity and reliability. In this context, this study attempts to assess the technical feasibility of integrating renewables into Indian electricity mix by 2022 and analyze its implications on power sector operations. This study uses TIMES, a bottom up energy optimization model with unit commitment and dispatch features. We model coal and gas fired units discretely with region-wise representation of wind and solar resources. The dispatch features are used for operational analysis of power plant units under ramp rate and minimum generation constraints. The study analyzes India's electricity sector transition for the year 2022 with three scenarios. The base case scenario (no RE addition) along with INDC scenario (with 100GW solar, 60GW wind, 10GW biomass) and low RE scenario (50GW solar, 30GW wind) have been created to analyze the implications of large scale integration of variable renewable energy. The results provide us insights on trade-offs involved in achieving mitigation targets and investment decisions involved. The study also examines operational reliability and flexibility requirements of the system for integrating renewables.

  16. The Cosmology Large Angular Scale Surveyor (CLASS)

    Science.gov (United States)

    Harrington, Kathleen; Marriange, Tobias; Aamir, Ali; Appel, John W.; Bennett, Charles L.; Boone, Fletcher; Brewer, Michael; Chan, Manwei; Chuss, David T.; Colazo, Felipe; hide

    2016-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) is a four telescope array designed to characterize relic primordial gravitational waves from in ation and the optical depth to reionization through a measurement of the polarized cosmic microwave background (CMB) on the largest angular scales. The frequencies of the four CLASS telescopes, one at 38 GHz, two at 93 GHz, and one dichroic system at 145/217 GHz, are chosen to avoid spectral regions of high atmospheric emission and span the minimum of the polarized Galactic foregrounds: synchrotron emission at lower frequencies and dust emission at higher frequencies. Low-noise transition edge sensor detectors and a rapid front-end polarization modulator provide a unique combination of high sensitivity, stability, and control of systematics. The CLASS site, at 5200 m in the Chilean Atacama desert, allows for daily mapping of up to 70% of the sky and enables the characterization of CMB polarization at the largest angular scales. Using this combination of a broad frequency range, large sky coverage, control over systematics, and high sensitivity, CLASS will observe the reionization and recombination peaks of the CMB E- and B-mode power spectra. CLASS will make a cosmic variance limited measurement of the optical depth to reionization and will measure or place upper limits on the tensor-to-scalar ratio, r, down to a level of 0.01 (95% C.L.).

  17. Large herbivore population performance and climate in a South African semi-arid savanna

    Directory of Open Access Journals (Sweden)

    Armin H. Seydack

    2012-02-01

    Interpretation according to a climate–vegetation response model suggested that acclimation of forage plants to increasing temperature had resulted in temperature-enhanced plant productivity, initially increasing food availability and supporting transient synchronous increases in population abundance of both blue wildebeest and zebra, and selective grazers. As acclimation of plants to concurrently rising minimum (nocturnal temperature (Tmin took effect, adjustments in metabolic functionality occurred involving accelerated growth activity at the cost of storage-based metabolism. Growth-linked nitrogen dilution and reduced carbon-nutrient quality of forage then resulted in phases of subsequently declining herbivore populations. Over the long term (1910–2010, progressive plant functionality shifts towards accelerated metabolic growth rather than storage priority occurred in response to Tmin rising faster than maximum temperature (Tmax, thereby cumulatively compromising the carbon-nutrient quality of forage, a key resource for selective grazers. The results of analyses thus revealed consistency between herbivore population trends and levels of forage quantity and quality congruent with expected plant metabolic responses to climate effects. Thus, according to the climate-vegetation response model, climate effects were implicated as the ultimate cause of large herbivore population performance in space and over time. Conservation implications: In its broadest sense, the objective of this study was to contribute towards the enhanced understanding of landscape-scale functioning of savanna systems with regard to the interplay between climate, vegetation and herbivore population dynamics.

  18. Measuring the topology of large-scale structure in the universe

    Science.gov (United States)

    Gott, J. Richard, III

    1988-11-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data.

  19. Measuring the topology of large-scale structure in the universe

    International Nuclear Information System (INIS)

    Gott, J.R. III

    1988-01-01

    An algorithm for quantitatively measuring the topology of large-scale structure has now been applied to a large number of observational data sets. The present paper summarizes and provides an overview of some of these observational results. On scales significantly larger than the correlation length, larger than about 1200 km/s, the cluster and galaxy data are fully consistent with a sponge-like random phase topology. At a smoothing length of about 600 km/s, however, the observed genus curves show a small shift in the direction of a meatball topology. Cold dark matter (CDM) models show similar shifts at these scales but not generally as large as those seen in the data. Bubble models, with voids completely surrounded on all sides by wall of galaxies, show shifts in the opposite direction. The CDM model is overall the most successful in explaining the data. 45 references

  20. Contribution of large scale coherence to wind turbine power: A large eddy simulation study in periodic wind farms

    Science.gov (United States)

    Chatterjee, Tanmoy; Peet, Yulia T.

    2018-03-01

    Length scales of eddies involved in the power generation of infinite wind farms are studied by analyzing the spectra of the turbulent flux of mean kinetic energy (MKE) from large eddy simulations (LES). Large-scale structures with an order of magnitude bigger than the turbine rotor diameter (D ) are shown to have substantial contribution to wind power. Varying dynamics in the intermediate scales (D -10 D ) are also observed from a parametric study involving interturbine distances and hub height of the turbines. Further insight about the eddies responsible for the power generation have been provided from the scaling analysis of two-dimensional premultiplied spectra of MKE flux. The LES code is developed in a high Reynolds number near-wall modeling framework, using an open-source spectral element code Nek5000, and the wind turbines have been modelled using a state-of-the-art actuator line model. The LES of infinite wind farms have been validated against the statistical results from the previous literature. The study is expected to improve our understanding of the complex multiscale dynamics in the domain of large wind farms and identify the length scales that contribute to the power. This information can be useful for design of wind farm layout and turbine placement that take advantage of the large-scale structures contributing to wind turbine power.

  1. Bottom-Up Accountability Initiatives and Large-Scale Land ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    fuel/energy, climate, and finance has occurred and one of the most ... this wave of large-scale land acquisitions. In fact, esti- ... Environmental Rights Action/Friends of the Earth,. Nigeria ... map the differentiated impacts (gender, ethnicity,.

  2. Subgrid-scale models for large-eddy simulation of rotating turbulent channel flows

    Science.gov (United States)

    Silvis, Maurits H.; Bae, Hyunji Jane; Trias, F. Xavier; Abkar, Mahdi; Moin, Parviz; Verstappen, Roel

    2017-11-01

    We aim to design subgrid-scale models for large-eddy simulation of rotating turbulent flows. Rotating turbulent flows form a challenging test case for large-eddy simulation due to the presence of the Coriolis force. The Coriolis force conserves the total kinetic energy while transporting it from small to large scales of motion, leading to the formation of large-scale anisotropic flow structures. The Coriolis force may also cause partial flow laminarization and the occurrence of turbulent bursts. Many subgrid-scale models for large-eddy simulation are, however, primarily designed to parametrize the dissipative nature of turbulent flows, ignoring the specific characteristics of transport processes. We, therefore, propose a new subgrid-scale model that, in addition to the usual dissipative eddy viscosity term, contains a nondissipative nonlinear model term designed to capture transport processes, such as those due to rotation. We show that the addition of this nonlinear model term leads to improved predictions of the energy spectra of rotating homogeneous isotropic turbulence as well as of the Reynolds stress anisotropy in spanwise-rotating plane-channel flows. This work is financed by the Netherlands Organisation for Scientific Research (NWO) under Project Number 613.001.212.

  3. Large-scale water projects in the developing world: Revisiting the past and looking to the future

    Science.gov (United States)

    Sivakumar, Bellie; Chen, Ji

    2014-05-01

    During the past half a century or so, the developing world has been witnessing a significant increase in freshwater demands due to a combination of factors, including population growth, increased food demand, improved living standards, and water quality degradation. Since there exists significant variability in rainfall and river flow in both space and time, large-scale storage and distribution of water has become a key means to meet these increasing demands. In this regard, large dams and water transfer schemes (including river-linking schemes and virtual water trades) have been playing a key role. While the benefits of such large-scale projects in supplying water for domestic, irrigation, industrial, hydropower, recreational, and other uses both in the countries of their development and in other countries are undeniable, concerns on their negative impacts, such as high initial costs and damages to our ecosystems (e.g. river environment and species) and socio-economic fabric (e.g. relocation and socio-economic changes of affected people) have also been increasing in recent years. These have led to serious debates on the role of large-scale water projects in the developing world and on their future, but the often one-sided nature of such debates have inevitably failed to yield fruitful outcomes thus far. The present study aims to offer a far more balanced perspective on this issue. First, it recognizes and emphasizes the need for still additional large-scale water structures in the developing world in the future, due to the continuing increase in water demands, inefficiency in water use (especially in the agricultural sector), and absence of equivalent and reliable alternatives. Next, it reviews a few important success and failure stories of large-scale water projects in the developing world (and in the developed world), in an effort to arrive at a balanced view on the future role of such projects. Then, it discusses some major challenges in future water planning

  4. Origin of the large scale structures of the universe

    International Nuclear Information System (INIS)

    Oaknin, David H.

    2004-01-01

    We revise the statistical properties of the primordial cosmological density anisotropies that, at the time of matter-radiation equality, seeded the gravitational development of large scale structures in the otherwise homogeneous and isotropic Friedmann-Robertson-Walker flat universe. Our analysis shows that random fluctuations of the density field at the same instant of equality and with comoving wavelength shorter than the causal horizon at that time can naturally account, when globally constrained to conserve the total mass (energy) of the system, for the observed scale invariance of the anisotropies over cosmologically large comoving volumes. Statistical systems with similar features are generically known as glasslike or latticelike. Obviously, these conclusions conflict with the widely accepted understanding of the primordial structures reported in the literature, which requires an epoch of inflationary cosmology to precede the standard expansion of the universe. The origin of the conflict must be found in the widespread, but unjustified, claim that scale invariant mass (energy) anisotropies at the instant of equality over comoving volumes of cosmological size, larger than the causal horizon at the time, must be generated by fluctuations in the density field with comparably large comoving wavelength

  5. Technologies and challenges in large-scale phosphoproteomics

    DEFF Research Database (Denmark)

    Engholm-Keller, Kasper; Larsen, Martin Røssel

    2013-01-01

    become the main technique for discovery and characterization of phosphoproteins in a nonhypothesis driven fashion. In this review, we describe methods for state-of-the-art MS-based analysis of protein phosphorylation as well as the strategies employed in large-scale phosphoproteomic experiments...... with focus on the various challenges and limitations this field currently faces....

  6. Large-scale building energy efficiency retrofit: Concept, model and control

    International Nuclear Information System (INIS)

    Wu, Zhou; Wang, Bo; Xia, Xiaohua

    2016-01-01

    BEER (Building energy efficiency retrofit) projects are initiated in many nations and regions over the world. Existing studies of BEER focus on modeling and planning based on one building and one year period of retrofitting, which cannot be applied to certain large BEER projects with multiple buildings and multi-year retrofit. In this paper, the large-scale BEER problem is defined in a general TBT (time-building-technology) framework, which fits essential requirements of real-world projects. The large-scale BEER is newly studied in the control approach rather than the optimization approach commonly used before. Optimal control is proposed to design optimal retrofitting strategy in terms of maximal energy savings and maximal NPV (net present value). The designed strategy is dynamically changing on dimensions of time, building and technology. The TBT framework and the optimal control approach are verified in a large BEER project, and results indicate that promising performance of energy and cost savings can be achieved in the general TBT framework. - Highlights: • Energy efficiency retrofit of many buildings is studied. • A TBT (time-building-technology) framework is proposed. • The control system of the large-scale BEER is modeled. • The optimal retrofitting strategy is obtained.

  7. Large-scale linear programs in planning and prediction.

    Science.gov (United States)

    2017-06-01

    Large-scale linear programs are at the core of many traffic-related optimization problems in both planning and prediction. Moreover, many of these involve significant uncertainty, and hence are modeled using either chance constraints, or robust optim...

  8. Large Scale Solar Heating

    DEFF Research Database (Denmark)

    Heller, Alfred

    2001-01-01

    The main objective of the research was to evaluate large-scale solar heating connected to district heating (CSDHP), to build up a simulation tool and to demonstrate the application of the simulation tool for design studies and on a local energy planning case. The evaluation was mainly carried out...... model is designed and validated on the Marstal case. Applying the Danish Reference Year, a design tool is presented. The simulation tool is used for proposals for application of alternative designs, including high-performance solar collector types (trough solar collectors, vaccum pipe collectors......). Simulation programs are proposed as control supporting tool for daily operation and performance prediction of central solar heating plants. Finaly the CSHP technolgy is put into persepctive with respect to alternatives and a short discussion on the barries and breakthrough of the technology are given....

  9. Large-scale exact diagonalizations reveal low-momentum scales of nuclei

    Science.gov (United States)

    Forssén, C.; Carlsson, B. D.; Johansson, H. T.; Sääf, D.; Bansal, A.; Hagen, G.; Papenbrock, T.

    2018-03-01

    Ab initio methods aim to solve the nuclear many-body problem with controlled approximations. Virtually exact numerical solutions for realistic interactions can only be obtained for certain special cases such as few-nucleon systems. Here we extend the reach of exact diagonalization methods to handle model spaces with dimension exceeding 1010 on a single compute node. This allows us to perform no-core shell model (NCSM) calculations for 6Li in model spaces up to Nmax=22 and to reveal the 4He+d halo structure of this nucleus. Still, the use of a finite harmonic-oscillator basis implies truncations in both infrared (IR) and ultraviolet (UV) length scales. These truncations impose finite-size corrections on observables computed in this basis. We perform IR extrapolations of energies and radii computed in the NCSM and with the coupled-cluster method at several fixed UV cutoffs. It is shown that this strategy enables information gain also from data that is not fully UV converged. IR extrapolations improve the accuracy of relevant bound-state observables for a range of UV cutoffs, thus making them profitable tools. We relate the momentum scale that governs the exponential IR convergence to the threshold energy for the first open decay channel. Using large-scale NCSM calculations we numerically verify this small-momentum scale of finite nuclei.

  10. Use of two population metrics clarifies biodiversity dynamics in large-scale monitoring: the case of trees in Japanese old-growth forests: the need for multiple population metrics in large-scale monitoring.

    Science.gov (United States)

    Ogawa, Mifuyu; Yamaura, Yuichi; Abe, Shin; Hoshino, Daisuke; Hoshizaki, Kazuhiko; Iida, Shigeo; Katsuki, Toshio; Masaki, Takashi; Niiyama, Kaoru; Saito, Satoshi; Sakai, Takeshi; Sugita, Hisashi; Tanouchi, Hiroyuki; Amano, Tatsuya; Taki, Hisatomo; Okabe, Kimiko

    2011-07-01

    Many indicators/indices provide information on whether the 2010 biodiversity target of reducing declines in biodiversity have been achieved. The strengths and limitations of the various measures used to assess the success of such measures are now being discussed. Biodiversity dynamics are often evaluated by a single biological population metric, such as the abundance of each species. Here we examined tree population dynamics of 52 families (192 species) at 11 research sites (three vegetation zones) of Japanese old-growth forests using two population metrics: number of stems and basal area. We calculated indices that track the rate of change in all species of tree by taking the geometric mean of changes in population metrics between the 1990s and the 2000s at the national level and at the levels of the vegetation zone and family. We specifically focused on whether indices based on these two metrics behaved similarly. The indices showed that (1) the number of stems declined, whereas basal area did not change at the national level and (2) the degree of change in the indices varied by vegetation zone and family. These results suggest that Japanese old-growth forests have not degraded and may even be developing in some vegetation zones, and indicate that the use of a single population metric (or indicator/index) may be insufficient to precisely understand the state of biodiversity. It is therefore important to incorporate more metrics into monitoring schemes to overcome the risk of misunderstanding or misrepresenting biodiversity dynamics.

  11. Large Scale Visual Recommendations From Street Fashion Images

    OpenAIRE

    Jagadeesh, Vignesh; Piramuthu, Robinson; Bhardwaj, Anurag; Di, Wei; Sundaresan, Neel

    2014-01-01

    We describe a completely automated large scale visual recommendation system for fashion. Our focus is to efficiently harness the availability of large quantities of online fashion images and their rich meta-data. Specifically, we propose four data driven models in the form of Complementary Nearest Neighbor Consensus, Gaussian Mixture Models, Texture Agnostic Retrieval and Markov Chain LDA for solving this problem. We analyze relative merits and pitfalls of these algorithms through extensive e...

  12. ``Large''- vs Small-scale friction control in turbulent channel flow

    Science.gov (United States)

    Canton, Jacopo; Örlü, Ramis; Chin, Cheng; Schlatter, Philipp

    2017-11-01

    We reconsider the ``large-scale'' control scheme proposed by Hussain and co-workers (Phys. Fluids 10, 1049-1051 1998 and Phys. Rev. Fluids, 2, 62601 2017), using new direct numerical simulations (DNS). The DNS are performed in a turbulent channel at friction Reynolds number Reτ of up to 550 in order to eliminate low-Reynolds-number effects. The purpose of the present contribution is to re-assess this control method in the light of more modern developments in the field, in particular also related to the discovery of (very) large-scale motions. The goals of the paper are as follows: First, we want to better characterise the physics of the control, and assess what external contribution (vortices, forcing, wall motion) are actually needed. Then, we investigate the optimal parameters and, finally, determine which aspects of this control technique actually scale in outer units and can therefore be of use in practical applications. In addition to discussing the mentioned drag-reduction effects, the present contribution will also address the potential effect of the naturally occurring large-scale motions on frictional drag, and give indications on the physical processes for potential drag reduction possible at all Reynolds numbers.

  13. Power suppression at large scales in string inflation

    Energy Technology Data Exchange (ETDEWEB)

    Cicoli, Michele [Dipartimento di Fisica ed Astronomia, Università di Bologna, via Irnerio 46, Bologna, 40126 (Italy); Downes, Sean; Dutta, Bhaskar, E-mail: mcicoli@ictp.it, E-mail: sddownes@physics.tamu.edu, E-mail: dutta@physics.tamu.edu [Mitchell Institute for Fundamental Physics and Astronomy, Department of Physics and Astronomy, Texas A and M University, College Station, TX, 77843-4242 (United States)

    2013-12-01

    We study a possible origin of the anomalous suppression of the power spectrum at large angular scales in the cosmic microwave background within the framework of explicit string inflationary models where inflation is driven by a closed string modulus parameterizing the size of the extra dimensions. In this class of models the apparent power loss at large scales is caused by the background dynamics which involves a sharp transition from a fast-roll power law phase to a period of Starobinsky-like slow-roll inflation. An interesting feature of this class of string inflationary models is that the number of e-foldings of inflation is inversely proportional to the string coupling to a positive power. Therefore once the string coupling is tuned to small values in order to trust string perturbation theory, enough e-foldings of inflation are automatically obtained without the need of extra tuning. Moreover, in the less tuned cases the sharp transition responsible for the power loss takes place just before the last 50-60 e-foldings of inflation. We illustrate these general claims in the case of Fibre Inflation where we study the strength of this transition in terms of the attractor dynamics, finding that it induces a pivot from a blue to a redshifted power spectrum which can explain the apparent large scale power loss. We compute the effects of this pivot for example cases and demonstrate how magnitude and duration of this effect depend on model parameters.

  14. Single-field consistency relations of large scale structure part III: test of the equivalence principle

    Energy Technology Data Exchange (ETDEWEB)

    Creminelli, Paolo [Abdus Salam International Centre for Theoretical Physics, Strada Costiera 11, Trieste, 34151 (Italy); Gleyzes, Jérôme; Vernizzi, Filippo [CEA, Institut de Physique Théorique, Gif-sur-Yvette cédex, F-91191 France (France); Hui, Lam [Physics Department and Institute for Strings, Cosmology and Astroparticle Physics, Columbia University, New York, NY, 10027 (United States); Simonović, Marko, E-mail: creminel@ictp.it, E-mail: jerome.gleyzes@cea.fr, E-mail: lhui@astro.columbia.edu, E-mail: msimonov@sissa.it, E-mail: filippo.vernizzi@cea.fr [SISSA, via Bonomea 265, Trieste, 34136 (Italy)

    2014-06-01

    The recently derived consistency relations for Large Scale Structure do not hold if the Equivalence Principle (EP) is violated. We show it explicitly in a toy model with two fluids, one of which is coupled to a fifth force. We explore the constraints that galaxy surveys can set on EP violation looking at the squeezed limit of the 3-point function involving two populations of objects. We find that one can explore EP violations of order 10{sup −3}÷10{sup −4} on cosmological scales. Chameleon models are already very constrained by the requirement of screening within the Solar System and only a very tiny region of the parameter space can be explored with this method. We show that no violation of the consistency relations is expected in Galileon models.

  15. Imprint of thawing scalar fields on the large scale galaxy overdensity

    Science.gov (United States)

    Dinda, Bikash R.; Sen, Anjan A.

    2018-04-01

    We investigate the observed galaxy power spectrum for the thawing class of scalar field models taking into account various general relativistic corrections that occur on very large scales. We consider the full general relativistic perturbation equations for the matter as well as the dark energy fluid. We form a single autonomous system of equations containing both the background and the perturbed equations of motion which we subsequently solve for different scalar field potentials. First we study the percentage deviation from the Λ CDM model for different cosmological parameters as well as in the observed galaxy power spectra on different scales in scalar field models for various choices of scalar field potentials. Interestingly the difference in background expansion results from the enhancement of power from Λ CDM on small scales, whereas the inclusion of general relativistic (GR) corrections results in the suppression of power from Λ CDM on large scales. This can be useful to distinguish scalar field models from Λ CDM with future optical/radio surveys. We also compare the observed galaxy power spectra for tracking and thawing types of scalar field using some particular choices for the scalar field potentials. We show that thawing and tracking models can have large differences in observed galaxy power spectra on large scales and for smaller redshifts due to different GR effects. But on smaller scales and for larger redshifts, the difference is small and is mainly due to the difference in background expansion.

  16. Optimization of large-scale heterogeneous system-of-systems models.

    Energy Technology Data Exchange (ETDEWEB)

    Parekh, Ojas; Watson, Jean-Paul; Phillips, Cynthia Ann; Siirola, John; Swiler, Laura Painton; Hough, Patricia Diane (Sandia National Laboratories, Livermore, CA); Lee, Herbert K. H. (University of California, Santa Cruz, Santa Cruz, CA); Hart, William Eugene; Gray, Genetha Anne (Sandia National Laboratories, Livermore, CA); Woodruff, David L. (University of California, Davis, Davis, CA)

    2012-01-01

    Decision makers increasingly rely on large-scale computational models to simulate and analyze complex man-made systems. For example, computational models of national infrastructures are being used to inform government policy, assess economic and national security risks, evaluate infrastructure interdependencies, and plan for the growth and evolution of infrastructure capabilities. A major challenge for decision makers is the analysis of national-scale models that are composed of interacting systems: effective integration of system models is difficult, there are many parameters to analyze in these systems, and fundamental modeling uncertainties complicate analysis. This project is developing optimization methods to effectively represent and analyze large-scale heterogeneous system of systems (HSoS) models, which have emerged as a promising approach for describing such complex man-made systems. These optimization methods enable decision makers to predict future system behavior, manage system risk, assess tradeoffs between system criteria, and identify critical modeling uncertainties.

  17. Mining Together : Large-Scale Mining Meets Artisanal Mining, A Guide for Action

    OpenAIRE

    World Bank

    2009-01-01

    The present guide mining together-when large-scale mining meets artisanal mining is an important step to better understanding the conflict dynamics and underlying issues between large-scale and small-scale mining. This guide for action not only points to some of the challenges that both parties need to deal with in order to build a more constructive relationship, but most importantly it sh...

  18. Large transverse momenta in inclusive hadronic reactions and asymptotic scale invariance

    International Nuclear Information System (INIS)

    Miralles, F.; Sala, C.

    1976-01-01

    The inclusive reaction among scalar particles in considered, assuming that in the large-transverse momentum limit, scale invariance becomes important. Predictions are made of the asymptotic scale invariance for large four transverse momentum in hadron-hadron interactions, and they are compared with previous predictions. Photoproduction is also studied and the predictions that follow from different assumptions about the compositeness of hadrons are compared

  19. Chirping for large-scale maritime archaeological survey

    DEFF Research Database (Denmark)

    Grøn, Ole; Boldreel, Lars Ole

    2014-01-01

    Archaeological wrecks exposed on the sea floor are mapped using side-scan and multibeam techniques, whereas the detection of submerged archaeological sites, such as Stone Age settlements, and wrecks, partially or wholly embedded in sea-floor sediments, requires the application of high-resolution ...... the present state of this technology, it appears well suited to large-scale maritime archaeological mapping....

  20. LARGE-SCALE COMMERCIAL INVESTMENTS IN LAND: SEEKING ...

    African Journals Online (AJOL)

    extent of large-scale investment in land or to assess its impact on the people in recipient countries. .... favorable lease terms, apparently based on a belief that this is necessary to .... Harm to the rights of local occupiers of land can result from a dearth. 24. ..... applies to a self-identified group based on the group's traditions.

  1. Active power reserves evaluation in large scale PVPPs

    DEFF Research Database (Denmark)

    Crăciun, Bogdan-Ionut; Kerekes, Tamas; Sera, Dezso

    2013-01-01

    The present trend on investing in renewable ways of producing electricity in the detriment of conventional fossil fuel-based plants will lead to a certain point where these plants have to provide ancillary services and contribute to overall grid stability. Photovoltaic (PV) power has the fastest...... growth among all renewable energies and managed to reach high penetration levels creating instabilities which at the moment are corrected by the conventional generation. This paradigm will change in the future scenarios where most of the power is supplied by large scale renewable plants and parts...... of the ancillary services have to be shared by the renewable plants. The main focus of the proposed paper is to technically and economically analyze the possibility of having active power reserves in large scale PV power plants (PVPPs) without any auxiliary storage equipment. The provided reserves should...

  2. On the relationship between calcified neurocysticercosis and epilepsy in an endemic village: A large-scale, computed tomography-based population study in rural Ecuador.

    Science.gov (United States)

    Del Brutto, Oscar H; Arroyo, Gianfranco; Del Brutto, Victor J; Zambrano, Mauricio; García, Héctor H

    2017-11-01

    Using a large-scale population-based study, we aimed to assess prevalence and patterns of presentation of neurocysticercosis (NCC) and its relationship with epilepsy in community-dwellers aged ≥20 years living in Atahualpa (rural Ecuador). In a three-phase epidemiological study, individuals with suspected seizures were identified during a door-to-door survey and an interview (phase I). Then, neurologists evaluated suspected cases and randomly selected negative persons to estimate epilepsy prevalence (phase II). In phase III, all participants were offered noncontrast computed tomography (CT) for identifying NCC cases. The independent association between NCC (exposure) and epilepsy (outcome) was assessed by the use of multivariate logistic regression models adjusted for age, sex, level of education, and alcohol intake. CT findings were subsequently compared to archived brain magnetic resonance imaging in a sizable subgroup of participants. Of 1,604 villagers aged ≥20 years, 1,462 (91%) were enrolled. Forty-one persons with epilepsy (PWE) were identified, for a crude prevalence of epilepsy of 28 per 1,000 population (95% confidence interval [CI] = 20.7-38.2). A head CT was performed in 1,228 (84%) of 1,462 participants, including 39 of 41 PWE. CT showed lesions consistent with calcified parenchymal brain cysticerci in 118 (9.6%) cases (95% CI = 8.1-11.4%). No patient had other forms of NCC. Nine of 39 PWE, as opposed to 109 of 1,189 participants without epilepsy, had NCC (23.1% vs. 9.2%, p = 0.004). This difference persisted in the adjusted logistic regression model (odds ratio = 3.04, 95% CI = 1.35-6.81, p = 0.007). This large CT-based study demonstrates that PWE had three times the odds of having NCC than those without epilepsy, providing robust epidemiological evidence favoring the relationship between NCC and epilepsy. Wiley Periodicals, Inc. © 2017 International League Against Epilepsy.

  3. Analysis for Large Scale Integration of Electric Vehicles into Power Grids

    DEFF Research Database (Denmark)

    Hu, Weihao; Chen, Zhe; Wang, Xiaoru

    2011-01-01

    Electric Vehicles (EVs) provide a significant opportunity for reducing the consumption of fossil energies and the emission of carbon dioxide. With more and more electric vehicles integrated in the power systems, it becomes important to study the effects of EV integration on the power systems......, especially the low and middle voltage level networks. In the paper, the basic structure and characteristics of the electric vehicles are introduced. The possible impacts of large scale integration of electric vehicles on the power systems especially the advantage to the integration of the renewable energies...... are discussed. Finally, the research projects related to the large scale integration of electric vehicles into the power systems are introduced, it will provide reference for large scale integration of Electric Vehicles into power grids....

  4. Evolution favors protein mutational robustness in sufficiently large populations

    Directory of Open Access Journals (Sweden)

    Venturelli Ophelia S

    2007-07-01

    Full Text Available Abstract Background An important question is whether evolution favors properties such as mutational robustness or evolvability that do not directly benefit any individual, but can influence the course of future evolution. Functionally similar proteins can differ substantially in their robustness to mutations and capacity to evolve new functions, but it has remained unclear whether any of these differences might be due to evolutionary selection for these properties. Results Here we use laboratory experiments to demonstrate that evolution favors protein mutational robustness if the evolving population is sufficiently large. We neutrally evolve cytochrome P450 proteins under identical selection pressures and mutation rates in populations of different sizes, and show that proteins from the larger and thus more polymorphic population tend towards higher mutational robustness. Proteins from the larger population also evolve greater stability, a biophysical property that is known to enhance both mutational robustness and evolvability. The excess mutational robustness and stability is well described by mathematical theory, and can be quantitatively related to the way that the proteins occupy their neutral network. Conclusion Our work is the first experimental demonstration of the general tendency of evolution to favor mutational robustness and protein stability in highly polymorphic populations. We suggest that this phenomenon could contribute to the mutational robustness and evolvability of viruses and bacteria that exist in large populations.

  5. Scale of habitat connectivity and colonization in fragmented nuthatch populations

    NARCIS (Netherlands)

    Langevelde, van F.

    2000-01-01

    Studies of effects of landscape pattern on population dynamics should consider the spatial scale at which habitat connectivity varies relative to the spatial scale of the species' behavioral response. In this paper, I investigate the relationship between the degree of connectivity of wooded patches

  6. Synthesizing large-scale pyroclastic flows: Experimental design, scaling, and first results from PELE

    Science.gov (United States)

    Lube, G.; Breard, E. C. P.; Cronin, S. J.; Jones, J.

    2015-03-01

    Pyroclastic flow eruption large-scale experiment (PELE) is a large-scale facility for experimental studies of pyroclastic density currents (PDCs). It is used to generate high-energy currents involving 500-6500 m3 natural volcanic material and air that achieve velocities of 7-30 m s-1, flow thicknesses of 2-4.5 m, and runouts of >35 m. The experimental PDCs are synthesized by a controlled "eruption column collapse" of ash-lapilli suspensions onto an instrumented channel. The first set of experiments are documented here and used to elucidate the main flow regimes that influence PDC dynamic structure. Four phases are identified: (1) mixture acceleration during eruption column collapse, (2) column-slope impact, (3) PDC generation, and (4) ash cloud diffusion. The currents produced are fully turbulent flows and scale well to natural PDCs including small to large scales of turbulent transport. PELE is capable of generating short, pulsed, and sustained currents over periods of several tens of seconds, and dilute surge-like PDCs through to highly concentrated pyroclastic flow-like currents. The surge-like variants develop a basal <0.05 m thick regime of saltating/rolling particles and shifting sand waves, capped by a 2.5-4.5 m thick, turbulent suspension that grades upward to lower particle concentrations. Resulting deposits include stratified dunes, wavy and planar laminated beds, and thin ash cloud fall layers. Concentrated currents segregate into a dense basal underflow of <0.6 m thickness that remains aerated. This is capped by an upper ash cloud surge (1.5-3 m thick) with 100 to 10-4 vol % particles. Their deposits include stratified, massive, normally and reversely graded beds, lobate fronts, and laterally extensive veneer facies beyond channel margins.

  7. Novel algorithm of large-scale simultaneous linear equations

    International Nuclear Information System (INIS)

    Fujiwara, T; Hoshi, T; Yamamoto, S; Sogabe, T; Zhang, S-L

    2010-01-01

    We review our recently developed methods of solving large-scale simultaneous linear equations and applications to electronic structure calculations both in one-electron theory and many-electron theory. This is the shifted COCG (conjugate orthogonal conjugate gradient) method based on the Krylov subspace, and the most important issue for applications is the shift equation and the seed switching method, which greatly reduce the computational cost. The applications to nano-scale Si crystals and the double orbital extended Hubbard model are presented.

  8. Rank Order Coding: a Retinal Information Decoding Strategy Revealed by Large-Scale Multielectrode Array Retinal Recordings.

    Science.gov (United States)

    Portelli, Geoffrey; Barrett, John M; Hilgen, Gerrit; Masquelier, Timothée; Maccione, Alessandro; Di Marco, Stefano; Berdondini, Luca; Kornprobst, Pierre; Sernagor, Evelyne

    2016-01-01

    How a population of retinal ganglion cells (RGCs) encodes the visual scene remains an open question. Going beyond individual RGC coding strategies, results in salamander suggest that the relative latencies of a RGC pair encode spatial information. Thus, a population code based on this concerted spiking could be a powerful mechanism to transmit visual information rapidly and efficiently. Here, we tested this hypothesis in mouse by recording simultaneous light-evoked responses from hundreds of RGCs, at pan-retinal level, using a new generation of large-scale, high-density multielectrode array consisting of 4096 electrodes. Interestingly, we did not find any RGCs exhibiting a clear latency tuning to the stimuli, suggesting that in mouse, individual RGC pairs may not provide sufficient information. We show that a significant amount of information is encoded synergistically in the concerted spiking of large RGC populations. Thus, the RGC population response described with relative activities, or ranks, provides more relevant information than classical independent spike count- or latency- based codes. In particular, we report for the first time that when considering the relative activities across the whole population, the wave of first stimulus-evoked spikes is an accurate indicator of stimulus content. We show that this coding strategy coexists with classical neural codes, and that it is more efficient and faster. Overall, these novel observations suggest that already at the level of the retina, concerted spiking provides a reliable and fast strategy to rapidly transmit new visual scenes.

  9. Large-Scale Systems Control Design via LMI Optimization

    Czech Academy of Sciences Publication Activity Database

    Rehák, Branislav

    2015-01-01

    Roč. 44, č. 3 (2015), s. 247-253 ISSN 1392-124X Institutional support: RVO:67985556 Keywords : Combinatorial linear matrix inequalities * large-scale system * decentralized control Subject RIV: BC - Control Systems Theory Impact factor: 0.633, year: 2015

  10. Unraveling The Connectome: Visualizing and Abstracting Large-Scale Connectomics Data

    KAUST Repository

    Al-Awami, Ali K.

    2017-01-01

    -user system seamlessly integrates a diverse set of tools. Our system provides support for the management, provenance, accountability, and auditing of large-scale segmentations. Finally, we present a novel architecture to render very large volumes interactively

  11. Report of the Workshop on Petascale Systems Integration for LargeScale Facilities

    Energy Technology Data Exchange (ETDEWEB)

    Kramer, William T.C.; Walter, Howard; New, Gary; Engle, Tom; Pennington, Rob; Comes, Brad; Bland, Buddy; Tomlison, Bob; Kasdorf, Jim; Skinner, David; Regimbal, Kevin

    2007-10-01

    There are significant issues regarding Large Scale System integration that are not being addressed in other forums such as current research portfolios or vendor user groups. Unfortunately, the issues in the area of large-scale system integration often fall into a netherworld; not research, not facilities, not procurement, not operations, not user services. Taken together, these issues along with the impact of sub-optimal integration technology means the time required to deploy, integrate and stabilize large scale system may consume up to 20 percent of the useful life of such systems. Improving the state of the art for large scale systems integration has potential to increase the scientific productivity of these systems. Sites have significant expertise, but there are no easy ways to leverage this expertise among them . Many issues inhibit the sharing of information, including available time and effort, as well as issues with sharing proprietary information. Vendors also benefit in the long run from the solutions to issues detected during site testing and integration. There is a great deal of enthusiasm for making large scale system integration a full-fledged partner along with the other major thrusts supported by funding agencies in the definition, design, and use of a petascale systems. Integration technology and issues should have a full 'seat at the table' as petascale and exascale initiatives and programs are planned. The workshop attendees identified a wide range of issues and suggested paths forward. Pursuing these with funding opportunities and innovation offers the opportunity to dramatically improve the state of large scale system integration.

  12. Promoting Handwashing Behavior: The Effects of Large-scale Community and School-level Interventions.

    Science.gov (United States)

    Galiani, Sebastian; Gertler, Paul; Ajzenman, Nicolas; Orsola-Vidal, Alexandra

    2016-12-01

    This paper analyzes a randomized experiment that uses novel strategies to promote handwashing with soap at critical points in time in Peru. It evaluates a large-scale comprehensive initiative that involved both community and school activities in addition to communication campaigns. The analysis indicates that the initiative was successful in reaching the target audience and in increasing the treated population's knowledge about appropriate handwashing behavior. These improvements translated into higher self-reported and observed handwashing with soap at critical junctures. However, no significant improvements in the health of children under the age of 5 years were observed. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  13. Nuclear-pumped lasers for large-scale applications

    International Nuclear Information System (INIS)

    Anderson, R.E.; Leonard, E.M.; Shea, R.F.; Berggren, R.R.

    1989-05-01

    Efficient initiation of large-volume chemical lasers may be achieved by neutron induced reactions which produce charged particles in the final state. When a burst mode nuclear reactor is used as the neutron source, both a sufficiently intense neutron flux and a sufficiently short initiation pulse may be possible. Proof-of-principle experiments are planned to demonstrate lasing in a direct nuclear-pumped large-volume system; to study the effects of various neutron absorbing materials on laser performance; to study the effects of long initiation pulse lengths; to demonstrate the performance of large-scale optics and the beam quality that may be obtained; and to assess the performance of alternative designs of burst systems that increase the neutron output and burst repetition rate. 21 refs., 8 figs., 5 tabs

  14. Large scale sodium-water reaction tests for Monju steam generators

    International Nuclear Information System (INIS)

    Sato, M.; Hiroi, H.; Hori, M.

    1976-01-01

    To demonstrate the safe design of the steam generator system of the prototype fast reactor Monju against the postulated large leak sodium-water reaction, a large scale test facility SWAT-3 was constructed. SWAT-3 is a 1/2.5 scale model of the Monju secondary loop on the basis of the iso-velocity modeling. Two tests have been conducted in SWAT-3 since its construction. The test items using SWAT-3 are discussed, and the description of the facility and the test results are presented

  15. Workplace bullying and sleep disturbances: findings from a large scale cross-sectional survey in the French working population.

    Science.gov (United States)

    Niedhammer, Isabelle; David, Simone; Degioanni, Stephanie; Drummond, Anne; Philip, Pierre; Acquarone, D; Aicardi, F; André-Mazeaud, P; Arsento, M; Astier, R; Baille, H; Bajon-Thery, F; Barre, E; Basire, C; Battu, J L; Baudry, S; Beatini, C; Beaud'huin, N; Becker, C; Bellezza, D; Beque, C; Bernstein, O; Beyssier, C; Blanc-Cascio, F; Blanchet, N; Blondel, C; Boisselot, R; Bordes-Dupuy, G; Borrelly, N; Bouhnik, D; Boulanger, M F; Boulard, J; Borreau, P; Bourret, D; Boustière, A M; Breton, C; Bugeon, G; Buono-Michel, M; Canonne, J F; Capella, D; Cavin-Rey, M; Cervoni, C; Charreton, D; Charrier, D; Chauvin, M A; Chazal, B; Cougnot, C; Cuvelier, G; Dalivoust, G; Daumas, R; Debaille, A; De Bretteville, L; Delaforge, G; Delchambre, A; Domeny, L; Donati, Y; Ducord-Chapelet, J; Duran, C; Durand-Bruguerolle, D; Fabre, D; Faivre, A; Falleri, R; Ferrando, G; Ferrari-Galano, J; Flutet, M; Fouché, J P; Fournier, F; Freyder, E; Galy, M; Garcia, A; Gazazian, G; Gérard, C; Girard, F; Giuge, M; Goyer, C; Gravier, C; Guyomard, A; Hacquin, M C; Halimi, E; Ibagnes, T; Icart, P; Jacquin, M C; Jaubert, B; Joret, J P; Julien, J P; Kacel, M; Kesmedjian, E; Lacroix, P; Lafon-Borelli, M; Lallai, S; Laudicina, J; Leclercq, X; Ledieu, S; Leroy, J; Leroyer, L; Loesche, F; Londi, D; Longueville, J M; Lotte, M C; Louvain, S; Lozé, M; Maculet-Simon, M; Magallon, G; Marcelot, V; Mareel, M C; Martin, P; Masse, A M; Méric, M; Milliet, C; Mokhtari, R; Monville, A M; Muller, B; Obadia, G; Pelser, M; Peres, L; Perez, E; Peyron, M; Peyronnin, F; Postel, S; Presseq, P; Pyronnet, E; Quinsat, C; Raulot-Lapointe, H; Rigaud, P; Robert, F; Robert, O; Roger, K; Roussel, A; Roux, J P; Rubini-Remigy, D; Sabate, N; Saccomano-Pertus, C; Salengro, B; Salengro-Trouillez, P; Samson, E; Sendra-Gille, L; Seyrig, C; Stoll, G; Tarpinian, N; Tavernier, M; Tempesta, S; Terracol, H; Torresani, F; Triglia, M F; Vandomme, V; Vieillard, F; Vilmot, K; Vital, N

    2009-09-01

    The purpose of this study was to explore the associations between workplace bullying, the characteristics of workplace bullying, and sleep disturbances in a large sample of employees of the French working population. Workplace bullying, evaluated using the validated instrument developed by Leymann, and sleep disturbances, as well as covariates, were measured using a self-administered questionnaire. Covariates included age, marital status, presence of children, education, occupation, working hours, night work, physical and chemical exposures at work, self-reported health, and depressive symptoms. Statistical analysis was performed using logistic regression analysis and was carried out separately for men and women. General working population. The study population consisted of a random sample of 3132 men and 4562 women of the working population in the southeast of France. Workplace bullying was strongly associated with sleep disturbances. Past exposure to bullying also increased the risk for this outcome. The more frequent the exposure to bullying, the higher the risk of experiencing sleep disturbances. Observing someone else being bullied in the workplace was also associated with the outcome. Adjustment for covariates did not modify the results. Additional adjustment for self-reported health and depressive symptoms diminished the magnitude of the associations that remained significant. The prevalence of workplace bullying (around 10%) was found to be high in this study as well was the impact of this major job-related stressor on sleep disturbances. Although no conclusion about causality could be drawn from this cross-sectional study, the findings suggest that the contribution of workplace bullying to the burden of sleep disturbances may be substantial.

  16. A Chain Perspective on Large-scale Number Systems

    NARCIS (Netherlands)

    Grijpink, J.H.A.M.

    2012-01-01

    As large-scale number systems gain significance in social and economic life (electronic communication, remote electronic authentication), the correct functioning and the integrity of public number systems take on crucial importance. They are needed to uniquely indicate people, objects or phenomena

  17. Image-based Exploration of Large-Scale Pathline Fields

    KAUST Repository

    Nagoor, Omniah H.

    2014-01-01

    structure in which each pixel contains a list of pathlines segments. With this view-dependent method it is possible to filter, color-code and explore large-scale flow data in real-time. In addition, optimization techniques such as early-ray termination

  18. Research on the Construction Management and Sustainable Development of Large-Scale Scientific Facilities in China

    Science.gov (United States)

    Guiquan, Xi; Lin, Cong; Xuehui, Jin

    2018-05-01

    As an important platform for scientific and technological development, large -scale scientific facilities are the cornerstone of technological innovation and a guarantee for economic and social development. Researching management of large-scale scientific facilities can play a key role in scientific research, sociology and key national strategy. This paper reviews the characteristics of large-scale scientific facilities, and summarizes development status of China's large-scale scientific facilities. At last, the construction, management, operation and evaluation of large-scale scientific facilities is analyzed from the perspective of sustainable development.

  19. Development and analysis of prognostic equations for mesoscale kinetic energy and mesoscale (subgrid scale) fluxes for large-scale atmospheric models

    Science.gov (United States)

    Avissar, Roni; Chen, Fei

    1993-01-01

    Generated by landscape discontinuities (e.g., sea breezes) mesoscale circulation processes are not represented in large-scale atmospheric models (e.g., general circulation models), which have an inappropiate grid-scale resolution. With the assumption that atmospheric variables can be separated into large scale, mesoscale, and turbulent scale, a set of prognostic equations applicable in large-scale atmospheric models for momentum, temperature, moisture, and any other gaseous or aerosol material, which includes both mesoscale and turbulent fluxes is developed. Prognostic equations are also developed for these mesoscale fluxes, which indicate a closure problem and, therefore, require a parameterization. For this purpose, the mean mesoscale kinetic energy (MKE) per unit of mass is used, defined as E-tilde = 0.5 (the mean value of u'(sub i exp 2), where u'(sub i) represents the three Cartesian components of a mesoscale circulation (the angle bracket symbol is the grid-scale, horizontal averaging operator in the large-scale model, and a tilde indicates a corresponding large-scale mean value). A prognostic equation is developed for E-tilde, and an analysis of the different terms of this equation indicates that the mesoscale vertical heat flux, the mesoscale pressure correlation, and the interaction between turbulence and mesoscale perturbations are the major terms that affect the time tendency of E-tilde. A-state-of-the-art mesoscale atmospheric model is used to investigate the relationship between MKE, landscape discontinuities (as characterized by the spatial distribution of heat fluxes at the earth's surface), and mesoscale sensible and latent heat fluxes in the atmosphere. MKE is compared with turbulence kinetic energy to illustrate the importance of mesoscale processes as compared to turbulent processes. This analysis emphasizes the potential use of MKE to bridge between landscape discontinuities and mesoscale fluxes and, therefore, to parameterize mesoscale fluxes

  20. Solving large scale structure in ten easy steps with COLA

    Energy Technology Data Exchange (ETDEWEB)

    Tassev, Svetlin [Department of Astrophysical Sciences, Princeton University, 4 Ivy Lane, Princeton, NJ 08544 (United States); Zaldarriaga, Matias [School of Natural Sciences, Institute for Advanced Study, Olden Lane, Princeton, NJ 08540 (United States); Eisenstein, Daniel J., E-mail: stassev@cfa.harvard.edu, E-mail: matiasz@ias.edu, E-mail: deisenstein@cfa.harvard.edu [Center for Astrophysics, Harvard University, 60 Garden Street, Cambridge, MA 02138 (United States)

    2013-06-01

    We present the COmoving Lagrangian Acceleration (COLA) method: an N-body method for solving for Large Scale Structure (LSS) in a frame that is comoving with observers following trajectories calculated in Lagrangian Perturbation Theory (LPT). Unlike standard N-body methods, the COLA method can straightforwardly trade accuracy at small-scales in order to gain computational speed without sacrificing accuracy at large scales. This is especially useful for cheaply generating large ensembles of accurate mock halo catalogs required to study galaxy clustering and weak lensing, as those catalogs are essential for performing detailed error analysis for ongoing and future surveys of LSS. As an illustration, we ran a COLA-based N-body code on a box of size 100 Mpc/h with particles of mass ≈ 5 × 10{sup 9}M{sub s}un/h. Running the code with only 10 timesteps was sufficient to obtain an accurate description of halo statistics down to halo masses of at least 10{sup 11}M{sub s}un/h. This is only at a modest speed penalty when compared to mocks obtained with LPT. A standard detailed N-body run is orders of magnitude slower than our COLA-based code. The speed-up we obtain with COLA is due to the fact that we calculate the large-scale dynamics exactly using LPT, while letting the N-body code solve for the small scales, without requiring it to capture exactly the internal dynamics of halos. Achieving a similar level of accuracy in halo statistics without the COLA method requires at least 3 times more timesteps than when COLA is employed.

  1. Concepts for Large Scale Hydrogen Production

    OpenAIRE

    Jakobsen, Daniel; Åtland, Vegar

    2016-01-01

    The objective of this thesis is to perform a techno-economic analysis of large-scale, carbon-lean hydrogen production in Norway, in order to evaluate various production methods and estimate a breakeven price level. Norway possesses vast energy resources and the export of oil and gas is vital to the country s economy. The results of this thesis indicate that hydrogen represents a viable, carbon-lean opportunity to utilize these resources, which can prove key in the future of Norwegian energy e...

  2. Measuring Cosmic Expansion and Large Scale Structure with Destiny

    Science.gov (United States)

    Benford, Dominic J.; Lauer, Tod R.

    2007-01-01

    Destiny is a simple, direct, low cost mission to determine the properties of dark energy by obtaining a cosmologically deep supernova (SN) type Ia Hubble diagram and by measuring the large-scale mass power spectrum over time. Its science instrument is a 1.65m space telescope, featuring a near-infrared survey camera/spectrometer with a large field of view. During its first two years, Destiny will detect, observe, and characterize 23000 SN Ia events over the redshift interval 0.4Destiny will be used in its third year as a high resolution, wide-field imager to conduct a weak lensing survey covering >lo00 square degrees to measure the large-scale mass power spectrum. The combination of surveys is much more powerful than either technique on its own, and will have over an order of magnitude greater sensitivity than will be provided by ongoing ground-based projects.

  3. A framework for comparative evaluation of dosimetric methods to triage a large population following a radiological event

    International Nuclear Information System (INIS)

    Flood, Ann Barry; Nicolalde, Roberto J.; Demidenko, Eugene; Williams, Benjamin B.; Shapiro, Alla; Wiley, Albert L.; Swartz, Harold M.

    2011-01-01

    Background: To prepare for a possible major radiation disaster involving large numbers of potentially exposed people, it is important to be able to rapidly and accurately triage people for treatment or not, factoring in the likely conditions and available resources. To date, planners have had to create guidelines for triage based on methods for estimating dose that are clinically available and which use evidence extrapolated from unrelated conditions. Current guidelines consequently focus on measuring clinical symptoms (e.g., time-to-vomiting), which may not be subject to the same verification of standard methods and validation processes required for governmental approval processes of new and modified procedures. Biodosimeters under development have not yet been formally approved for this use. Neither set of methods has been tested in settings involving large-scale populations at risk for exposure. Objective: To propose a framework for comparative evaluation of methods for such triage and to evaluate biodosimetric methods that are currently recommended and new methods as they are developed. Methods: We adapt the NIH model of scientific evaluations and sciences needed for effective translational research to apply to biodosimetry for triaging very large populations following a radiation event. We detail criteria for translating basic science about dosimetry into effective multi-stage triage of large populations and illustrate it by analyzing 3 current guidelines and 3 advanced methods for biodosimetry. Conclusions: This framework for evaluating dosimetry in large populations is a useful technique to compare the strengths and weaknesses of different dosimetry methods. It can help policy-makers and planners not only to compare the methods' strengths and weaknesses for their intended use but also to develop an integrated approach to maximize their effectiveness. It also reveals weaknesses in methods that would benefit from further research and evaluation.

  4. A framework for comparative evaluation of dosimetric methods to triage a large population following a radiological event

    Energy Technology Data Exchange (ETDEWEB)

    Flood, Ann Barry, E-mail: Ann.B.Flood@Dartmouth.Edu [Dartmouth Physically Based Biodosimetry Center for Medical Countermeasures Against Radiation (Dart-Dose CMCR), Dartmouth Medical School, Hanover, NH 03768 (United States); Nicolalde, Roberto J., E-mail: Roberto.J.Nicolalde@Dartmouth.Edu [Dartmouth Physically Based Biodosimetry Center for Medical Countermeasures Against Radiation (Dart-Dose CMCR), Dartmouth Medical School, Hanover, NH 03768 (United States); Demidenko, Eugene, E-mail: Eugene.Demidenko@Dartmouth.Edu [Dartmouth Physically Based Biodosimetry Center for Medical Countermeasures Against Radiation (Dart-Dose CMCR), Dartmouth Medical School, Hanover, NH 03768 (United States); Williams, Benjamin B., E-mail: Benjamin.B.Williams@Dartmouth.Edu [Dartmouth Physically Based Biodosimetry Center for Medical Countermeasures Against Radiation (Dart-Dose CMCR), Dartmouth Medical School, Hanover, NH 03768 (United States); Shapiro, Alla, E-mail: Alla.Shapiro@fda.hhs.gov [Food and Drug Administration (FDA), Rockville, MD (United States); Wiley, Albert L., E-mail: Albert.Wiley@orise.orau.gov [Oak Ridge Institute for Science and Education (ORISE), Oak Ridge, TN (United States); Swartz, Harold M., E-mail: Harold.M.Swartz@Dartmouth.Edu [Dartmouth Physically Based Biodosimetry Center for Medical Countermeasures Against Radiation (Dart-Dose CMCR), Dartmouth Medical School, Hanover, NH 03768 (United States)

    2011-09-15

    Background: To prepare for a possible major radiation disaster involving large numbers of potentially exposed people, it is important to be able to rapidly and accurately triage people for treatment or not, factoring in the likely conditions and available resources. To date, planners have had to create guidelines for triage based on methods for estimating dose that are clinically available and which use evidence extrapolated from unrelated conditions. Current guidelines consequently focus on measuring clinical symptoms (e.g., time-to-vomiting), which may not be subject to the same verification of standard methods and validation processes required for governmental approval processes of new and modified procedures. Biodosimeters under development have not yet been formally approved for this use. Neither set of methods has been tested in settings involving large-scale populations at risk for exposure. Objective: To propose a framework for comparative evaluation of methods for such triage and to evaluate biodosimetric methods that are currently recommended and new methods as they are developed. Methods: We adapt the NIH model of scientific evaluations and sciences needed for effective translational research to apply to biodosimetry for triaging very large populations following a radiation event. We detail criteria for translating basic science about dosimetry into effective multi-stage triage of large populations and illustrate it by analyzing 3 current guidelines and 3 advanced methods for biodosimetry. Conclusions: This framework for evaluating dosimetry in large populations is a useful technique to compare the strengths and weaknesses of different dosimetry methods. It can help policy-makers and planners not only to compare the methods' strengths and weaknesses for their intended use but also to develop an integrated approach to maximize their effectiveness. It also reveals weaknesses in methods that would benefit from further research and evaluation.

  5. Test-particle simulations of SEP propagation in IMF with large-scale fluctuations

    Science.gov (United States)

    Kelly, J.; Dalla, S.; Laitinen, T.

    2012-11-01

    The results of full-orbit test-particle simulations of SEPs propagating through an IMF which exhibits large-scale fluctuations are presented. A variety of propagation conditions are simulated - scatter-free, and scattering with mean free path, λ, of 0.3 and 2.0 AU - and the cross-field transport of SEPs is investigated. When calculating cross-field displacements the Parker spiral geometry is accounted for and the role of magnetic field expansion is taken into account. It is found that transport across the magnetic field is enhanced in the λ =0.3 AU and λ =2 AU cases, compared to the scatter-free case, with the λ =2 AU case in particular containing outlying particles that had strayed a large distance across the IMF. Outliers are catergorized by means of Chauvenet's criterion and it is found that typically between 1 and 2% of the population falls within this category. The ratio of latitudinal to longitudinal diffusion coefficient perpendicular to the magnetic field is typically 0.2, suggesting that transport in latitude is less efficient.

  6. Inferring infection hazard in wildlife populations by linking data across individual and population scales

    Science.gov (United States)

    Pepin, Kim M.; Kay, Shannon L.; Golas, Ben D.; Shriner, Susan A.; Gilbert, Amy T.; Miller, Ryan S.; Graham, Andrea L.; Riley, Steven; Cross, Paul C.; Samuel, Michael D.; Hooten, Mevin B.; Hoeting, Jennifer A.; Lloyd-Smith, James O.; Webb, Colleen T.; Buhnerkempe, Michael G.

    2017-01-01

    Our ability to infer unobservable disease-dynamic processes such as force of infection (infection hazard for susceptible hosts) has transformed our understanding of disease transmission mechanisms and capacity to predict disease dynamics. Conventional methods for inferring FOI estimate a time-averaged value and are based on population-level processes. Because many pathogens exhibit epidemic cycling and FOI is the result of processes acting across the scales of individuals and populations, a flexible framework that extends to epidemic dynamics and links within-host processes to FOI is needed. Specifically, within-host antibody kinetics in wildlife hosts can be short-lived and produce patterns that are repeatable across individuals, suggesting individual-level antibody concentrations could be used to infer time since infection and hence FOI. Using simulations and case studies (influenza A in lesser snow geese and Yersinia pestis in coyotes), we argue that with careful experimental and surveillance design, the population-level FOI signal can be recovered from individual-level antibody kinetics, despite substantial individual-level variation. In addition to improving inference, the cross-scale quantitative antibody approach we describe can reveal insights into drivers of individual-based variation in disease response, and the role of poorly understood processes such as secondary infections, in population-level dynamics of disease.

  7. Testing, development and demonstration of large scale solar district heating systems

    DEFF Research Database (Denmark)

    Furbo, Simon; Fan, Jianhua; Perers, Bengt

    2015-01-01

    In 2013-2014 the project “Testing, development and demonstration of large scale solar district heating systems” was carried out within the Sino-Danish Renewable Energy Development Programme, the so called RED programme jointly developed by the Chinese and Danish governments. In the project Danish...... know how on solar heating plants and solar heating test technology have been transferred from Denmark to China, large solar heating systems have been promoted in China, test capabilities on solar collectors and large scale solar heating systems have been improved in China and Danish-Chinese cooperation...

  8. Optimal Selection of AC Cables for Large Scale Offshore Wind Farms

    DEFF Research Database (Denmark)

    Hou, Peng; Hu, Weihao; Chen, Zhe

    2014-01-01

    The investment of large scale offshore wind farms is high in which the electrical system has a significant contribution to the total cost. As one of the key components, the cost of the connection cables affects the initial investment a lot. The development of cable manufacturing provides a vast...... and systematical way for the optimal selection of cables in large scale offshore wind farms....

  9. Temporal fluctuation scaling in populations and communities

    Science.gov (United States)

    Michael Kalyuzhny; Yishai Schreiber; Rachel Chocron; Curtis H. Flather; David A. Kessler; Nadav M. Shnerb

    2014-01-01

    Taylor's law, one of the most widely accepted generalizations in ecology, states that the variance of a population abundance time series scales as a power law of its mean. Here we reexamine this law and the empirical evidence presented in support of it. Specifically, we show that the exponent generally depends on the length of the time series, and its value...

  10. Large-scale river regulation

    International Nuclear Information System (INIS)

    Petts, G.

    1994-01-01

    Recent concern over human impacts on the environment has tended to focus on climatic change, desertification, destruction of tropical rain forests, and pollution. Yet large-scale water projects such as dams, reservoirs, and inter-basin transfers are among the most dramatic and extensive ways in which our environment has been, and continues to be, transformed by human action. Water running to the sea is perceived as a lost resource, floods are viewed as major hazards, and wetlands are seen as wastelands. River regulation, involving the redistribution of water in time and space, is a key concept in socio-economic development. To achieve water and food security, to develop drylands, and to prevent desertification and drought are primary aims for many countries. A second key concept is ecological sustainability. Yet the ecology of rivers and their floodplains is dependent on the natural hydrological regime, and its related biochemical and geomorphological dynamics. (Author)

  11. The Large-scale Effect of Environment on Galactic Conformity

    Science.gov (United States)

    Sun, Shuangpeng; Guo, Qi; Wang, Lan; Wang, Jie; Gao, Liang; Lacey, Cedric G.; Pan, Jun

    2018-04-01

    We use a volume-limited galaxy sample from the SDSS Data Release 7 to explore the dependence of galactic conformity on the large-scale environment, measured on ˜ 4 Mpc scales. We find that the star formation activity of neighbour galaxies depends more strongly on the environment than on the activity of their primary galaxies. In under-dense regions most neighbour galaxies tend to be active, while in over-dense regions neighbour galaxies are mostly passive, regardless of the activity of their primary galaxies. At a given stellar mass, passive primary galaxies reside in higher density regions than active primary galaxies, leading to the apparently strong conformity signal. The dependence of the activity of neighbour galaxies on environment can be explained by the corresponding dependence of the fraction of satellite galaxies. Similar results are found for galaxies in a semi-analytical model, suggesting that no new physics is required to explain the observed large-scale conformity.

  12. Investigating the dependence of SCM simulated precipitation and clouds on the spatial scale of large-scale forcing at SGP

    Science.gov (United States)

    Tang, Shuaiqi; Zhang, Minghua; Xie, Shaocheng

    2017-08-01

    Large-scale forcing data, such as vertical velocity and advective tendencies, are required to drive single-column models (SCMs), cloud-resolving models, and large-eddy simulations. Previous studies suggest that some errors of these model simulations could be attributed to the lack of spatial variability in the specified domain-mean large-scale forcing. This study investigates the spatial variability of the forcing and explores its impact on SCM simulated precipitation and clouds. A gridded large-scale forcing data during the March 2000 Cloud Intensive Operational Period at the Atmospheric Radiation Measurement program's Southern Great Plains site is used for analysis and to drive the single-column version of the Community Atmospheric Model Version 5 (SCAM5). When the gridded forcing data show large spatial variability, such as during a frontal passage, SCAM5 with the domain-mean forcing is not able to capture the convective systems that are partly located in the domain or that only occupy part of the domain. This problem has been largely reduced by using the gridded forcing data, which allows running SCAM5 in each subcolumn and then averaging the results within the domain. This is because the subcolumns have a better chance to capture the timing of the frontal propagation and the small-scale systems. Other potential uses of the gridded forcing data, such as understanding and testing scale-aware parameterizations, are also discussed.

  13. Dynamic Reactive Power Compensation of Large Scale Wind Integrated Power System

    DEFF Research Database (Denmark)

    Rather, Zakir Hussain; Chen, Zhe; Thøgersen, Paul

    2015-01-01

    wind turbines especially wind farms with additional grid support functionalities like dynamic support (e,g dynamic reactive power support etc.) and ii) refurbishment of existing conventional central power plants to synchronous condensers could be one of the efficient, reliable and cost effective option......Due to progressive displacement of conventional power plants by wind turbines, dynamic security of large scale wind integrated power systems gets significantly compromised. In this paper we first highlight the importance of dynamic reactive power support/voltage security in large scale wind...... integrated power systems with least presence of conventional power plants. Then we propose a mixed integer dynamic optimization based method for optimal dynamic reactive power allocation in large scale wind integrated power systems. One of the important aspects of the proposed methodology is that unlike...

  14. Constructing sites on a large scale

    DEFF Research Database (Denmark)

    Braae, Ellen Marie; Tietjen, Anne

    2011-01-01

    Since the 1990s, the regional scale has regained importance in urban and landscape design. In parallel, the focus in design tasks has shifted from master plans for urban extension to strategic urban transformation projects. A prominent example of a contemporary spatial development approach...... for setting the design brief in a large scale urban landscape in Norway, the Jaeren region around the city of Stavanger. In this paper, we first outline the methodological challenges and then present and discuss the proposed method based on our teaching experiences. On this basis, we discuss aspects...... is the IBA Emscher Park in the Ruhr area in Germany. Over a 10 years period (1988-1998), more than a 100 local transformation projects contributed to the transformation from an industrial to a post-industrial region. The current paradigm of planning by projects reinforces the role of the design disciplines...

  15. Report of the LASCAR forum: Large scale reprocessing plant safeguards

    International Nuclear Information System (INIS)

    1992-01-01

    This report has been prepared to provide information on the studies which were carried out from 1988 to 1992 under the auspices of the multinational forum known as Large Scale Reprocessing Plant Safeguards (LASCAR) on safeguards for four large scale reprocessing plants operated or planned to be operated in the 1990s. The report summarizes all of the essential results of these studies. The participants in LASCAR were from France, Germany, Japan, the United Kingdom, the United States of America, the Commission of the European Communities - Euratom, and the International Atomic Energy Agency

  16. Eight attention points when evaluating large-scale public sector reforms

    DEFF Research Database (Denmark)

    Hansen, Morten Balle; Breidahl, Karen Nielsen; Furubo, Jan-Eric

    2017-01-01

    This chapter analyses the challenges related to evaluations of large-scale public sector reforms. It is based on a meta-evaluation of the evaluation of the reform of the Norwegian Labour Market and Welfare Administration (the NAV-reform) in Norway, which entailed both a significant reorganization...... sector reforms. Based on the analysis, eight crucial points of attention when evaluating large-scale public sector reforms are elaborated. We discuss their reasons and argue that other countries will face the same challenges and thus can learn from the experiences of Norway....

  17. Large-scale melting and impact mixing on early-formed asteroids

    DEFF Research Database (Denmark)

    Greenwood, Richard; Barrat, J.-A.; Scott, Edward Robert Dalton

    Large-scale melting of asteroids and planetesimals is now known to have taken place ex-tremely early in solar system history [1]. The first-generation bodies produced by this process would have been subject to rapid collisional reprocessing, leading in most cases to fragmentation and/or accretion...... the relationship between the different groups of achondrites [3, 4]. Here we present new oxygen isotope evidence con-cerning the role of large-scale melting and subsequent impact mixing in the evolution of three important achondrite groups: the main-group pallasites, meso-siderites and HEDs....

  18. About Ganoderma boninense in oil palm plantations of Sumatra and peninsular Malaysia: Ancient population expansion, extensive gene flow and large scale dispersion ability.

    Science.gov (United States)

    Mercière, Maxime; Boulord, Romain; Carasco-Lacombe, Catherine; Klopp, Christophe; Lee, Yang-Ping; Tan, Joon-Sheong; Syed Alwee, Sharifah S R; Zaremski, Alba; De Franqueville, Hubert; Breton, Frédéric; Camus-Kulandaivelu, Létizia

    Wood rot fungi form one of the main classes of phytopathogenic fungus. The group includes many species, but has remained poorly studied. Many species belonging to the Ganoderma genus are well known for causing decay in a wide range of tree species around the world. Ganoderma boninense, causal agent of oil palm basal stem rot, is responsible for considerable yield losses in Southeast Asian oil palm plantations. In a large-scale sampling operation, 357 sporophores were collected from oil palm plantations spread over peninsular Malaysia and Sumatra and genotyped using 11 SSR markers. The genotyping of these samples made it possible to investigate the population structure and demographic history of G. boninense across the oldest known area of interaction between oil palm and G. boninense. Results show that G. boninense possesses a high degree of genetic diversity and no detectable genetic structure at the scale of Sumatra and peninsular Malaysia. The fact that few duplicate genotypes were found in several studies including this one supports the hypothesis of spore dispersal in the spread of G. boninense. Meanwhile, spatial autocorrelation analysis shows that G. boninense is able to disperse across both short and long distances. These results bring new insight into mechanisms by which G. boninense spreads in oil palm plantations. Finally, the use of approximate Bayesian computation (ABC) modelling indicates that G. boninense has undergone a demographic expansion in the past, probably before the oil palm was introduced into Southeast Asia. Copyright © 2017 British Mycological Society. Published by Elsevier Ltd. All rights reserved.

  19. Updating Geospatial Data from Large Scale Data Sources

    Science.gov (United States)

    Zhao, R.; Chen, J.; Wang, D.; Shang, Y.; Wang, Z.; Li, X.; Ai, T.

    2011-08-01

    In the past decades, many geospatial databases have been established at national, regional and municipal levels over the world. Nowadays, it has been widely recognized that how to update these established geo-spatial database and keep them up to date is most critical for the value of geo-spatial database. So, more and more efforts have been devoted to the continuous updating of these geospatial databases. Currently, there exist two main types of methods for Geo-spatial database updating: directly updating with remote sensing images or field surveying materials, and indirectly updating with other updated data result such as larger scale newly updated data. The former method is the basis because the update data sources in the two methods finally root from field surveying and remote sensing. The later method is often more economical and faster than the former. Therefore, after the larger scale database is updated, the smaller scale database should be updated correspondingly in order to keep the consistency of multi-scale geo-spatial database. In this situation, it is very reasonable to apply map generalization technology into the process of geo-spatial database updating. The latter is recognized as one of most promising methods of geo-spatial database updating, especially in collaborative updating environment in terms of map scale, i.e , different scale database are produced and maintained separately by different level organizations such as in China. This paper is focused on applying digital map generalization into the updating of geo-spatial database from large scale in the collaborative updating environment for SDI. The requirements of the application of map generalization into spatial database updating are analyzed firstly. A brief review on geospatial data updating based digital map generalization is then given. Based on the requirements analysis and review, we analyze the key factors for implementing updating geospatial data from large scale including technical

  20. Large Deviations for Two-Time-Scale Diffusions, with Delays

    International Nuclear Information System (INIS)

    Kushner, Harold J.

    2010-01-01

    We consider the problem of large deviations for a two-time-scale reflected diffusion process, possibly with delays in the dynamical terms. The Dupuis-Ellis weak convergence approach is used. It is perhaps the most intuitive and simplest for the problems of concern. The results have applications to the problem of approximating optimal controls for two-time-scale systems via use of the averaged equation.

  1. The (in)effectiveness of Global Land Policies on Large-Scale Land Acquisition

    NARCIS (Netherlands)

    Verhoog, S.M.

    2014-01-01

    Due to current crises, large-scale land acquisition (LSLA) is becoming a topic of growing concern. Public data from the ‘Land Matrix Global Observatory’ project (Land Matrix 2014a) demonstrates that since 2000, 1,664 large-scale land transactions in low- and middle-income countries were reported,

  2. Large Scale Landslide Database System Established for the Reservoirs in Southern Taiwan

    Science.gov (United States)

    Tsai, Tsai-Tsung; Tsai, Kuang-Jung; Shieh, Chjeng-Lun

    2017-04-01

    Typhoon Morakot seriously attack southern Taiwan awaken the public awareness of large scale landslide disasters. Large scale landslide disasters produce large quantity of sediment due to negative effects on the operating functions of reservoirs. In order to reduce the risk of these disasters within the study area, the establishment of a database for hazard mitigation / disaster prevention is necessary. Real time data and numerous archives of engineering data, environment information, photo, and video, will not only help people make appropriate decisions, but also bring the biggest concern for people to process and value added. The study tried to define some basic data formats / standards from collected various types of data about these reservoirs and then provide a management platform based on these formats / standards. Meanwhile, in order to satisfy the practicality and convenience, the large scale landslide disasters database system is built both provide and receive information abilities, which user can use this large scale landslide disasters database system on different type of devices. IT technology progressed extreme quick, the most modern system might be out of date anytime. In order to provide long term service, the system reserved the possibility of user define data format /standard and user define system structure. The system established by this study was based on HTML5 standard language, and use the responsive web design technology. This will make user can easily handle and develop this large scale landslide disasters database system.

  3. A review of large-scale solar heating systems in Europe

    International Nuclear Information System (INIS)

    Fisch, M.N.; Guigas, M.; Dalenback, J.O.

    1998-01-01

    Large-scale solar applications benefit from the effect of scale. Compared to small solar domestic hot water (DHW) systems for single-family houses, the solar heat cost can be cut at least in third. The most interesting projects for replacing fossil fuels and the reduction of CO 2 -emissions are solar systems with seasonal storage in combination with gas or biomass boilers. In the framework of the EU-APAS project Large-scale Solar Heating Systems, thirteen existing plants in six European countries have been evaluated. lie yearly solar gains of the systems are between 300 and 550 kWh per m 2 collector area. The investment cost of solar plants with short-term storage varies from 300 up to 600 ECU per m 2 . Systems with seasonal storage show investment costs twice as high. Results of studies concerning the market potential for solar heating plants, taking new collector concepts and industrial production into account, are presented. Site specific studies and predesign of large-scale solar heating plants in six European countries for housing developments show a 50% cost reduction compared to existing projects. The cost-benefit-ratio for the planned systems with long-term storage is between 0.7 and 1.5 ECU per kWh per year. (author)

  4. The Modified HZ Conjugate Gradient Algorithm for Large-Scale Nonsmooth Optimization.

    Science.gov (United States)

    Yuan, Gonglin; Sheng, Zhou; Liu, Wenjie

    2016-01-01

    In this paper, the Hager and Zhang (HZ) conjugate gradient (CG) method and the modified HZ (MHZ) CG method are presented for large-scale nonsmooth convex minimization. Under some mild conditions, convergent results of the proposed methods are established. Numerical results show that the presented methods can be better efficiency for large-scale nonsmooth problems, and several problems are tested (with the maximum dimensions to 100,000 variables).

  5. The Modified HZ Conjugate Gradient Algorithm for Large-Scale Nonsmooth Optimization.

    Directory of Open Access Journals (Sweden)

    Gonglin Yuan

    Full Text Available In this paper, the Hager and Zhang (HZ conjugate gradient (CG method and the modified HZ (MHZ CG method are presented for large-scale nonsmooth convex minimization. Under some mild conditions, convergent results of the proposed methods are established. Numerical results show that the presented methods can be better efficiency for large-scale nonsmooth problems, and several problems are tested (with the maximum dimensions to 100,000 variables.

  6. Validating Bayesian truth serum in large-scale online human experiments.

    Science.gov (United States)

    Frank, Morgan R; Cebrian, Manuel; Pickard, Galen; Rahwan, Iyad

    2017-01-01

    Bayesian truth serum (BTS) is an exciting new method for improving honesty and information quality in multiple-choice survey, but, despite the method's mathematical reliance on large sample sizes, existing literature about BTS only focuses on small experiments. Combined with the prevalence of online survey platforms, such as Amazon's Mechanical Turk, which facilitate surveys with hundreds or thousands of participants, BTS must be effective in large-scale experiments for BTS to become a readily accepted tool in real-world applications. We demonstrate that BTS quantifiably improves honesty in large-scale online surveys where the "honest" distribution of answers is known in expectation on aggregate. Furthermore, we explore a marketing application where "honest" answers cannot be known, but find that BTS treatment impacts the resulting distributions of answers.

  7. Large scale modulation of high frequency acoustic waves in periodic porous media.

    Science.gov (United States)

    Boutin, Claude; Rallu, Antoine; Hans, Stephane

    2012-12-01

    This paper deals with the description of the modulation at large scale of high frequency acoustic waves in gas saturated periodic porous media. High frequencies mean local dynamics at the pore scale and therefore absence of scale separation in the usual sense of homogenization. However, although the pressure is spatially varying in the pores (according to periodic eigenmodes), the mode amplitude can present a large scale modulation, thereby introducing another type of scale separation to which the asymptotic multi-scale procedure applies. The approach is first presented on a periodic network of inter-connected Helmholtz resonators. The equations governing the modulations carried by periodic eigenmodes, at frequencies close to their eigenfrequency, are derived. The number of cells on which the carrying periodic mode is defined is therefore a parameter of the modeling. In a second part, the asymptotic approach is developed for periodic porous media saturated by a perfect gas. Using the "multicells" periodic condition, one obtains the family of equations governing the amplitude modulation at large scale of high frequency waves. The significant difference between modulations of simple and multiple mode are evidenced and discussed. The features of the modulation (anisotropy, width of frequency band) are also analyzed.

  8. On the renormalization of the effective field theory of large scale structures

    International Nuclear Information System (INIS)

    Pajer, Enrico; Zaldarriaga, Matias

    2013-01-01

    Standard perturbation theory (SPT) for large-scale matter inhomogeneities is unsatisfactory for at least three reasons: there is no clear expansion parameter since the density contrast is not small on all scales; it does not fully account for deviations at large scales from a perfect pressureless fluid induced by short-scale non-linearities; for generic initial conditions, loop corrections are UV-divergent, making predictions cutoff dependent and hence unphysical. The Effective Field Theory of Large Scale Structures successfully addresses all three issues. Here we focus on the third one and show explicitly that the terms induced by integrating out short scales, neglected in SPT, have exactly the right scale dependence to cancel all UV-divergences at one loop, and this should hold at all loops. A particularly clear example is an Einstein deSitter universe with no-scale initial conditions P in ∼ k n . After renormalizing the theory, we use self-similarity to derive a very simple result for the final power spectrum for any n, excluding two-loop corrections and higher. We show how the relative importance of different corrections depends on n. For n ∼ −1.5, relevant for our universe, pressure and dissipative corrections are more important than the two-loop corrections

  9. On the renormalization of the effective field theory of large scale structures

    Energy Technology Data Exchange (ETDEWEB)

    Pajer, Enrico [Department of Physics, Princeton University, Princeton, NJ 08544 (United States); Zaldarriaga, Matias, E-mail: enrico.pajer@gmail.com, E-mail: matiasz@ias.edu [Institute for Advanced Study, Princeton, NJ 08544 (United States)

    2013-08-01

    Standard perturbation theory (SPT) for large-scale matter inhomogeneities is unsatisfactory for at least three reasons: there is no clear expansion parameter since the density contrast is not small on all scales; it does not fully account for deviations at large scales from a perfect pressureless fluid induced by short-scale non-linearities; for generic initial conditions, loop corrections are UV-divergent, making predictions cutoff dependent and hence unphysical. The Effective Field Theory of Large Scale Structures successfully addresses all three issues. Here we focus on the third one and show explicitly that the terms induced by integrating out short scales, neglected in SPT, have exactly the right scale dependence to cancel all UV-divergences at one loop, and this should hold at all loops. A particularly clear example is an Einstein deSitter universe with no-scale initial conditions P{sub in} ∼ k{sup n}. After renormalizing the theory, we use self-similarity to derive a very simple result for the final power spectrum for any n, excluding two-loop corrections and higher. We show how the relative importance of different corrections depends on n. For n ∼ −1.5, relevant for our universe, pressure and dissipative corrections are more important than the two-loop corrections.

  10. Stability and Control of Large-Scale Dynamical Systems A Vector Dissipative Systems Approach

    CERN Document Server

    Haddad, Wassim M

    2011-01-01

    Modern complex large-scale dynamical systems exist in virtually every aspect of science and engineering, and are associated with a wide variety of physical, technological, environmental, and social phenomena, including aerospace, power, communications, and network systems, to name just a few. This book develops a general stability analysis and control design framework for nonlinear large-scale interconnected dynamical systems, and presents the most complete treatment on vector Lyapunov function methods, vector dissipativity theory, and decentralized control architectures. Large-scale dynami

  11. A Combined Eulerian-Lagrangian Data Representation for Large-Scale Applications.

    Science.gov (United States)

    Sauer, Franz; Xie, Jinrong; Ma, Kwan-Liu

    2017-10-01

    The Eulerian and Lagrangian reference frames each provide a unique perspective when studying and visualizing results from scientific systems. As a result, many large-scale simulations produce data in both formats, and analysis tasks that simultaneously utilize information from both representations are becoming increasingly popular. However, due to their fundamentally different nature, drawing correlations between these data formats is a computationally difficult task, especially in a large-scale setting. In this work, we present a new data representation which combines both reference frames into a joint Eulerian-Lagrangian format. By reorganizing Lagrangian information according to the Eulerian simulation grid into a "unit cell" based approach, we can provide an efficient out-of-core means of sampling, querying, and operating with both representations simultaneously. We also extend this design to generate multi-resolution subsets of the full data to suit the viewer's needs and provide a fast flow-aware trajectory construction scheme. We demonstrate the effectiveness of our method using three large-scale real world scientific datasets and provide insight into the types of performance gains that can be achieved.

  12. Use of large-scale acoustic monitoring to assess anthropogenic pressures on Orthoptera communities.

    Science.gov (United States)

    Penone, Caterina; Le Viol, Isabelle; Pellissier, Vincent; Julien, Jean-François; Bas, Yves; Kerbiriou, Christian

    2013-10-01

    Biodiversity monitoring at large spatial and temporal scales is greatly needed in the context of global changes. Although insects are a species-rich group and are important for ecosystem functioning, they have been largely neglected in conservation studies and policies, mainly due to technical and methodological constraints. Sound detection, a nondestructive method, is easily applied within a citizen-science framework and could be an interesting solution for insect monitoring. However, it has not yet been tested at a large scale. We assessed the value of a citizen-science program in which Orthoptera species (Tettigoniidae) were monitored acoustically along roads. We used Bayesian model-averaging analyses to test whether we could detect widely known patterns of anthropogenic effects on insects, such as the negative effects of urbanization or intensive agriculture on Orthoptera populations and communities. We also examined site-abundance correlations between years and estimated the biases in species detection to evaluate and improve the protocol. Urbanization and intensive agricultural landscapes negatively affected Orthoptera species richness, diversity, and abundance. This finding is consistent with results of previous studies of Orthoptera, vertebrates, carabids, and butterflies. The average mass of communities decreased as urbanization increased. The dispersal ability of communities increased as the percentage of agricultural land and, to a lesser extent, urban area increased. Despite changes in abundances over time, we found significant correlations between yearly abundances. We identified biases linked to the protocol (e.g., car speed or temperature) that can be accounted for ease in analyses. We argue that acoustic monitoring of Orthoptera along roads offers several advantages for assessing Orthoptera biodiversity at large spatial and temporal extents, particularly in a citizen science framework. © 2013 Society for Conservation Biology.

  13. Cosmological streaming velocities and large-scale density maxima

    International Nuclear Information System (INIS)

    Peacock, J.A.; Lumsden, S.L.; Heavens, A.F.

    1987-01-01

    The statistical testing of models for galaxy formation against the observed peculiar velocities on 10-100 Mpc scales is considered. If it is assumed that observers are likely to be sited near maxima in the primordial field of density perturbations, then the observed filtered velocity field will be biased to low values by comparison with a point selected at random. This helps to explain how the peculiar velocities (relative to the microwave background) of the local supercluster and the Rubin-Ford shell can be so similar in magnitude. Using this assumption to predict peculiar velocities on two scales, we test models with large-scale damping (i.e. adiabatic perturbations). Allowed models have a damping length close to the Rubin-Ford scale and are mildly non-linear. Both purely baryonic universes and universes dominated by massive neutrinos can account for the observed velocities, provided 0.1 ≤ Ω ≤ 1. (author)

  14. Selective vulnerability related to aging in large-scale resting brain networks.

    Science.gov (United States)

    Zhang, Hong-Ying; Chen, Wen-Xin; Jiao, Yun; Xu, Yao; Zhang, Xiang-Rong; Wu, Jing-Tao

    2014-01-01

    Normal aging is associated with cognitive decline. Evidence indicates that large-scale brain networks are affected by aging; however, it has not been established whether aging has equivalent effects on specific large-scale networks. In the present study, 40 healthy subjects including 22 older (aged 60-80 years) and 18 younger (aged 22-33 years) adults underwent resting-state functional MRI scanning. Four canonical resting-state networks, including the default mode network (DMN), executive control network (ECN), dorsal attention network (DAN) and salience network, were extracted, and the functional connectivities in these canonical networks were compared between the younger and older groups. We found distinct, disruptive alterations present in the large-scale aging-related resting brain networks: the ECN was affected the most, followed by the DAN. However, the DMN and salience networks showed limited functional connectivity disruption. The visual network served as a control and was similarly preserved in both groups. Our findings suggest that the aged brain is characterized by selective vulnerability in large-scale brain networks. These results could help improve our understanding of the mechanism of degeneration in the aging brain. Additional work is warranted to determine whether selective alterations in the intrinsic networks are related to impairments in behavioral performance.

  15. Large Vessel Occlusion Scales Increase Delivery to Endovascular Centers Without Excessive Harm From Misclassifications.

    Science.gov (United States)

    Zhao, Henry; Coote, Skye; Pesavento, Lauren; Churilov, Leonid; Dewey, Helen M; Davis, Stephen M; Campbell, Bruce C V

    2017-03-01

    Clinical large vessel occlusion (LVO) triage scales were developed to identify and bypass LVO to endovascular centers. However, there are concerns that scale misclassification of patients may cause excessive harm. We studied the settings where misclassifications were likely to occur and the consequences of these misclassifications in a representative stroke population. Prospective data were collected from consecutive ambulance-initiated stroke alerts at 2 stroke centers, with patients stratified into typical (LVO with predefined severe syndrome and non-LVO without) or atypical presentations (opposite situations). Five scales (Rapid Arterial Occlusion Evaluation [RACE], Los Angeles Motor Scale [LAMS], Field Assessment Stroke Triage for Emergency Destination [FAST-ED], Prehospital Acute Stroke Severity scale [PASS], and Cincinnati Prehospital Stroke Severity Scale [CPSSS]) were derived from the baseline National Institutes of Health Stroke Scale scored by doctors and analyzed for diagnostic performance compared with imaging. Of a total of 565 patients, atypical presentations occurred in 31 LVO (38% of LVO) and 50 non-LVO cases (10%). Most scales correctly identified >95% of typical presentations but <20% of atypical presentations. Misclassification attributable to atypical presentations would have resulted in 4 M1/internal carotid artery occlusions, with National Institutes of Health Stroke Scale score ≥6 (5% of LVO) being missed and 9 non-LVO infarcts (5%) bypassing the nearest thrombolysis center. Atypical presentations accounted for the bulk of scale misclassifications, but the majority of these misclassifications were not detrimental, and use of LVO scales would significantly increase timely delivery to endovascular centers, with only a small proportion of non-LVO infarcts bypassing the nearest thrombolysis center. Our findings, however, would require paramedics to score as accurately as doctors, and this translation is made difficult by weaknesses in current

  16. The relationship between large-scale and convective states in the tropics - Towards an improved representation of convection in large-scale models

    Energy Technology Data Exchange (ETDEWEB)

    Jakob, Christian [Monash Univ., Melbourne, VIC (Australia)

    2015-02-26

    This report summarises an investigation into the relationship of tropical thunderstorms to the atmospheric conditions they are embedded in. The study is based on the use of radar observations at the Atmospheric Radiation Measurement site in Darwin run under the auspices of the DOE Atmospheric Systems Research program. Linking the larger scales of the atmosphere with the smaller scales of thunderstorms is crucial for the development of the representation of thunderstorms in weather and climate models, which is carried out by a process termed parametrisation. Through the analysis of radar and wind profiler observations the project made several fundamental discoveries about tropical storms and quantified the relationship of the occurrence and intensity of these storms to the large-scale atmosphere. We were able to show that the rainfall averaged over an area the size of a typical climate model grid-box is largely controlled by the number of storms in the area, and less so by the storm intensity. This allows us to completely rethink the way we represent such storms in climate models. We also found that storms occur in three distinct categories based on their depth and that the transition between these categories is strongly related to the larger scale dynamical features of the atmosphere more so than its thermodynamic state. Finally, we used our observational findings to test and refine a new approach to cumulus parametrisation which relies on the stochastic modelling of the area covered by different convective cloud types.

  17. Large scale inhomogeneities and the cosmological principle

    International Nuclear Information System (INIS)

    Lukacs, B.; Meszaros, A.

    1984-12-01

    The compatibility of cosmologic principles and possible large scale inhomogeneities of the Universe is discussed. It seems that the strongest symmetry principle which is still compatible with reasonable inhomogeneities, is a full conformal symmetry in the 3-space defined by the cosmological velocity field, but even in such a case, the standard model is isolated from the inhomogeneous ones when the whole evolution is considered. (author)

  18. Large-scale circulation departures related to wet episodes in northeast Brazil

    Science.gov (United States)

    Sikdar, D. N.; Elsner, J. B.

    1985-01-01

    Large scale circulation features are presented as related to wet spells over northeast Brazil (Nordeste) during the rainy season (March and April) of 1979. The rainy season season is devided into dry and wet periods, the FGGE and geostationary satellite data was averaged and mean and departure fields of basic variables and cloudiness were studied. Analysis of seasonal mean circulation features show: lowest sea level easterlies beneath upper level westerlies; weak meridional winds; high relative humidity over the Amazon basin and relatively dry conditions over the South Atlantic Ocean. A fluctuation was found in the large scale circulation features on time scales of a few weeks or so over Nordeste and the South Atlantic sector. Even the subtropical High SLP's have large departures during wet episodes, implying a short period oscillation in the Southern Hemisphere Hadley circulation.

  19. The three-point function as a probe of models for large-scale structure

    International Nuclear Information System (INIS)

    Frieman, J.A.; Gaztanaga, E.

    1993-01-01

    The authors analyze the consequences of models of structure formation for higher-order (n-point) galaxy correlation functions in the mildly non-linear regime. Several variations of the standard Ω = 1 cold dark matter model with scale-invariant primordial perturbations have recently been introduced to obtain more power on large scales, R p ∼20 h -1 Mpc, e.g., low-matter-density (non-zero cosmological constant) models, open-quote tilted close-quote primordial spectra, and scenarios with a mixture of cold and hot dark matter. They also include models with an effective scale-dependent bias, such as the cooperative galaxy formation scenario of Bower, et al. The authors show that higher-order (n-point) galaxy correlation functions can provide a useful test of such models and can discriminate between models with true large-scale power in the density field and those where the galaxy power arises from scale-dependent bias: a bias with rapid scale-dependence leads to a dramatic decrease of the hierarchical amplitudes Q J at large scales, r approx-gt R p . Current observational constraints on the three-point amplitudes Q 3 and S 3 can place limits on the bias parameter(s) and appear to disfavor, but not yet rule out, the hypothesis that scale-dependent bias is responsible for the extra power observed on large scales

  20. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir

    2018-02-24

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  1. Exploiting Data Sparsity for Large-Scale Matrix Computations

    KAUST Repository

    Akbudak, Kadir; Ltaief, Hatem; Mikhalev, Aleksandr; Charara, Ali; Keyes, David E.

    2018-01-01

    Exploiting data sparsity in dense matrices is an algorithmic bridge between architectures that are increasingly memory-austere on a per-core basis and extreme-scale applications. The Hierarchical matrix Computations on Manycore Architectures (HiCMA) library tackles this challenging problem by achieving significant reductions in time to solution and memory footprint, while preserving a specified accuracy requirement of the application. HiCMA provides a high-performance implementation on distributed-memory systems of one of the most widely used matrix factorization in large-scale scientific applications, i.e., the Cholesky factorization. It employs the tile low-rank data format to compress the dense data-sparse off-diagonal tiles of the matrix. It then decomposes the matrix computations into interdependent tasks and relies on the dynamic runtime system StarPU for asynchronous out-of-order scheduling, while allowing high user-productivity. Performance comparisons and memory footprint on matrix dimensions up to eleven million show a performance gain and memory saving of more than an order of magnitude for both metrics on thousands of cores, against state-of-the-art open-source and vendor optimized numerical libraries. This represents an important milestone in enabling large-scale matrix computations toward solving big data problems in geospatial statistics for climate/weather forecasting applications.

  2. Divergence of perturbation theory in large scale structures

    Science.gov (United States)

    Pajer, Enrico; van der Woude, Drian

    2018-05-01

    We make progress towards an analytical understanding of the regime of validity of perturbation theory for large scale structures and the nature of some non-perturbative corrections. We restrict ourselves to 1D gravitational collapse, for which exact solutions before shell crossing are known. We review the convergence of perturbation theory for the power spectrum, recently proven by McQuinn and White [1], and extend it to non-Gaussian initial conditions and the bispectrum. In contrast, we prove that perturbation theory diverges for the real space two-point correlation function and for the probability density function (PDF) of the density averaged in cells and all the cumulants derived from it. We attribute these divergences to the statistical averaging intrinsic to cosmological observables, which, even on very large and "perturbative" scales, gives non-vanishing weight to all extreme fluctuations. Finally, we discuss some general properties of non-perturbative effects in real space and Fourier space.

  3. Dose monitoring in large-scale flowing aqueous media

    International Nuclear Information System (INIS)

    Kuruca, C.N.

    1995-01-01

    The Miami Electron Beam Research Facility (EBRF) has been in operation for six years. The EBRF houses a 1.5 MV, 75 KW DC scanned electron beam. Experiments have been conducted to evaluate the effectiveness of high-energy electron irradiation in the removal of toxic organic chemicals from contaminated water and the disinfection of various wastewater streams. The large-scale plant operates at approximately 450 L/min (120 gal/min). The radiation dose absorbed by the flowing aqueous streams is estimated by measuring the difference in water temperature before and after it passes in front of the beam. Temperature measurements are made using resistance temperature devices (RTDs) and recorded by computer along with other operating parameters. Estimated dose is obtained from the measured temperature differences using the specific heat of water. This presentation will discuss experience with this measurement system, its application to different water presentation devices, sources of error, and the advantages and disadvantages of its use in large-scale process applications

  4. Using landscape ecology to test hypotheses about large-scale abundance patterns in migratory birds

    Science.gov (United States)

    Flather, C.H.; Sauer, J.R.

    1996-01-01

    The hypothesis that Neotropical migrant birds may be undergoing widespread declines due to land use activities on the breeding grounds has been examined primarily by synthesizing results from local studies. Growing concern for the cumulative influence of land use activities on ecological systems has heightened the need for large-scale studies to complement what has been observed at local scales. We investigated possible landscape effects on Neotropical migrant bird populations for the eastern United States by linking two large-scale inventories designed to monitor breeding-bird abundances and land use patterns. The null hypothesis of no relation between landscape structure and Neotropical migrant abundance was tested by correlating measures of landscape structure with bird abundance, while controlling for the geographic distance among samples. Neotropical migrants as a group were more 'sensitive' to landscape structure than either temperate migrants or permanent residents. Neotropical migrants tended to be more abundant in landscapes with a greater proportion of forest and wetland habitats, fewer edge habitats, large forest patches, and with forest habitats well dispersed throughout the scene. Permanent residents showed few correlations with landscape structure and temperate migrants were associated with habitat diversity and edge attributes rather than with the amount, size, and dispersion of forest habitats. The association between Neotropical migrant abundance and forest fragmentation differed among physiographic strata, suggesting that land-scape context affects observed relations between bird abundance and landscape structure. Finally, associations between landscape structure and temporal trends in Neotropical migrant abundance were negatively correlated with forest habitats. These results suggest that extrapolation of patterns observed in some landscapes is not likely to hold regionally, and that conservation policies must consider the variation in landscape

  5. Disinformative data in large-scale hydrological modelling

    Directory of Open Access Journals (Sweden)

    A. Kauffeldt

    2013-07-01

    Full Text Available Large-scale hydrological modelling has become an important tool for the study of global and regional water resources, climate impacts, and water-resources management. However, modelling efforts over large spatial domains are fraught with problems of data scarcity, uncertainties and inconsistencies between model forcing and evaluation data. Model-independent methods to screen and analyse data for such problems are needed. This study aimed at identifying data inconsistencies in global datasets using a pre-modelling analysis, inconsistencies that can be disinformative for subsequent modelling. The consistency between (i basin areas for different hydrographic datasets, and (ii between climate data (precipitation and potential evaporation and discharge data, was examined in terms of how well basin areas were represented in the flow networks and the possibility of water-balance closure. It was found that (i most basins could be well represented in both gridded basin delineations and polygon-based ones, but some basins exhibited large area discrepancies between flow-network datasets and archived basin areas, (ii basins exhibiting too-high runoff coefficients were abundant in areas where precipitation data were likely affected by snow undercatch, and (iii the occurrence of basins exhibiting losses exceeding the potential-evaporation limit was strongly dependent on the potential-evaporation data, both in terms of numbers and geographical distribution. Some inconsistencies may be resolved by considering sub-grid variability in climate data, surface-dependent potential-evaporation estimates, etc., but further studies are needed to determine the reasons for the inconsistencies found. Our results emphasise the need for pre-modelling data analysis to identify dataset inconsistencies as an important first step in any large-scale study. Applying data-screening methods before modelling should also increase our chances to draw robust conclusions from subsequent

  6. A comparison of working in small-scale and large-scale nursing homes: A systematic review of quantitative and qualitative evidence.

    Science.gov (United States)

    Vermeerbergen, Lander; Van Hootegem, Geert; Benders, Jos

    2017-02-01

    Ongoing shortages of care workers, together with an ageing population, make it of utmost importance to increase the quality of working life in nursing homes. Since the 1970s, normalised and small-scale nursing homes have been increasingly introduced to provide care in a family and homelike environment, potentially providing a richer work life for care workers as well as improved living conditions for residents. 'Normalised' refers to the opportunities given to residents to live in a manner as close as possible to the everyday life of persons not needing care. The study purpose is to provide a synthesis and overview of empirical research comparing the quality of working life - together with related work and health outcomes - of professional care workers in normalised small-scale nursing homes as compared to conventional large-scale ones. A systematic review of qualitative and quantitative studies. A systematic literature search (April 2015) was performed using the electronic databases Pubmed, Embase, PsycInfo, CINAHL and Web of Science. References and citations were tracked to identify additional, relevant studies. We identified 825 studies in the selected databases. After checking the inclusion and exclusion criteria, nine studies were selected for review. Two additional studies were selected after reference and citation tracking. Three studies were excluded after requesting more information on the research setting. The findings from the individual studies suggest that levels of job control and job demands (all but "time pressure") are higher in normalised small-scale homes than in conventional large-scale nursing homes. Additionally, some studies suggested that social support and work motivation are higher, while risks of burnout and mental strain are lower, in normalised small-scale nursing homes. Other studies found no differences or even opposing findings. The studies reviewed showed that these inconclusive findings can be attributed to care workers in some

  7. Results of Large-Scale Spacecraft Flammability Tests

    Science.gov (United States)

    Ferkul, Paul; Olson, Sandra; Urban, David L.; Ruff, Gary A.; Easton, John; T'ien, James S.; Liao, Ta-Ting T.; Fernandez-Pello, A. Carlos; Torero, Jose L.; Eigenbrand, Christian; hide

    2017-01-01

    For the first time, a large-scale fire was intentionally set inside a spacecraft while in orbit. Testing in low gravity aboard spacecraft had been limited to samples of modest size: for thin fuels the longest samples burned were around 15 cm in length and thick fuel samples have been even smaller. This is despite the fact that fire is a catastrophic hazard for spaceflight and the spread and growth of a fire, combined with its interactions with the vehicle cannot be expected to scale linearly. While every type of occupied structure on earth has been the subject of full scale fire testing, this had never been attempted in space owing to the complexity, cost, risk and absence of a safe location. Thus, there is a gap in knowledge of fire behavior in spacecraft. The recent utilization of large, unmanned, resupply craft has provided the needed capability: a habitable but unoccupied spacecraft in low earth orbit. One such vehicle was used to study the flame spread over a 94 x 40.6 cm thin charring solid (fiberglasscotton fabric). The sample was an order of magnitude larger than anything studied to date in microgravity and was of sufficient scale that it consumed 1.5 of the available oxygen. The experiment which is called Saffire consisted of two tests, forward or concurrent flame spread (with the direction of flow) and opposed flame spread (against the direction of flow). The average forced air speed was 20 cms. For the concurrent flame spread test, the flame size remained constrained after the ignition transient, which is not the case in 1-g. These results were qualitatively different from those on earth where an upward-spreading flame on a sample of this size accelerates and grows. In addition, a curious effect of the chamber size is noted. Compared to previous microgravity work in smaller tunnels, the flame in the larger tunnel spread more slowly, even for a wider sample. This is attributed to the effect of flow acceleration in the smaller tunnels as a result of hot

  8. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    Science.gov (United States)

    Dednam, W.; Botha, A. E.

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  9. Evaluation of Kirkwood-Buff integrals via finite size scaling: a large scale molecular dynamics study

    International Nuclear Information System (INIS)

    Dednam, W; Botha, A E

    2015-01-01

    Solvation of bio-molecules in water is severely affected by the presence of co-solvent within the hydration shell of the solute structure. Furthermore, since solute molecules can range from small molecules, such as methane, to very large protein structures, it is imperative to understand the detailed structure-function relationship on the microscopic level. For example, it is useful know the conformational transitions that occur in protein structures. Although such an understanding can be obtained through large-scale molecular dynamic simulations, it is often the case that such simulations would require excessively large simulation times. In this context, Kirkwood-Buff theory, which connects the microscopic pair-wise molecular distributions to global thermodynamic properties, together with the recently developed technique, called finite size scaling, may provide a better method to reduce system sizes, and hence also the computational times. In this paper, we present molecular dynamics trial simulations of biologically relevant low-concentration solvents, solvated by aqueous co-solvent solutions. In particular we compare two different methods of calculating the relevant Kirkwood-Buff integrals. The first (traditional) method computes running integrals over the radial distribution functions, which must be obtained from large system-size NVT or NpT simulations. The second, newer method, employs finite size scaling to obtain the Kirkwood-Buff integrals directly by counting the particle number fluctuations in small, open sub-volumes embedded within a larger reservoir that can be well approximated by a much smaller simulation cell. In agreement with previous studies, which made a similar comparison for aqueous co-solvent solutions, without the additional solvent, we conclude that the finite size scaling method is also applicable to the present case, since it can produce computationally more efficient results which are equivalent to the more costly radial distribution

  10. Large Scale Landform Mapping Using Lidar DEM

    Directory of Open Access Journals (Sweden)

    Türkay Gökgöz

    2015-08-01

    Full Text Available In this study, LIDAR DEM data was used to obtain a primary landform map in accordance with a well-known methodology. This primary landform map was generalized using the Focal Statistics tool (Majority, considering the minimum area condition in cartographic generalization in order to obtain landform maps at 1:1000 and 1:5000 scales. Both the primary and the generalized landform maps were verified visually with hillshaded DEM and an orthophoto. As a result, these maps provide satisfactory visuals of the landforms. In order to show the effect of generalization, the area of each landform in both the primary and the generalized maps was computed. Consequently, landform maps at large scales could be obtained with the proposed methodology, including generalization using LIDAR DEM.

  11. Monitoring and Information Fusion for Search and Rescue Operations in Large-Scale Disasters

    National Research Council Canada - National Science Library

    Nardi, Daniele

    2002-01-01

    ... for information fusion with application to search-and-rescue and large scale disaster relief. The objective is to develop and to deploy tools to support the monitoring activities in an intervention caused by a large-scale disaster...

  12. Advances in a framework to compare bio-dosimetry methods for triage in large-scale radiation events

    International Nuclear Information System (INIS)

    Flood, Ann Barry; Boyle, Holly K.; Du, Gaixin; Demidenko, Eugene; Williams, Benjamin B.; Swartz, Harold M.; Nicolalde, Roberto J.

    2014-01-01

    Planning and preparation for a large-scale nuclear event would be advanced by assessing the applicability of potentially available bio-dosimetry methods. Using an updated comparative framework the performance of six bio-dosimetry methods was compared for five different population sizes (100-1 000 000) and two rates for initiating processing of the marker (15 or 15 000 people per hour) with four additional time windows. These updated factors are extrinsic to the bio-dosimetry methods themselves but have direct effects on each method's ability to begin processing individuals and the size of the population that can be accommodated. The results indicate that increased population size, along with severely compromised infrastructure, increases the time needed to triage, which decreases the usefulness of many time intensive dosimetry methods. This framework and model for evaluating bio-dosimetry provides important information for policy-makers and response planners to facilitate evaluation of each method and should advance coordination of these methods into effective triage plans. (authors)

  13. Building Participation in Large-scale Conservation: Lessons from Belize and Panama

    Directory of Open Access Journals (Sweden)

    Jesse Guite Hastings

    2015-01-01

    Full Text Available Motivated by biogeography and a desire for alignment with the funding priorities of donors, the twenty-first century has seen big international NGOs shifting towards a large-scale conservation approach. This shift has meant that even before stakeholders at the national and local scale are involved, conservation programmes often have their objectives defined and funding allocated. This paper uses the experiences of Conservation International′s Marine Management Area Science (MMAS programme in Belize and Panama to explore how to build participation at the national and local scale while working within the bounds of the current conservation paradigm. Qualitative data about MMAS was gathered through a multi-sited ethnographic research process, utilising document review, direct observation, and semi-structured interviews with 82 informants in Belize, Panama, and the United States of America. Results indicate that while a large-scale approach to conservation disadvantages early national and local stakeholder participation, this effect can be mediated through focusing engagement efforts, paying attention to context, building horizontal and vertical partnerships, and using deliberative processes that promote learning. While explicit consideration of geopolitics and local complexity alongside biogeography in the planning phase of a large-scale conservation programme is ideal, actions taken by programme managers during implementation can still have a substantial impact on conservation outcomes.

  14. Cosmology Large Angular Scale Surveyor (CLASS) Focal Plane Development

    Science.gov (United States)

    Chuss, D. T.; Ali, A.; Amiri, M.; Appel, J.; Bennett, C. L.; Colazo, F.; Denis, K. L.; Dunner, R.; Essinger-Hileman, T.; Eimer, J.; hide

    2015-01-01

    The Cosmology Large Angular Scale Surveyor (CLASS) will measure the polarization of the Cosmic Microwave Background to search for and characterize the polarized signature of inflation. CLASS will operate from the Atacama Desert and observe approx.70% of the sky. A variable-delay polarization modulator provides modulation of the polarization at approx.10Hz to suppress the 1/f noise of the atmosphere and enable the measurement of the large angular scale polarization modes. The measurement of the inflationary signal across angular scales that spans both the recombination and reionization features allows a test of the predicted shape of the polarized angular power spectra in addition to a measurement of the energy scale of inflation. CLASS is an array of telescopes covering frequencies of 38, 93, 148, and 217 GHz. These frequencies straddle the foreground minimum and thus allow the extraction of foregrounds from the primordial signal. Each focal plane contains feedhorn-coupled transition-edge sensors that simultaneously detect two orthogonal linear polarizations. The use of single-crystal silicon as the dielectric for the on-chip transmission lines enables both high efficiency and uniformity in fabrication. Integrated band definition has been implemented that both controls the bandpass of the single-mode transmission on the chip and prevents stray light from coupling to the detectors.

  15. Mapping spatial patterns of denitrifiers at large scales (Invited)

    Science.gov (United States)

    Philippot, L.; Ramette, A.; Saby, N.; Bru, D.; Dequiedt, S.; Ranjard, L.; Jolivet, C.; Arrouays, D.

    2010-12-01

    Little information is available regarding the landscape-scale distribution of microbial communities and its environmental determinants. Here we combined molecular approaches and geostatistical modeling to explore spatial patterns of the denitrifying community at large scales. The distribution of denitrifrying community was investigated over 107 sites in Burgundy, a 31 500 km2 region of France, using a 16 X 16 km sampling grid. At each sampling site, the abundances of denitrifiers and 42 soil physico-chemical properties were measured. The relative contributions of land use, spatial distance, climatic conditions, time and soil physico-chemical properties to the denitrifier spatial distribution were analyzed by canonical variation partitioning. Our results indicate that 43% to 85% of the spatial variation in community abundances could be explained by the measured environmental parameters, with soil chemical properties (mostly pH) being the main driver. We found spatial autocorrelation up to 739 km and used geostatistical modelling to generate predictive maps of the distribution of denitrifiers at the landscape scale. Studying the distribution of the denitrifiers at large scale can help closing the artificial gap between the investigation of microbial processes and microbial community ecology, therefore facilitating our understanding of the relationships between the ecology of denitrifiers and N-fluxes by denitrification.

  16. ability in Large Scale Land Acquisitions in Kenya

    International Development Research Centre (IDRC) Digital Library (Canada)

    Corey Piccioni

    Kenya's national planning strategy, Vision 2030. Agri- culture, natural resource exploitation, and infrastruc- ... sitions due to high levels of poverty and unclear or in- secure land tenure rights in Kenya. Inadequate social ... lease to a private company over the expansive Yala. Swamp to undertake large-scale irrigation farming.

  17. An industrial perspective on bioreactor scale-down: what we can learn from combined large-scale bioprocess and model fluid studies.

    Science.gov (United States)

    Noorman, Henk

    2011-08-01

    For industrial bioreactor design, operation, control and optimization, the scale-down approach is often advocated to efficiently generate data on a small scale, and effectively apply suggested improvements to the industrial scale. In all cases it is important to ensure that the scale-down conditions are representative of the real large-scale bioprocess. Progress is hampered by limited detailed and local information from large-scale bioprocesses. Complementary to real fermentation studies, physical aspects of model fluids such as air-water in large bioreactors provide useful information with limited effort and cost. Still, in industrial practice, investments of time, capital and resources often prohibit systematic work, although, in the end, savings obtained in this way are trivial compared to the expenses that result from real process disturbances, batch failures, and non-flyers with loss of business opportunity. Here we try to highlight what can be learned from real large-scale bioprocess in combination with model fluid studies, and to provide suitable computation tools to overcome data restrictions. Focus is on a specific well-documented case for a 30-m(3) bioreactor. Areas for further research from an industrial perspective are also indicated. Copyright © 2011 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Human visual system automatically represents large-scale sequential regularities.

    Science.gov (United States)

    Kimura, Motohiro; Widmann, Andreas; Schröger, Erich

    2010-03-04

    Our brain recordings reveal that large-scale sequential regularities defined across non-adjacent stimuli can be automatically represented in visual sensory memory. To show that, we adopted an auditory paradigm developed by Sussman, E., Ritter, W., and Vaughan, H. G. Jr. (1998). Predictability of stimulus deviance and the mismatch negativity. NeuroReport, 9, 4167-4170, Sussman, E., and Gumenyuk, V. (2005). Organization of sequential sounds in auditory memory. NeuroReport, 16, 1519-1523 to the visual domain by presenting task-irrelevant infrequent luminance-deviant stimuli (D, 20%) inserted among task-irrelevant frequent stimuli being of standard luminance (S, 80%) in randomized (randomized condition, SSSDSSSSSDSSSSD...) and fixed manners (fixed condition, SSSSDSSSSDSSSSD...). Comparing the visual mismatch negativity (visual MMN), an event-related brain potential (ERP) index of memory-mismatch processes in human visual sensory system, revealed that visual MMN elicited by deviant stimuli was reduced in the fixed compared to the randomized condition. Thus, the large-scale sequential regularity being present in the fixed condition (SSSSD) must have been represented in visual sensory memory. Interestingly, this effect did not occur in conditions with stimulus-onset asynchronies (SOAs) of 480 and 800 ms but was confined to the 160-ms SOA condition supporting the hypothesis that large-scale regularity extraction was based on perceptual grouping of the five successive stimuli defining the regularity. 2010 Elsevier B.V. All rights reserved.

  19. Large-scale climate variation modifies the winter grouping behavior of endangered Indiana bats

    Science.gov (United States)

    Thogmartin, Wayne E.; McKann, Patrick C.

    2014-01-01

    Power laws describe the functional relationship between 2 quantities, such as the frequency of a group as the multiplicative power of group size. We examined whether the annual size of well-surveyed wintering populations of endangered Indiana bats (Myotis sodalis) followed a power law, and then leveraged this relationship to predict whether the aggregation of Indiana bats in winter was influenced by global climate processes. We determined that Indiana bat wintering populations were distributed according to a power law (mean scaling coefficient α = −0.44 [95% confidence interval {95% CI} = −0.61, −0.28). The antilog of these annual scaling coefficients ranged between 0.67 and 0.81, coincident with the three-fourths power found in many other biological phenomena. We associated temporal patterns in the annual (1983–2011) scaling coefficient with the North Atlantic Oscillation (NAO) index in August (βNAOAugust = −0.017 [90% CI = −0.032, −0.002]), when Indiana bats are deciding when and where to hibernate. After accounting for the strong effect of philopatry to habitual wintering locations, Indiana bats aggregated in larger wintering populations during periods of severe winter and in smaller populations in milder winters. The association with August values of the NAO indicates that bats anticipate future winter weather conditions when deciding where to roost, a heretofore unrecognized role for prehibernation swarming behavior. Future research is needed to understand whether the three-fourths–scaling patterns we observed are related to scaling in metabolism.

  20. Large-Scale Spray Releases: Additional Aerosol Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Daniel, Richard C.; Gauglitz, Phillip A.; Burns, Carolyn A.; Fountain, Matthew S.; Shimskey, Rick W.; Billing, Justin M.; Bontha, Jagannadha R.; Kurath, Dean E.; Jenks, Jeromy WJ; MacFarlan, Paul J.; Mahoney, Lenna A.

    2013-08-01

    One of the events postulated in the hazard analysis for the Waste Treatment and Immobilization Plant (WTP) and other U.S. Department of Energy (DOE) nuclear facilities is a breach in process piping that produces aerosols with droplet sizes in the respirable range. The current approach for predicting the size and concentration of aerosols produced in a spray leak event involves extrapolating from correlations reported in the literature. These correlations are based on results obtained from small engineered spray nozzles using pure liquids that behave as a Newtonian fluid. The narrow ranges of physical properties on which the correlations are based do not cover the wide range of slurries and viscous materials that will be processed in the WTP and in processing facilities across the DOE complex. To expand the data set upon which the WTP accident and safety analyses were based, an aerosol spray leak testing program was conducted by Pacific Northwest National Laboratory (PNNL). PNNL’s test program addressed two key technical areas to improve the WTP methodology (Larson and Allen 2010). The first technical area was to quantify the role of slurry particles in small breaches where slurry particles may plug the hole and prevent high-pressure sprays. The results from an effort to address this first technical area can be found in Mahoney et al. (2012a). The second technical area was to determine aerosol droplet size distribution and total droplet volume from prototypic breaches and fluids, including sprays from larger breaches and sprays of slurries for which literature data are mostly absent. To address the second technical area, the testing program collected aerosol generation data at two scales, commonly referred to as small-scale and large-scale testing. The small-scale testing and resultant data are described in Mahoney et al. (2012b), and the large-scale testing and resultant data are presented in Schonewill et al. (2012). In tests at both scales, simulants were used

  1. COMBINED EFFECTS OF GALAXY INTERACTIONS AND LARGE-SCALE ENVIRONMENT ON GALAXY PROPERTIES

    International Nuclear Information System (INIS)

    Park, Changbom; Choi, Yun-Young

    2009-01-01

    We inspect the coupled dependence of physical parameters of the Sloan Digital Sky Survey galaxies on the small-scale (distance to and morphology of the nearest neighbor galaxy) and the large-scale (background density smoothed over 20 nearby galaxies) environments. The impacts of interaction on galaxy properties are detected at least out to the neighbor separation corresponding to the virial radius of galaxies, which is typically between 200 and 400 h -1 kpc for the galaxies in our sample. To detect these long-range interaction effects, it is crucial to divide galaxy interactions into four cases dividing the morphology of target and neighbor galaxies into early and late types. We show that there are two characteristic neighbor-separation scales where the galaxy interactions cause abrupt changes in the properties of galaxies. The first scale is the virial radius of the nearest neighbor galaxy r vir,nei . Many physical parameters start to deviate from those of extremely isolated galaxies at the projected neighbor separation r p of about r vir,nei . The second scale is at r p ∼ 0.05r vir,nei = 10-20 h -1 kpc, and is the scale at which the galaxies in pairs start to merge. We find that late-type neighbors enhance the star formation activity of galaxies while early-type neighbors reduce it, and that these effects occur within r vir,nei . The hot halo gas and cold disk gas must be participating in the interactions at separations less than the virial radius of the galaxy plus dark halo system. Our results also show that the role of the large-scale density in determining galaxy properties is minimal once luminosity and morphology are fixed. We propose that the weak residual dependence of galaxy properties on the large-scale density is due to the dependence of the halo gas property on the large-scale density.

  2. Just enough inflation. Power spectrum modifications at large scales

    International Nuclear Information System (INIS)

    Cicoli, Michele; Downes, Sean

    2014-07-01

    We show that models of 'just enough' inflation, where the slow-roll evolution lasted only 50-60 e-foldings, feature modifications of the CMB power spectrum at large angular scales. We perform a systematic and model-independent analysis of any possible non-slow-roll background evolution prior to the final stage of slow-roll inflation. We find a high degree of universality since most common backgrounds like fast-roll evolution, matter or radiation-dominance give rise to a power loss at large angular scales and a peak together with an oscillatory behaviour at scales around the value of the Hubble parameter at the beginning of slow-roll inflation. Depending on the value of the equation of state parameter, different pre-inflationary epochs lead instead to an enhancement of power at low-l, and so seem disfavoured by recent observational hints for a lack of CMB power at l< or similar 40. We also comment on the importance of initial conditions and the possibility to have multiple pre-inflationary stages.

  3. Large-scale ocean connectivity and planktonic body size

    KAUST Repository

    Villarino, Ernesto; Watson, James R.; Jö nsson, Bror; Gasol, Josep M.; Salazar, Guillem; Acinas, Silvia G.; Estrada, Marta; Massana, Ramó n; Logares, Ramiro; Giner, Caterina R.; Pernice, Massimo C.; Olivar, M. Pilar; Citores, Leire; Corell, Jon; Rodrí guez-Ezpeleta, Naiara; Acuñ a, José Luis; Molina-Ramí rez, Axayacatl; Gonzá lez-Gordillo, J. Ignacio; Có zar, André s; Martí , Elisa; Cuesta, José A.; Agusti, Susana; Fraile-Nuez, Eugenio; Duarte, Carlos M.; Irigoien, Xabier; Chust, Guillem

    2018-01-01

    Global patterns of planktonic diversity are mainly determined by the dispersal of propagules with ocean currents. However, the role that abundance and body size play in determining spatial patterns of diversity remains unclear. Here we analyse spatial community structure - β-diversity - for several planktonic and nektonic organisms from prokaryotes to small mesopelagic fishes collected during the Malaspina 2010 Expedition. β-diversity was compared to surface ocean transit times derived from a global circulation model, revealing a significant negative relationship that is stronger than environmental differences. Estimated dispersal scales for different groups show a negative correlation with body size, where less abundant large-bodied communities have significantly shorter dispersal scales and larger species spatial turnover rates than more abundant small-bodied plankton. Our results confirm that the dispersal scale of planktonic and micro-nektonic organisms is determined by local abundance, which scales with body size, ultimately setting global spatial patterns of diversity.

  4. Large-scale ocean connectivity and planktonic body size

    KAUST Repository

    Villarino, Ernesto

    2018-01-04

    Global patterns of planktonic diversity are mainly determined by the dispersal of propagules with ocean currents. However, the role that abundance and body size play in determining spatial patterns of diversity remains unclear. Here we analyse spatial community structure - β-diversity - for several planktonic and nektonic organisms from prokaryotes to small mesopelagic fishes collected during the Malaspina 2010 Expedition. β-diversity was compared to surface ocean transit times derived from a global circulation model, revealing a significant negative relationship that is stronger than environmental differences. Estimated dispersal scales for different groups show a negative correlation with body size, where less abundant large-bodied communities have significantly shorter dispersal scales and larger species spatial turnover rates than more abundant small-bodied plankton. Our results confirm that the dispersal scale of planktonic and micro-nektonic organisms is determined by local abundance, which scales with body size, ultimately setting global spatial patterns of diversity.

  5. Multi-level discriminative dictionary learning with application to large scale image classification.

    Science.gov (United States)

    Shen, Li; Sun, Gang; Huang, Qingming; Wang, Shuhui; Lin, Zhouchen; Wu, Enhua

    2015-10-01

    The sparse coding technique has shown flexibility and capability in image representation and analysis. It is a powerful tool in many visual applications. Some recent work has shown that incorporating the properties of task (such as discrimination for classification task) into dictionary learning is effective for improving the accuracy. However, the traditional supervised dictionary learning methods suffer from high computation complexity when dealing with large number of categories, making them less satisfactory in large scale applications. In this paper, we propose a novel multi-level discriminative dictionary learning method and apply it to large scale image classification. Our method takes advantage of hierarchical category correlation to encode multi-level discriminative information. Each internal node of the category hierarchy is associated with a discriminative dictionary and a classification model. The dictionaries at different layers are learnt to capture the information of different scales. Moreover, each node at lower layers also inherits the dictionary of its parent, so that the categories at lower layers can be described with multi-scale information. The learning of dictionaries and associated classification models is jointly conducted by minimizing an overall tree loss. The experimental results on challenging data sets demonstrate that our approach achieves excellent accuracy and competitive computation cost compared with other sparse coding methods for large scale image classification.

  6. NASA: Assessments of Selected Large-Scale Projects

    Science.gov (United States)

    2011-03-01

    REPORT DATE MAR 2011 2. REPORT TYPE 3. DATES COVERED 00-00-2011 to 00-00-2011 4. TITLE AND SUBTITLE Assessments Of Selected Large-Scale Projects...Volatile EvolutioN MEP Mars Exploration Program MIB Mishap Investigation Board MMRTG Multi Mission Radioisotope Thermoelectric Generator MMS Magnetospheric...probes designed to explore the Martian surface, to satellites equipped with advanced sensors to study the earth , to telescopes intended to explore the

  7. Homogenization techniques for population dynamics in strongly heterogeneous landscapes.

    Science.gov (United States)

    Yurk, Brian P; Cobbold, Christina A

    2018-12-01

    An important problem in spatial ecology is to understand how population-scale patterns emerge from individual-level birth, death, and movement processes. These processes, which depend on local landscape characteristics, vary spatially and may exhibit sharp transitions through behavioural responses to habitat edges, leading to discontinuous population densities. Such systems can be modelled using reaction-diffusion equations with interface conditions that capture local behaviour at patch boundaries. In this work we develop a novel homogenization technique to approximate the large-scale dynamics of the system. We illustrate our approach, which also generalizes to multiple species, with an example of logistic growth within a periodic environment. We find that population persistence and the large-scale population carrying capacity is influenced by patch residence times that depend on patch preference, as well as movement rates in adjacent patches. The forms of the homogenized coefficients yield key theoretical insights into how large-scale dynamics arise from the small-scale features.

  8. Presenting an Approach for Conducting Knowledge Architecture within Large-Scale Organizations.

    Science.gov (United States)

    Varaee, Touraj; Habibi, Jafar; Mohaghar, Ali

    2015-01-01

    Knowledge architecture (KA) establishes the basic groundwork for the successful implementation of a short-term or long-term knowledge management (KM) program. An example of KA is the design of a prototype before a new vehicle is manufactured. Due to a transformation to large-scale organizations, the traditional architecture of organizations is undergoing fundamental changes. This paper explores the main strengths and weaknesses in the field of KA within large-scale organizations and provides a suitable methodology and supervising framework to overcome specific limitations. This objective was achieved by applying and updating the concepts from the Zachman information architectural framework and the information architectural methodology of enterprise architecture planning (EAP). The proposed solution may be beneficial for architects in knowledge-related areas to successfully accomplish KM within large-scale organizations. The research method is descriptive; its validity is confirmed by performing a case study and polling the opinions of KA experts.

  9. Evaluating the use of local ecological knowledge to monitor hunted tropical-forest wildlife over large spatial scales

    Directory of Open Access Journals (Sweden)

    Luke Parry

    2015-09-01

    Full Text Available Monitoring the distribution and abundance of hunted wildlife is critical to achieving sustainable resource use, yet adequate data are sparse for most tropical regions. Conventional methods for monitoring hunted forest-vertebrate species require intensive in situ survey effort, which severely constrains spatial and temporal replication. Integrating local ecological knowledge (LEK into monitoring and management is appealing because it can be cost-effective, enhance community participation, and provide novel insights into sustainable resource use. We develop a technique to monitor population depletion of hunted forest wildlife in the Brazilian Amazon, based on the local ecological knowledge of rural hunters. We performed rapid interview surveys to estimate the landscape-scale depletion of ten large-bodied vertebrate species around 161 Amazonian riverine settlements. We assessed the explanatory and predictive power of settlement and landscape characteristics and were able to develop robust estimates of local faunal depletion. By identifying species-specific drivers of depletion and using secondary data on human population density, land form, and physical accessibility, we then estimated landscape- and regional-scale depletion. White-lipped peccary (Tayassu pecari, for example, were estimated to be absent from 17% of their putative range in Brazil's largest state (Amazonas, despite 98% of the original forest cover remaining intact. We found evidence that bushmeat consumption in small urban centers has far-reaching impacts on some forest species, including severe depletion well over 100 km from urban centers. We conclude that LEK-based approaches require further field validation, but have significant potential for community-based participatory monitoring as well as cost-effective, large-scale monitoring of threatened forest species.

  10. A Grouping Particle Swarm Optimizer with Personal-Best-Position Guidance for Large Scale Optimization.

    Science.gov (United States)

    Guo, Weian; Si, Chengyong; Xue, Yu; Mao, Yanfen; Wang, Lei; Wu, Qidi

    2017-05-04

    Particle Swarm Optimization (PSO) is a popular algorithm which is widely investigated and well implemented in many areas. However, the canonical PSO does not perform well in population diversity maintenance so that usually leads to a premature convergence or local optima. To address this issue, we propose a variant of PSO named Grouping PSO with Personal- Best-Position (Pbest) Guidance (GPSO-PG) which maintains the population diversity by preserving the diversity of exemplars. On one hand, we adopt uniform random allocation strategy to assign particles into different groups and in each group the losers will learn from the winner. On the other hand, we employ personal historical best position of each particle in social learning rather than the current global best particle. In this way, the exemplars diversity increases and the effect from the global best particle is eliminated. We test the proposed algorithm to the benchmarks in CEC 2008 and CEC 2010, which concern the large scale optimization problems (LSOPs). By comparing several current peer algorithms, GPSO-PG exhibits a competitive performance to maintain population diversity and obtains a satisfactory performance to the problems.

  11. Large-Scale Traveling Weather Systems in Mars’ Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-10-01

    Between late fall and early spring, Mars’ middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  12. Large-Scale Traveling Weather Systems in Mars Southern Extratropics

    Science.gov (United States)

    Hollingsworth, Jeffery L.; Kahre, Melinda A.

    2017-01-01

    Between late fall and early spring, Mars' middle- and high-latitude atmosphere supports strong mean equator-to-pole temperature contrasts and an accompanying mean westerly polar vortex. Observations from both the MGS Thermal Emission Spectrometer (TES) and the MRO Mars Climate Sounder (MCS) indicate that a mean baroclinicity-barotropicity supports intense, large-scale eastward traveling weather systems (i.e., transient synoptic-period waves). Such extratropical weather disturbances are critical components of the global circulation as they serve as agents in the transport of heat and momentum, and generalized scalar/tracer quantities (e.g., atmospheric dust, water-vapor and ice clouds). The character of such traveling extratropical synoptic disturbances in Mars' southern hemisphere during late winter through early spring is investigated using a moderately high-resolution Mars global climate model (Mars GCM). This Mars GCM imposes interactively-lifted and radiatively-active dust based on a threshold value of the surface stress. The model exhibits a reasonable "dust cycle" (i.e., globally averaged, a dustier atmosphere during southern spring and summer occurs). Compared to the northern-hemisphere counterparts, the southern synoptic-period weather disturbances and accompanying frontal waves have smaller meridional and zonal scales, and are far less intense. Influences of the zonally asymmetric (i.e., east-west varying) topography on southern large-scale weather are investigated, in addition to large-scale up-slope/down-slope flows and the diurnal cycle. A southern storm zone in late winter and early spring presents in the western hemisphere via orographic influences from the Tharsis highlands, and the Argyre and Hellas impact basins. Geographically localized transient-wave activity diagnostics are constructed that illuminate dynamical differences amongst the simulations and these are presented.

  13. Nonlinear evolution of large-scale structure in the universe

    International Nuclear Information System (INIS)

    Frenk, C.S.; White, S.D.M.; Davis, M.

    1983-01-01

    Using N-body simulations we study the nonlinear development of primordial density perturbation in an Einstein--de Sitter universe. We compare the evolution of an initial distribution without small-scale density fluctuations to evolution from a random Poisson distribution. These initial conditions mimic the assumptions of the adiabatic and isothermal theories of galaxy formation. The large-scale structures which form in the two cases are markedly dissimilar. In particular, the correlation function xi(r) and the visual appearance of our adiabatic (or ''pancake'') models match better the observed distribution of galaxies. This distribution is characterized by large-scale filamentary structure. Because the pancake models do not evolve in a self-similar fashion, the slope of xi(r) steepens with time; as a result there is a unique epoch at which these models fit the galaxy observations. We find the ratio of cutoff length to correlation length at this time to be lambda/sub min//r 0 = 5.1; its expected value in a neutrino dominated universe is 4(Ωh) -1 (H 0 = 100h km s -1 Mpc -1 ). At early epochs these models predict a negligible amplitude for xi(r) and could explain the lack of measurable clustering in the Lyα absorption lines of high-redshift quasars. However, large-scale structure in our models collapses after z = 2. If this collapse precedes galaxy formation as in the usual pancake theory, galaxies formed uncomfortably recently. The extent of this problem may depend on the cosmological model used; the present series of experiments should be extended in the future to include models with Ω<1

  14. TOPOLOGY OF A LARGE-SCALE STRUCTURE AS A TEST OF MODIFIED GRAVITY

    International Nuclear Information System (INIS)

    Wang Xin; Chen Xuelei; Park, Changbom

    2012-01-01

    The genus of the isodensity contours is a robust measure of the topology of a large-scale structure, and it is relatively insensitive to nonlinear gravitational evolution, galaxy bias, and redshift-space distortion. We show that the growth of density fluctuations is scale dependent even in the linear regime in some modified gravity theories, which opens a new possibility of testing the theories observationally. We propose to use the genus of the isodensity contours, an intrinsic measure of the topology of the large-scale structure, as a statistic to be used in such tests. In Einstein's general theory of relativity, density fluctuations grow at the same rate on all scales in the linear regime, and the genus per comoving volume is almost conserved as structures grow homologously, so we expect that the genus-smoothing-scale relation is basically time independent. However, in some modified gravity models where structures grow with different rates on different scales, the genus-smoothing-scale relation should change over time. This can be used to test the gravity models with large-scale structure observations. We study the cases of the f(R) theory, DGP braneworld theory as well as the parameterized post-Friedmann models. We also forecast how the modified gravity models can be constrained with optical/IR or redshifted 21 cm radio surveys in the near future.

  15. Computing the universe: how large-scale simulations illuminate galaxies and dark energy

    Science.gov (United States)

    O'Shea, Brian

    2015-04-01

    High-performance and large-scale computing is absolutely to understanding astronomical objects such as stars, galaxies, and the cosmic web. This is because these are structures that operate on physical, temporal, and energy scales that cannot be reasonably approximated in the laboratory, and whose complexity and nonlinearity often defies analytic modeling. In this talk, I show how the growth of computing platforms over time has facilitated our understanding of astrophysical and cosmological phenomena, focusing primarily on galaxies and large-scale structure in the Universe.

  16. Prospects for investment in large-scale, grid-connected solar power in Africa

    DEFF Research Database (Denmark)

    Hansen, Ulrich Elmer; Nygaard, Ivan; Pedersen, Mathilde Brix

    since the 1990s have changed the competiveness of solar PV in all markets, ranging from individual households via institutions to mini-grids and grid-connected installations. In volume and investment, the market for large-scale grid-connected solar power plants is by far the most important......-scale investments in grid-connected solar power plants and local assembly facilities for PV panels, have exceeded even optimistic scenarios. Finally, therefore, there seem to be bright prospects for investment in large-scale grid-connected solar power in Africa....

  17. Argentine Population Genetic Structure: Large Variance in Amerindian Contribution

    Science.gov (United States)

    Seldin, Michael F.; Tian, Chao; Shigeta, Russell; Scherbarth, Hugo R.; Silva, Gabriel; Belmont, John W.; Kittles, Rick; Gamron, Susana; Allevi, Alberto; Palatnik, Simon A.; Alvarellos, Alejandro; Paira, Sergio; Caprarulo, Cesar; Guillerón, Carolina; Catoggio, Luis J.; Prigione, Cristina; Berbotto, Guillermo A.; García, Mercedes A.; Perandones, Carlos E.; Pons-Estel, Bernardo A.; Alarcon-Riquelme, Marta E.

    2011-01-01

    Argentine population genetic structure was examined using a set of 78 ancestry informative markers (AIMs) to assess the contributions of European, Amerindian, and African ancestry in 94 individuals members of this population. Using the Bayesian clustering algorithm STRUCTURE, the mean European contribution was 78%, the Amerindian contribution was 19.4%, and the African contribution was 2.5%. Similar results were found using weighted least mean square method: European, 80.2%; Amerindian, 18.1%; and African, 1.7%. Consistent with previous studies the current results showed very few individuals (four of 94) with greater than 10% African admixture. Notably, when individual admixture was examined, the Amerindian and European admixture showed a very large variance and individual Amerindian contribution ranged from 1.5 to 84.5% in the 94 individual Argentine subjects. These results indicate that admixture must be considered when clinical epidemiology or case control genetic analyses are studied in this population. Moreover, the current study provides a set of informative SNPs that can be used to ascertain or control for this potentially hidden stratification. In addition, the large variance in admixture proportions in individual Argentine subjects shown by this study suggests that this population is appropriate for future admixture mapping studies. PMID:17177183

  18. Large-scale land transformations in Indonesia: The role of ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    ... enable timely responses to the impacts of large-scale land transformations in Central Kalimantan ... In partnership with UNESCO's Organization for Women in Science for the ... New funding opportunity for gender equality and climate change.

  19. Solving Large Scale Crew Scheduling Problems in Practice

    NARCIS (Netherlands)

    E.J.W. Abbink (Erwin); L. Albino; T.A.B. Dollevoet (Twan); D. Huisman (Dennis); J. Roussado; R.L. Saldanha

    2010-01-01

    textabstractThis paper deals with large-scale crew scheduling problems arising at the Dutch railway operator, Netherlands Railways (NS). NS operates about 30,000 trains a week. All these trains need a driver and a certain number of guards. Some labor rules restrict the duties of a certain crew base

  20. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Directory of Open Access Journals (Sweden)

    Xiangyun Xiao

    Full Text Available The reconstruction of gene regulatory networks (GRNs from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM, experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  1. A new asynchronous parallel algorithm for inferring large-scale gene regulatory networks.

    Science.gov (United States)

    Xiao, Xiangyun; Zhang, Wei; Zou, Xiufen

    2015-01-01

    The reconstruction of gene regulatory networks (GRNs) from high-throughput experimental data has been considered one of the most important issues in systems biology research. With the development of high-throughput technology and the complexity of biological problems, we need to reconstruct GRNs that contain thousands of genes. However, when many existing algorithms are used to handle these large-scale problems, they will encounter two important issues: low accuracy and high computational cost. To overcome these difficulties, the main goal of this study is to design an effective parallel algorithm to infer large-scale GRNs based on high-performance parallel computing environments. In this study, we proposed a novel asynchronous parallel framework to improve the accuracy and lower the time complexity of large-scale GRN inference by combining splitting technology and ordinary differential equation (ODE)-based optimization. The presented algorithm uses the sparsity and modularity of GRNs to split whole large-scale GRNs into many small-scale modular subnetworks. Through the ODE-based optimization of all subnetworks in parallel and their asynchronous communications, we can easily obtain the parameters of the whole network. To test the performance of the proposed approach, we used well-known benchmark datasets from Dialogue for Reverse Engineering Assessments and Methods challenge (DREAM), experimentally determined GRN of Escherichia coli and one published dataset that contains more than 10 thousand genes to compare the proposed approach with several popular algorithms on the same high-performance computing environments in terms of both accuracy and time complexity. The numerical results demonstrate that our parallel algorithm exhibits obvious superiority in inferring large-scale GRNs.

  2. Sizing and scaling requirements of a large-scale physical model for code validation

    International Nuclear Information System (INIS)

    Khaleel, R.; Legore, T.

    1990-01-01

    Model validation is an important consideration in application of a code for performance assessment and therefore in assessing the long-term behavior of the engineered and natural barriers of a geologic repository. Scaling considerations relevant to porous media flow are reviewed. An analysis approach is presented for determining the sizing requirements of a large-scale, hydrology physical model. The physical model will be used to validate performance assessment codes that evaluate the long-term behavior of the repository isolation system. Numerical simulation results for sizing requirements are presented for a porous medium model in which the media properties are spatially uncorrelated

  3. Fast Simulation of Large-Scale Floods Based on GPU Parallel Computing

    Directory of Open Access Journals (Sweden)

    Qiang Liu

    2018-05-01

    Full Text Available Computing speed is a significant issue of large-scale flood simulations for real-time response to disaster prevention and mitigation. Even today, most of the large-scale flood simulations are generally run on supercomputers due to the massive amounts of data and computations necessary. In this work, a two-dimensional shallow water model based on an unstructured Godunov-type finite volume scheme was proposed for flood simulation. To realize a fast simulation of large-scale floods on a personal computer, a Graphics Processing Unit (GPU-based, high-performance computing method using the OpenACC application was adopted to parallelize the shallow water model. An unstructured data management method was presented to control the data transportation between the GPU and CPU (Central Processing Unit with minimum overhead, and then both computation and data were offloaded from the CPU to the GPU, which exploited the computational capability of the GPU as much as possible. The parallel model was validated using various benchmarks and real-world case studies. The results demonstrate that speed-ups of up to one order of magnitude can be achieved in comparison with the serial model. The proposed parallel model provides a fast and reliable tool with which to quickly assess flood hazards in large-scale areas and, thus, has a bright application prospect for dynamic inundation risk identification and disaster assessment.

  4. An efficient and novel computation method for simulating diffraction patterns from large-scale coded apertures on large-scale focal plane arrays

    Science.gov (United States)

    Shrekenhamer, Abraham; Gottesman, Stephen R.

    2012-10-01

    A novel and memory efficient method for computing diffraction patterns produced on large-scale focal planes by largescale Coded Apertures at wavelengths where diffraction effects are significant has been developed and tested. The scheme, readily implementable on portable computers, overcomes the memory limitations of present state-of-the-art simulation codes such as Zemax. The method consists of first calculating a set of reference complex field (amplitude and phase) patterns on the focal plane produced by a single (reference) central hole, extending to twice the focal plane array size, with one such pattern for each Line-of-Sight (LOS) direction and wavelength in the scene, and with the pattern amplitude corresponding to the square-root of the spectral irradiance from each such LOS direction in the scene at selected wavelengths. Next the set of reference patterns is transformed to generate pattern sets for other holes. The transformation consists of a translational pattern shift corresponding to each hole's position offset and an electrical phase shift corresponding to each hole's position offset and incoming radiance's direction and wavelength. The set of complex patterns for each direction and wavelength is then summed coherently and squared for each detector to yield a set of power patterns unique for each direction and wavelength. Finally the set of power patterns is summed to produce the full waveband diffraction pattern from the scene. With this tool researchers can now efficiently simulate diffraction patterns produced from scenes by large-scale Coded Apertures onto large-scale focal plane arrays to support the development and optimization of coded aperture masks and image reconstruction algorithms.

  5. Evolutionary leap in large-scale flood risk assessment needed

    OpenAIRE

    Vorogushyn, Sergiy; Bates, Paul D.; de Bruijn, Karin; Castellarin, Attilio; Kreibich, Heidi; Priest, Sally J.; Schröter, Kai; Bagli, Stefano; Blöschl, Günter; Domeneghetti, Alessio; Gouldby, Ben; Klijn, Frans; Lammersen, Rita; Neal, Jeffrey C.; Ridder, Nina

    2018-01-01

    Current approaches for assessing large-scale flood risks contravene the fundamental principles of the flood risk system functioning because they largely ignore basic interactions and feedbacks between atmosphere, catchments, river-floodplain systems and socio-economic processes. As a consequence, risk analyses are uncertain and might be biased. However, reliable risk estimates are required for prioritizing national investments in flood risk mitigation or for appraisal and management of insura...

  6. Food security through large scale investments in agriculture

    Science.gov (United States)

    Rulli, M.; D'Odorico, P.

    2013-12-01

    Most of the human appropriation of freshwater resources is for food production. There is some concern that in the near future the finite freshwater resources available on Earth might not be sufficient to meet the increasing human demand for agricultural products. In the late 1700s Malthus argued that in the long run the humanity would not have enough resources to feed itself. Malthus' analysis, however, did not account for the emergence of technological innovations that could increase the rate of food production. The modern and contemporary history has seen at least three major technological advances that have increased humans' access to food, namely, the industrial revolution, the green revolution, and the intensification of global trade. Here we argue that a fourth revolution has just started to happen. It involves foreign direct investments in agriculture, which intensify the crop yields of potentially highly productive agricultural lands by introducing the use of more modern technologies. The increasing demand for agricultural products and the uncertainty of international food markets has recently drawn the attention of governments and agribusiness firms toward investments in productive agricultural land, mostly in the developing world. The targeted countries are typically located in regions that have remained only marginally utilized because of lack of modern technology. It is expected that in the long run large scale land acquisitions for commercial farming will bring the technology required to close the existing yield gaps. While the extent of the acquired land and the associated appropriation of freshwater resources have been investigated in detail, the amount of food this land can produce and the number of people it could feed still need to be quantified. Here we use a unique dataset of verified land deals to provide a global quantitative assessment of the rates of crop and food appropriation potentially associated with large scale land acquisitions. We

  7. No large scale curvature perturbations during the waterfall phase transition of hybrid inflation

    International Nuclear Information System (INIS)

    Abolhasani, Ali Akbar; Firouzjahi, Hassan

    2011-01-01

    In this paper the possibility of generating large scale curvature perturbations induced from the entropic perturbations during the waterfall phase transition of the standard hybrid inflation model is studied. We show that whether or not appreciable amounts of large scale curvature perturbations are produced during the waterfall phase transition depends crucially on the competition between the classical and the quantum mechanical backreactions to terminate inflation. If one considers only the classical evolution of the system, we show that the highly blue-tilted entropy perturbations induce highly blue-tilted large scale curvature perturbations during the waterfall phase transition which dominate over the original adiabatic curvature perturbations. However, we show that the quantum backreactions of the waterfall field inhomogeneities produced during the phase transition dominate completely over the classical backreactions. The cumulative quantum backreactions of very small scale tachyonic modes terminate inflation very efficiently and shut off the curvature perturbation evolution during the waterfall phase transition. This indicates that the standard hybrid inflation model is safe under large scale curvature perturbations during the waterfall phase transition.

  8. Applications of random forest feature selection for fine-scale genetic population assignment.

    Science.gov (United States)

    Sylvester, Emma V A; Bentzen, Paul; Bradbury, Ian R; Clément, Marie; Pearce, Jon; Horne, John; Beiko, Robert G

    2018-02-01

    Genetic population assignment used to inform wildlife management and conservation efforts requires panels of highly informative genetic markers and sensitive assignment tests. We explored the utility of machine-learning algorithms (random forest, regularized random forest and guided regularized random forest) compared with F ST ranking for selection of single nucleotide polymorphisms (SNP) for fine-scale population assignment. We applied these methods to an unpublished SNP data set for Atlantic salmon ( Salmo salar ) and a published SNP data set for Alaskan Chinook salmon ( Oncorhynchus tshawytscha ). In each species, we identified the minimum panel size required to obtain a self-assignment accuracy of at least 90% using each method to create panels of 50-700 markers Panels of SNPs identified using random forest-based methods performed up to 7.8 and 11.2 percentage points better than F ST -selected panels of similar size for the Atlantic salmon and Chinook salmon data, respectively. Self-assignment accuracy ≥90% was obtained with panels of 670 and 384 SNPs for each data set, respectively, a level of accuracy never reached for these species using F ST -selected panels. Our results demonstrate a role for machine-learning approaches in marker selection across large genomic data sets to improve assignment for management and conservation of exploited populations.

  9. Why are large cities faster? Universal scaling and self-similarity in urban organization and dynamics

    Science.gov (United States)

    Bettencourt, L. M. A.; Lobo, J.; West, G. B.

    2008-06-01

    Cities have existed since the beginning of civilization and have always been intimately connected with humanity's cultural and technological development. Much about the human and social dynamics that takes place is cities is intuitively recognizable across time, space and culture; yet we still do not have a clear cut answer as to why cities exist or to what factors are critical to make them thrive or collapse. Here, we construct an extensive quantitative characterization of the variation of many urban indicators with city size, using large data sets for American, European and Chinese cities. We show that social and economic quantities, characterizing the creation of wealth and new ideas, show increasing returns to population scale, which appear quantitatively as a power law of city size with an exponent β≃ 1.15 > 1. Concurrently, quantities characterizing material infrastructure typically show economies of scale, namely β≃ 0.8 exponential growth, which inexorably lead to crises of urban organization. To avoid them we show that growth may proceed in cycles, separated by major urban adaptations, with the unintended consequence that the duration of such cycles decreases with larger urban population size and is now estimated to be shorter than a human lifetime.

  10. A large scale field experiment in the Amazon Basin (Lambada/Bateristca)

    Energy Technology Data Exchange (ETDEWEB)

    Dolman, A.J.; Kabat, P.; Gash, J.H.C.; Noilhan, J.; Jochum, A.M.; Nobre, C. [Winand Staring Centre, Wageningen (Netherlands)

    1994-12-31

    A description is given of a large scale field experiment planned in the Amazon Basin, aiming to assess the large scale balances of energy, water and CO{sub 2}. The background for this experiment, the embedding in global change programmes of IGBP/BAHC and WCRP/GEWEX is described. A proposal by four European groups aimed at designing the experiment with the help of mesoscale models is described and a possible European input to this experiment is suggested. 24 refs., 1 app.

  11. Large-Scale medical image analytics: Recent methodologies, applications and Future directions.

    Science.gov (United States)

    Zhang, Shaoting; Metaxas, Dimitris

    2016-10-01

    Despite the ever-increasing amount and complexity of annotated medical image data, the development of large-scale medical image analysis algorithms has not kept pace with the need for methods that bridge the semantic gap between images and diagnoses. The goal of this position paper is to discuss and explore innovative and large-scale data science techniques in medical image analytics, which will benefit clinical decision-making and facilitate efficient medical data management. Particularly, we advocate that the scale of image retrieval systems should be significantly increased at which interactive systems can be effective for knowledge discovery in potentially large databases of medical images. For clinical relevance, such systems should return results in real-time, incorporate expert feedback, and be able to cope with the size, quality, and variety of the medical images and their associated metadata for a particular domain. The design, development, and testing of the such framework can significantly impact interactive mining in medical image databases that are growing rapidly in size and complexity and enable novel methods of analysis at much larger scales in an efficient, integrated fashion. Copyright © 2016. Published by Elsevier B.V.

  12. Large-scale galaxy bias

    Science.gov (United States)

    Desjacques, Vincent; Jeong, Donghui; Schmidt, Fabian

    2018-02-01

    This review presents a comprehensive overview of galaxy bias, that is, the statistical relation between the distribution of galaxies and matter. We focus on large scales where cosmic density fields are quasi-linear. On these scales, the clustering of galaxies can be described by a perturbative bias expansion, and the complicated physics of galaxy formation is absorbed by a finite set of coefficients of the expansion, called bias parameters. The review begins with a detailed derivation of this very important result, which forms the basis of the rigorous perturbative description of galaxy clustering, under the assumptions of General Relativity and Gaussian, adiabatic initial conditions. Key components of the bias expansion are all leading local gravitational observables, which include the matter density but also tidal fields and their time derivatives. We hence expand the definition of local bias to encompass all these contributions. This derivation is followed by a presentation of the peak-background split in its general form, which elucidates the physical meaning of the bias parameters, and a detailed description of the connection between bias parameters and galaxy statistics. We then review the excursion-set formalism and peak theory which provide predictions for the values of the bias parameters. In the remainder of the review, we consider the generalizations of galaxy bias required in the presence of various types of cosmological physics that go beyond pressureless matter with adiabatic, Gaussian initial conditions: primordial non-Gaussianity, massive neutrinos, baryon-CDM isocurvature perturbations, dark energy, and modified gravity. Finally, we discuss how the description of galaxy bias in the galaxies' rest frame is related to clustering statistics measured from the observed angular positions and redshifts in actual galaxy catalogs.

  13. Large-Scale Machine Learning for Classification and Search

    Science.gov (United States)

    Liu, Wei

    2012-01-01

    With the rapid development of the Internet, nowadays tremendous amounts of data including images and videos, up to millions or billions, can be collected for training machine learning models. Inspired by this trend, this thesis is dedicated to developing large-scale machine learning techniques for the purpose of making classification and nearest…

  14. Reproducible, large-scale production of thallium-based high-temperature superconductors

    International Nuclear Information System (INIS)

    Gay, R.L.; Stelman, D.; Newcomb, J.C.; Grantham, L.F.; Schnittgrund, G.D.

    1990-01-01

    This paper reports on the development of a large scale spray-calcination technique generic to the preparation of ceramic high-temperature superconductor (HTSC) powders. Among the advantages of the technique is that of producing uniformly mixed metal oxides on a fine scale. Production of both yttrium and thallium-based HTSCs has been demonstrated using this technique. In the spray calciner, solutions of the desired composition are atomized as a fine mist into a hot gas. Evaporation and calcination are instantaneous, yielding an extremely fine, uniform oxide powder. The calciner is 76 cm in diameter and can produce metal oxide powder at relatively large rates (approximately 100 g/h) without contamination

  15. Inflationary tensor fossils in large-scale structure

    Energy Technology Data Exchange (ETDEWEB)

    Dimastrogiovanni, Emanuela [School of Physics and Astronomy, University of Minnesota, Minneapolis, MN 55455 (United States); Fasiello, Matteo [Department of Physics, Case Western Reserve University, Cleveland, OH 44106 (United States); Jeong, Donghui [Department of Astronomy and Astrophysics, The Pennsylvania State University, University Park, PA 16802 (United States); Kamionkowski, Marc, E-mail: ema@physics.umn.edu, E-mail: mrf65@case.edu, E-mail: duj13@psu.edu, E-mail: kamion@jhu.edu [Department of Physics and Astronomy, 3400 N. Charles St., Johns Hopkins University, Baltimore, MD 21218 (United States)

    2014-12-01

    Inflation models make specific predictions for a tensor-scalar-scalar three-point correlation, or bispectrum, between one gravitational-wave (tensor) mode and two density-perturbation (scalar) modes. This tensor-scalar-scalar correlation leads to a local power quadrupole, an apparent departure from statistical isotropy in our Universe, as well as characteristic four-point correlations in the current mass distribution in the Universe. So far, the predictions for these observables have been worked out only for single-clock models in which certain consistency conditions between the tensor-scalar-scalar correlation and tensor and scalar power spectra are satisfied. Here we review the requirements on inflation models for these consistency conditions to be satisfied. We then consider several examples of inflation models, such as non-attractor and solid-inflation models, in which these conditions are put to the test. In solid inflation the simplest consistency conditions are already violated whilst in the non-attractor model we find that, contrary to the standard scenario, the tensor-scalar-scalar correlator probes directly relevant model-dependent information. We work out the predictions for observables in these models. For non-attractor inflation we find an apparent local quadrupolar departure from statistical isotropy in large-scale structure but that this power quadrupole decreases very rapidly at smaller scales. The consistency of the CMB quadrupole with statistical isotropy then constrains the distance scale that corresponds to the transition from the non-attractor to attractor phase of inflation to be larger than the currently observable horizon. Solid inflation predicts clustering fossils signatures in the current galaxy distribution that may be large enough to be detectable with forthcoming, and possibly even current, galaxy surveys.

  16. Sampling strategy for a large scale indoor radiation survey - a pilot project

    International Nuclear Information System (INIS)

    Strand, T.; Stranden, E.

    1986-01-01

    Optimisation of a stratified random sampling strategy for large scale indoor radiation surveys is discussed. It is based on the results from a small scale pilot project where variances in dose rates within different categories of houses were assessed. By selecting a predetermined precision level for the mean dose rate in a given region, the number of measurements needed can be optimised. The results of a pilot project in Norway are presented together with the development of the final sampling strategy for a planned large scale survey. (author)

  17. Large-scale methanol plants. [Based on Japanese-developed process

    Energy Technology Data Exchange (ETDEWEB)

    Tado, Y

    1978-02-01

    A study was made on how to produce methanol economically which is expected as a growth item for use as a material for pollution-free energy or for chemical use, centering on the following subjects: (1) Improvement of thermal economy, (2) Improvement of process, and (3) Problems of hardware attending the expansion of scale. The results of this study were already adopted in actual plants, obtaining good results, and large-scale methanol plants are going to be realized.

  18. Dynamical links between small- and large-scale mantle heterogeneity: Seismological evidence

    Science.gov (United States)

    Frost, Daniel A.; Garnero, Edward J.; Rost, Sebastian

    2018-01-01

    We identify PKP • PKP scattered waves (also known as P‧ •P‧) from earthquakes recorded at small-aperture seismic arrays at distances less than 65°. P‧ •P‧ energy travels as a PKP wave through the core, up into the mantle, then scatters back down through the core to the receiver as a second PKP. P‧ •P‧ waves are unique in that they allow scattering heterogeneities throughout the mantle to be imaged. We use array-processing methods to amplify low amplitude, coherent scattered energy signals and resolve their incoming direction. We deterministically map scattering heterogeneity locations from the core-mantle boundary to the surface. We use an extensive dataset with sensitivity to a large volume of the mantle and a location method allowing us to resolve and map more heterogeneities than have previously been possible, representing a significant increase in our understanding of small-scale structure within the mantle. Our results demonstrate that the distribution of scattering heterogeneities varies both radially and laterally. Scattering is most abundant in the uppermost and lowermost mantle, and a minimum in the mid-mantle, resembling the radial distribution of tomographically derived whole-mantle velocity heterogeneity. We investigate the spatial correlation of scattering heterogeneities with large-scale tomographic velocities, lateral velocity gradients, the locations of deep-seated hotspots and subducted slabs. In the lowermost 1500 km of the mantle, small-scale heterogeneities correlate with regions of low seismic velocity, high lateral seismic gradient, and proximity to hotspots. In the upper 1000 km of the mantle there is no significant correlation between scattering heterogeneity location and subducted slabs. Between 600 and 900 km depth, scattering heterogeneities are more common in the regions most remote from slabs, and close to hotspots. Scattering heterogeneities show an affinity for regions close to slabs within the upper 200 km of the

  19. Does environmental policy affect scaling laws between population and pollution? Evidence from American metropolitan areas.

    Science.gov (United States)

    Muller, Nicholas Z; Jha, Akshaya

    2017-01-01

    Modern cities are engines of production, innovation, and growth. However, urbanization also increases both local and global pollution from household consumption and firms' production. Do emissions change proportionately to city size or does pollution tend to outpace or lag urbanization? Do emissions scale differently with population versus economic growth or are emissions, population, and economic growth inextricably linked? How are the scaling relationships between emissions, population, and economic growth affected by environmental regulation? This paper examines the link between urbanization, economic growth and pollution using data from Metropolitan Statistical Areas (MSAs) in the United States between 1999 and 2011. We find that the emissions of local air pollution in these MSAs scale according to a ¾ power law with both population size and gross domestic product (GDP). However, the monetary damages from these local emissions scale linearly with both population and GDP. Counties that have previously been out of attainment with the local air quality standards set by the Clean Air Act show an entirely different relationship: local emissions scale according to the square root of population, while the monetary damages from local air pollution follow a 2/3rds power law with population. Counties out of attainment are subject to more stringent emission controls; we argue based on this that enforcement of the Clean Air Act induces sublinear scaling between emissions, damages, and city size. In contrast, we find that metropolitan GDP scales super-linearly with population in all MSAs regardless of attainment status. Summarizing, our findings suggest that environmental policy limits the adverse effects of urbanization without interfering with the productivity benefits that manifest in cities.

  20. PKI security in large-scale healthcare networks

    OpenAIRE

    Mantas, G.; Lymberopoulos, D.; Komninos, N.

    2012-01-01

    During the past few years a lot of PKI (Public Key Infrastructures) infrastructures have been proposed for healthcare networks in order to ensure secure communication services and exchange of data among healthcare professionals. However, there is a plethora of challenges in these healthcare PKI infrastructures. Especially, there are a lot of challenges for PKI infrastructures deployed over large-scale healthcare networks. In this paper, we propose a PKI infrastructure to ensure security in a ...